Extracting data from SAP ECC

WRITTEN BY: supportmymoto.com STAFF

Many SAP Knowledge Companies (DS) purposes use SAP ECC information as a supply. It now occurs that DS helps a number of mechanisms to extract information from SAP ECC. Which one to decide on? There typically is a most popular methodology, relying on the performance required and on the capabilities supplied inside the precise context of the supply system. On this weblog, I talk about all totally different choices.

Begin with configuring a knowledge retailer for the SAP supply system:

Determine 1: SAP Knowledge retailer definition


Necessary settings are Consumer Identify and Password, the title (or IP handle) of the SAP Utility server, the Consumer and System (occasion) quantity.

1/. “Common” information stream

When extracting from a single desk, do this in a typical information stream. Import the desk definition from the Metadata Repository within the Knowledge retailer Explorer and use it as supply in a knowledge stream.

An extraction from the KNA1 buyer grasp information desk seems to be like this:

Determine 2: Knowledge stream – extract from KNA1


Word the little crimson triangle within the supply desk icon. It’s a sign of a direct extract from the SAP information layer.

That is the only methodology. No particular measures are needed on the degree of the supply system. The extraction will run as a SAP dialog job. Word that there’s a time-out setting (system-configurable, usually 10, 30, 60 minutes) for dialog jobs. The DS job will fail when information extraction takes longer.

The the place circumstances are pushed right down to the underlying database. That is useful for the job efficiency. Additionally, ensure to extract solely the columns which are actually required. The extraction course of period is linearly associated to the info quantity, which equals variety of information * common information measurement. The much less information is transferred, the quicker the DS job runs.

Determine 3: Question remodel – extract from KNA1, excluding out of date information

Determine 4: Generated SQL code – the where-clause is pushed down


This method is particularly useful when implementing incremental masses. When the supply desk comprises a column with a last-modification timestamp, you’ll be able to simply implement source-based changed-data seize (CDC). Preserve monitor of the timestamps you used within the earlier incremental extraction (use a management desk for that), initialize world variables with these values and use them within the where-clause of your Question remodel.

Determine 5: Question remodel – extract not too long ago created or modified information from MARA

Determine 6: Generated SQL code – source-based CDC


2/. ABAP information stream

Though where-conditions are pushed right down to the underlying database from a traditional information stream, the joins are usually not (kind and group-by operations aren’t both!), typically resulting in abominable efficiency, particularly when coping with bigger information volumes.

If you wish to extract materials grasp information from MARA and complement every file with the English materials description from MAKT, you’ll be able to construct a knowledge stream like this:

See also  Xiaomi Redmi Note 7 (China) on xda-developers- XDA Developers

Determine 7: Knowledge stream – extract from MARA and MAKT

Determine 8: Question remodel – be a part of MARA and MAKT on MANDT and MATNR columns

Determine 9: Generated SQL code – no be a part of

DS generates two SQL statements. It first extracts all present information from MARA. After which for every particular person file it retrieves the corresponding English description (MATNR = AIVariable_1 and MANDT = AIVariable_2). That method results in as many round-trips to the underlying database as there are information within the MARA desk! It is just a sound one when coping with smaller units of knowledge.

You could enhance efficiency by altering the properties of the supply tables.

The default settings are:

Determine 10: Default supply desk properties


Making MARA the driving desk (by giving it a better Be a part of rank than MAKT) and caching MAKT in reminiscence results in a totally totally different technology of SQL code, with out singular round-trips to the database. The MARA desk streams by way of the info stream, the MAKT desk is cached, and the be a part of is resolved in DS reminiscence.

Determine 11: Modified supply desk properties

Determine 12: Generated SQL code – cache MAKT


The feasibility of this method is impacted by 2 parameters:

  • The quantity of reminiscence out there for caching
  • The time it takes to cache the MAKT desk

This works completely for smaller tables. But it surely’s not a nicely performing answer both when MAKT and MARA are too giant

The really helpful answer for extracting from a be a part of of SAP tables is thru using an ABAP information stream.

Determine 13: ABAP Knowledge stream – extract from MARA and MAKT


DS generates ABAP code akin to the properties of the supply tables and the logic of the dataflow. The desk with the very best Be a part of Rank turns into the driving desk. Additionally on this case, the period of the extraction course of is linearly associated to the info quantity: the much less information is transferred, the quicker the DS job runs.

Determine 14: Generated ABAP code


The ABAP code is pushed to the SAP system and executed there. This system outcomes solely are despatched again to DS for additional processing. This method solely works when the SAP system is open for growth! Additionally, be sure that:

  • The ABAP execution possibility is about to Generate and Execute.
  • The Execute in Background (batch) property is about to Sure to keep away from the time-out in SAP dialog jobs.
  • The Knowledge switch methodology is about to RFC. The RFC vacation spot have to be outlined within the SAP system. The opposite Knowledge switch strategies are there for upward-compatibility causes solely and should not be used anymore. All of them result in inferior efficiency.

Determine 15: SAP Knowledge retailer definition


To execute the identical DS job in non-development tiers of your panorama, transport the ABAP program from DEV to TST, PRD… first. Set the ABAP execution possibility to Execute Preloaded and run the DS job. It is not going to generate the ABAP code once more, however execute the transported code.

See also  History of the Microsoft 'Ninja Cat on a Unicorn' (and how to get your own)

ABAP information flows are additionally a handy answer for implementing incremental masses for ECC tables that don’t comprise a last-modification timestamp column. Inserts and modifications in KNA1 are registered within the CDHDR and CDPOS tables. Use an ABAP information stream to hitch KNA1 to these tables. Ensure CDHDR will get the very best Be a part of Rank, KNA1 the bottom, in an effort to get probably the most environment friendly code generated. And embody the the place circumstances to:

  • Filter out present prospects
  • Get the not too long ago modified information solely
  • From the proper entries within the log tables

Determine 16: ABAP Knowledge stream – extract from KNA1

Determine 17: Question remodel – be a part of KNA1 with CDHDR and CDPOS

Determine 18: Question remodel – extract not too long ago created or modified information from KNA1

3/. SAP Extractor

SAP Extractors are tailor-made for BW Enterprise Content material. They comprise all of the logic for widespread enterprise transformations, doable aggregations, and in addition how one can establish adjustments already, so they’re very appropriate for implementing incremental masses.

The DS Extractor characteristic is constructed on high of the Operational Knowledge Provisioning (ODP) API. DS helps all forms of ODP sources supported by the ODP API, together with CDC performance.

In a daily information stream, DS makes use of RFC to name the ODP information replication API. The where-conditions are usually not pushed right down to the extractor. Meaning all information are pulled from the extractor and filtering is completed afterwards in DS. Import the ODP object definition from the Metadata Repository within the Knowledge retailer Explorer.

Determine 19: Import extractor 0CUSTOMER_ATTR


Ensure to set the Extraction mode to Question. Then use it as supply in a knowledge stream. An extraction from the 0CUSTOMER_ATTR ODP object seems to be like this:

Determine 20: Knowledge stream – extract from 0CUSTOMER_ATTR


If you wish to extract a minor subset of the info solely, use the ODP object as supply in an ABAP information stream.

Determine 21: ABAP information stream – extract from 0CUSTOMER_ATTR


Add the where-clause to the Question remodel.

Determine 22: Question remodel – extract present American prospects from 0CUSTOMER_ATTR


DS generates the ABAP that calls the ODP information replication API. The generated code comprises the logic for filtering out pointless information.

Determine 23: Generated ABAP code


Implementing CDC is mostly a piece of cake for these extractors which are “delta-enabled”. Ensure to set the Extraction mode to Modified-data seize (CDC). When import the ODP object.

Determine 24: Import extractor 0PROJECT_ATTR


Then use it as supply in a knowledge stream. No want so as to add any time-based situation within the where-clause. The extractor logic will assure that solely new and modified information are handed to DS. Simply be sure that the Preliminary load property of the ODP object is about to No. Solely set it to Sure if you need the goal desk to be re-initialized.

Determine 25: ODP object properties


Embody a Map_CDC_Operation remodel to routinely synchronize the goal desk with the supply object. The remodel will translate the Row Operation worth into the corresponding DS row kind:

  • I > Insert
  • B & U: earlier than and after-image of an Replace
  • D > Delete

Determine 26: Knowledge stream – delta extract from 0PROJECT_ATTR


4/. ECC operate

DS can name RFC-enabled ECC features returning tables as information stream sources, as nicely. If a typical ECC operate shouldn’t be RFC-enabled, you want an RFC-enabled wrapper operate that passes the parameters to the usual operate, calls it and forwards the outcomes to DS.

You’ll be able to solely import a operate’s metadata by title. Name it from a question remodel by choosing New Operate Name… from the pop-up menu in its output schema. Choose the operate from the ECC information retailer. Outline Enter Parameter(s) and choose Output Parameter. The operate name is added to the output schema.

Determine 27: Question remodel – name an ECC operate


Then unnest the return ends in a subsequent Question remodel earlier than writing it right into a goal desk.

Determine 28: Question remodel – unnest the operate output schema

5/. SAP LT (SLT) Replication Server

DS can use SLT Replication Server as a supply. It is a very neat and chic option to construct CDC jobs in DS. Working with SLT objects is just like the best way DS works with SAP extractors.

Outline the SLT information retailer like some other SAP information retailer. Simply ensure you choose the precise ODP context within the definition.

Determine 29: SLT Knowledge retailer definition


You’ll be able to then import the tables it’s good to extract information from.

Determine 30: SLT desk metadata


Use the ODP object as supply in a knowledge stream. The primary time the info stream is executed it can do a full extract of the underlying desk. All successive runs will routinely carry out an incremental one and solely ahead the delta.

Determine 31: Knowledge stream – delta extract from MARA by way of SLT


6/. IDOC

DS real-time jobs can learn from IDOC messages and IDOC information. DS batch jobs can solely learn from IDOC information.

Import the IDOC metadata definition by title from the SAP information retailer. Use the IDOC as a file supply in a knowledge stream inside a batch job.

Determine 32: Knowledge stream – delta extract from IDOC File Supply


Double-click on the IDOC icon to open its definition and specify the title of the IDOC file. You need to use wildcards (? and *) or listing a number of file names separated by commas if you wish to course of a number of information in a single information stream.

Determine 33: IDOC properties


Generate the IDOC file(s) from SAP and run your DS job.

NOTE : Please do not copy - https://supportmymoto.com

Leave a Reply

Situs Judi Slot Online Terpercaya

Link Slot Gacor

Slot Anti Rungkat

Slot Online Terpercaya