How to create effective custom RFC function to integrate millions of records super fast with SAP BODS

Transfer million of entries fast
Share on Facebook18Share on LinkedIn9Tweet about this on TwitterShare on Google+0
Please Share!!

As an ABAPer, I have the opportunity to support several systems and different teams in every project. SAP offers many tools to integrate SAP with the legacy system. One of such tools is SAP BO Data Services (BODS) which is an ETL (Extract, Transform and Load) tool for SAP. SAP Data Service is a wonderful tool for data integration, data profiling, and data processing. It helps to integrate and transform trusted data-to-data warehouse system for the purpose of analytical reporting. SAP BODS is also used as the management console for scheduling of jobs and it comprises of smooth UI development interface, a metadata repository and data connectivity to source and target system. If you want to learn more about SAP BODS then please let us know in the comments section so that we can plan to create some useful articles on this subject.

We can use SAP remote function calls (RFCs) in queries created in Data Services data flows. In addition, Data Services provides the SAP application BAPI interface to support the use of remote function calls designed for business transactions (BAPIs). We use the SAP application datastore to import BAPI function metadata. SAP functions that are not RFC-enabled can be used in ABAP data flows with the following restrictions:

  • The function can only have scalar, multiple input parameters. The function cannot use table parameters.
  • For the output, you can select only one scalar parameter. Data Services cannot use normal functions in data flows because an SAP application normal function is not an RFC function.

However, we can write a wrapper for a normal function, which can change the function to be RFC enabled. In this case, a normal function which would be in a wrapper RFC FM would be supported in data flows including table parameters.

Also Read Applying Enterprise Integration Patterns in SAP ABAP from the same Author.

So what is so special about this? Well, we are talking about loading millions of records to SAP with SAP BODS. Imagine, you have to load 20 millions of sales movements and it has to be the fastest way.

Here is where we get into action. To create an RFC which is going to be called from SAP BODS there are some special considerations:

  1. First and foremost, RFC should be remote enabled.RFC FM
  2. All the input parameters should be marked as Pass Value.
    Pass by Value
  3. If you have any table parameters, they need to be defined in the tables section at the time of creation in the T-Code SM37 transaction and marked as optional.

Table Parameters     4. The table parameters should be typed as LIKE even if this option doesn’t appear in the drop down.

LIKE not in dropdown

This kind of configuration (i.e LIKE typing) is marked as obsolete by SAP ABAP help documentation but this does not apply for SAP BODS integration. One very common error which developers face while consuming an RFC from BODS is because the ABAPers create the RFC FM with the formal parameter as Changing typed and it doesn’t work that way from BODS.

5. Define basic Exceptions so that if the parameters are not in the correct format or the table are empty, you can raise an exception.

Exceptions in FM

6. Most of the time when legacy information is integrated with SAP, two phase staging is recommended with high volume interfaces. Just like in inbound IDocs, it is recommended not to process immediately, similarly in BODS, if we directly call the BAPI or Transaction, it can cause an overhead to the system or block other users, so you have to consolidate the information in a Z custom table and every time before you load data you should delete the information using the function truncate table:

Truncated Table

This is better than trying to DELETE. Remember we are loading millions of records to SAP, and every day it has to delete the information that was loaded a day before, so if you try to delete a table of 20 million with delete statement is not the best for the system performance.

** Staging is a commonly used term in SAP BODS. It means the place where the information flows before it reaches the final target. For example, when you load information from legacy systems to SAP, you might first save it or stage it in some other DB commonly a SQL, work on it (transform it if necessary) and then persisted in SAP. Afterall SAP BODS is another ETL (Extract, Transform and Load) tool. 🙂

**** Truncate is better than Delete because it doesn’t generate log in the database and it is faster.

DELETE in LOOP. Is it still a TABOO?

After that, you need to activate it, and notified to the SAP BODDeveloperer to consume your function.

RFC FM for BODS

The developer log on screen in BODS side.

BODS log on

You will need to import the RFC function by its name.

Function call in BODS

Once done this it could be used in any ETL Job in the project.  In this case, we used the function for achieving parallelism; calling it from three different workflows.

ELT in BODS

Each workflow queries a range of information from the legacy system and then using the function loads them into SAP. The input parameters for the function call is defined inside each workflow.

Function in BODS

Once every step has green signal, we execute the JOB. For this run, we are going to load 15 million records.

BODS run for 15 million entries

Once the JOB is successfully completed, we can check the target table at SAP to validated if 15.7 million entries were populated. Which the below image confirms.

Is HANA an alternative?

So…What Do You Think?

Now we want to hear from you.

What do you think of this integration? Do you see any point of improvement? Have you done anything similar like this?

Either way, please leave a quick comment below right now to let us know.

Share on Facebook18Share on LinkedIn9Tweet about this on TwitterShare on Google+0
Please Share!!

About the Author

Carlos Alberto Ron C.
Carlos Alberto Ron C.
Carlos is SAP Certified Consultant who holds a Computer Sciences Degree from State University, Monterrey Mexico UNAL. He has more than 15 years of IT experience and holds a diverse resume. He has worked mostly in retail, afore and consulting industry. His core skills are data integration and software development. He loves the challenge of leading new technology projects and learning new development platforms, proactively and with discipline. Find more about him on LinkedIn.

4 Comments on "How to create effective custom RFC function to integrate millions of records super fast with SAP BODS"

  1. Hi Kumar load to SAP HANNA is similar to other Data Store configuration, it has just a few steps more because you also need configure the import server and create and import the target structure. Please refer to the guide ds_42_sap_en.pdf

  2. Hi Team, We have a scenario where we need to upload material master data through HANA migration cockpit, where BODS also required, has anyone done this?

  3. You are correct Haarish with SAP HANA and SAP BODS Workbench, in just a glimpse you can do it, thanks for the comment.

  4. Mohamed Haarish | July 4, 2017 at 11:22 pm | Reply

    Hello Carlos,

    I appreciate YOUR ARTICLE IN BODS. Adding to this there is a way to load Millions of Data in no seconds through BODS and HANA Integration which is very easy to naïve users like many. Hope to see more article on BODS on HANA in future!

    Regards,
    Haarish
    Fujitsu Japan

Leave a comment

Your email address will not be published.


*