Skip to main content Skip to complementary content

tSAPDataSourceReceiver Standard properties

These properties are used to configure tSAPDataSourceReceiver running in the Standard Job framework.

The Standard tSAPDataSourceReceiver component belongs to the Business family.

The component in this framework is available in all Talend products.

Basic settings

Failover

Select this option to use the RFC server failover feature. The RFC server failover feature is implemented by deploying brokers using the rfc.server.remote.broker.url parameter in Remote broker section of the tsap-rfc-server.properties file. After selecting this option, you need to provide the IP addresses and the port numbers of the brokers in the Host and Port columns in the table that appears. You can find the IP addresses and the port numbers of the brokers by checking the rfc.server.remote.broker.url parameter in Remote broker section of the tsap-rfc-server.properties file. See Configuring the tsap-rfc-server.properties file for related information.
Information noteNote: This option is available only if you have installed the R2021-01 Studio Monthly update or a later one delivered by Talend. For more information, check with your administrator.

Talend SAP RFC Server host

Enter the IP address of the Talend SAP RFC server, or localhost if it is installed on your local machine.

Port

Enter the port used by the Talend SAP RFC server.

User and Password

Enter the username and password used to connect to the Talend SAP RFC server.

To enter the password, click the [...] button next to the password field, and then in the pop-up dialog box enter the password between double quotes and click OK to save the settings.

Use SSL Transport

Select this check box to use the SSL transport mechanism. For more information about SSL transport, see SSL Transport Reference.

Datasource name

Specify the name of the Data Source system.

Mode

Select one of the following modes for using this component from the drop-down list.
  • Keep running forever: Select this mode if you want this component to keep listening for the coming data requests until you stop the execution of the Job. With this mode selected, you can also specify the following two properties:
    • Maximum duration (seconds): Specify the period of time (in seconds) for which the Job should continue to consume any data requests on the RFC Server. Once this time limit is reached, the Job stops consuming and closes the connection to the RFC Server.

    • Maximum number of data to consume: Specify the maximum number of data requests to be consumed by the Job. Once this limit is reached, the Job stops consuming and closes the connection to the RFC Server.

  • Batch (consume all data available then terminate): Select this mode if you want the Job to stop automatically once all data requests available on the RFC server are consumed. With this mode selected, you can also specify in the Sleep time (seconds) field the sleeping time, in seconds, which corresponds to the time the process will wait for the first message to be read.

Schema and Edit schema

A schema is a row description. It defines the number of fields (columns) to be processed and passed on to the next component. When you create a Spark Job, avoid the reserved word line when naming the fields.

  • Built-In: You create and store the schema locally for this component only.

  • Repository: You have already created the schema and stored it in the Repository. You can reuse it in various projects and Job designs.

Click Edit schema to make changes to the schema. If the current schema is of the Repository type, three options are available:

  • View schema: choose this option to view the schema only.

  • Change to built-in property: choose this option to change the schema to Built-in for local changes.

  • Update repository connection: choose this option to change the schema stored in the repository and decide whether to propagate the changes to all the Jobs upon completion. If you just want to propagate the changes to the current Job, you can select No upon completion and choose this schema metadata again in the Repository Content window.

Advanced settings

tStatCatcher Statistics

Select this check box to gather the Job processing metadata at the Job level as well as at each component level.

Global Variables

Global Variables

ERROR_MESSAGE: the error message generated by the component when an error occurs. This is an After variable and it returns a string. This variable functions only if the Die on error check box is cleared, if the component has this check box.

A Flow variable functions during the execution of a component while an After variable functions after the execution of the component.

To fill up a field or expression with a variable, press Ctrl + Space to access the variable list and choose the variable to use from it.

For further information about variables, see Talend Studio User Guide.

Usage

Usage rule

This component is usually used as a start component of a Job or subJob and it always needs an output link.

Limitation

Specific jar and dll files provided by SAP must be added to your Studio and the workstation hosting your Studio. The exact procedure varies among platforms such as 32-bit or 64-bit Windows or Linux.
  • For more information, see How to install the SAP Java Connector and the "Centralizing SAP metadata" section in Talend Studio User Guide.

  • You can find an example of how to install SAP Java connectors in Talend Help Center (https://help.talend.com) and the "Centralizing SAP metadata" section in Talend Studio User Guide.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!