Skip to main content Skip to complementary content

What's new in R2022-05

Big Data: new features

Feature

Description

Available in

Support of latest Delta Lake API versions Talend Studio now supports the following Delta Lake API versions to be aligned with latest supported Databricks runtime versions:
  • 1.0.0 for Spark 3.1.x
  • 1.1.0 for Spark 3.2.x
.

All subscription-based Talend products with Big Data

Support of Google Dataproc 2.0.x on Spark Universal 3.1.x with Hive components in Standard Jobs You can now run your Standard Jobs containing Hive components on a Google Dataproc cluster using Spark Universal with Spark 3.1.x. You can configure it in the Basic settings view of the following Hive components:
  • tHiveConnection
  • tHiveCreateTable
  • tHiveLoad
  • tHiveInput
  • tHiveRow

All subscription-based Talend products with Big Data

Availability-noteBeta
Support of Databricks runtime 10.x and onwards on Spark Universal 3.2.x
You can now run your Spark Batch and Streaming Jobs on job and all-purpose Databricks clusters, both on AWS and Azure, using Spark Universal with Spark 3.2.x. You can configure it either in the Spark Configuration view of your Spark Jobs or in the Hadoop Cluster Connection metadata wizard.

When you select this mode, Talend Studio is compatible with Databricks 10.x version.

All subscription-based Talend products with Big Data

Support of CDP Private Cloud Base with Knox and Impala in Standard Jobs If you use CDP Private Cloud Base to run your Standard Jobs, you can now make use of both Knox and Impala.

All subscription-based Talend products with Big Data

Possibility to open other Spark Batch and Streaming Jobs/Joblets versions in Talend Studio You now have the possibility to open another version of your Spark Batch and Streaming Jobs and Joblets in Talend Studio. The Open another version option is available in the Repository tree view when you right-click the Spark Job or Joblet and it allows you to access older versions or to create a new one.

All subscription-based Talend products with Big Data

Data Integration: new features

Feature

Description

Available in

Management of Talend Studio update settings from Talend Management Console

The update settings of Talend Studio can now be managed by your administrator from Talend Management Console. Talend Studio can automatically retrieve the update settings that your administrator configured in Talend Management Console when logging into a project managed by Talend Management Console.

For more information, see Configuring update repositories.

All Talend Cloud products and Talend Data Fabric

Enhancement of the database connection wizard to support the prompt functionality on context variables The database connection metadata wizard has been enhanced to support the prompt functionality on context variables. This allows you to edit first the value of any context variable, on which the prompt functionality is enabled, for a database connection parameter when performing an operation which requires the database connection.
Information noteNote: It is not supported for HBase, Hive, and Impala.

All subscription-based Talend products with Talend Studio

Support multi-line input for credential properties of components and metadata wizards created in the Talend Component Kit framework Talend Studio now provides the following two modes for entering the value of a credential property for components and metadata wizards created in the Talend Component Kit Component Kit framework.
  • Pure password mode (for both components and metadata wizards): allows you to enter a multi-line value. All inputs are considered as the value of the credential property.
  • Java mode (for components only): allows you to enter the value inside double quotes by following the Java convention.

For more information about the Talend Component Kit framework and the components and metadata wizards created in this framework, see Developing a component using Talend Component Kit.

All subscription-based Talend products with Talend Studio

Enhancement of tSAPTableInput to support multi-table query with Talend SAP RFC Server using dynamic schema

This release provides a dynamic schema enhancement, through which tSAPTableInput supports multi-table query with Talend SAP RFC Server.

All subscription-based Talend products with Talend Studio

Support of zip4j 2.x libraries in tFileArchive and tFileUnarchive

tFileArchive and tFileUnarchive support zip4j 2.x libraries.

All subscription-based Talend products with Talend Studio

Enhancements in tFileOutputJSON and tWriteJSONField to support dynamic datatype and avoid scientific notation

This release provides the following two enhancements for tFileOutputJSON and tWriteJSONField components.

  • Support of dynamic datatype;
  • An option for avoiding scientific notation for numbers.

All subscription-based Talend products with Talend Studio

HttpClient library upgraded to version 4.5.13 in tFileFetch

HttpClient library is upgraded to version 4.5.13 in tFileFetch.

All subscription-based Talend products with Talend Studio

Data Quality: new features

Feature

Description

Available in

Apache Spark 3.1 in local mode DQ components support Apache Spark 3.1 in local mode, except for tMatchIndexPredict and tMatchIndex.

All Talend Platform and Data Fabric products

Application Integration: new features

Feature

Description

Available in

New option in cSetHeader to remove headers

A new option Headers to remove is added to the cSetHeader component to remove message headers.

All subscription-based Talend products with ESB

Continuous Integration: new features

Feature

Description

Available in

Talend CI Builder is available from Talend repository from version 8.0.4 onwards From version 8.0.4 onwards, Talend CI Builder is available in the official Talend repository.

All subscription-based Talend products with Talend Studio

Support of CVE detection for Routes

The detection of fixed CVEs for Route artifacts is now supported.

All subscription-based Talend products with Talend Studio

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!