Talend Data Integration v7 Certified Developer
Add-OnThis service comes with additional cost.
Talend certification exams are designed to be challenging to ensure that you have the skills to successfully implement quality projects. Preparation is critical to passing.
This certification exam covers topics related to the development of Talend Studio data integration Jobs. Topics include, but are not limited to, using Talend Studio in a stand-alone and collaborative team environment to build, analyze, and test Jobs, accessing files and databases, joining and filtering data, orchestrating complex tasks, and following best practices.
Certification exam details
Duration: 90 minutes
Number of questions: 55
Passing score: 70%
Exam content is updated periodically. The number and difficulty of questions may change. The passing score is adjusted to maintain a consistent standard.
Recommended experience
-
At least six months of experience using Talend products
-
General knowledge of data integration architecture and advanced features such as parallelization
-
Experience with Talend Data Integration 7.x solutions, including manual installation and configuration, project management, user management, Job deployment strategies, and troubleshooting of common issues
Preparation
To prepare for this certification exam, Talend recommends:
-
Taking the Data Integration Basics and Data Integration Advanced learning plans
-
Studying the training material in the Talend Data Integration Certification Preparation training modules
-
Reading product documentation and knowledge base articles on Talend Community
Badge
After passing this certification exam, you are awarded the Talend Data Integration Developer Certified badge. To know more about the criteria to earn this badge, refer to the Talend Academy Badging program page.
Ready to register for your exam?
Connect to Talend Exam to register.
Certification exam topics
Getting started with Talend Data Integration
- Define Talend Data Integration
- Describe the Talend Studio UI
- Create a simple Job
Working with files
- Configure basic component properties
- Create and configure a schema
- Use the tMap component and configure a simple mapping
- Use pre-defined Talend Java functions
Joining and filtering data
- Define and configure a Talend metadata
- Join two sources of data using the tMap component
- Define the tMap Join settings and its reject capture mechanism
- Create a filter condition
- Configure a filter reject output along with multiple filtered outputs
Using context variables
- Define a standard context variable use case
- Use context variables in a Job
- Run Jobs in multiple contexts
Error handling
- Use triggers to create a sequence of subJobs
- Use logging components in a Job design
- Create 'if' triggers based on component variables
Working with databases
- Define a database metadata
- Set actions on tables and data
- Customize SQL queries in database components
- Use metadata, generic schemas, and context variables in database components
Orchestrating Jobs
- Explain a typical master Job use case
- Describe the order of priority to pass parameters in context variables
- Send dynamic parameters to a child Job by overriding the context variables
- Explain Joblets and compare them to other orchestration primitives
- Refactor and create a Joblet from an existing Job
- Create a Joblet from the ground up
- Incorporate a Joblet into a Job
- Explain the different parallelization options available in Talend Studio
- Profile the execution of a Job with and without parallelism applied
Deploying Jobs
- Build a Job and understand its options
- Run a standalone Job
- Handle context variables in standalone Jobs
- Configure remote hosts in Talend Studio
- Launch a Job on a remote host
Project management
- Explain key differences between a local and remote connection
- Configure a remote connection in Talend Studio
- Explain the key concepts of revision control (Git)
- Perform revision control operations (switch, copy, and compare between branches)
- Define a reference project and use items from a reference project
Debugging
- Debug a Job using Traces Debug
Getting started with Pipeline Designer
- Define Pipeline Designer
- Understand the Pipeline Designer UI
- Create a simple pipeline
Managing connections, datasets, and pipelines
- Manage connections
- Manage datasets
- Manage pipelines
Using processors in Pipeline Designer
-
Join data source
- Filter data
-
Aggregate data
- Transform data using Data Preparation processors
Using context variables in Pipeline Designer
- Define a standard context variable
- Use context variables in processors
- Use context variables in connection and dataset configurations
Getting started with Stitch
- Define Stitch
- Configure Stitch destinations and integrations
- Extract data using the Stitch replication process
- Load data using the Stitch replication process
- Integration of Stitch API