Basic Information
Ref Number
Last day to apply
Primary Location
Country
Job Type
Work Style
Description and Requirements
4+ years of industry experience in the field of Data Engineering.
Proficient in Google Cloud Platform (GCP) services such as Dataflow, BigQuery, Cloud Storage and Pub/Sub.
Strong understanding of data pipeline architectures and ETL processes.
Experience with Python programming language in terms of data processing.
Knowledge of SQL and experience with relational databases.
Familiarity with version control systems like Git.
Ability to analyze, troubleshoot, and resolve complex data pipeline issues.
Software engineering experience in optimizing data pipelines to improve performance and reliability.
Continuously optimize data pipeline efficiency and reduce operational costs and reduce number of issues/failures
Automate repetitive tasks in data processing and management
Experience in monitoring and alerting for Data Pipelines.
Continuously improve data pipeline reliability through analysis and testing
Perform SLA-oriented monitoring for critical pipelines and provide suggestions as well implement post business approval for SLA adherence if needed.
Monitor performance and reliability of GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
Maintain infrastructure reliability for GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
Conduct post-incident reviews and implement improvements for data pipelines.
Develop and maintain documentation for data pipeline systems and processes.
Excellent communication and documentation skills.
Strong problem-solving and analytical skills.
Open to work in a 24X7 shift.
Additional Job Description
Venue: TELUS International, UG Floor, Tower No. 6, Candor Tech Space, Sector 135, Noida.
Date: 19th July 2025 (Saturday)
Time: 11:00 AM to 3:00 PM
EEO Statement