Basic Information

Ref Number

Req_00159843

Last day to apply

12-Jul-2025

Primary Location

IN - Noida 54 - NSEZ

Country

India

Job Type

Digital Solutions

Work Style

Hybrid, On Site

Description and Requirements

As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design.

Required Skills:

  • Design, develop, and support data pipelines and related data products and platforms.

  • Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.

  • Perform application impact assessments, requirements reviews, and develop work estimates.

  • Develop test strategies and site reliability engineering measures for data products and solutions.

  • Participate in agile development "scrums" and solution reviews.

  • Mentor junior Data Engineers.

  • Lead the resolution of critical operations issues, including post-implementation reviews.

  • Perform technical data stewardship tasks, including metadata management, security, and privacy by design.

  • Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies 

  • Demonstrate SQL and database proficiency in various data engineering tasks.

  • Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.

  • Develop Unix scripts to support various data operations.

  • Model data to support business intelligence and analytics initiatives.

  • Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation.

  • Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog,

  • Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).

Additional Job Description

Qualifications:

  • Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.

  • 4+ years of data engineering experience.

  • 2 years of data solution architecture and design experience.

  • GCP Certified Data Engineer (preferred).

EEO Statement

At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent.

Equal Opportunity Employer

At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants’ qualifications, merits, competence and performance without regard to any characteristic related to diversity.
× -