Základné informácie
Ref Number
Primárna lokácia
Krajina
Typy zamestnania
Work Style
Opis a požiadavky
Welcome to TELUS Digital — where innovation drives impact at a global scale. As an award-winning digital product consultancy and the digital division of TELUS , one of Canada’s largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.
With a global team across North America, South America, Central America, Europe, and APAC, we offer end-to-end expertise across eight core service areas: Digital Product Consulting, Digital Marketing Services, Data & AI, Strategy Consulting, Business Operations Modernization, Enterprise Applications, Cloud Engineering, and QA & Test Engineering.
From mobile apps and websites to voice UI, chatbots, AI, customer service, and in-store solutions, TELUS Digital enables seamless, trusted, and digitally powered experiences that meet customers wherever they are — all backed by the secure infrastructure and scale of our multi-billion-dollar parent company.
Work Mode / Location :: Remote
Note: For this role, the experience with Snowflake Data Clean Room is mandatory
Responsibilities
Design, develop and optimize scalable data pipelines and ETL workflows using Google Cloud Platform (GCP), particularly leveraging BigQuery, Dataflow, Dataproc and Pub/Sub.
Design and manage secure, efficient data integrations involving Snowflake and BigQuery.
Write, test and maintain high-quality Python code for data extraction, transformation and loading (ETL), analytics and automation tasks.
Use Git for collaborative version control, code reviews and managing data engineering projects.
Implement infrastructure-as-code practices using Pulumi for cloud resources management and automation within GCP environments.
Apply clean room techniques to design and maintain secure data sharing environments in alignment with privacy standards and client requirements.
Collaborate with cross-functional teams (data scientists, business analysts, product teams) to deliver data solutions, troubleshoot issues and assure data integrity throughout the lifecycle.
Optimize performance of batch and streaming data pipelines, ensuring reliability and scalability.
Maintain documentation on processes, data flows and configurations for operational transparency.
Qualifications
Strong hands-on experience with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub.
Proficiency in data engineering development using Python.
Deep familiarity with Snowflake—data modeling, secure data sharing and advanced query optimization.
Proven experience with Git for source code management and collaborative development.
Demonstrated ability using Pulumi (or similar IaC tools) for deployment and support of cloud infrastructure.
Practical understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations.
Solid skills in debugging complex issues within data pipelines and cloud environments.
Effective communication and documentation skills.
Skills
Snowflake Data Clean Room
GCP
ETL
Educational Background
Bachelor's degree in IT, computer science, software engineering, or equivalent experience.
EEO Statement