Basic Information
Ref Number
Last day to apply
Primary Location
Country
Job Type
Work Style
Description and Requirements
As a Data Engineer, you will architect and design data pipelines, data products, and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, lead agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design.
Required Skills:
Architect, Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.
Perform application impact assessments, requirements reviews, and develop work estimates.
Develop test strategies and site reliability engineering measures for data products and solutions.
Lead agile development "scrums" and solution reviews.
Mentor junior Data Engineering Specialists.
Lead the resolution of critical operations issues, including post-implementation reviews.
Perform technical data stewardship tasks, including metadata management, security, and privacy by design.
Demonstrate expertise in SQL and database proficiency in various data engineering tasks.
Automate complex data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.
Develop and manage Unix scripts for data engineering tasks.
Intermediate proficiency in infrastructure-as-code tools like Terraform, Puppet, and Ansible to automate infrastructure deployment.
Proficiency in data modeling to support analytics and business intelligence.
Working knowledge of ML Ops to integrate machine learning workflows with data pipelines.
Extensive expertise in GCP technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud
Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), and BigTable.
Advanced proficiency in programming languages (Python).
Additional Job Description
Qualifications:
Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.
Analytics certification in BI or AI/ML.
6+ years of data engineering experience.
4 years of data platform solution architecture and design experience.
GCP Certified Data Engineer (preferred).
EEO Statement