Basic Information
Ref Number
Last day to apply
Primary Location
Państwo
Job Type
Work Style
Description and Requirements
Required Skills:
- 4+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets.
- Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others.
- Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources.
- Work closely with analysts and business process owners to translate business requirements into technical solutions.
- Coding experience in scripting and languages (Python, SQL, PySpark).
- Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, Vertex AI).
- Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability.
- Understanding CI/CD Processes using Pulumi, Github, Cloud Build, Cloud SDK, Docker
- Experience with SAS/SQL Server/SSIS is an added advantage.
Additional Job Description
- Bachelor's degree in Computer Science or related technical field, or equivalent practical experience.
- GCP Certified Data Engineer (preferred)
- Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams and business audiences.
EEO Statement