Основная информация
Ref Number
Last day to apply
Primary Location
Country
Job Type
Work Style
Описание и требования
Required skills:
- 4+ years of industry experience in the field of Data Engineering support and enhancement Proficient in any Cloud Platform services (GCP, Azure, AWS, etc.).
- Strong understanding of data pipeline architectures and ETL processes.
- Excellent Python programming skills for data processing and automation.
- Excellent SQL query writing skills for data analysis and Experience with relational databases Familiarity with version control systems like Git .
- Ability to analyze, troubleshoot, and resolve complex data pipeline issues.
- Software engineering experience in optimizing data pipelines to improve performance and reliability Continuously optimize data pipeline efficiency and reduce operational costs and reduce the number of issues/failures
- Automate repetitive tasks in data processing and management
- Experience in monitoring and alerting for Data Pipelines
- Continuously improve data pipeline reliability through analysis and testing
- Perform SLA-oriented monitoring for critical pipelines and provide suggestions as well as implement post-business approval for SLA adherence if needed
- Monitor performance and reliability of data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
- Maintain infrastructure reliability for data pipelines, Informatica ETL workflows, MDM and Control-M jobs. Conduct post-incident reviews and implement improvements for data pipelines.
- Develop and maintain documentation for data pipeline systems and processes
- Experience with Data Visualization using Google Looker Studio, Tableau, Domo, Power BI, or similar tools is an added advantage
Additional Job Description
- Excellent communication and documentation skills
- Strong problem-solving and analytical skills
- Open to work in a 24X7 shift
EEO Statement