We are looking for a talented DevOps Engineer to join our team and provide support for DevOps and data engineering tasks within our cloud-based infrastructure. The ideal candidate will possess expertise in DevOps practices, containerization, data pipeline development, and troubleshooting intricate data workflows.
Responsibilities :
- Provide DevOps support with a focus on GCP infrastructure.
- Develop, maintain, and optimize data pipelines for high-performance processing.
- Debug and optimize SQL queries and Python or R-based scripts.
- Manage and maintain databases (Redshift, PostgreSQL, MS SQL – preferred).
- Work with Kubernetes (K8s) for container orchestration and resource management.
- Utilize Docker and Argo for containerization and workflow orchestration.
- Support Tableau users by ensuring efficient data access and integration.
- Troubleshoot and optimize ETL processes, including Pentaho / Carte workflows.
- Debug and enhance Java-based components when required.
- Other duties as needed.
Requirements :
4-5 years of experience in DevOps and data engineering roles.Strong experience with GCP infrastructure, including cloud networking, IAM, and compute resources.Proficiency in Python for scripting, automation, and debugging.Hands-on experience with SQL query optimization across multiple database engines.Experience managing Kubernetes clusters and containerized workloads.Strong understanding of Docker and Argo Workflows.Familiarity with Tableau for data visualization and reporting.Experience with Pentaho / Carte for ETL process debugging.Basic Java knowledge for troubleshooting integrations.Nice to have :
Experience with AWS services related to data processing and orchestration.Familiarity with additional BI tools and ETL platforms.Experience working in hybrid cloud environments.Perks and Benefits :
Provident Fund and Medical AllowancesProfessional Training and CertificationsPaid Time OffSemi-Annual Performance Bonus and Awards#J-18808-Ljbffr