Data Engineer – Cloud Platforms (Azure | GCP | AWS) We are seeking an experienced Data Engineer with over 5 years of expertise in designing and implementing modern, cloud‑based data solutions, primarily focused on Google Cloud Platform (GCP) and Microsoft Azure, with additional exposure to AWS. The ideal candidate is skilled in building scalable data platforms, orchestrating efficient data pipelines, and enabling analytics through Microsoft Fabric, BigQuery, and related technologies.
Technical Canvas
GCP Focus :
Hands‑on experience with BigQuery, Dataflow, Pub / Sub, Composer (Airflow), Cloud Storage, and Cloud Functions, building end‑to‑end ingestion, transformation, and analytics solutions.
Azure Expertise :
Proficiency in Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage (ADLS), and Azure SQL Database.
Fabric Integration :
Familiarity with Microsoft Fabric – integrating Data Pipelines, Dataflows Gen2, Lakehouses, and Notebooks to streamline ingestion, transformation, and reporting.
AWS Exposure :
Working knowledge of AWS Glue, S3, Lambda, and Redshift for data processing and analytics.
Data Engineering Excellence
Proven ability to design, build, and maintain ETL / ELT pipelines, data models, and data warehouses across multi‑cloud environments (GCP, Azure, AWS).
Experience implementing data lakehouse architectures and Fabric‑based workflows to enable unified analytics.
Analytics & Visualization
Strong experience with Power BI and Looker Studio to deliver dynamic data models, semantic layers, and interactive dashboards.
Ability to integrate reporting across Fabric, BigQuery, and Synapse ecosystems.
Programming & Automation
Proficient in SQL and Python, developing reusable, scalable scripts for data transformation, validation, and orchestration.
Experience with Airflow (Composer), ADF, and Databricks Workflows for job orchestration.
Database Systems
Experienced with Azure SQL, PostgreSQL, BigQuery, MySQL, and SQL Server, ensuring data accuracy, security, and performance optimization.
Version Control & DevOps
Proficient in Git, with exposure to CI / CD practices for deploying and maintaining data pipelines and Fabric / GCP artifacts in multi‑environment setups.
Core Attributes
Analytical thinker who translates complex business needs into data‑driven solutions.
Excellent communication and stakeholder management skills across business and technical teams.
Collaborative, proactive, and detail‑oriented professional who thrives in fast‑paced, multi‑cloud environments.
Seniority level Mid‑Senior level
Employment type Full‑time
Job function Other
Industries IT Services and IT Consulting
#J-18808-Ljbffr
Senior Data Engineer • Lahore, Pakistan