Key Responsibilities Design and architect agentic AI pipelines and multi-agent systems for data ingestion, transformation, processing, and analytics.
Build and optimize Lakehouse -based solutions on Databricks , including ETL / ELT pipelines, Delta Lake storage, and ML model operationalization.
Implement and orchestrate AI Agents using tools such as Google Vertex AI Agent Builder , Aera / AREA , or similar agentic platforms.
Integrate LLMs and foundation models into autonomous workflows for analytics and decisioning.
Develop, test, and deploy RAG pipelines and LLMOps workflows for domain-specific knowledge and reasoning tasks.
Implement best practices for prompt engineering, fine-tuning, responsible AI, and guardrails in production.
Collaborate with Data Engineering, MLOps, and Product teams to ensure scalability, performance, and security of the agentic platform.
Monitor performance and optimize deployed agents and AI workflows.
Required Skills & Qualifications 5–8 years of experience in AI / ML Engineering, Data Engineering, or Applied AI roles.
Strong expertise in Databricks (PySpark, Delta Lake, MLflow, Unity Catalog).
Hands-on experience with agentic frameworks and toolkits, such as : Google Vertex AI Agent Builder Aera / AREA Agentic Platform OR open-source frameworks (LangChain, AutoGen, CrewAI, HuggingFace ecosystem) Strong Python skills with experience developing scalable AI / ML applications.
Experience deploying LLMs, RAG workflows , and working with vector databases (Pinecone, Weaviate, Chroma, BigQuery Vector Search).
Good understanding of MLOps / LLMOps : CI / CD, model versioning, experiment tracking, model monitoring.
Familiarity with cloud environments (GCP preferred; AWS / Azure a plus).
Strong system design, problem-solving, and debugging abilities.
Nice-to-Have Experience developing autonomous decisioning systems for manufacturing, supply chain, finance, or enterprise automation.
Exposure to streaming platforms such as Kafka, Kinesis, Pub / Sub .
Knowledge of graph-based reasoning, enterprise knowledge models, or knowledge graphs.
Experience integrating with enterprise platforms such as SAP, Salesforce, ServiceNow , etc.
Soft Skills Excellent verbal and written communication.
Strong ownership mindset with the ability to work independently.
Ability to collaborate effectively with diverse technical and business teams.
Education Bachelor’s or Master’s degree in Computer Science, Software Engineering, Data Science, AI / ML , or other relevant fields (or equivalent experience).
Powered by JazzHR
Ai Engineer • PK