Key Responsibilities:
· Design and architect end-to-end cloud data platform solutions across AWS, Snowflake, Airflow, dbt, and lakehouse components.
· Analyse system requirements and develop enterprise data models aligned with business needs and scalable architecture principles.
· Define and oversee the implementation of data integration architectures, including ingestion pipelines, data flows, orchestration, and ELT/ETL processes.
· Build, optimize, and maintain high-performance data pipelines for ingestion, transformation, storage, and delivery across warehouse and lakehouse layers.
· Apply hands-on tuning and optimization to Snowflake, S3-based lakehouse structures (Iceberg), Airflow DAGs, and dbt transformation models.
· Ensure data governance, lineage, quality, observability, access control, and compliance throughout the entire data lifecycle.
· Evaluate, recommend, and implement emerging data technologies and architectural improvements to enhance platform scalability and performance.
· Monitor and troubleshoot data workflows, compute layers, storage layers, and orchestration to ensure platform reliability.
· Produce high-quality Architecture Design Documents (ADDs), solution designs, and technical standards aligned with enterprise architecture principles.
· Collaborate with engineering teams to maintain platform reliability through automation, observability, and continuous improvement.
· Provide production support, guiding teams through root-cause analysis, pipeline recovery, and optimization strategies.
· Champion best practices in data engineering, lakehouse design, ELT architecture, CI/CD, and cloud-native patterns.
Technical Qualifications:
· Minimum 7 years’ experience in a similar position within a dynamic and fast paced environment at enterprise production scale.
· Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
· Strong Proficiency in SQL, Data Integration (IBM Streamsets, Informatica) and Data Transformation (Airflow + DBT) with thorough understanding of data transformation processes.
· Strong Experience in AWS (IAM, Lambda, EKS, S3 (Datalake), EMR (Pyspark), Datalakehouse (Iceberg), MWAA and other services relevant to data architecture and analytics
· Expertise in Snowflake, with a proven ability to optimize queries for efficient data retrieval.
· Experience with both batch (StreamSets) and streaming (Kafka) data solutions.
· Familiarity with data modelling concepts (Inmon, Kimball, Data Vault, etc.).
· Exposure to CI/CD or Infrastructure-as-code (Terraform or any other) is a plus.
· Experience with Application Performance Monitoring Frameworks.
· Knowledge of data quality and security best practices is advantageous.
· Proven experience in producing sound and comprehensive Architecture Design Documents and ability to align with Enterprise Architecture Principles.
· Understanding of the classic software architecture patterns.
Non-Technical Skills:
· Strong attention to detail, with a proactive approach to identifying and resolving problems.
· A genuine interest in emerging data technologies and a curiosity about continuous improvements
· Excellent verbal and written communication skills.
· Attention to detail mindset, proactively identify problems & evaluate solutions
· AI Empowered Experience