Superb opportunity for a ETL Test Engineer to design and implement automated testing frameworks across Databricks-based data platforms within a modern data environment. You’ll work closely with Data Engineering teams to ensure data accuracy, quality, and performance across complex data pipelines and Lakehouse architectures. Databricks experience is essential for this vacancy.
Key Responsibilities
- Develop automated testing for Databricks notebooks, Delta Lake tables, and structured streaming pipelines.
- Build and maintain test frameworks for ETL/ELT processes, data transformations, and ingestion workflows.
- Implement automated validation for Delta Live Tables, Unity Catalog, and Spark performance testing.
- Design and execute data quality test strategies, reconciliation processes, and performance benchmarks.
- Create monitoring frameworks, test dashboards, and quality metrics for reporting and continuous improvement.
Skills & Experience
- 2+ years’ hands-on experience with Databricks (Spark).
- Strong programming skills in Python (PySpark) and SQL.
- Proven experience with data testing frameworks and automation tools.
- Knowledge of AWS (S3, Glue, Lambda) and Lakehouse / Delta Lake architecture.
- Experience with Git, data governance, and Agile delivery environments.