Senior Data Engineer - 6 month rate contract
€500 per day - Dublin (remote)
Key Responsibilities
· Maintain/develop data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation and loading of data from a wide variety of data sources using Python, SQL, DBT, and other data technologies
· Design, implement, test and maintain data pipelines/ new features based on stakeholders' requirements
· Develop/maintain scalable, available, quality assured analytical building blocks/datasets by close coordination with data analysts
· Optimize/ maintain workflows/ scripts on present data warehouses and present ETL
· Design / develop / maintain components of data processing frameworks
· Build and maintain data quality and durability tracking mechanisms to provide visibility into and address inevitable changes in data ingestion, processing, and storage
· Collaborate with stakeholders to define data requirements and objectives.
· Translate technical designs into business appropriate representations and analyse business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business
· Address questions from downstream data consumers through appropriate channels
· Create data tools for analytics and BI teams that assist them in building and optimizing our product into an innovative industry leader
· Stay up to date with data engineering best practices, patterns, evaluate and analyze new technologies, capabilities, open-source software in context of our data strategy to ensure we are adapting our own core technologies to stay ahead of the industry
· Contribute to Analytics engineering process
Required Qualifications
5+ Years Relevant Work Experience
BA / BS in Data Science, Computer Science, Statistics, Mathematics, or a related field
Built processes supporting data transformation, data structures, metadata, dependency, data quality, and workload management
Experience with Snowflake, Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe. Must have worked on Snowflake Cost optimization scenarios.
Overall solid programming skills, able to write modular, maintainable code, preferably Python & SQL
Have experience with workflow management solutions like Airflow
Have experience on Data transformations tools like DBT
Experience working with Git
Experience working with big data environment, like, Hive, Spark and Presto
Ready to work flexible hours
Preferred Requirements
Experience supporting Support, Customer Success,
DAG Airflows
Knowledge of natural language processing (NLP) and computer vision techniques.
Familiarity with version control systems (e.g., Git).
Snowflake
DBT
Working knowledge of Power BI
AWS environment, for example S3, Lambda, Glue, Cloud watch
Basic understanding of Salesforce
Experience working with remote teams spread across multiple time-zones
Have a hunger to learn and the ability to operate in a self-guided manner
For more info please contact Michael on 01 6146058 / [email protected]
#LI-MF7