Data Engineer
Data Engineer, Data Extraction and ETL
Summary
- Experience at least 3 years in DWH, Data Platform development
- Good knowledge of SQL, Python
- Experience working with at least one of the next technological stacks: DataBricks, Data Factory, Fabric, AWS Glue, Apache Spark.
- Experienced development of REST APIs
- Experience in GitLab, JIRA, Azure DevOps;
- Intermediate English
Requirements
- Bachelor’s degree in Computer Science or Technical degree.
- Experience at least 3 years in DWH, Data Platform development.
- Familiarity with data architecture, data modeling concepts.
- Good knowledge of SQL, Python.
- Experience working with at least one of the next technological stacks: DataBricks, Data Factory, Fabric, AWS Glue, Apache Spark.
- Readiness to master new technologies and tools.
- Experienced development of REST APIs
- Experience in GitLab, JIRA, Azure DevOps;
- High level of soft skills, work, and professional ethics.
- English — at least Intermediate.
Responsibilities
- Eliciting and gathering requirements, description of the requirements
- Data model creation
- ETL Pipelines & CICD
- Definition and configuration and development of ETL pipelines
- Preparation of data for quality assurance
- Automation of deployment and test processes
- Creation of documentation in Confluence and GitLab
- API's creation
- Testing the developed solutions
- Maintenance the solution