Upstaff Sign up
Alex K.
🇷🇴Romania (UTC+02:00)
Created AtUpstaffer since September, 2023

Alex K. — Data Engineer

Expertise in Data Engineer, Data Science.

Last verified on November, 2023

Core Skills

Bio Summary

- Senior Data Engineer with a strong technology core background in companies focused on data collection, management, and analysis.
- Proficient in SQL, NoSQL, Python, Pyspark, Oracle PL/SQL, Microsoft T-SQL, and Perl/Bash.
- Experienced in working with AWS stack (Redshift, Aurora, PostgreSQL, Lambda, S3, Glue, Terraform, CodePipeline) and GCP stack (BigQuery, Dataflow, Dataproc, Pub/Sub, Data Studio, Terraform, Cloud Build).
- Skilled in working with RDBMS such as Oracle, MySQL, PostgreSQL, MsSQL, and DB2.
- Familiar with Big Data technologies like AWS Redshift, GCP BigQuery, MongoDB, Apache Hadoop, AWS DynamoDB, and Neo4j.
- Proficient in ETL tools such as Talend Data Integration, Informatica, Oracle Data Integrator (ODI), IBM Datastage, and Apache Airflow.
- Experienced in using Git, Bitbucket, SVN, and Terraform for version control and infrastructure management.
- Holds a Master's degree in Environmental Engineering and has several years of experience in the field.
- Has worked on various projects as a data engineer, including operational data warehousing, data integration for crypto wallets/De-Fi, cloud data hub architecture, data lake migration, GDPR reporting, CRM migration, and legacy data warehouse migration.
- Strong expertise in designing and developing ETL processes, performance tuning, troubleshooting, and providing technical consulting to business users.
- Familiar with agile methodologies and has experience working in agile environments.
- Has experience with Oracle, Microsoft SQL Server, and MongoDB databases.
- Has worked in various industries including financial services, automotive, marketing, and gaming.
- Advanced English
- Available in 4 weeks after approval for the project

Technical Skills

Programming LanguagesPython
Python Libraries and ToolsPySpark
Data Analysis and Visualization TechnologiesApache Airflow
Databases & Management Systems / ORMApache Hadoop, AWS DynamoDB, AWS Redshift, Data Lake, IBM DB2, Microsoft SQL Server, MongoDB, MySQL, Neo4j, NoSQL, Oracle Database, PL/SQL, PostgreSQL, RDBMS, SQL, T-SQL
Cloud Platforms, Services & ComputingAWS, GCP, Informatica
Amazon Web ServicesAWS Aurora, AWS CodePipeline, AWS DynamoDB, AWS Glue, AWS Lambda, AWS Redshift, AWS S3
Google Cloud PlatformDataflow, Dataproc, Google BigQuery, Google Data Studio
Scripting and Command Line InterfacesBash, Perl
Version ControlBitBucket, Git, SVN
Methodologies, Paradigms and PatternsPublish/Subscribe Architectural Pattern
Virtualization, Containers and OrchestrationTerraform
Other Technical SkillsFinancial Services

Work Experience

Senior Data Engineer, NY, USA

Duration: 09.2022 - present
Summary: Operational DWH
Responsibilities: Architect and implement the financial datasets into the Operation Data warehouse using a Snowflake architecture approach
Technologies: GCP infrastructure: VM, Storage, Big Query, Cloud SQL; Talend Cloud Data integration: modelling, DWH architecture

Senior Data Engineer, Ta’Xbiex, Malta

Duration: 09.2021 – 09-2022
Summary: Crypto Wallet / De-Fi
Responsibilities: Architect and develop the data integration framework for the IPFS data using a caching mechanism in Neo4j; Migrate the application data out of BFDB (Graph database) into the newly build Neo4j solution; Develop an ETL pipeline in AWS S3, Lambda, and Python; Architect and implement a new reporting Data Warehouse (Kimball) solution in AWS Aurora; Integrate data out of AWS DynamoDB and into Neo4j and Aurora DWH
Technologies: Neo4j, AWS S3, Lambda, Python, AWS Aurora, AWS DynamoDB

Senior Data Engineer, Accenture, Romania

Duration: 04.2021 – 09.2021
Summary: Cloud Data Hub – Core architecture team
Responsibilities: Develop AWS Glue + pyspark ingestion blueprints that become the framework for all data engineering teams; Architect a new file ingest dockerized blueprint: terraform -> SFTP -> S3 -> Glue/pyspark -> Target; Expand the Analytics blueprints: MongoDB, pyspark, terraform, docker
Technologies: AWS Glue, pyspark, terraform, MongoDB, Docker

Senior Data Engineer, France

Duration: 04.2020 – 04.2021
Summary: Data Lake
Responsibilities: Develop data ingestion pipelines using pyspark, Dataproc, Cloud Storage, BigQuery, Airflow, Cloud Build; Migrate the existing Data Lake from Apache Hadoop to GCP using Dataproc, pyspark, and BigQuery
Technologies: pyspark, Dataproc, Cloud Storage, BigQuery, Airflow, Cloud Build

Senior Data Engineer, Munich, Germany

Duration: 04.2018 - 04.2020
Summary: DWH Migration & BAU
Responsibilities: Migrate the company’s EDW (data vault architecture) ETL processes from Talend into Informatica Power Center 10; Maintain and document all existing Talend ETL pipelines until they were migrated and decommissioned; QA the newly migrated ETL pipelines in Informatica Data Quality; BAU activities for the overnight ETL processing in both Talend and Informatica
Technologies: Informatica Power Center, Talend, Informatica Data Quality

Senior Data Engineer, Leverkusen, Germany

Duration: 10.2017 – 04.2018
Summary: GDPR Neo4j (graph) reporting
Responsibilities: Talend Data Integration ETL development that defines the OWL 2 RDF ontology of the graph data model (Neo4j) which will help for the GDPR queries; develop cypher statements in order to create and establish the supplied ontology and graph structure; dynamic data processing from various systems (RDBMS, Apache Hadoop, Flat files, JSON, XML)
Technologies: Talend Data Integration, Neo4j, RDBMS, Apache Hadoop

Senior Data Engineer, Nuremberg, Germany

Duration: 04.2017 – 10.2017
Summary: CRM Migration
Responsibilities: Coordinate a small team of 2 developers (myself included) and 1 QA for the following 2 projects: Extracting CRM attributes for SUN analysis from the new CRM solution; Talend Data Integration development of the customer migration and data business validation processes due to the replacement of the legacy CRM
Technologies: Talend Data Integration

Senior Data warehouse developer, Cluj-Napoca, Romania

Duration: 11.2016 – 04.2017
Summary: Legacy DWH to Data Lake migration
Responsibilities: IDP: migrating 2 legacy DWH (Oracle and MySQL) into a new AWS Redshift Data Lake; Talend Data Integration ETL development for the DWH merger using a metadata-driven ETL engine; python development for the IDP (integrated data platform) CRM milestones
Technologies: Talend Data Integration, AWS Redshift, Python

Senior ETL developer, Phoenix, AZ, US

Duration: 07.2015 – 11.2016
Summary: Mainframe migration and DWH Migration
Responsibilities: ETL migration & Pentaho data integration (PDI) to Oracle Data Integrator (ODI 11g) for the DWH processes: Design a new DWH snowflake model as the main DataHub for across the company; IBM DB2 ERP OLTP dataset normalization; PDI legacy ETL processes support (BAU); Historical reporting (SCD type 2 and Snapshot Facts)
Technologies: Pentaho Data Integration, Oracle Data Integrator (ODI 11g), IBM DB2

Senior Data Warehouse Consultant, Zurich, Switzerland

Duration: 2014 - 2015
Summary: Senior Data Warehouse Consultant
Responsibilities: Design & develop ETL & ELT processes using PL/SQL; Performance tuning; Testing new releases in DEV and UAT environments; Preparation of release packages for Production; Answering Ad-Hoc queries and providing technical know-how for the business users
Technologies: PL/SQL

Senior Data Warehouse Developer, Gibraltar

Duration: 2013 - 2014
Summary: Senior Data Warehouse Developer
Responsibilities: Developed ETL & ELT processes using Oracle ODI using the Kimball methodology; Designed and developed objects using Oracle Warehouse Builder 10g; ETL development using Oracle Data Integrator (ODI) and Microsoft SSIS; Development and BAU for Microsoft OLAP Cubes (MDX queries) and Microsoft SSAS Tabular
Technologies: Oracle ODI, Oracle Warehouse Builder 10g, Microsoft SSIS, Microsoft SSAS

Senior Data Warehouse Developer, Cluj-Napoca, Romania

Duration: 2012 - 2013
Summary: Senior Data Warehouse Developer
Responsibilities: Design & develop new business functionalities in an agile environment; Develop, monitor and maintain warehouse critical alerts and processes; Write and maintain functional and technical specifications; Monitor, optimize and troubleshoot database and cube performance
Technologies: Oracle ODI, Oracle Warehouse Builder 10g

Database Developer, Cluj-Napoca, Romania

Duration: 2011 - 2012
Summary: Database Developer
Responsibilities: Design & develop new ETL processes and integrate them into the existing model using OWB 10g; Develop stored procedures using the ORM method, which are called from .Net by the AR module; Develop business reports in an Oracle Applications 11i environment, using Oracle Reports 6i; Develop ETL/ELT processes in OWB and Microsoft SSIS
Technologies: OWB 10g, Microsoft SSIS

Oracle DBA, Timisoara, Romania

Duration: 2010 - 2011
Summary: Oracle DBA
Responsibilities: Database administration on different versions: 8i, 9i, 10g, 11g, 11R2; Design and document existing database architecture; Provide periodically On-Call support for critical situations within the SLA response & resolve time; Implement and use various methods of connectivity and tunneling through the VPN; Apply Oracle CPU Updates on a regular basis, in accordance with their release dates
Technologies: Oracle DB

Junior Database Developer, Oradea, Romania

Duration: 2009 - 2010
Summary: Junior Database Developer
Responsibilities: Develop Functions, Triggers, Stored Procedures, Indexes, and Views; Microsoft SSIS - ETL development; Data management and architecture of all the DB servers; Monitor the replication status of our distribution database and to all our subscribed databases; Define and implement administrative procedures for billing endorsement, cash books, etc.
Technologies: Microsoft SSIS

Junior Database Developer, Oradea, Romania

Duration: 2008 - 2009
Summary: Junior Database Developer
Responsibilities: OLTP - ETL development using Microsoft SSIS and SqlServer Transact-SQL; Developed BI reports using Microsoft SSRS; Ensure that services under your responsibility are delivered according to the Service description; Data planning, development, deployment and administration; Ensure components supported are maintained, optimized and used effectively
Technologies: Microsoft SSIS, Microsoft SSRS

Education

  • University of Oradea
    Master's degree Environmental Engineering
    2009 - 2012
  • University of Oradea
    Graduate's degree Environmental Protection
    2005 - 2009
  • High School "Lucian Blaga"
    Information Technology
    2001 - 2005

How to hire with Upstaff

1

Talk to Our Talent Expert

Our journey starts with a 30-min discovery call to explore your project challenges, technical needs and team diversity.

2

Meet Carefully Matched Talents

Within 1-3 days, we’ll share profiles and connect you with the right talents for your project. Schedule a call to meet engineers in person.

3

Validate Your Choice

Bring new talent on board with a trial period to confirm you hire the right one. There are no termination fees or hidden costs.

Why Upstaff

Upstaff is a technology partner with expertise in AI, Web3, Software, and Data. We help businesses gain competitive edge by optimizing existing systems and utilizing modern technology to fuel business growth.

Real-time project team launch

<24h

Interview First Engineers

Upstaff's network enables clients to access specialists within hours & days, streamlining the hiring process to 24-48 hours, start ASAP.

x10

Faster Talent Acquisition

Upstaff's network & platform enables clients to scale up and down blazing fast. Every hire typically is 10x faster comparing to regular recruitement workflow.

Vetted and Trusted Engineers

100%

Security And Vetting-First

AI tools and expert human reviewers in the vetting process is combined with track record & historically collected feedbacks from clients and teammates.

~50h

Save Time For Deep Vetting

In average, we save over 50 hours of client team to interview candidates for each job position. We are fueled by a passion for tech expertise, drawn from our deep understanding of the industry.

Flexible Engagement Models

Arrow

Custom Engagement Models

Flexible staffing solutions, accommodating both short-term projects and longer-term engagements, full-time & part-time

Sharing

Unique Talent Ecosystem

Candidate Staffing Platform stores data about past and present candidates, enables fast work and scalability, providing clients with valuable insights into their talent pipeline.

Transparent

$0

No Hidden Costs

Price quoted is the total price to you. No hidden or unexpected cost for for candidate placement.

x1

One Consolidated Invoice

No matter how many engineers you employ, there is only one monthly consolidated invoice.

Ready to hire Alex K.
or someone with similar Skills?
Looking for Someone Else? Join Upstaff access to All profiles and Individual Match
Start Hiring