Upstaff Sign up
Ankit A.
🇮🇳India (UTC+01:00)
Created AtUpstaffer since January, 2023

Ankit A. — Lead Data Quality Engineer

Expertise in Data Engineer.

Last verified on July, 2023

Core Skills

SQL
SQL
Talend
Talend

Bio Summary

- 5+ years of experience in the IT industry as a Software Developer with hands-on experience with technologies like Talend Studio, My SQL, Postgres, Google Data Studio, Snowflake, Matillion, and Supermetrics.
- Developed Talend jobs and Matillion jobs for extracting, transforming, and loading (ETL) data in accordance with the Business logic of the Product. Analyzing, understanding, and updating the Existing Talend and Matillion jobs.
- Implemented automation of SQL Queries scripts for daily, weekly & monthly reports.
- Worked on Snowflake to create Stored Procedures, Streams, and Tasks. Data Validation and data quality checks at the backend with SQL.
- Worked on different Cloud sources like Google Analytics, Google Ads, Facebook Ads, Google Search Console etc
- Upper intermediate English
- Available for buyout

Technical Skills

Programming LanguagesJava, Python
.NET PlatformAzure
Data Analysis and Visualization TechnologiesTableau, Talend
Databases & Management Systems / ORMAWS Redshift, Microsoft SQL Server, MySQL, PostgreSQL, Snowflake, SQL
Cloud Platforms, Services & ComputingAWS, Azure, Matillion, Supermetrics
Amazon Web ServicesAWS Redshift, AWS S3
Google Cloud PlatformGoogle Data Studio
Methodologies, Paradigms and PatternsAgile, Scrum
Collaboration, Task & Issue TrackingAtlassian Confluence, Atlassian Trello, Jira
Operating SystemsLinux, Windows
QA, Test Automation, SecurityPostman

Projects

NorthwesternMutual, Data Engineer

October 2022 - Present

Responsibilities:

  • Code, test, implement and maintain medium to highly complex ETL mappings and scripts to build and maintain automated ETL processes.

  • Data extraction from different sources
    like Adobe Analytics, Google Search Console, Ahrefs,

    Semrush and Alps.

  • Data loading in Snowflake database using Matillion.

Technologies:SQL,Snowflake, AWS S3, Matillion, Supermetrics.

Diagonal Matrix, Data Engineer

April 2022 - September 2022

Responsibilities:

  • Creating Snowflake scripts for extracting and loading data from one table to another table.
  • Creating Stored Procedures, Streams, and Tasks to schedule Stored Procedures in Snowflake.
  • Worked on Github to store all SQL scripts for future reference

Technologies: Snowflake, AWS S3, Github.

Sunrise, Data Engineer

October 2021 - March 2022

Responsibilities:

  • Code, test, implement, and maintain medium to highly complex ETL mappings and scripts to build and maintain automated ETL processes.
  • Data extraction from different sources like Google Ads, Facebook Ads, and Criteo using super metrics and email reports
  • Data loading in Snowflake database using Matillion

Technologies: Snowflake, AWS S3, Matillion, Supermetrics.

HEB_CHIP, Data Engineer

April 2021 - September 2021

Responsibilities:

  • Code, test, implement, and maintain medium to highly complex ETL mappings and scripts to build and maintain automated ETL processes.
  • Data extraction from different sources like Google Ads, Facebook Ads, and Taboola using Supermetrics
  • Data loading in Snowflake database using Matillion

Technologies: Snowflake, AWS S3, Matillion, Supermetrics.

Vanquis, Data Engineer

October 2020 - March 2021  

Responsibilities:

  • Quality Analysis or Quality Check of Quarterly (Tableau) and Weekly (GDS) dashboards
  • Load the extracted data into the Data Lake.
  • Using Data Lake load the data to the snowflake staging area, mapped & transformed it, and moved it to the final tables.
  • Extracted the clicks, and impressions data from GSC (Google Search Console) at Page Level to show it in the tableau dashboard.
  • Data extraction from Adobe Analytics using email reports

Technologies: Snowflake, AWS S3, Matillion, Supermetrics.

SEO_Newsletter, Data Engineer

June 2020 - November 2020

Responsibilities:

  • Code, test, implement, and maintain medium to highly complex ETL mappings and scripts to build and maintain automated ETL processes.
  • Data extraction from Google Trends in CSV format • Data loading in Snowflake database using Matillion

Technologies: Snowflake, Matillion, Google Trends, AWS S3.

Webster, Data Engineer

December 2019 - May 2020

Responsibilities:

  • Data extraction on monthly basis from Google Analytics and Google Search Console
  • Cleanse, De-duplicate, and transform the data and loaded in the Snowflake database using Matillion

Technologies: Snowflake, Matillion, Facebook Ads Manager, AWS S3, Supermetrics.

ZUMBA, Data Engineer

June 2019 - November 2019

Responsibilities:

  • Data extraction from Facebook Ads, Google Analytics, and Google Ads using API
  • Data loading in Snowflake database using Matillion
  • Writing Unit Test cases and executing those Test cases to check if the application is as per requirements.
  • Developed a new job flow to load data at a weekly level.
  • Done the monthly job flow changes according to the change request. Work on complex SQL queries to combine the data of all sources

Technologies: Snowflake, Matillion, Facebook Ads Manager, AWS S3, Supermetrics.

 AFS Integration, Data Engineer

December 2018 - May 2019

Responsibilities:

  • Migrated the tables and the data from MSSQL to Redshift Cloud DB. Created Queries for the Data Validation between source and target.
  • Performs data validation, cleansing, and analysis to load the data from legacy systems to Redshift.
  • Testing the code developed from the development perspective.

Technologies: Postman, Talend, RedShift, AWS S3.

RedHaww, Data Engineer

June 2018 - November 2018

Responsibilities:

  • Created Scripts in Talend to extract the data from API, load it in MS SQL Server, extract data from tables in CSV format, and place it in Azure Storage.
  • Created SQL Queries for Data Validation in source and targets.
  • Deployment & Scheduling of Talend jobs on TMC.
  • Monitoring of the ETL Processes.
  • Exception Handling - handling failure events and sending emails about failure as per required.

Technologies: Talend, Talend Cloud Manage Platform, Postman, MS SQL Server, Azure.

SP&P, Data Engineer

December 2017 - May 2018

Responsibilities:

  • Job Development in Talend to extract the data from Xero, SKU Vault, Shopify & Amazon API and load it in Postgres DB.
  • Validation of data among source, staging, and target.  Unit Testing of the scripts to ensure the data is loaded correctly. Implemented the Error handling in the scripts to log the errors.  Deployment & Scheduling of Talend jobs on the server.
  • Created Revenue, Profit & Loss Dashboards in Google Data Studio. Participated in daily meetings to discuss upcoming work, schedules, and status, attending review meetings and walkthroughs with the Architect, Quality team, and Developers discussing queries related to defects. Prepared Job documentation and resolved bugs and status reporting to management.

Technologies: Talend, Postgres, Tableau, Postman.

Education

BS (Computer Science)

 

 

How to hire with Upstaff

1

Talk to Our Talent Expert

Our journey starts with a 30-min discovery call to explore your project challenges, technical needs and team diversity.

2

Meet Carefully Matched Talents

Within 1-3 days, we’ll share profiles and connect you with the right talents for your project. Schedule a call to meet engineers in person.

3

Validate Your Choice

Bring new talent on board with a trial period to confirm you hire the right one. There are no termination fees or hidden costs.

Why Upstaff

Upstaff is a technology partner with expertise in AI, Web3, Software, and Data. We help businesses gain competitive edge by optimizing existing systems and utilizing modern technology to fuel business growth.

Real-time project team launch

<24h

Interview First Engineers

Upstaff's network enables clients to access specialists within hours & days, streamlining the hiring process to 24-48 hours, start ASAP.

x10

Faster Talent Acquisition

Upstaff's network & platform enables clients to scale up and down blazing fast. Every hire typically is 10x faster comparing to regular recruitement workflow.

Vetted and Trusted Engineers

100%

Security And Vetting-First

AI tools and expert human reviewers in the vetting process is combined with track record & historically collected feedbacks from clients and teammates.

~50h

Save Time For Deep Vetting

In average, we save over 50 hours of client team to interview candidates for each job position. We are fueled by a passion for tech expertise, drawn from our deep understanding of the industry.

Flexible Engagement Models

Arrow

Custom Engagement Models

Flexible staffing solutions, accommodating both short-term projects and longer-term engagements, full-time & part-time

Sharing

Unique Talent Ecosystem

Candidate Staffing Platform stores data about past and present candidates, enables fast work and scalability, providing clients with valuable insights into their talent pipeline.

Transparent

$0

No Hidden Costs

Price quoted is the total price to you. No hidden or unexpected cost for for candidate placement.

x1

One Consolidated Invoice

No matter how many engineers you employ, there is only one monthly consolidated invoice.

Ready to hire Ankit A.
or someone with similar Skills?
Looking for Someone Else? Join Upstaff access to All profiles and Individual Match
Start Hiring