Artur Solution Architect

Solutions Architect

Summary

- Solution Architect with 7 years of experience. My experience includes architecting, managing, and implementing various platform solutions. I have extensive knowledge of designing complex data solutions. I understand the tradeoffs of implementing commercial and open-source data solutions and have helped clients implement the best platform based on their business requirements.
- IT engineer with 15 years of solid experience in the following areas: full development life cycle, CI/CD, software development processes and methodologies, architecture and infrastructure design, and disaster recovery solution elaboration.
- Outstanding analytical and problem-solving skills. Professional oriented on a result with leadership and communications skills.

 Experience

Lead Solution Architect, CTDev, Reinsurance project

June 2020 – March 2023
Responsibilities:

  • Requirements gathering.
  • Designing and documenting end-to-end production solution architecture.
  • Collaboration with the Client’s top management and enterprise architects to design integration
    architecture.
  • Designing a solution to replace production-used applications with new ones to fulfill actual business needs.
  • Designing a unified split-brain protection solution of existing on-prem and cloud based clusters.
  • Redesigning monolith architecture to microservices.
  • Reviewing proposed architecture solutions.
  • Technical supervision.
  • Act as a single point of responsibility over any software engineering-related matters.
  • Addressing clients' issues, identifying and managing risks.
  • Coordination between multiple disciplines, and stakeholders.
  • Build up a delivery plan along with estimations on timeframes.
  • Hands-on experience:
  • PoC and key components implementation
  • Concurrency issues troubleshooting. Hazelcast cluster tuning with a focus on Split-Brain protection.
  • Troubleshooting Client specific issues.

Technologies:  AWS Cloud, (CodePipeline, KMS, Glue, CloudWatch, IAM, EKS), Hazelcast, Kafka, Java (Spring Boot), PostgreSQL, MongoDB, Angular, GWT,

BigData Solution Architect, BigData platform for ADNOC

Dec 2017 – June 2020
Description: Process more than 20 data sources and build a notification system. We have
implemented near real-time spark streaming and managed to improve the time of reaction on some
events by 30 times. One of the challenges was the requirement that all data should not leave a
specific location. The cloud vendor was not able to deploy a full solution in that location, but we
managed to split runtime environments and control portals within different locations to fulfill the
business needs.
Responsibilities:

  • Requirements gathering
  • Designing and documenting end-to-end production solution architecture
  • Preparing work breakdown structure for required technical changes
  • Architecture design review
  • Technical supervision
  • PoC implementation
  • Setup Data Factory pipelines integrated with Azure Batch account and sources (SQL databases
    and customs services).
  • Setup Databricks Spark cluster
  • Secure REST endpoint leveraging Azure API Management
  • Setup DevOps pipelines for CICD
  • Troubleshooting related issues

Technologies:  Azure Cloud, (Data Factory, KeyVault, Insights, Azure Batch, Azure API Management, AAD, Event Hub, Databricks), Kubernetes

BigData Solution Architect, Machine learning platform for Procter & Gamble US.

Description: Perform sales predictions for different regions like UK and Russia. We exported 4
years of sales history and built an AI-flavored platform with Spark cluster for ETL jobs. This project
improved predictions by 23% in comparison to human analytics. Afterward, this solution was
easily scaled to many other countries around the globe.
Responsibilities:

  • Requirements gathering
  • Designing and documenting end-to-end production solution architecture
  • Preparing work breakdown structure for required technical changes
  • Technical supervision
  • PoC implementation
  • Setup ML pipelines leveraging Azure ML services.
  • Setup ETL pipelines leveraging Apache Airflow.
  • Setup Exploratory Environment (Jupyter Lab integrated with pipelines) for Data Science team.
  • Setup monitoring tools for environments.
  • Collaborate with the Data Science team and tune the environment to improve and speed up work

Technologies:  Azure Cloud, Azure ML services, Azure KeyVault, Azure Insights, Apache Airflow, Jenkins, Jupyter Lab, Python (Anaconda), MLFlow, Ansible, Bash, Git

BigData Solution Architect, Healthcare BigData platform

Description: Build a platform to process and match donors and patients. 

We used Kafka to integrate more than 10 data sources for an ingestion layer. To match more than 40 million donors and patients, we have built a Databricks spark cluster with 7 machines. This approach has
increased the matching speed from 2 hours to 15 min.
Responsibilities:

  • Requirements gathering
  • Designing and documenting end-to-end production solution architecture
  • Reviewing the existing implementation
  • Preparing work breakdown structure for required technical changes to integrate new services with the existing solution
  • Technical supervision
  • Hands-on experience:
  • PoC implementation
  • Setup Cloudera Kafka (on-premise)
  • Integrate existing microservices and tools with the new Kafka installation
  • Tune the on-premise Spark cluster to solve performance issues (GC).
  • Focusing on Infrastructure as a Code approach to enable quick infrastructure setup for new installations.
  • Tune existing Spark pipelines to optimize data processing.

Technologies: AWS cloud, Cloudera Kafka, Spark, Java (Spring Cloud), Apache Ignite, Apache Kudu, Terraform, Ansible, Bash

Solution Architect, WorkFusion, WorkFusion platform

Sep 2016 - Dec 2017
Responsibilities:

  • Enhancements of the existing product architecture
  • Design of the set of customizations on the basis of the WorkFusion platform to address specific clients' needs using the POC approach.
  • Designing and documenting solution architecture
  • Cooperation with the Data Science team

Transition & Rollout Support

  • Planning and conducting the training for technical and business users to enable UAT launch;
  • Triage of UAT issues, including ones related to the DevOps area;
  • Processing of the business critical change requests revealed during the stage
  • Consulting the partner organization regarding the technical implementation of The product (WF platform).

Contribution to pre-sale:

  • Requirements gathering, conducting Q&A sessions with clients
  • Revision of SOW from a functional and technical consulting standpoint
  • Creation of the Solution Overview for the SOW

Hands-on experience:

  • Setup Mesos cluster to perform ML tasks with management tools integration.
  • Setup on-prem solutions for customers and align it with Cloud-based production installation.
  • Design Disaster Recovery solutions for customers depending on specific requirements
  • Troubleshooting critical issues on production and teaching the customer`s team to handle it.
  • Documentation preparation

Technologies:  AWS, Apache Mesos, Mesos Marathon, ELK (Elasticsearch, Logstash, Kibana), Kafka,
Spring Cloud, Spring Boot, Amazon S3, Riak CS, PostgreSQL, MongoDB, Nginx,
Selenium grid, Ansible, Python, Bash

Lead Software Engineer/Team Lead, Strevus 

Aug 2013 - May 2015
Responsibilities:

  • Design and implement solutions that help reach the application needs
  • Architectural design and implementation of frameworks ( own ORM-like framework, data models
    conversion framework);
  • Evaluate technologies and approaches and apply them to the application;
  • Discussion and clarification of requirements with Business Analysts from the Business side
  • Participate in setup and improve software development and support processes.
  • Team Leading (4 - 20 team members)

Technologies:  Spring, AMQP (RabbitMQ), Thrift, Cassandra, ElasticSearch, CXF, Google Guava, Nginx, Tomcat

Lead Software Engineer/Team lead, EPAM Systems 

Aug 2011 - May 2013
Responsibilities:

  • Application implementation;
  • Support of legacy code;
  • The decision-making of key technology solutions and monitoring their implementation;
  • Team Leading (5 team members);

Technologies:  Java EE, ATG, Oracle, JSP, JavaScript, JUnits;

Software Engineer, Itision Minsk, Belarus

Dec 2008 - May 2011
Responsibilities:

  • Applications implementation;

Technologies: Hibernate, Spring, MySQL, Guice, Lucene, Apache Velocity, Google Protobuf, GWT (RPC), GIN, JSP, JSPX, JavaScript, CSS, HTML;

Education

Bachelor’s Degree, Belarusian National Technical University Minsk, Belarus
2007