Senior Azure DevOps for a data spaces platform
Summary
- We are looking for a Senior Azure DevOps to work on building a full-cycle data management platform, which will include data ingestion, ETL, data quality, data enrichment, data processing pipelines orchestrated into an "elastic data fabric" and, most importantly, utilizing federated learning.
- Can be AWS/GCP experienced DevOps, but familiar with Azure and Data/data processing
- Azure Pureview, Active Directory, Azure Key Vault, Databricks, AKS, Docker Azure Arc, CI/CD would be a plus
- Start: ASAP
- Duration: Long-term
Project Description
We are looking for a Senior Azure DevOps who will work on a platform that automates data ingestion, processing, and sharing with user-friendly, privacy-preserving, and scalable solutions for industrial manufacturing.
The platform will incorporate scalable and dynamic tools for creating and managing data spaces, handling complex data workflows, and ensuring modularity and privacy compliance.
Preliminary Project's Stack:
- Backend: Python, Flask/FastAPI, Go
- Frontend: ReactJS, Angular.
- AI/ML: Azure Machine Learning, Azure Databricks, TensorFlow Federated, PyTorch, and privacy-enhancing techniques.
- Cloud and DevOps: Kubernetes, Docker, Azure DevOps, CI/CD Data pipelines on Azure
- Data Engineering: Apache NiFi. Kafka Connect, Databricks - on Azure.
- Database: Cosmos DB, Postgres/Hyperscale or MySQL/Healwave
*stack may change during the hiring process of qualified specialists in their areas
Responsibilities:
- Infrastructure setup
- CI/CD pipelines
- System scalability
- Real-time deployment of modular components and infrastructure stability.
Requirements:
- Azure DevOps / GitHub Actions / Jenkins container orchestration (AKS, Helm).
- Terraform, ARM/Bicep templates, or other infrastructure-as-code is essential to ensure consistent deployments across different environments.
Would be a plus:
- ADLS Gen2 or equivalent for data lake
- Azure Purview, Active Directory, Azure Key Vault, Databricks, AKS, Docker Azure Arc, CI/CD