Data Platform Software Engineer for multinational project – remote!
Last update: 13.03. 2023 13:51
Job Type: Full-time, Freelance,
Sector: Development & Engineering, Infrastructure, Data Science & Analytics,
Benefits: Meal vouchers, Sick Days, Extra vacation, Flexible Schedule, Cafeteria system, Pension Insurance Contribution, Multisport Card Contribution, Partial Remote Work, Czech language not required, Educational courses and trainings,
Hiring: Men and Women
I’m looking for an experienced Software Engineer for a new Data Platform team. You’ll play a critical role by deploying and expanding the new data platform across Europe and migrating legacy data initiatives. You’ll provide hands-on support to country data teams, collaborate with the product team, and develop components on top of the new platform. This is a key position, located in Prague near the Data Science and Data Modelling teams, and working closely with the team in the Netherlands.
Join the new Data Platform Catapult team as a Software Engineer and help build and automate new data platform infrastructure. You will be responsible for the building and automation of data platform infrastructure and functional data components, enable and actively collaborate with platform teams in the business units to deliver high-quality software components. You’ll work with over 75 data engineers in 10 teams and 10 countries. This means you are in regular direct contact with data engineers from different teams, cloud engineers, ML Ops engineers, as well as the product owner and product manager.
Tech stack is MS Azure Cloud for infrastructure, Terraform for IaC, Terratest (Go) for testing, Azure Data Factory for pipelines and orchestration, and Databricks for processing and transforming data. Come be a part of the team and help drive innovation through data!
- You have relevant software, data platform engineering experience, building platforms which are modular, testable, scalable and easily consumable.
- Have hands-on experience on one or more cloud services (Azure/GCP/AWS) services: ADF, Data-lake, Delta-lake, Databricks, Key Vaults, BigQuery, Cloud Dataflow, Datapipeline, etc.
- Demonstrated programming and debugging experience with Python,PySpark, SQL, Go.
- Experience building API and Microservices solutions
- Experience with Data as Code; version control, small and regular commits, unit tests, CI/CD, packaging, branching, containerization, etc.
- Preferably experience with open source projects run with a “build once, deploy often” mindset & Experience or interest in Domain Driven Design.
- Plenty of training and education opportunities in the Learning & Development Centre
- A large (international) network of colleagues who are happy to share their knowledge with you
- The autonomy to determine your own development path
- Flexible work from home policy
District: Praha hlavní město