Data Engineer - Metica
The Role
We’re looking for a skilled data engineer to join our team and help us build and scale the data infrastructure that powers Metica’s cutting-edge personalization and optimization solutions. In this role, you’ll design and implement robust data platforms, pipelines, and systems that process and analyze vast amounts of data in real-time. Your work will directly impact our ability to deliver game-changing insights and tools to some of the world’s leading gaming studios and publishers.
This is an exciting opportunity for a hands-on data engineer who thrives in a startup environment, loves solving complex problems, and is passionate about gaming and data-driven technology. This is an in-office role based in London, with the flexibility to work from home one day per week.
You Will
- Design, build, and maintain scalable data pipelines using tools like Spark, Dask, Pandas, or Ray
- Develop data platforms such as data lakehouses, data lakes, and data warehouses
- Work with big data platforms like EMR, Databricks, or DataProc to process and manage large datasets
- Leverage your expertise in cloud environments (AWS, Azure, or Google Cloud) to ensure the scalability and reliability of our systems
- Support and operate production systems, ensuring data pipelines run smoothly and efficiently
- Collaborate with product, engineering, and data science teams to optimize platform performance and functionality
You Need
- Demonstrable experience building data pipelines with Spark, Dask, Pandas, or Ray
- Proficiency in programming languages such as Python (preferred), Scala (preferred), and Java (desirable)
- Hands-on experience working with cloud providers like AWS, Azure, or Google Cloud
- Proven knowledge of big data platforms like EMR, Databricks, or DataProc
- Expertise in building and managing data platforms, including data lakehouses, lakes, or warehouses
Bonus Points
- Experience with automated data quality frameworks
- Familiarity with CI/CD pipelines
- Understanding of machine learning models
- Knowledge of infrastructure as code tools
- Experience with data visualization tools