We are Kaizen Gaming
Kaizen Gaming, the team powering Betano, is one of the biggest GameTech companies in the world, operating in 19 markets. We always aim to leverage cutting-edge technology, providing the best experience to our millions of customers who trust us for their entertainment.
We are a diverse team of more than 2.700 Kaizeners, from 40+ nationalities spreading across 3 continents. We are an equal opportunity employer committed to fostering a diverse and inclusive workplace. We welcome applications from individuals of all backgrounds, regardless of race, gender, religion, sexual orientation, or age.
Our #oneteam is proud to be among the Best Workplaces in Europe and certified Great Place to Work across our offices. Here, there’ll be no average day for you. Ready to Press Play on Potential?
Let's start with the role
Our Lake House infrastructure has been a foundational pillar, seamlessly supporting a diverse cohort from BI, Data Science and AI teams to our innovative product squads. As we navigate towards the next frontier characterized by Data Products and insights from a Data Mesh paradigm, we anticipate transformative outcomes. Central to our mission, we aim to enhance the robustness of our data pipelines, the agility of our data products and the precision of our solutions, all while upholding the highest standards of data governance.
As a Senior Analytics Engineer at Kaizen Gaming, you will play a critical role in shaping our data ecosystem. Operating at the silver and gold layers of our medallion architecture, you will lead by example in ensuring data is discoverable, secure, high-quality, and observable. You will act as a key driver in elevating our data infrastructure, mentoring peers, and influencing cross-functional initiatives.
Our Analytics Engineering teams consist of members with diverse backgrounds and our tech stack includes, but is not limited to, Databricks, Delta Lake, dbt Cloud, SQL Server, Azure Data Warehouse, SSIS, SSAS, SSRS, Spark, Apache Airflow, Apache Kafka.
As a Senior Analytics Engineer, you will:
- Lead the design and implementation of robust, scalable data models and transformation pipelines across key domains.
- Architect and maintain high-performance ELT workflows, ensuring efficiency, reliability, and observability at scale.
- Translate complex business problems into elegant technical solutions that align with our long-term data strategy.
- Establish and champion best practices in data governance, modeling, testing, and documentation across teams.
- Mentor mid-level analytics engineers and contribute to the development of team-wide standards and frameworks.
- Collaborate cross-functionally with data, product, and engineering peers to drive strategic initiatives and build innovative data products.
What you'll bring
- 6+ years of hands-on experience in writing complex, highly optimized SQL code for data manipulation, transformation and reporting.
- 4+ years of experience in ETL/ELT development, data modeling, data warehouse architecture, and reporting tools.
- Proven ability to design and maintain scalable ELT workflows in modern, cloud-native environments.
- Deep understanding of data modeling techniques and enterprise data warehouse principles.
- A track record of technical leadership—driving initiatives, mentoring peers, and establishing best practices.
- Hands-on experience with the modern data stack, including tools like dbt, Airflow, and Databricks.
- Familiarity with CI/CD practices in data engineering and deployment pipelines.
- Strong communication skills, with the ability to effectively bridge technical and non-technical audiences.
It’s a plus if you bring
- Degree in a quantitative/technical field (e.g. Computer Science, Statistics, Engineering) or equivalent industry experience.
- Experience in programming languages preferably in Python or Scala.
- Experience working with modern OLAP systems (Pinot, ClickHouse, StarRocks).
- Familiarity with data observability and data quality tools such as Great Expectations, Monte Carlo, Validio, or Soda, and a passion for building trust in data.
- Knowledge of data governance principles and best practices.
- Exposure to an Agile team-working environment.