Skip to content

Data Engineer

RemoteEurope, Saint Julian's, Malta€35,000 - €64,350 per yearData Team

Job description

At BlueLabs we're combining the buzzing world of sports betting with modern tech and great engineering culture. We own our multi-tenant sports betting platform end-to-end. It consists of tens of microservices in a handful of decoupled domains, orchestrated by a terraform-provisioned Kubernetes cluster, achieving high scalability thanks to an event-driven architecture based on Apache Pulsar. We follow modern CI/CD and agile methodologies to deploy into production multiple times daily and use Grafana to monitor our infrastructure and applications.

We're now looking for a Data Engineer to join our Data Team. The team covers a wide range of skills to drive data-related initiatives and impact various parts of the organisation and its tenants. We aim to provide a solid, modern data platform, discover insights, promote data-driven decisions, and collaborate with other teams to optimize, innovate, and enhance our product. The team's mission is to provide an ecosystem where data is transmitted, modelled, stored, processed, and analyzed in a fast, efficient, reliable, and secure way. We want to provide our company with a competitive advantage by leveraging real-time data analytics and ML-powered products.


    • Identifying problems, designing solutions, implementing them, performing code reviews, and maintaining services in the production environment

    • Careful modelling of the data storage layer, ensuring reliable and swift message transfer, building high-performance data pipelines and integrations, together with supporting real-time analytics and data science flows

    • Applying simple and effective solutions, and a “getting things done” mentality

However, that's not all! At BlueLabs, we encourage you to contribute wherever your interests take you — and shape your role and our product accordingly.


The compensation range for this role is 35,000 EUR - 64,350 EUR annually, depending on your skills and experience. We encourage you to read our Recruitment FAQs for further details. In addition to the monetary compensation, we provide several perks, including a shiny new MacBook 16" M1 Pro or Linux laptop.

Job requirements

  • BS degree in Computer Science or a similar technical field

  • 2+ years of professional software engineering experience

  • 1+ years of experience working with relational databases (Postgres, MySQL) and writing complex SQL queries

  • Deep understanding of modern back-end systems, microservices, message-driven architecture, distributed systems, and data replication

  • Background in building ETL and ELT processes - knowledge of Flink, Spark, and DBT is highly appreciated

  • Understanding of data streaming concepts and technologies such as Kafka, Pulsar, and RabbitMQ

  • Familiarity with Agile methodology, containerization, continuous integration/deployment, cloud environment, and monitoring

  • Ability to write clean, efficient, maintainable, and well-tested code; Python/Java/Golang skills are a big plus

  • Analytical thinking, troubleshooting skills, and attention to detail

  • Good communication skills in verbal and written English.

Nice to have

    • Experience with Data Warehouses (BigQuery, SingleStore), Bitemporal modelling, Data Lakes, Stream Processing (Apache Flink, Beam), Batch Processing (Spark), Workflow Management Tools (Prefect, Argo), BI Tools (Looker), or other Big Data solutions are highly appreciated

    • Knowledge of Terraform/IaC and Kubernetes

    • Experience setting up dashboards and alerting in Grafana/DataDog