RemoteMalta, Saint Julian's, EuropeData Team
At BlueLabs we're combining the buzzing world of sports betting with modern tech and great engineering culture. We own our multi-tenant sports betting platform end-to-end. It consists of tens of microservices in a handful of decoupled domains, orchestrated by a terraform-provisioned Kubernetes cluster, achieving high scalability thanks to an event-driven architecture based on Apache Pulsar. We follow modern CI/CD and agile methodologies to deploy into production multiple times per day and use Datadog to monitor our infrastructure and applications.
We're now looking for a Data Engineer to join our Data Team. The team covers a wide range of skills to drive data-related initiatives and impact different parts of the organization. Our aim is to provide a solid data platform, discover insights, promote data-driven decisions, and collaborate with other teams to optimize, innovate, and enhance our services. The team's mission is to provide an ecosystem where data is transmitted, stored, processed, and analyzed in a fast, stable, reliable, and secure way. By applying data science on top of the gathered data and with ML-powered applications we want to give our company a competitive advantage.
- Identifying problems, designing solutions, implementing them, performing code reviews, and maintaining services in the production environment
- Careful modeling of the data storage layer, ensuring reliable and swift message transfer, building high-performance data pipelines, and supporting Analytics and Data Science flows
- Applying simple and effective solutions, and a “getting things done” mentality
However, that's not all! At BlueLabs, we encourage you to contribute wherever your interests take you — and shape your role and our product accordingly.
The compensation range for this role is €50,000 to €80,000 annually, depending on your skills and experience. We encourage you to read our Recruitment FAQs for further details. In addition to the monetary compensation, we provide several perks, including a shiny new MacBook 16" M1 Pro or Linux laptop.
- BS degree in Computer Science or similar technical field
- 2+ years of professional software engineering experience
- 1+ years of experience working with relational databases (Postgres, MariaDB, Oracle) and writing complex SQL queries
- Deep understanding of modern back-end systems, microservices, message-driven architecture, distributed systems, and replication
- Background in building data transformation pipelines - knowledge of DBT is highly appreciated
- Understanding of data streaming concepts and technologies such as Kafka, Pulsar, and RabbitMQ
- Familiarity with Agile methodology, containerization, continuous integration/deployment, cloud environment, and monitoring
- Ability to write clean, efficient, maintainable, and well-tested code; Golang/Java/Python skills are a big plus
- Analytical thinking, troubleshooting skills, and attention to detail
- Good communication skills in verbal and written English.
Nice to have
- Experience with Data Warehouses (BigQuery, Snowflake, SingleStore), Data Lakes, NoSQL, Stream Processing (Apache Beam, Flink), Workflow Management Tools (Prefect, Airflow, Argo), BI Tools (Looker), or other Big Data solutions are highly appreciated
- Knowledge of Terraform/IaC and Kubernetes
- Experience setting up dashboards and alerting in Grafana/DataDog