Senior Data Engineer

About the Role

Data engineering is your passion, and you love turning data into valuable assets?
You enjoy creating data products that boost productivity across teams and organisations?
You have solid experience in designing and maintaining data architectures, an engineering mindset, and a passion for analysing complex datasets?
You thrive in agile environments and deliver in short iterations?
If this sounds like you, then we need YOU in our team!

What You Will Do

  • Design, develop, optimize, and maintain data architecture.
  • Design and maintain ingestion of multiple data sources.
  • Analyze, manipulate, transform, and process large and complex datasets.
  • Enable training and running of machine learning models.
  • Build real-time data pipelines.
  • Help customers become cloud-native and data-driven companies.
  • Support the team with active knowledge transfer.
  • Influence the introduction of new tools, methodologies, and techniques.
  • Work in an agile and cross-functional team.

This Is You

At heart, you are a passionate team player who respects others’ opinions:

  • You know how to be the best team player.
  • You have a strong eye for detail and excel at documentation.
  • You base your decisions on metrics.
  • You are structured and set a benchmark for quality.
  • You are open to new technologies.
  • 5+ years of experience as a Data Engineer.
  • 3+ years of experience in Python or Scala and SQL.
  • A bachelor’s degree in computer science, data science, data engineering, or a related field such as mathematics or physics.
  • Experience in semantic modelling and familiarity with Data Lake, Data Warehouse, Data Vault, Data Mart.
  • Deep understanding of structured and unstructured data stores (distributed filesystems, SQL, NoSQL).
  • Knowledge of how to structure data pipelines for reliability, scalability, and performance.
  • Comfortable with analytics processing engines (e.g., Spark, Flink).
  • Experience with multiple storage formats (e.g., JSON, Parquet, ORC).
  • Fluent in English (German is a plus).

Bonus Experience (Nice to Have)

  • ML Engineering & MLOps, including deploying, tracking, and monitoring models (MLFlow, Kubeflow, TensorFlow Serving, etc.).
  • Experience with cloud technologies such as Azure Databricks, Fabric, Snowflake, AWS Athena, or Google BigQuery.
  • Experience building real-time data pipelines (Azure Stream Analytics, Kinesis, Dataflow, Kafka, RabbitMQ).
  • Familiarity with CI/CD for data pipelines and ML models (GitHub Actions, Jenkins, Airflow).

Apply for This Job

About Us

At MobiLab, we empower our employees to bring their creative mindset into action while guiding our customers toward unlocking their full data potential and becoming cloud-native organisations.

We are a diverse and dynamic team united by engineering excellence. We value inclusivity and welcome individuals of all backgrounds, including those with disabilities. Here, you will directly influence the future of business and contribute to industry-leading companies.

We are dedicated to your growth. Our culture encourages continuous learning and knowledge sharing through our MobiLab Career Development framework. Our headquarters in the heart of Cologne provides a creative, inspiring workspace.

We offer a range of benefits, including a public transport ticket, access to industry conferences, a company pension scheme, and more.

If you’re passionate about Cloud Integration and strive for engineering perfection, we invite you to join the MobiLab Team. Let’s grow together!

CareerBee Logo

Don't miss out on new jobs!

Signup for weekly updates on new jobs so you can be the first to apply

Contact form for Companies

Are you a talented professional seeking a new opportunity?
Visit our Talents Page.