hero
companies
Jobs

Senior Data Engineer - German Speaking (f/m/d)

VOIDS

VOIDS

Software Engineering, Data Science
Germany
EUR 90k-110k / year + Equity
Posted on Mar 25, 2026

We maximize product availability with minimal cashflow investment in 1/10 of the time.

We solve a real problem for SMEs. With AI.

VOIDS is the AI brain for mid-size Shopify brands inventory. We forecast demand at the product level, catch stockouts and inefficiencies before they happen, and give e-commerce teams exactly the right action — or execute it automatically with a single click.

The result: 98% inventory efficiency, 20x ROI, and six-figure cash unlocked. Within weeks.

We launched in June 2023. Since then: 300% growth, 1B+ data points processed, €2M ARR, and 50+ brands live — including Hyrox, 6pm, Creamyfabrics, and NatureHeart. Now, we're targeting €10M ARR by 2027.

Today we own demand forecasting and stock management. Our vision for tomorrow: AI handles procurement end-to-end — fully autonomous.

This is where you come in. We're a small, fast team and every hire shapes the trajectory of the company. You'll shape how we ingest, process, and activate 1B+ data points, and help us build the data foundation for a fully AI-driven procurement future. Work directly with Jannik and Tobias, who live and breathe e-commerce and AI.

We're just getting started — want to build it with us?

🚀 The Mission

At VOIDS, data is the product. Our pipelines already move over €1,000,000,000 in yearly e-commerce revenue through complex transactional, behavioral, and marketing datasets. We're scaling fast — and we need an engineer who can understand customer needs intuitively, expand the flexibility of our data pipelines and push the boundaries of what's possible when you delegate entire data / engineering workflows to AI.

🛠 What You'll Do

You'll own the reliability and growth of our data infrastructure end-to-end. This isn't a ticket-execution role — you'll identify problems, design solutions, and ship them yourself.

Connectivity Expansion & Integrations

  • Expand our data connector ecosystem far beyond Shopify and Amazon, paving the way for complete AI-driven custom integrations.
  • Evaluate, implement, and maintain new data sources in a way that works with existing flows — system stability and customization tolerance are non-negotiable.
  • Work closely with customers to understand their data sources, requirements, and edge cases — you are the first technical contact when it comes to what data goes into our system.

Customer & Team Collaboration

  • Communicate fluently in German and English — with customers during onboarding and pilot projects, and async with the internal team.
  • Act as a bridge between customer needs and technical implementation, translating real-world data messiness into clean, reliable pipelines.
  • Understand the e-commerce space intuitively - Suggest solutions to customers and implemented them before the customers even asks for it.

Data Pipeline Architecture

  • Take ownership of our Bronze → Silver → Gold medallion architecture: the logic between layers needs to be airtight, well-documented, and consistent.
  • Scale the piplines to new heights: More data, faster pipelines, less costs. You need to find abstraction layer that allow to scale across multiple customer with very unique requirements.
  • Improve Developer Experience: Enable fast iterations cycles and smooth developer experience when working with existing systems or building new things on top.

AI-Delegated Development Workflows

  • Fully embrace AI tooling — not just as a productivity booster, but as a core part of how you work: delegate end-to-end workflows (testing, development, staging, production) to AI agents where possible.
  • Build and maintain AI-driven pipelines that can handle deep customiszation without system failures — the architecture must be robust enough that AI-generated changes don't break production.
  • Push the limits of what's achievable by combining your engineering judgment with AI automation. 10x yourself every year.

Data Quality, Testing & Reliability

  • Own the full development lifecycle: testing → development → staging → production, with automated checks at every layer.
  • Set up and maintain robust testing environments and DataOps/MLOps workflows to enable rapid iteration.
  • Proactively identify bottlenecks, inconsistencies, and schema drift — and fix them before they reach downstream consumers.

✅ Must-Have Skills

  • Fluent German and English — both written and spoken (customer-facing communication required)
  • 3+ years of experience in Data Engineering or closely related roles
  • 3+ years experience in Python, particularly with data manipulation libraries (Pandas, Polars) for efficient data processing
  • Deep proficiency in SQL and PostgreSQL for structured data
  • Hands-on experience building and maintaining scalable streaming, event-driven and batch data pipelines and workflows as inputs for web applications and AI models
  • Proven ability to set up and maintain robust testing environments, and manage efficient DataOps/MLOps workflows to enable rapid iteration
  • Familiarity with infrastructure and containerization frameworks (Kubernetes, Docker, Terraform)
  • End-to-end expertise in designing and operating scalable data platforms, including storage (S3/Parquet), data pipelines, APIs, and connectors, with a strong grasp of layered data architectures.
  • Daily, fluent use of AI tools — you actively delegate end-to-end workflows to AI: from testing and development through to staging and production. AI is not a helper tool; it's how you multiply your output.
  • Strong product intuition and understanding with a proactive, ownership-oriented mindset
  • Comfortable with ambiguity, autonomous decision-making, and direct customer contact

🌟 Bonus / Nice-to-Have

  • Experience in B2B AI startups / scale-ups
  • Experience with eCommerce data sets and solutions (Shopify, Amazon Seller Central, Google Ads, Meta Ads, Klaviyo, Channable, etc.)
  • Familiarity with scalable big data tools and frameworks (dbt, dask, Apache Spark, EMR, Databricks, AWS Glue)
  • Familiarity or interest in Data Science workflows, especially related to time series forecasting (Nixtla, Darts, statsmodels, sktime)
  • Contributions to developer experience, data observability, or internal tooling improvements

🧱 Tech Stack

  • Programming: Python (Pandas, Polars), SQL
  • Data Storage & Management: PostgreSQL, AWS S3 (Parquet), BigQuery
  • Orchestration: Airflow, EventBridge, Crons..
  • AI Tools: Claude Code, CursorAI Agents
  • Containerization: Docker, Kubernetes, Terraform
  • Data Integration: Airbyte (self-hosted on Kubernetes)
  • Processing & ML: AWS SageMaker, AWS Lambda, MLflow

Optional, if you're interested in expanding into data science tasks (full-stack mindset appreciated):

  • Modeling & Analytics: Statistical, ML, and neural time series forecasting (Nixtla, statsmodels, XGBoost)

🎯 What Success Looks Like (First 3 Months)

  • You understand the existing medallion architecture deeply and have made significant improvements to increase developer experience and pipeline performance.
  • You've shipped at least one new integration to another 3rd party data source system that's live in production.
  • The logical definitions between architecture layers are documented and consistent.
  • You've had real conversations with customers and turned their feedback into technical improvements.
  • Everyone in the team knows what you're working on, and you're proactively unblocking others.
  • You have increased productivity of the team by implementing AI workflows.

🤖 How We Work

  • AI-first engineering: We don't just use AI tools — we delegate entire workflows to them. You're expected to embrace this fully and help us push it further.
  • Fast-paced, high-impact, no overhead: Short daily stand-ups (15min), efficient weekly planning (30min), autonomous decisions, ship daily
  • Pragmatic engineering values: simplicity, maintainability, customer focus — no over-engineering.
  • Customer proximity: You'll be in direct contact with customers in pilot projects. Good communication matters as much as good code.
  • 50/50 hybrid: Remote flexibility combined with our office in Hamburg city centre with drinks and snacks.
  • Autonomous decision making: We trust engineers to own their work and loop others in when needed, typically there is only lightweight consultation with the CTO and engineers

🎁 What You’ll Get

  • Permanent full-time contract (no B2B)
  • Competitive salary (€90,000–€110,000)
  • Equity available for senior hires
  • 30 days paid vacation
  • All AI subscriptions with unlimited usage you want
  • New Mac Book Pro & min. 2 Monitors in the office ;)
  • Regular team events and quarterly off-sites
  • Real ownership and influence
  • A calm, focused work environment that rewards initiative
  • Wellpass membership to unlimited fitness, yoga, swimming, climbing, and more

🧑 🏫 Hiring Process

We move fast and keep it simple.

  • Initial Screening (30 min)
  • Technical Interview with CTO (30 min)
  • Realistic Live Coding Challenge (90 min)
  • Meet the Team in Hamburg
  • Offer within 2 weeks from start to decision

💡 How to apply

We care less about titles and more about impact.

When you apply, tell us:

  • A connector or integration you built and what complexity you dealt with
  • How you currently use AI in your daily engineering workflow — concretely, not in theory
  • What motivates you, and what kinds of data problems you find genuinely interesting
  • 👉 Send us your answers and your CV: jobs@voids.ai

    Or shoot us a message on LinkedIn.