hero
companies
Jobs

AI Developer - Data Aggregation & Intelligence

We Make Change

We Make Change

Software Engineering, Data Science
Île-de-France, France · Paris, France · France
Posted on Feb 23, 2026

Volunteer for a startup building AI-powered infrastructure for sustainable business! 📊

Intersektion provides an AI-powered ESG infrastructure that enables companies to turn sustainability into a structured, measurable, and value-generating system.

Companies today face accelerating environmental and social expectations from regulators and investors, yet most struggle to keep up. ESG data is often scattered across departments, collection is manual and time-consuming, and there is significant uncertainty regarding complex regulatory compliance. As sustainability becomes a core business requirement, organizations lack the operational systems needed to connect their actions to real-world impact and financial performance.

Intersektion provides a centralized platform that automatically imports financial and extra-financial data, using AI to extract and consolidate key ESG indicators. By structuring governance workflows and centralizing documentation for audits, we help companies generate compliant reports (such as CSRD and ISO standards). Our platform moves sustainability from a communication topic to a structured data system, reducing risk and ensuring every action is documented and measurable.

Role (Volunteer, unpaid): AI Developer – Data Aggregation & Intelligence

Role Description: We are looking for an AI Developer specialized in large-scale data aggregation, structuring, and intelligent processing. The core mission of this role is to design and build robust data architectures capable of ingesting, cleaning, normalizing, and transforming heterogeneous data sources into structured, high-value assets that power AI-driven products and decision systems. This is not just a backend role. The objective is to turn raw, fragmented data into scalable intelligence that directly impacts product performance and business outcomes.

Key responsibilities:

  • Design and maintain scalable data ingestion pipelines (APIs, structured files, unstructured documents, external feeds).
  • Build automated systems for data cleaning, normalization, enrichment, and validation.
  • Architect data infrastructures optimized for AI usage (vector databases, indexing systems, semantic search when relevant).
  • Implement continuous data synchronization and update mechanisms.
  • Ensure data quality, consistency, traceability, and governance.
  • Collaborate closely with product and engineering teams to translate business needs into scalable data systems.
  • Optimize performance, reliability, and scalability of data workflows.
  • Implement monitoring and alerting systems to ensure pipeline stability.

The ideal candidate understands that raw data has no value without structure and intelligence. The goal is to build a resilient, automated, and scalable data foundation capable of powering advanced AI models and high-impact product features.

Time Commitment: Volunteer 7-9 hours per week for 6+ months remotely 💻

If you want to make change happen, apply to volunteer with Intersektion now!