Data Developer
HeronAI
What You’ll Do
- Design, implement, and maintain scalable ETL pipelines for data ingestion and transformation from tools like QuickBooks, Excel, and other financial platforms.
- Collaborate with engineering teams to ensure seamless data flow between backend APIs, databases, and visualization tools.
- Optimize the performance of databases (e.g., Postgres, DynamoDB), ensuring efficient handling of large-scale, complex datasets.
- Develop and maintain data models that align with analytics and reporting needs.
- Work closely with the DevOps team to ensure data pipelines are secure, reliable, and cost-efficient.
- Conduct data quality checks and implement automated validation processes to ensure accuracy and consistency.
- Troubleshoot and resolve data-related issues, ensuring minimal downtime and disruption.
- Contribute to the company’s SOC 2 compliance efforts, ensuring data security and privacy protocols are followed.
What Makes This Role Exciting
- As one of our first hires, you’ll have the freedom to define how we build solutions.
- Choose the methodologies and tools that best meet our customers’ needs.
- Work on a high-growth platform backed by MIT, Harvard, Techstars, and Forbes-recognized innovation.
- Join us at an exciting time—we’ve grown our waitlist from 250 to 1,700 users in 10 months and recently secured $1.5M in seed funding.
- Solve critical gaps in the data analytics market and automate workflows for industries facing major workforce transformations.
Who You Are
- A professional with 5+ years of experience in data engineering, ETL development, or a related field.
- Proficient in Python or another programming language for data processing.
- Skilled in designing, optimizing, and managing databases like Postgres or DynamoDB.
- Experienced in building scalable ETL pipelines for data ingestion, transformation, and storage.
- Knowledgeable about cloud platforms (AWS preferred) and tools like Lambda, S3, and Redshift.
- Comfortable with data modeling and understanding the needs of analytics platforms.
- Passionate about ensuring data accuracy, security, and reliability in fast-paced environments.
- Familiar with SOC 2 compliance or other data security frameworks.
Exceptional Candidates Will Bring
- Experience integrating data from financial systems (e.g., QuickBooks, Excel).
- A strong understanding of data visualization requirements, including performance optimization for dashboards.
- Familiarity with real-time or near-real-time data processing pipelines.
- A history of contributing to high-growth startups or scaling data-intensive platforms.
What Good Looks Like
Q1 Targets:
- Build ETL pipelines for QuickBooks and Google Sheets with robust data validation.
- Deliver backend support for 5 pre-designed dashboard templates.
- Ensure dashboards generate actionable insights with no data errors.
Q2 Targets:
- Support incremental API integrations for Xero and HubSpot.
- Optimize ETL performance to reduce load times by 30%.
- Develop automated data cleaning and deduplication tools to reduce user intervention.
Q3 Targets:
- Scale ETL pipelines to support 3,000 concurrent users.
- Add support for multi-dataset analytics with "Ask Jules".
- Deliver predictive analytics capabilities for proprietary metrics like Variance Analysis.
Q4 Targets:
- Support enterprise-level integrations (e.g., SAP and Salesforce) with scalable ETL processes.
- Build new ETL templates for Predictive Churn and Competitor Comparison metrics.
- Ensure data pipelines can handle unstructured data sources with minimal user effort.
Why You’ll Love Working With Us
- Be part of a company that’s saving businesses hours every week and driving smarter growth.
- Work with a team of passionate innovators who see this as a once-in-a-lifetime opportunity to change the game.
- Develop alongside some of the best minds in tech, with connections to MIT, Harvard, and Techstars.
- We believe in working hard, celebrating wins, lots of ‘Fika’, and building a strong team dynamic that fosters creativity and collaboration.