hero
3,318
companies
3,776
Jobs
If you are a Techstars portfolio companyclaim your profile.

Data Engineer

Samaritan

Samaritan

Software Engineering, Data Science
Seattle, WA, USA
Posted on Mar 1, 2025
| about | Samaritan is a platform built to help people experiencing homelessness gain the social and financial support needed to find a home. If using technology to solve real problems for real people is your passion and you're skilled in data engineering, you're in the right place.

Health and human services offer Samaritan Memberships as a resource to their patients/clients experiencing homelessness. New Members set goals with their case managers, and can both receive rewards for taking positive steps forward on their care plans (such as meeting a housing specialist, applying for work, or accessing medical care) and receive direct support from the community to overcome blockers to their progress (like phone service, medications, or proper clothing). Over time, more than 50% of Samaritan Members measurably improve access to care, housing, employment, or another goal.

We've practiced the model in Seattle, New York, Los Angeles, and a few other cities across the US, with new pilots starting in AZ and expansions throughout CA. For now, we have a mighty team of 15 (iOS, Android, Backend, plus non-technical), but have just closed a fundraising round to bring on more passionate talent. The 15-year vision (we're on year 5) is to have helped 100,000 adults and families towards housing (and to have the next 100,000 Members enrolled!). So we've got some work to do and would love to hear from you!

| opportunities |

  • support homeless Samaritan Members, volunteer samaritans, and frontline care partners by improving our PostgresSQL database, API, and Rails backend
  • build data visualizations and reports for both internal tools to support team and live data views for partners
  • lead information and data security efforts to protect patient PHI/PII as Samaritan scales our healthcare contracts serve to thousands of new Members
  • generate insights and recommendations from our external data sets to help optimize Samaritan’s operations and maximize outcomes for our Members
  • lead Samaritan’s use of AI and ML to maximize outcomes for our Members

| responsibilities |

  • Own building scalable analytical solutions that provide us with actionable insights to make data-driven decisions
  • You will drive efficiency in data handling processes, setup product analytics, drive advanced analysis and build new metrics which are key inputs to improve the Member outcomes, scale, and grow
  • You will work collaboratively with both operations and engineering teams on many non-standard and unique business problems and support business initiatives by collecting required and related data from external/internal sources
  • You work to build and proactively improve the consistency and integration between samaritan’s BI solutions and any related systems
  • Owning the design, development, and maintenance of scalable solutions for ongoing metrics, reports, analyses, dashboards, etc. to support analytical and business needs
  • Interface with other teams to extract, transform, and load data from a wide variety of data sources using AWS services and internal tools
  • Build and deliver high quality data sets to support data science research needs as well as and internal and external customer reporting needs
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service reporting for customers
  • Translate basic business problem statements into analysis requirements
  • Use analytical and statistical rigor to answer business questions and drive business decisions
  • Find and create ways to measure the customer experience to drive business outcomes
  • Develop queries and visualizations for ad-hoc requests and projects, as well as ongoing reporting
  • Write queries and output efficiently, and have in-depth knowledge of the available in area of expertise
  • Pull the needed with standard query syntax; periodically identify more advanced methods of query optimization
  • Convert to make it analysis-ready
  • Recognize and adopt best practices in reporting and analysis: integrity, design, analysis, validation, and documentation
  • Troubleshoot operational quality issues
  • Review and audit existing jobs and queries
  • Recommending improvements to back-end sources for increased accuracy and simplicity

| experience & qualifications |

  • The ideal candidate should have strong communication skills and ability to prioritize effectively to ensure timelines are met
  • You should be a self-starter, comfortable with ambiguity, able to think big and be creative (while still paying careful attention to detail)
  • You think in terms of architecture, not just code
  • 3+ years of data engineering experience
  • Experience with data modeling, warehousing and building ETL pipelines
  • Experience with SQL, we use PostgreSQL
  • Experience in at least one modern scripting or programming language, such as Python, Ruby, or NodeJS. We use Ruby on Rails for backend API.
  • These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness and professionalism, and safeguard business operations