Lead Software Engineer - Python, GenAI, AWS
Hackajob
Job Description
Job Overview
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorganChase within Consumer and Community Banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives.
Job Responsibilities
- Collaborate with cross-functional teams to identify business requirements and develop data-driven solutions using Agentic/GenAI frameworks in a fast-paced environment.
- Conduct research on prompt and context engineering techniques to enhance the performance of LLM-based solutions.
- Design and implement scalable and reliable data processing pipelines, performing analysis and deriving insights to optimize business outcomes.
- Build and maintain Data Lakes and data processing workflows using Databricks to support machine learning operations.
- Communicate technical concepts and results effectively to both technical and non-technical stakeholders.
- Utilize AWS services including S3, Lambda, Redshift, Athena, Step Functions, MSK, EKS, and Data Lake architectures.
- Collaborate with data scientists, engineers, and business stakeholders to deliver high-quality data solutions.
- Act as a self-starter, independently taking initiative in driving assignments to completion and solving problems without the need for escalation.
- Advanced degree in Computer Science, Data Science, Mathematics, or related field.
- 5+ years of applied experience in data science and machine learning.
- Strong Python skills with PySpark, Spark SQL, and DataFrames for large-scale data processing.
- Proficient with GenAI models (e.g., OpenAI), including RAG/fine-tuning where appropriate.
- Experience with LLM orchestration and agentic AI libraries; built AI agents, agentic frameworks, and MCP servers.
- Databricks expertise building and managing data lakes and end-to-end data processing workflows.
- Strong communicator and mentor with excellent troubleshooting; rapidly turns POCs into production and improves productivity using tools like Copilot.
Proficiency in all other AWS components—preferably AWS certified.
Experience integrating AI/ML models into data pipelines is a plus.
Experience with version control (Git) and CI/CD pipelines.