Data Leader
Discover Dollar
Roles And Responsibilities
- Team Development and Leadership: Build and mentor high-performing data engineering, data science and data analysis teams by promoting a collaborative, open, and growth-oriented culture.
- Strategic Planning: Define the strategy and roadmap for the Discover Dollar data platform, ensuring alignment with business and technological objectives.
- System Architecture: Design and implement scalable, fault-tolerant distributed systems for data ingestion, processing, and analytics.
- Cross-Functional Collaboration: Partner with product, business, and data science teams to gather requirements, prioritize tasks, and develop impactful solutions.
- Hands-On Problem Solving: Contribute to coding when necessary to resolve complex challenges or provide architectural insights.
- Operational Excellence: Advocate for and implement best practices in software development, deployment, monitoring, and incident management.
- Innovation and Growth: Drive initiatives to adopt emerging technologies and enhance the capabilities of the Discover Dollar data platform.
- Project Ownership: Take full ownership of projects, ensuring high quality, scalability, and timely delivery.
- Data Engineering and Architecture: Proficient in ETL processes, analytical platforms, and event-driven architectures.
- Programming Proficiency: Advanced knowledge of Python, Pyspark and SQL, with a strong understanding of distributed data systems.
- Pipeline Design: Experienced in creating real-time and batch data pipelines using tools like Kafka, Spark, Azure Data Factory, and Databricks and Azure Synapse Analytics.
- Cloud and DevOps Tools: Hands-on experience with cloud platforms such as Azure, along with tools like Docker.
- DataOps and Data Governance: Well-versed in DataOps and Version Control and data governance platforms like Unity Catalog.
- Bachelor's degree in Computer Science, Data Science, Engineering, or a related field.
- 6+ years of experience in data engineering, analytics, or related fields.
- Strong expertise in ETL processes, SQL, Python, Spark and cloud-based data solutions.
- Proficiency in data warehousing solutions, including Azure, Redshift, Snowflake, or similar.
- Solid understanding of API development and integration.
- Strong problem-solving and communication skills, with the ability to collaborate across cross-functional teams.
- Experience with advanced machine learning models and deployment strategies.