hero
companies
Jobs

Data Analysis Intern

Discover Dollar

Discover Dollar

Data Science
Mangaluru, Karnataka, India
Posted on Feb 2, 2026
About the Role
We are seeking a motivated and detail-oriented Data Analyst Intern who is eager to learn and contribute to real-world data projects. As an intern, you will support the data team in transforming raw data into meaningful insights, building foundational data pipelines, and delivering accurate reports that enable data-driven decision-making across the organization.
Key Responsibilities
  • Assist in building and maintaining data pipelines using Python, Pandas, and Polars, under guidance from senior team members.
  • Support data cleaning, transformation, and validation, ensuring high data quality and consistency.
  • Help develop and maintain data models used for reporting and analytics.
  • Work with analysts and business stakeholders to understand data requirements and translate them into datasets or reports.
  • Contribute to automated reporting and dashboard workflows, where applicable.
  • Perform exploratory data analysis to identify trends, anomalies, and actionable insights.
  • Clearly document data logic, assumptions, and processes for reusability and transparency.
  • Take ownership of assigned tasks, proactively flag issues, and suggest improvements rather than waiting for instructions.
  • Demonstrate a problem-solver’s mindset—digging deeper into data issues and following through until resolution.
Must-Have Skills
  • Good knowledge of Python.
  • Working knowledge of SQL (joins, aggregations, CTEs, window functions).
  • Familiarity with Pandas for data manipulation.
  • Strong analytical and problem-solving skills.
  • A go-getter attitude with the ability to take initiative and deliver outcomes, not just complete tasks.
  • Willingness to learn new tools and technologies quickly.
  • Ability to work independently while taking feedback constructively.
  • Good communication skills and strong attention to detail.
Nice to Have
  • Exposure to PySpark or Polars.
  • Basic understanding of ETL/ELT concepts and data workflows.
  • Familiarity with Git or version control systems.
  • Awareness of cloud platforms such as Azure or tools like Databricks.
  • Fundamental understanding of data structures and algorithms.