Logo of Huzzle

Data Engineer

  • Job
    Full-time
    Mid Level
  • Data
  • Ann Arbor

AI generated summary

  • You need a Bachelor’s degree, 3+ years in data engineering, knowledge of dimensional modeling, experience with ELT tools and cloud platforms, strong problem-solving skills, and a proactive mindset.
  • You will build and optimize data pipelines, support ELT processes, manage data models, collaborate with teams, and enhance data quality and engineering practices in a cloud environment.

Requirements

  • * Bachelor's degree required. Degree in Computer Science, Data Engineering, Information Systems, or a related field preferred.
  • * 3+ years of experience in data engineering, data warehousing, ELT development, or related technical roles.
  • * Working knowledge of dimensional modeling concepts (Star and Snowflake schemas).
  • * Experience building and maintaining data pipelines across multiple source and target systems.
  • * Experience with ELT tools and cloud data platforms (e.g., Snowflake, Matillion).
  • * Strong problem-solving skills and ability to work collaboratively in cross-functional environments.
  • * Proactive, results-oriented mindset and eagerness to learn new data technologies.

Responsibilities

  • Build, test, and maintain reliable and scalable data pipelines that integrate data from diverse source systems into centralized data platforms.
  • Implement and optimize ELT processes to deliver high-quality, analytics-ready data.
  • Monitor, troubleshoot, and optimize pipeline performance, reliability, and cost efficiency.
  • Support the development of logical and physical data models using dimensional modeling techniques (e.g., Star and Snowflake schemas).
  • Contribute to curated data layers that enable reporting, dashboards, and AI/ML applications.
  • Work within cloud-based data platforms like Snowflake to support performance tuning, cost management, and operational stability.
  • Assist with managing structured and unstructured datasets across multiple systems and environments.
  • Partner with cross-functional stakeholders (Engineering, Online Learning, XR, and others) to understand data requirements and translate them into technical solutions.
  • Identify opportunities to improve data quality, reliability, documentation, and engineering processes.
  • Participate in team knowledge sharing and contribute to community initiatives such as Data Hours or the BI+AI Community of Practice.

FAQs

What is the work schedule for this position?

The position is hybrid, requiring a minimum of 4 days in the office per week (Monday through Thursday), with the option for remote work on Fridays.

How long is the term for this position?

This is a 5-year term limited position with the possibility of renewal depending on funding.

What are the required qualifications for this role?

The required qualifications include a Bachelor's degree (preferred in Computer Science, Data Engineering, Information Systems, or a related field), 3+ years of experience in data engineering or related technical roles, and working knowledge of dimensional modeling concepts.

What is the salary range for this position?

The general salary range for this position is $96,000 - $105,000, with compensation based on education level, experience, knowledge, and skills.

Is there a specific application process for candidates?

Yes, candidates must submit a cover letter and resume as one file, explaining how the role aligns with their career aspirations and skill set.

Does the position require collaboration with other teams?

Yes, the role involves collaboration with cross-functional stakeholders including Engineering, Online Learning, and more to understand data requirements and translate them into technical solutions.

Are there opportunities for professional growth in this role?

Yes, the role emphasizes continuous learning and offers opportunities to grow technical leadership skills over time.

What is the focus of the Data and Business Intelligence Team?

The focus is on driving CAI's data strategy and ensuring responsible, ethical, and secure use of data while leveraging it for analytics, insights, and decision-making.

What technologies should candidates be familiar with for this position?

Candidates should have experience with ELT tools and cloud data platforms, such as Snowflake and Matillion, and knowledge of building and maintaining data pipelines.

What resources are available regarding benefits?

For details on benefits, candidates can visit http://benefits.umich.edu/.

Will there be background checks conducted for this position?

Yes, the University of Michigan conducts background checks on all job candidates upon acceptance of a contingent offer.

A world-renowned public institution, fostering excellence for all. #LeadersAndBest

Education
Industry
10,001+
Employees
1817
Founded Year

Mission & Purpose

The mission of the University of Michigan is to serve the people of Michigan and the world through preeminence in creating, communicating, preserving and applying knowledge, art, and academic values, and in developing leaders and citizens who will challenge the present and enrich the future.