GradTracker Website Redesign
2021 · 4 months · Team-Based Academic Project
Redesigning a graduate school academic progress tracking system through iterative prototyping and usability testing.
Overview
This project examined the usability of the University of Utah's GradTracker system—a tool graduate students use to track degree requirements, submit progress forms, and plan coursework. Working in a team of three, we conducted contextual inquiries and heuristic evaluations to identify usability failures, then designed and tested a redesigned interface through iterative prototyping. The project covered the full design lifecycle from problem discovery through validated design recommendations.
The Problem
GradTracker presented graduate students with a fragmented, opaque experience. Students struggled to understand what requirements they still needed to fulfill and by when, had difficulty navigating to the correct forms, and often didn't understand why certain information was being requested or who would read it. The system required users to cross-reference multiple sources—including external handbooks and degree audit tools—to accomplish basic planning and reporting tasks.
Our heuristic evaluation identified specific violations of established usability principles: poor visibility of system status, insufficient feedback, inconsistent navigation, and a lack of contextual help. These were not just aesthetic issues—they created real barriers to students reporting academic progress.
Execution & Iteration
We followed an iterative design process grounded in user research:
- Contextual Inquiry: Interviewed graduate students and advisors to understand their workflows, goals, and frustrations with the current system. We found that advisors were generally satisfied with GradTracker while students bore the burden of its usability failures.
- Heuristic Evaluation: Systematically evaluated the existing interface against Nielsen's usability heuristics, surfacing violations across visibility, feedback, navigation consistency, and error prevention.
- Storyboarding: Developed storyboards to map two primary user tasks—checking graduation requirements and submitting a progress form—grounding design decisions in concrete user scenarios.
- Paper Prototyping: Created low-fidelity paper prototypes and conducted usability tests with think-aloud protocols. Iteratively refined the design based on observed friction points.
- Digital Mockups: Translated refined prototypes into mid-to-high fidelity digital mockups, incorporating a redesigned navigation system, contextual tooltips, inline status indicators, and a consolidated progress view.
- Usability Testing: Conducted task-based usability tests on the digital prototype, analyzing think-aloud feedback to validate design decisions and identify remaining gaps.
Execution & Synthesis
Across both rounds of testing, two themes emerged consistently: users needed contextual help embedded at the point of action (not in external documentation), and the information architecture needed to reflect actual student mental models rather than administrative categories.
Key design decisions that addressed these findings:
- A consolidated Progress Report landing page surfacing requirement status, suggested timelines, and course completion at a glance
- A tooltip system providing inline clarification for form fields, status labels, and progress expectations—reducing reliance on the handbook
- Consistent hamburger navigation available from every page, following established conventions (Nielsen's consistency and standards heuristic)
- Course hover previews surfacing catalog descriptions directly in context, eliminating the need to navigate externally
One honest limitation: we had insufficient access to advisor workflows, which required us to make assumptions about what information they needed from student submissions. This shaped some form design decisions in ways we couldn't fully validate.
Impact & Results
The redesign addressed the core usability failures identified in our initial evaluation. Usability test participants were able to complete both target tasks—checking graduation requirements and submitting a progress form—with significantly less confusion than reported with the original system. The inline tooltip system was specifically cited by test participants as reducing the need to consult external documentation.
Key Takeaways
This project was my first experience with the full UX design lifecycle—from discovery through validated prototypes—and it shaped how I think about design process. A few things I took away:
- Iterative testing reveals things that no amount of expert review can predict. The tooltip system emerged directly from observing a user go silent while trying to understand a form field.
- Think-aloud protocols require careful facilitation. Knowing when to let silence breathe and when to prompt is a real skill—one we reflected on explicitly in our usability testing debrief.
- Research constraints are worth naming honestly. We made assumptions about advisor needs that we couldn't fully validate, and acknowledging that in our recommendations made them more credible, not less.