ACES Program Explorer

2022-23

aces.illinois.edu/academics/program-explorer

close up of cards in the program explorer

The Problem

Rapidly increasing number of majors, degrees, and certificates

A basic list worked well for a while, but as more are added, the list was getting too long.

User issues:

  • decision paralysis – too many options to choose from
  • no filtering or sorting options
  • user has to expand each item to read a short description
  • not enough information to compare programs
  • once expanded, the “Learn more” buttons take you off the main website creating confusion
screengrab of previous layout and design for aces section of majors with a long list

The previous solution used sections of accordions grouped by degree level, then certificates.

The Solution

Create a new Drupal content type that users can filter, sort, and explore all of the options ACES has to offer.

  • Allow users to explore their opportunities by filtering to see a smaller set of options.
  • Mobile friendly first, then allow for larger screen expansion
  • Keep it concise for assisted technology to minimize the number of options to tab or read through.
ACES Program Explorer show on 4 screens of various sizes of devices

My Roles

  • Project Management
  • UI Design
  • WCAG Compliance
  • User Experience Design and Research
  • Testing

Tools

  • Adobe XD
  • Slack
  • Drupal
  • Optimal Workshops Testing

Teammates

  • Drupal developer
  • Content strategist
  • UX Lunch Club members
Multiple screens on phones showing parts and pieces of the program explorer

Challenges

Accessibility

Approaching this project with an accessibility-first and always mindset.

Solution: Use multiple lenses to capture design and development needs throughout the process.

  • Animation and Effects – tons of information and possible interaction for users to sort and filter, so we decided that little to no use of animation is desired.
  • Audio and Video – the program detail pages include a video, but the autoplay is not enabled intentionally. Videos and subtitles are provided for each video.
  • Color – all color contrast is checked for WCAG 2.1 AA standards including hover and focus states.
  • Controls – the checkboxes in the filtering options meet the WCAG 2.1 standards for space around each form element in addition to changing the layout on smaller devices.
  • Font – style is proportionally sized, easy to read, and scales responsively.
  • Image and Icons – appropriate use of images in cards as informational and correct alt text and intentionally not using icons due to the amount of content.
  • Inclusivity – rework verbiage on cards to follow plain writing principles and use inclusive language.
  • Keyboard – logical reading order of all elements, skip to functionality on the page, and focus states are applied.
  • Layout – content logically flows and resizes on all devices and screen sizes.
  • Material honesty – elements look and act appropriately. The cards use links and the button is only used for the filter-clearing action.

We Learned…

While testing for keyboard usability, we originally programmed the photo of each card to be a link and the heading to be a link. This causes double the work for a screen reader since each link is read. Instead, we treat the entire card as a link and it’s only read once. This cuts the time to read in half as well as lessons the cognitive load on the user.

Screen readers and keyboard users alike value the number of results listed and spoken for text-to-speech users. It is very helpful to any user to know the number of results you will be looking at. This can help them decide to narrow the results even more or just know where they are in the list of options they have such as #12 of 17 results.

Topic Areas

Narrowing and finding the right topics to use for our vast array of majors and certificates.

Solution: Treejack Testing user testing

User testing in Optimal Workshop also gave us insights into the verbiage used for program names, academic departments, subject matter categories, search terms, and search parameters.

Screenshot of workshop tasks
LinkedIn post inviting parents and students to take our survey
Screenshot of Optimal Workshops popular placements matrix

Initial List of Topical Categories

  • Animals
  • Plants & Crops
  • Food Systems & Nutrition
  • Health & Nutrition
  • Natural Resources & Conservation
  • People, Communities, & Families
  • Policy, Law, Economics
  • Sustainability & Environmental Stewardship
  • Technology

Final List of Topical Categories:

  • Agriculture
  • *Animals
  • Business, Economics, Policy and Law
  • Data and Technology
  • Food Systems
  • *Health and Nutrition
  • Leadership, Communication and Education
  • *People, Communities, and Families
  • *Plants and Crops
  • Sustainability and Environmental Sciences

*Topics stayed the same throughout testing.

Editing and Sorting Cards

Input, approval, and similar majors feedback wanted from staff

Solution: Invite staff and department heads to give feedback and be part of the process.

Card sorting activities were conducted with all of our audiences. We were able to gain additional insights into our information architecture, categories, search terms, and search parameters. This was a great way to get some parties involved that usually don’t jump in and created such a buzz around our product.

Printed single agronomy card and handwritten notes on a desk
Printed single agronomy card and handwritten notes and a postit note on a desk
Printed version of major cards laid out on conference table ready to be edited

Extend content type to multiple views

No need to duplicate content and editing time.

Solution: Create multiple views using the same data.

We know that one size doesn’t fit all. Depending on which page users start their journey due to organic search or a marketing funnel, the information displayed may need to be different on that page.

Having multiple views of the same content rather than recreating the content in many places reduces content redundancy, improved consistency, and allows site editors to update content in one place rather than manage it in many places.

Next steps – user testing to find out which views are the most effective, and what else we can do to improve the experience.

Bulleted list mockup
Table design mockup
Design mockup with tags for departments and icons for format

Design Process

Start with 5 questions

Who? What? Why? When? Where?

When starting any project, it’s important to ask ourselves the following questions:

  • Why are we doing this?
  • What are we trying to accomplish?
  • Who is this product for?

For the ACES Program Explorer project, our team began by defining our goals and understanding our target audience.

  • We needed and assumed that students would want a tool that allowed them to sort and narrow their options as they explored available majors.
  • We also knew that the list of programs is growing at a very fast rate, making it increasingly difficult for students to navigate their options.
  • As a general guideline, we limit the amount of duplicate content on our websites, which improves the user experience by reducing clutter and confusion.

We started with a clear understanding of our goals and target audience.

Research

We interviewed our college recruiters and advisors with lunch and learn sessions as they are our boots on the ground talking to prospective students and parents daily. We listened to the pain points they encounter and the feedback they received during campus visits and recruitment calls.

Additional research needed:

  • What are competitive institutions doing?
  • What have other departments at UIUC produced and have they done any research?

Refine based on feedback and create a high-fidelity mockup

In Figma, I started mapping out the journey that users might take.

  • If students start with a massive list of all our programs, they can select a filter and see a narrower set of options.
  • Based on those results, they can refine further if needed.
  • Once they click a card to investigate further, they get more details
    • Must have some of our conversions –apply now, contact the advisor.
  • Define where they will go from there. Give them options for the different stages of their journey.
    • More details
    • Contact info for questions
    • Suggested other similar programs
    • Return and keep exploring
Program Explorer User Journey

Wireframe the basics

Create low-fidelity mockups

I created mockups and tested the concepts with staff and internal student ambassadors.

The team started defining the search parameters during this stage.

  • Type – undergraduate, graduate, certificates, etc.
  • Format – online, hybrid, in-person
  • Topics – how few can we get to cover all ACES offers
  • Department – list
  • Additional items for certificates only?
  • Search
Layout of wireframes in Adobe XD design program

Refine based on feedback and create a high-fidelity mockup

Finalize the design and journey based on feedback

Define the analytics goals and the development needs

Here’s a screenshot of sample analytics taken after the product was built to see if users are going back to the explorer or clicking the similar major cards.

Build a first product

Finalize content for all majors, degrees, and certificates

The final design of the ACES Program Explorer features a clean and modern interface that prioritizes accessibility and ease of use. The landing page provides a welcoming introduction to the tool and invites students to explore available majors. The program options are presented in a filterable grid that allows users to compare and select the program that best fits their needs. Each major has a detailed description that includes information on course requirements, potential career paths, and advisor contact information.

Test. Refine. Rinse and repeat.