EAS 1220: Earthquake!

EAS 1220: Earthquake!

ASTRO 1101: From New Worlds to Black Holes

ASTRO 1101: From New Worlds to Black Holes

EAS 1700; Evolution of Life and the Earth

EAS 1700; Evolution of Life and the Earth

BIOEE 1540: Introductory Oceanography

BIOEE 1540: Introductory Oceanography

Reviews

CU Reviews @ Cornell University

CU Reviews @ Cornell University

ROLE

Lead Product Designer

TEAM

1 PMs

3 Designers

5 Engineers

1 Product Marketer

TIMELINE

Feb 2024 - Present

SKILLS

Interaction Design

Visual Design

Prototyping

Design Systems

UX Research

Cross-Functional Work

TOOLS

Figma

FigJam

Adobe Creative Suite

CONTEXT

What is CUReviews @ Cornell?

CU Reviews is a course review web application created by Cornell Digital Technology & Innovation, a student-run, open-source, software design and development team. The platform strives to simplifying the course enrollment process to help Cornell students make well-informed enrollment decisions. Since it’s launch in Fall 2019, CUReviews has garnered over 17,000 users and currently houses over 3,000 active users.

THE STORY OF CU REVIEWS

Every semester, students face the daunting task of choosing from thousands of classes and teachers to fulfill their academic requirements needed to graduate.

Cornell’s motto, “any person, any study” establishes emphasis on providing a vast array of course opportunities. While such a wide selection comes with incredible career and academic growth opportunities, this process can also be extremely stressful for students hoping to make the most out of their Cornell education. CU Reviews aims to ease this process by providing a platform where students can browse courses, read synthesized reviews, and evaluate key metrics to make smarter, more informed decisions.

PRODUCT PROBLEM

CU Reviews currently houses 4,717 informative, yet lengthy and subjective course reviews.

While CU Reviews course reviews provide valuable insights, they tend to be lengthy, subjective, and inconsistent in structure. This creates several key challenges for users, as the course selection process becomes:

Time-Consuming

Time-Consuming

Too Subjective

Too Subjective

Overwhelming

Overwhelming

USER RESEARCH

How might we help users quickly extract key insights and understand overall sentiments from course reviews to make informed decisions?

So, CU Reviews’ large database of reviews lacked a simple way for students to quickly grasp key themes and sentiments. This created an opportunity area for me to explore a new feature that could provide students with a more informative, efficient, less time-consuming course selection experience. To approach this design process, I first conducted user interviews to understand how Cornellians choose their courses and what factors matter most to them during selection. Here’s what some students shared:

“I think the professor who is teaching a course can make or break your course experience. Even then, I can’t always find their information on Rate My Professor....”

“I think the professor who is teaching a course can make or break your course experience. Even then, I can’t always find their information on Rate My Professor....”

“When reading reviews, I find descriptions of the professor, how hard the coursework is, and the workload to be the most helpful”

“When reading reviews, I find descriptions of the professor, how hard the coursework is, and the workload to be the most helpful”

“I like to see skills and topics that can be learned from the class, along with accessibility of resources such as TA office hours”

“I like to see skills and topics that can be learned from the class, along with accessibility of resources such as TA office hours”

Upon compiling and analyzing user research insights, we decided to focus on 5 key points.

After conducting user interviews, compiling, and analyzing insights into an affinity map, we were able to gain a better understanding of how students undergo the course selection process.

These insights led us to extract 5 key factors that are difficult to access information on, yet are most important to students when determining their course selections.

IDEATION

Time to brainstorm!

I led a brainstorming session with my team of 10 engineers, product managers, marketers, and designers to ideate possible features, making sure to reference our essential HMW question: How might we help users quickly extract key insights and understand overall sentiments from course reviews to make informed decisions? Below is an overview of our main brainstorming results.

We landed on building an AI-driven solution that could summarize student review insights into a concise and user-friendly format.

We named this summary-of-reviews solution the "Cornellians Say" feature, highlighting its focus on capturing and summarizing student perspectives on course selection. This feature aligned perfectly with our goals of boosting user retention and engagement, while addressing key user pain points, such as the time-consuming process of identifying the five most important factors for course selection.

Initial Explorations

With this direction focused on designing Cornellians Say, an AI-comprehensive summary of reviews, I explored the following low-fidelity sketches. During this process, I met with my team’s developers to ensure feasibility, discussing the use and constraints of OpenAI’s API so that we could best create a cohesive vision for this feature on CU Reviews.

ITERATION & USER TESTING

First, a change in the main CU Reviews page layout to accommodate incoming features.

Because this feature would be integrated into the main course reviews page, I explored various entry points that would work around the page’s existing components. I ultimately decided to transition the page from a singular column to a two-column layout, with the left sidebar dedicated to the respective new features my co-designer and I would design.

Exploring summary formats to simplify and optimize the course selection process.

After solidifying an entry point, I moved onto exploring mid-fidelity wireframes to better envision this product feature. During this process, I set up a critique session with my fellow designers on Cornell DTI, and conducted user testing to shift towards iterating upon a few top feature iterations that met both user needs and our team’s product goals.

V1. Dropdown w/ Top Tags

V1. Dropdown w/ Top Tags

detailed, informative summary about each key tag for the course

visually dense w/ multiple paragraphs to read

users may feel overwhelmed when reading information about each tag

V2. Modal w/ Top Tags

V2. Modal w/ Top Tags

allows for distinct separation between sidebar and tag details

require a larger number of clicks, higher interaction cost

not intuitive for users to click into a tag and expect a modal

V3. Filter By Top Tag

V3. Filter By Top Tag

allows for users to quickly arrive at specific information about a course

no overwhelming tag details, leads to the actual reviews instead

not intuitive for users to click tags and be led to a filter option

Always designing with the user in mind, via usability tests.

After conducting three usability tests, I found that while the dropdown and modal versions provided detailed summaries, they overwhelmed users and were hard to navigate. In contrast, the "filter by tag" iteration offered a more interactive experience, allowing users to click a tag and view relevant reviews. Based on this feedback, I chose to move forward with the filter iteration.

Filtering by tag is less overwhelming than reading tag details! But something’s off...

After moving forward with the filtering concept, I realized there was still room for improvement. While clicking on top tags to help users find reviews related to key course decision factors was useful, the process of clicking a tag to access filtering options still felt unintuitive. To address this, I shifted my focus towards refining new filtering and sorting options to make the process more seamless and intuitive for users.

REVISIONS

Original Filtering Concept

Clicking Top Tags To Filter Options

All Filter Options Selected

Final Filtering Concept

Rather than relying on the less intuitive option of clicking top tags to filter, I chose to remove that feature. Instead, I allowed users to filter directly by the top five key factors most important to students when selecting courses, based on my earlier user research insights.

Removed Unintuitive “Click Tags to Filter”

Filter By Key Factors Important to Course Selection

Touch ups to create a more intuitive, interactive, and informative course selection experience.

To conclude the design of the Cornellians Say feature, I added sorting options to improve user interaction and make it easier to find relevant information. Based on user research, I incorporated three options: by professor, recency, and most helpful reviews. These were designed to address varying user needs, as some prioritize recent feedback, others focus on professors, and many look for the most helpful reviews.

FINAL DESIGNS

Finalizing designs & shipping the feature to developers via design-dev handoff.

I have handed off the Cornellians Say feature to the developers on my team, and it is expected to ship by Fall 2024!

REFLECTION & IMPACT

The Magic of Many

The Cornellians Say feature was a true cross-functional effort. Guided by user insights, I collaborated closely with my co-designers, developers, product marketing manager, and PM on my beloved CU Reviews team to bring the feature to life. Each role’s input was crucial, ensuring the final solution was both technically feasible and user-centered. This project highlights how great design emerges when expertise and ideas from all sides come together. I'm thrilled to see the impact of what we’ve built as a team!

User-Centered Design is Key

Throughout this project, I learned how crucial it was to continuously revisit the feedback we received from Cornell students. Each adjustment and iteration of the Cornellians Say feature was shaped by their input, ensuring the final solution addressed the actual pain points they faced during course selection. This experience reinforced the importance of staying user-centered and grounding every design decision in real user needs.

Moving Forward...

As I continue working with the CU Reviews team within Cornell DTI, there are several plans I’m excited to explore and expand upon to further enhance the platform’s usability and impact.


Enhanced Personalization

I plan to explore how we can further tailor the AI summaries to individual user preferences, allowing students to prioritize what matters most to them, such as course workload, teaching style, or grading policies.


Interactive Visual Summaries

Incorporating more data visualizations, like graphs or charts, to display sentiment trends or highlight key themes would offer users a quick, visual way to digest course insights, making the review process even more efficient.


User Testing & Usability Metrics

I also plan to conduct additional user tests using the Single Ease Question (SEQ) to measure how easily users can interact with the new feature once it is fully developed. This would provide more quantitative insights into the feature’s usability.


Monitoring KPIs for Continued Improvement

To ensure ongoing success, I intend to monitor key performance indicators (KPIs), including user activity during course enrollment, overall usage throughout the semester, and the average time spent on the platform.


Shoutout to my amazing CU Reviews team & post-work session dinners <3

Shoutout to my amazing CU Reviews team & post-work session dinners <3

O

O

O

N

N

!

!

O

O

U

U

S

S

C

C

Made with

by Erica © 2025

Thanks for stopping by!

Curious to collaborate? Let's make it happen!

Made with

by Erica © 2025

Thanks for stopping by!

Curious to collaborate? Let's make it happen!