Interaction Design Studio Client Project
Sketch | Illustrator | Photoshop
UX Designer | UI Designer | UX Researcher
5 weeks
The Problem
Our client, RyeCatcher Education, works to ensure that all students, especially those with special needs or who come from at-risk environments, have access to the services they need to flourish in school. Their software is used to connect students to resources that support the full range of student needs both in and out of the classroom. RyeCatcher needed a tool for tracking and monitoring student behavioral progress over time. The goal of this project was to design a tablet optimized application that assists educators in tracking student and class behaviors.
How might we help educators track student behavior in a real-time classroom environment?​​​​​​​
Research + Exploration
To better understand the landscape, we assessed client expectations with the interaction design requirements and use case scenarios put forth by the client. Because use care scenarios and requirements of the project were predetermined, our exploration phase focused on really understanding the project scope.
stakeholder interview
We sourced stakeholder input from RyeCatcher CEO Arthi Krishnaswami to understand what the tool currently does, how it is used by the teachers and teaching aides, and what are the primary pain-points faced by users.

interaction requirements & use case scenarios
The client also provided us a list of tasks that the user should be able to complete in the application as well as a series of use case scenarios to contextualize the required interactions. 
Information Architecture
Once we had a preliminary grasp of the project scope, we got to mapping out the basic information architecture. Our goal was to organize and structure all relevant functionalities  in a way that was most efficient for in the classroom, on the spot tracking. We deliberated over two different entry points to the tracking function— through the student vs. through the tracked behavior. We explored the possibility of the UI to do both, but realized that the student based entry point was the most intuitive approach.
We started broad exploring an entire ecosystem of possibilities taking into account everything from tracking an entire incident or event involving multiple students, leveraging AI and natural language processing to facilitate rapid documentation of complex incident reports involving multiple students to logging demographics, family histories and classroom attendance. We had to cover a lot of ground before narrowing in on the primary function of the app— behavior tracking. Based on our exploration of what is possible within scope, we determined we needed to define three major sections of the tool:
With three main chunks to anchor our designs around, our team began mocking up drafts of both the visual and interaction design details for these screens.
Whiteboard sketches of tracker screen ideas
We prototyped the ancillary functionalities of the application first— the class and student profiles screens— before honing in on the tracker. Our initial approach to the app's key function was driven by data visualization. This created a new way to think about designing this screen, but also opened a conversation about visual connotations and how they relate to the app’s overall goals. Different sizes, orientations, and colors of trackers got interpreted in different ways by people, which led to important dialogues around meaning making and signifiers. 
For example, the size of a tracker can represent a behavior that a student is actively working on personally, and not at all related to a “general” scale of relevance. If negative trackers are prominent, how noticeable should they be, and for how long? When using icons as a visual mnemonic device, even if people are reading the textual labels to discern specific trackers, the icons are still playing their part in making those delineations easy. Often redundant yet diverse signifiers are helpful to accommodate different types of users as long as they are not producing cognitive load.
Usability Testing
We structured the interview process in two parts:

freeform exploration + think aloud protocol
We encouraged the user to navigate about the application freely and utilized the think aloud protocol to capture how many functionalities they understood without guidance and how many features they were able to discover on their own.
guided exploration + individual task performance
We guided them through each of the three primary user flows and asked them to perform a set of tasks on each screen. This helped us understand how many features and tasks were easy for the user to perform, where they struggled, and why they struggled.

users make meaning out of everything
Every single element we added to our screens were analyzed and interpreted by users to signify either a message or an interaction and so it is crucial to think intentionally about each and every UX, UI, and visual design decision.
discretion is key
The context of use for our application is sensitive and requires attention to discretion in the visual design. Since students may unintentionally view the screens while the educators are using the application, they shouldn't be able to easily deduce anything about their tracked behavioral information. We utilized a neutral color scheme and subtle signifiers to strike a balance between discretion and usage indicators.
a slight learning curve is okay
Our end users of this product consisted of educators with varying levels of technological capabilities and therefore we understood that the functions of the app will be easier for some than for others. We believe this is okay as long as there is error recovery for interactions that allows for the user to learn a particular function's correct capacity for use.
Back to Top