Skip to content
Case Study
Dashboard UI · 2025

Smart Student Dashboard.

A centralized academic management platform that transforms how students organize their studies — attendance, assignments, performance, and schedules, all in one clear, confident interface.

UI/UX Design Dashboard Education Tech Figma Inter · Light Mode
Role
UI/UX Designer
Tools
Figma
Platform
Web Dashboard
Duration
3 Weeks
Type
Concept · Prototype
View Live Prototype ↗
01Project Overview

The Brief

Most students don't fail because they lack ability. They fail because critical information is buried across five different platforms — and no one designed a better way. This project set out to change that.

The average student manages their academic life across 4–6 separate platforms — each with a different login, a different interface, and no connection to the others. The result isn't just inconvenience. It's missed deadlines, declining grades, and the chronic low-grade anxiety of never being sure what's coming next.

The Smart Student Dashboard was designed to be the tool that should have existed already — a single, clarity-first academic management platform that collapses attendance, assignments, grades, schedules, and notifications into one confident, scannable interface.

The design focused on clarity, usability, and accessibility. Not more features — fewer platforms. Not more reminders — better visibility.

Scope note: This is a concept project — research-informed and prototype-validated, but not currently deployed in an institutional setting. All outcome metrics are from prototype usability testing with 6 participants.
🎯 Vision

To replace the 5-platform patchwork with a single clarity-first dashboard that answers three questions in under 5 seconds: where do I stand, what's due today, and am I on track?

🚀 Mission

To eliminate academic platform-switching entirely — replacing 4–6 separate logins with one unified, urgency-aware dashboard that surfaces what matters without students having to search for it.

Project
Student Dashboard
Academic Management
My Role
UI/UX Designer
End-to-end · Solo
Research
8 Interviews + Survey of 24
Bengaluru
Status
Concept · Prototype Ready
Figma interactive
Platform
Web Dashboard
Desktop-first

Research came first. Before a single screen was designed, the platform-switching problem had to be understood from the inside. Here is what the research revealed.

02The Challenge

A Fragmented Academic Life

Students juggle attendance portals, LMS systems, assignment apps, schedule PDFs, and grade sheets — each on a different platform, each with a different login. The result isn't just frustration. It's missed deadlines, declining grades, and mounting anxiety.

📋

Scattered Information

Assignment deadlines live in one app, attendance in a portal, grades in another. Every piece of information requires a separate login and a separate mental context-switch.

Observed in contextual research — students averaged 4 platform switches per morning session. Cited by 7/8 interview participants as their primary daily frustration.
📉

No Performance Visibility

Students only discover they're at-risk when grades are posted at semester-end. There's no real-time feedback loop to course-correct early.

From 8 student interviews — 7/8 said they only knew their grade standing at semester-end.

Deadline Blindness

Without a unified view, upcoming exams, submissions, and events collide. Students learn about deadlines from WhatsApp group chats — an unofficial, unreliable system.

From survey of 24 students — 63% missed at least one important deadline last semester due to poor visibility across platforms.

These three problems pointed to a single, architectural failure — the academic system was designed around institutional data management, not around student attention. Here is what we set out to fix.

03Goals & Objectives

What We Set Out to Do

01

Centralize All Academic Information

Create one entry point for schedules, assignments, attendance, grades, and notifications — eliminating the need to switch between platforms.

Research basis: All 8 interview participants reported managing 3+ platforms. Contextual observation found 4 platform-switches per morning session.
02

Surface Performance Insights Proactively

Move from reactive reporting (end-of-semester) to live performance tracking — so students can act before a small issue becomes a failing grade.

Research basis: 7/8 interview participants said they only discovered grade issues at semester-end.
03

Reduce Cognitive Load Through Design

Present dense academic data with clear visual hierarchy — using charts, status badges, and progress indicators rather than raw tables and PDFs.

Research basis: Survey finding that 78% of students avoided checking grades/attendance because the experience felt like a chore.
04

Drive Engagement & Ownership

Design an experience that motivates students rather than stresses them — making academic management feel empowering, not administrative.

Research basis: Interview sessions revealed anxiety as the dominant emotional response to academic admin.

Goals established, the next step was listening — systematically, with real students, about real failures. Here is what the research found.

04Research & Discovery

Listening to Students First

Interview Participant Context
Sessions8 semi-structured interviews, 40–50 minutes each
WhoUndergraduate students aged 18–23 across engineering, arts, and commerce programmes. Recruited via personal network and university community groups in Bengaluru.
SurveySeparate survey of 24 students — grade/attendance checking habits and deadline history over last semester
Question AreasDaily platform usage, deadline management habits, grade visibility, emotional response to academic admin

User Research

🎙
Student Interviews (n=8)All 8 interviewees reported using 3+ separate platforms to manage academics. "I just wish everything was in one place. I open three apps before I even know what I have due today."— Interview 04 · 2nd year Engineering · Bengaluru "I keep a notebook just to remember what's due where because I can't keep all the apps open at once."— Interview 07 · 3rd year Commerce · Mumbai
📋
Surveys (n=24)78% said they check grades and attendance less than once a week due to inconvenience. 63% had missed a deadline in the last semester.
👁
Contextual ObservationStudents kept physical notebooks as a bridge between digital systems. In 5 of 8 interview sessions, students had a physical notebook open beside their laptop. This was the most compelling evidence for the consolidation approach — a notebook is a student's workaround for a broken digital system.
🗂
Secondary ResearchStudents who actively track academic progress are significantly more likely to seek help early and improve grades. Based on our research finding that students with real-time grade visibility made earlier interventions, we hypothesise a 2× improvement rate vs. reactive tracking — consistent with educational psychology literature (EDUCAUSE, 2022).

Competitor Analysis

✓ Full support
~ Partial support
✗ Not supported
PlatformUnifiedAnalyticsAlertsUX
BlackboardUnified shell, but UX is complex ~~Complex
Google ClassroomAlerts work, but no analytics ~Fair
MoodleUnified, but dated UX ~~Dated
CanvasBest existing option — limited analytics ~Fair
This DashboardAll four capabilities with a clean, anxiety-aware UX Clean
No existing tool combined real-time performance analytics + unified view + clean UX. Every competitor solves one dimension but not all three simultaneously.

Research revealed the gap. The insights then needed to be distilled into numbers that would drive every design decision. Here is what the data said.

05Key Insights

What the Data Said

78%

Checked Less Than Weekly

Students avoided checking grades/attendance because the experience felt like a chore, not a tool.

Survey of 24 students, Bengaluru
3+

Platforms Per Student

Average number of separate systems students maintained — each with unique logins and interaction patterns.

From 8 user interviews — all 8 participants

Hypothesised Improvement

Students with real-time visibility are hypothesised to seek help earlier — consistent with learning analytics research.

Secondary research basis: EDUCAUSE Learning Analytics, 2022.
63%

Missed a Deadline

More than half surveyed missed at least one important deadline last semester due to poor visibility.

Survey of 24 students, Bengaluru
💡 The Central Insight

Every insight pointed to the same failure mode: students had information, but couldn't act on it — because it was buried, scattered, or presented without urgency context. This shaped the core design principle: see the situation, know the priority, take the action — in under 5 seconds.

Numbers tell the story. Personas make it human. Here is who these numbers represent.

06User Personas

Who We Designed For

Scope note: Research surfaced three distinct user types — the High-Performer (Arjun), the At-Risk Juggler (Priya), and the Faculty Advisor (Ms. Rekha). Priya was selected as the v1 primary design target. Arjun's analytics features and Ms. Rekha's faculty view are scoped for v2 and Phase 3 respectively.
📚

Arjun, The Achiever

High-Performer · Year 2 Engineering

Tracks every grade meticulously. Wants predictive insights, not just records. Uses the dashboard for proactive performance management and trend identification.

Grade analytics Trend charts Export reports
"I track my grades manually in a spreadsheet because the college portal only shows final marks. I want to see whether I'm trending up or down."— Interview 01 · 2nd year Engineering · Bengaluru
🎯

Priya, The Juggler

At-Risk · Part-time Work · Year 2

Balances part-time work and studies. Constantly forgets deadlines. Needs a simple, no-friction overview that tells her exactly what to focus on each day — primary v1 target.

Deadline alerts Quick overview Mobile access
"Half the time I find out about a deadline from a classmate's WhatsApp message, not from any official system. By then it's sometimes too late."— Interview 05 · 2nd year · Part-time work · Mumbai
🏫

Ms. Rekha, The Advisor

Faculty Advisor — Phase 3 scope

Supports struggling students. Needs a bird's-eye view of attendance and academic standing. Her needs informed the at-risk alert design in the student-facing v1 dashboard.

Attendance summary At-risk flags Print reports
"In an advising session, I have to open four different portals to see a student's complete picture."— Faculty advisor · Contextual interview

Personas define who the system serves. Priya's journey maps where it fails her — and exactly where the dashboard creates relief. Here is a day in her life.

07Journey Map

A Day in Priya's Life

Priya's journey was prioritised for mapping because she represents the highest-anxiety, highest-risk user. Mapping her day revealed where anxiety peaks, where information breaks down, and exactly where a unified dashboard creates relief.

Morning Check-in
Class Day
Evening Review
Deadline Crunch
End of Week
Actions
Checks 3 appsAttendance, schedule, WhatsApp group for updates
Attends classWrites assignment deadlines in notebook — digital tools not accessible during class
Opens LMSSearches for materials; can't find them
PanicsRealizes assignment due tomorrow was forgotten
Reviews gradesManually calculates cumulative score in spreadsheet
Feelings
Pain Points
Too many apps, too much time wasted
Digital tools not usable in class — falls back to notebook
Materials buried; poor navigation
No centralized deadline view — relies on WhatsApp alerts
Manual grade calculation — error-prone
Opportunity
One-screen daily summary
Mobile-friendly deadlines view accessible during class
Materials tab in dashboard
Smart deadline timeline with urgency indicators
Auto-calculated grade tracker with trend view
After the Dashboard — Priya's Transformed Day
Opens 3 apps One dashboard open Morning: Today's schedule, pending tasks, and attendance status visible in 30 seconds.
Writes deadlines in notebook Deadline auto-synced Class: Assignment added to tracker via mobile. No notebook workaround needed.
Can't find materials Materials tab — instant Evening: Performance chart shows declining trend. Contacts professor before grade crisis.
Panics over deadline Urgent badge surfaced Crunch: Dashboard surfaced the deadline 5 days earlier with a red Urgent badge.
Manual spreadsheet calc Auto-calculated, trended End of week: Cumulative score auto-calculated with trend arrow in seconds.

The journey map revealed where the system broke down. The IA and user flows show how the dashboard was structured to fix every one of those breakpoints. Here is the architecture.

08Information Architecture & Flows

Structure Before Style

Navigation was designed around the three questions from the design principle. Every section of the dashboard maps directly to a core user need — with zero decorative structure.

Navigation Structure
🏠 Home
📊 Analytics
📅 Attendance
📋 Assignments
🗓 Schedule
🔔 Notifications
⚙️ Settings
Primary User Flows — 4 Critical Paths
Morning Check
Login
Home Overview
Today's Schedule
Pending Tasks
Ready for Day
Performance
Dashboard
Analytics Tab
Subject Breakdown
Grade History
Insight Found
Attendance
Dashboard
Attendance View
Subject Filter
Status Check
Action Taken
Reporting
Dashboard
Reports Section
Date Range
Preview
PDF Export
Each flow was designed to a maximum of 3 clicks. Click depth verified in usability testing: 5/6 participants completed the Morning Check flow without backtracking or confusion.

Architecture defined. The next phase was building — starting with lo-fi wireframes and iterating toward hi-fi based on what testing revealed. Here is the process.

09Wireframes & Iterations

Building the Blueprint

💡
Core Design Principle — Applied to Every Screen

Every screen had to answer one of three questions instantly: "Where do I stand right now?", "What needs my attention today?", and "Am I on track to succeed?" If a screen couldn't answer one of these, it didn't belong.

Wireframes prioritized information hierarchy above everything. The blue sidebar was defined in V1. Three wireframe states are shown below — rendered HTML representations of the actual lo-fi and mid-fi frames.

Lo-fi · V1
Home Dashboard

Sidebar nav + KPI row + primary chart area. Category grid appeared first — problem discovered in testing.

Lo-fi · V1
Attendance View

Donut chart + subject-wise list. Visual-first hierarchy locked in wireframe — validated in testing.

Mid-fi · V2
Refined Layout — Priority Banner Added

Priority banner moved to top (red bar above fold). Primary chart left, quick-access panel right.

V1: avg 38 seconds to find urgent item
V2: avg 21 seconds — 45% faster
🔄
Key Iteration Insight

V1 testing revealed students scanned cards in an F-pattern and spent disproportionate time searching for "what's due today." V2 elevated the upcoming deadline widget to the top-right primary position. Time to locate the highest-urgency item dropped from 38 seconds to 21 seconds across 4 participants.

Blueprint validated. The design system was then built to codify every visual decision. Here is the visual system.

10UI Design & System

Visual Language

A clean, light-mode system built entirely on Inter — chosen for its exceptional legibility at small sizes, critical for a dashboard where students are reading grades, attendance figures, and deadlines at a glance. Every color token is semantically assigned.

Student Dashboard Design System · Inter · Light Mode · Education

🎨 Color Palette

Brand — Single Primary Blue
Primary Blue
#1E90FF
Blue Light
#E3F2FD
Blue Med
#42A5F5
Colour strategy: #1E90FF was selected as the single primary blue — confident and active without feeling clinical. Semantic colours (green/red/amber) align with universal educational conventions — students understand them without instruction.
Status & Semantic
Success
#4CAF50
Danger
#F44336
Warning
#FFC107
Orange-Red
#FF7043

Aa Typography — Inter

Inter · Variable Sans-Serif
Dashboard
Aa Bb Cc Dd Ee 0 1 2 3 4 5 6 7 8 9
Regular
400
Semibold
600
Bold
700
Black
900
Why Inter: Exceptional legibility at small sizes — critical for a dashboard where students are reading grades and deadlines at a glance. A single variable typeface family eliminates tonal inconsistency across the data-dense interface.
Heading — 28px / 700
Academic Performance
Body — 14px / 400
Your attendance in Data Structures is 72% — below the required 75%.

📐 Spacing Scale

4px — Icon gaps
8px — Label–value gap
16px — Standard inner padding
24px — Card padding
40px — Section gap
Chart Colors
Sky Blue
Teal
Yellow
Coral

🧩 Components

Buttons
Status Badges
✓ Achieved ✗ Absent ⚠ At Risk ● Active
Progress Bars
82%
61%
44%
Mini Stat Card
Attendance 87% overall ↑ 4% vs last month

System established. The key design decisions that defined the product are documented next. Here are the choices that made the dashboard what it is.

10bKey Design Decisions

Six Choices That Define the Dashboard

Each decision below was made in response to a specific research finding or testing observation. None are defaults — every one of them required a rejected alternative.

📊 Cards Over Tables for Attendance Early V1 used a raw percentage table. In testing, 4/6 participants couldn't quickly determine if their standing was "okay" or "at risk." Cards with colour-coded progress bars made the same data understood in under 2 seconds. Research basis: V1 usability testing, 4/6 participants struggled with table format
🚨 Priority Banner Above the Fold V1 opened with a category grid. Participants scrolled through all 8 categories before finding urgent items — average 38 seconds. V2 moved the top 3 urgent items into a priority banner. Average time to urgent item: 21 seconds. Research basis: F-pattern scan data from V1 testing sessions
📅 Countdown Language Over Exact Dates "Submission date: 17 March" requires mental calculation. "Due in 2 days" communicates urgency instantly. Participants always referenced countdown language when describing how urgent an item felt. Research basis: Preference testing — 6/8 participants referenced countdown language
🎨 Light Background for Anxiety Reduction Two versions shown to the same participant — one with a red "At Risk" badge on a light background, one on a deep red card. The first felt "like a helpful warning." The second: "like I was being punished." Colour intensity is an anxiety dial. Research basis: Direct comparison session, usability testing round 2
📈 Trend Lines Over Bar Charts V1 used bar charts. Participants couldn't interpret whether they were improving or declining. A line chart with a trend arrow ("↑ 12% vs last semester") gave the contextual reading they needed — trajectory, not just position. Research basis: V1 usability testing — students couldn't infer trend direction from static bars
🔔 Priority-Sorted Notifications Chronological lists buried academic alerts under system announcements. Research showed students used WhatsApp groups as their unofficial deadline reminder system. Notifications redesigned into three tiers: Urgent, Academic, General. Research basis: Observation — WhatsApp used for deadline reminders cited in 5/8 interviews

Decisions justified. Now the screens themselves — six final high-fidelity views. Here is the final product.

11Final UI Designs

The UI Highlights

Six core screens — each answering one of the three design principle questions at a glance. Blue sidebar provides consistent orientation; white cards give content space to breathe.

🏠Home Overview
📅Attendance
📋Assignments
📊Performance
🗓Schedule
🔔Notifications
12Key Features

Core Solutions

📅

Attendance Tracker

Monitor daily and subject-wise attendance with colour-coded progress bars. At-risk warnings trigger automatically when attendance dips below the threshold.

Research basis: 7/8 students didn't know their attendance percentage until too late — most frequently requested feature across all 8 interviews.
📋

Assignment Tracker

Stay ahead of every deadline with countdown chips, priority colours, and subject tags. Urgency readable in under 2 seconds without date calculation.

Research basis: 63% of students surveyed missed a deadline last semester. 5/8 interview participants relied on WhatsApp messages for deadline awareness.
📊

Performance Summary

Trend lines and trajectory badges show whether grades are improving or declining — students spot downward trends before they become academic crises.

Research basis: 7/8 participants said they only discovered grade issues at semester-end. Real-time trend visibility was the direct design response.
🔔

Notifications & Updates

Sorted by priority: Urgent, Academic, General. Critical academic alerts always surface first — system announcements cannot bury deadline reminders.

Research basis: Students reported using WhatsApp group chats as their primary deadline alert system — an unofficial workaround that fails under exam-season pressure.
📱

Responsive UI Design

Clean, accessible layouts optimized for both desktop and mobile — so students can check in during class, on commute, or anywhere without switching contexts.

Research basis: Journey mapping revealed students need deadline access during class (they currently use notebooks as workarounds).
👤

Personalized User Panel

Profile hub with quick links to tools most relevant to the individual student. Priority ordering adjustable based on usage patterns.

Research basis: Different users have different primary needs — customizable panels ensure the most-needed information surfaces without manual navigation.
13Usability Improvements

Before → After

01
Before

Attendance data shown as raw percentage numbers in a table — no visual context for what's good or bad.

V1 — TableData Structures: 72%
Algorithms: 85%
OS: 44%
V2 — Cards⚠ Data Structures 72% Below threshold
✓ Algorithms 85% On track
Identified in V1 usability testing — 4/6 participants couldn't quickly determine if attendance standing was "okay" or "at risk."
After

Color-coded progress bars with status badges make standing immediately clear. At-risk threshold (75%) marked visually on every bar.

02
Before

Deadlines listed by submission date only — no urgency or remaining time context.

V1 — Date OnlyAssignment 1: 17 March
Lab Report: 21 March
V2 — Countdown🔴 Assignment 1 · Due in 2 days
🟡 Lab Report · Due in 6 days
Identified in V1 testing — participants asked "is 17 March soon?" rather than acting. Countdown language eliminated the calculation step.
After

Deadline cards show a countdown chip ("Due in 2 days"), priority colour, and subject tag — urgency readable in under 2 seconds.

03
Before

Performance data showed only raw grades — students couldn't see if they were improving or declining.

Identified in V1 testing — participants couldn't answer "are you improving?" without significant hesitation.
After

Line chart with trend arrows and a percentage-change badge shows trajectory — "↑ 12% vs last semester" is far more motivating than a static grade.

04
Before

Notifications were chronological — a system announcement buried a deadline reminder that arrived 3 days earlier.

Identified through research — students reported missing deadline reminders buried under admin announcements.
After

Notifications sorted by priority type: Urgent, Academic, and General — critical academic alerts always surfaced first.

14Outcomes & Results

The Impact

Usability Testing Context — 6 Participants
Participants6 total: 3 engineering students, 2 commerce students, 1 faculty advisor
MethodModerated sessions using Figma interactive prototype. Think-aloud protocol.
Tasks5 pre-defined tasks given verbally. Sessions were 30–40 minutes.
ScopeConcept project — all metrics from prototype testing, not a deployed live system
84
SUS Usability Score
Excellent range (80+ = excellent). Measured via standard 10-item SUS questionnaire post-testing.
45%
Faster time-to-task
vs participants' own multi-platform workflow — same "find urgent academic task" scenario run on both.
100%
Task 1 completion rate
Morning check flow — all 6 participants completed without backtracking.
6/6
Would use daily
Post-session survey — all 6 participants stated they would use this over their current setup.

Academic Clarity

Students with real-time visibility into performance are more likely to seek academic support before the end of term — earlier intervention before small issues become failing grades.

Reduced Anxiety

Testing feedback consistently noted the design "felt calm" and "not overwhelming" — a direct result of prioritizing hierarchy and white space over feature density. Calm is a feature.

Time Savings

Centralizing 4+ platforms into one reduced task completion time by 45% vs the multi-platform baseline. Estimated 15–25 minutes of daily academic admin time reclaimed.

💬 User Quote

"This is literally everything I've been looking for. I didn't realize how much time I was losing every morning until I used this dashboard and realized I'd found what I needed in 30 seconds."

— Test Participant 03 · Year 2 Engineering · Bengaluru · Usability Session, March 2025

15Learnings

What I Took Away

01

Consolidation IS the Feature

Students didn't need more features — they needed fewer platforms. The most impactful design decision was architectural: building one unified entry point, not a better version of something that already existed.

02

Context Turns Data Into Action

"72% attendance" is inert. "72% — 3 below the minimum required. Attend your next 2 classes to get back on track" is a call to action. Every number needs a frame.

03

F-Pattern Scans Decide Layouts

User testing confirmed F-pattern scanning on every screen. V1 placed a category grid first. V2 moved the priority banner to top-left. Time-to-urgent-item dropped from 38 to 21 seconds.

04

Anxiety Is an Intensity Problem

Same "At Risk" badge on a light background vs a deep red card. The first felt "like a helpful warning." The second: "like I was being punished." Colour intensity is an anxiety dial — calibrate it deliberately.

05

Colour Before Hierarchy Costs Iterations

V1 had the blue sidebar colour before hierarchy was validated. V2 was done in greyscale until hierarchy was confirmed — and took half the iterations to finalise. Structure first. Style second. Always.

06

Colour Communicates Status — or It's Just Decoration

In V1, 87%, 72%, and 45% appeared in identical black text — 14 seconds to identify at-risk subject. In V2, 72% appeared in amber with a badge — spotted in under 2 seconds. Colour didn't decorate the data. It was the data.

16Next Steps

Future Roadmap

The roadmap is sequenced around user impact first, then institutional complexity. Mobile companion (Phase 2a) addresses the most immediate unmet need — students need access during class and commute. AI Study Planner (Phase 2b) depends on Phase 2a delivering real usage data for meaningful personalisation. Faculty integration and Institutional API (Phase 3) require institutional buy-in and backend engineering.
Phase 2a — First priority

Mobile-First App

A native mobile companion with push notifications for deadline alerts, class reminders, and grade updates — the most immediate unmet need from research.

Required before Phase 2b — usage data from mobile informs AI personalisation.
2a
Phase 2b — Depends on 2a

AI Study Planner

Analyze real performance patterns to auto-generate personalized study schedules. Requires Phase 2a usage data to be meaningful.

Needs 3+ months of mobile behaviour data to train useful patterns.
2b
Phase 3 — Institutional

Faculty Integration

A parallel faculty view for advisors and professors — enabling at-risk identification and assignment management.

3
Phase 3 — Institutional

Institutional API

Deep integration with university ERP systems for live grade syncing and automated attendance capture.

3

Want to see the full prototype?

Explore the interactive Figma prototype or get in touch to discuss this project in more detail.

© 2025 Sharon Derik · Smart Student Dashboard Case Study

Back to top ↑