DCO.ai

product management
ux design
ux research

Redesigned a video learning tool interface by leveraging timestamped tags to highlight key moments, enhancing navigation and reducing cognitive load for educational users.

Roles & Responsibilities

Lead Designer | User Researcher | UX Manager

Project Context

Client: DCO.ai
Time
: 1 yr 2 mos
Team: Andrea Estrada, Marina Castellenos

Tools

Figma
Miro
Qualtrics
Adobe Premiere Pro

01

Project Overview

Problem

DCO.ai is a start-up company aiming to help users find and retain information through its language and image processing tool interface, leveraging artificial intelligence technology. Already used by multiple large media companies, the company wanted to explore the product’s potential in educational contexts. However, the text-heavy and cluttered interface discouraged learners, despite its utility in processing information from videos. This created a barrier for entry in the educational market, where usability and simplicity are paramount.

Goal

To design an interactive, simple, and intuitive video interface that:

01. Improves the experience of interacting with videos for educational purposes.
02. Helps expert and non-expert users easily find or discover information in long-form videos.
03. Reduces cognitive load and encourages deeper engagement with educational content.

02

Solution

We created a new design that provided a virtual learning space, simplifying and highlighting information processed from videos. Key moments were organized into time tags (timestamps) and chapters, with a reactive design visualizing connections between the video, transcript, and timeline. The interface was decluttered to reduce cognitive load, colors were chosen to align with the brand identity, and missing video player features like scrubbing and playback speed controls were added.

The new design allows users to:

• Quickly locate relevant sections in long videos.
• Interact with the information intuitively through visualized timestamps and chapters.
• Experience a seamless flow between video content and related textual information.

03

Process Review

literature review
competitive analysis
semi-structured interviews
affinity mapping
persona development
Research
Design
brainstorming workshops
participatory design
sketching & mockups
prototype (low- and hi- fidelity)
Testing
concept validation
A/B testing
first-click testing
design review
heuristic evaluation
04

Research

Semi-structured interviews

We interviewed 11 users to understand their thoughts on educational videos, frustrations with similar technologies, and common behaviors. To ensure encompassing results, we recruited participants including undergraduate and graduate students, self-learners, working professionals, and designers.

Key findings:
• Skimming behavior: Most users skim or avoid hour-long educational videos, finding it hard to locate relevant content.
• Standard controls: All users rely on common video player features such as pausing, rewinding, playback speed, and scrubbing.
• Importance of transcripts: 8 participants emphasized the necessity of transcripts, especially for long-form educational videos.

Affinity mapping

We synthesized the interview data into an affinity map using over 350 sticky notes. This process allowed us to visualize common user behaviors and pain points, which informed our design decisions. From this, we identified key insights and created personas to empathize with users and define specific design goals.

Personas

We developed two personas, representing:

1. A self-learner: Focused on efficiency and quick access to relevant information.
2. A student: Seeking deeper engagement and structured navigation for comprehensive learning.

Key findings & design recommendactions

01. Overwhelm from long videos:
> Visualize key concepts and information density in a video to help make key concepts easily identifiable.
02. Reliance on standard video controls:
> Implement core video player features like scrubbing and playback speed adjustment.
03. Value of video visuals:
> Suggest tagging screenshots from video to highlight important visuals (feature scope).
04. Collaborative learning needs:
> Allow shared tags/notes for peer interaction (future scope).

05

Ideation

Brainstorming workshops

We facilitated multiple brainstorming workshops with designers, students, engineers, and product managers to explore potential solutions. After each session, we conducted surveys to prioritize features and identify the most viable design directions.

Key ideas included:
Visualizing timestamps as part of the video timeline.
Interactive transcripts to bridge text and video content.
Collaborative tagging and note-sharing for group learning (deferred).

Mockups

To align with stakeholders, I created mid-fidelity mockups to visualize potential solutions. These mockups helped scope the project by focusing on:

Visualizing video content through time tags and chapters.
Implementing standard video player features.

Sketches

I created low-fidelity sketches in Procreate to encourage feedback and co-creation during user testing. Participants were more willing to share criticisms and suggest ideas when reviewing these sketches.

06

Design

We created multiple low-fidelity prototypes in Figma, iterating on user feedback throughout the design phase. The high-fidelity prototypes refined navigation, layout, and interaction patterns to address user pain points.

Low-fidelity prototype

High-fidelity prototype

Final Design Focus

• Visualizing tags and chapters: A cleaner, more intuitive video timeline.
• Video player enhancements: Adding missing controls like scrubbing, speed adjustment, and bookmarking.
• Updated visual design: A decluttered layout with rounded edges, softer borders, and improved text hierarchy.

Before

After

06

Evaluation

Concept testing

We conducted two rounds of user flow testing with 8 participants:

• 95% found the tag concept useful.
• Chapters preferred:
6 of 8 users appreciated chapter markers for navigation.
• Increased feedback: Users requested more visual feedback when interacting with tags.

A/B testing

During testing, we compared two interface layouts and tag visualizations to finalize interface layout. We purposefully used the sketches instead of high-fidelity prototype to ensure participants felt comfortable suggesting changes and sharing their critiques.

• Transcript placement: 100% preferred Layout B, placing the transcript at the bottom.
• Tag visualization: 75% favored a consistent placement for tag details on the right.

User flow test

Findings:

1. Users requested clearer feedback for selected tags (e.g., color highlights).
2. Borders were deemed too harsh; softer edges were preferred.
3. Tags and chapters were placed too closely together, requiring more spacing.

Final design changes

Added color feedback for tag selections.
Rounded edges and softened borders.
Increased text size and spacing between components.

07

Hand-off

We provided responsive prototypes and detailed design artifacts to the client’s engineering team. To ensure a smooth handoff, we conducted a meeting to discuss implementation feasibility and shared our annotated designs. We shared our Figma files and documents detailing findings with the stakeholders.

Suggested Future Opportunities:
• Tagging screenshots: Enhance visual learning by allowing users to tag key visuals.
• Collaborative features: Enable shared tags and notes for group learning.
• Improved transcription: Ensure ADA compliance by enhancing transcript accuracy.

08

Retrospective

What Went Well

Early testing provided actionable feedback, driving iterative design improvements.
Clear project scope ensured deliverables were realistic and aligned with stakeholder expectations.

Challenges

Balancing stakeholder input with user needs required careful prioritization.
Time constraints limited the scope of collaborative features.
Limited opportunities to communicate with the engineering team.

Lessons Learned

1. Begin testing early to validate ideas and iterate effectively.
2. Define scope realistically to ensure feasibility and quality outcomes.

09

Personal Reflection

Working on the redesign of the DCO.ai interface for educational use was both a challenging and rewarding experience. It gave me the opportunity to navigate complex user behaviors and expectations around long-form educational videos, balancing their needs with the constraints of a fast-paced startup environment. I particularly enjoyed the collaborative workshops, where brainstorming with cross-functional teams led to innovative ideas, such as the interactive tags and chapters feature.

The project also reinforced the importance of empathy in design. Through user interviews, I gained a deeper understanding of how overwhelming dense information can feel, and this insight guided my focus on decluttering and simplifying the interface. The iterative testing process taught me how even small design choices—like adjusting the placement of a transcript or softening borders—can significantly impact usability and user perception.

Through this project, I grew as a user experience professional designer by improving my ability to communicate design decisions effectively with both technical and non-technical stakeholders. This project also deepened my appreciation for the power of thoughtful visual and interaction design in making complex tools accessible to a wider audience. Most importantly, it underscored the value of designing for clarity and engagement, which I carry forward into all my future projects.

Example of text field