Mixed Methods | End-to-end research | Product Strategy | Cross-Functional Collaboration | Stakeholder Alignment
The Cisco Licensing ecosystem is complex and often frustrating, driving users away from key actions and tools.
I conducted mixed method research to uncover the root causes of low adoption for one of Cisco's key licensing platforms to inform improvements. Through cross-functional collaboration across Cisco and Cisco Meraki, my research delivered actionable recommendations that drove immediate interface enhancements and informed the greater product strategy.
Role
UX Researcher
Stakeholder Teams
Engineering, UX Design, Product Management, Data Science
Timeline
12 Weeks (6/10-9/10)
Relevant Tools
Qualtrics, Miro, Webex, Figma, Excel
A key Cisco licensing platform faced low adoption and limited knowledge of its current user experience. The platform lacked a cohesive vision, making this both a challenge and an opportunity for research.
Why it Mattered
• Low adoption was impacting critical business metrics.
• Frustrated customers were turning to alterative solutions.
• Leadership recognized the need for a user-centered approach to improve the platform experience.
Given the platform's low adoption, the limited knowledge of its user experience, and stakeholder priorities, I ran an exploratory research project with three main objectives:
1. Evaluate the current user experience and identify improvements
2. Understand how and why customers are using other platforms
3. Inform the greater product ecosystem strategy
Tasked to lead this research effort, I had to learn a new and complex domain of network devices and licensing models.
I held initial collaborative meetings with stakeholders to understand the products, strategy roadmaps, and define the problem space.
Mapping out stakeholders to learn from and who should be involved in the project. Initial research notes.
I kicked off the project with an interactive meeting with key stakeholders to solidify the project goals, research questions, and proposed plan.
This helped ensure alignment with stakeholder priorities and finalize my research plan.
Kickoff meeting board for stakeholder collaboration
To paint a complete picture, I chose a combination of qualitative and quantitative methods. This approach allowed me to triangulate findings and build more impactful recommendations.
1:1 Virtual Interviews
To identify preferences, pain points, and use cases of licensing tool(s)
User Walkthrough
To assess customer experience and usage via user demonstration
SUS Survey
To measure user satisfaction & perceived usability of the system
Strategic Participant Recruitment - I designed a Qualtrics screener to recruit both active users and non-users who had turned to alternative platforms. This helped revel not only the 'what' but also the 'why' behind usage gaps.
Goal-oriented Moderator Guides - I created tailored moderator guides for the two user types, focusing on platform experience, alternate platform use, and licensing mental models. Stakeholder feedback ensured alignment with key tasks for walkthroughs.
Participants had varying experience and inconsistent use cases, which challenged me to adapt beyond the moderator guide. This helped me explore deeper emotional drivers behind user behavior. I began to discover that the platform’s challenges weren’t just about functionality — users didn’t trust it.
This dynamic approach allowed me to dive into deeper causes while staying aligned with the study's broader goals.
With stakeholders spanning both Cisco and Cisco Meraki, including engineers who primarily designed the platform, I prioritized ongoing engagement throughout the research process. I shared interim updates to keep teams informed, met with engineers to clarify knowledge gaps, and tailored my insights to address cross-functional priorities.
This ensured that my recommendations were impactful and meaningful to all stakeholders while staying grounded in their strategic objectives.
Analyzing the data required careful attention to nuanced behaviors and feedback. I employed a mixed-methods synthesis approach to ensure my findings were both accurate and meaningful.
Affinity Mapping
Grouping observations from the interviews and walkthroughs to identify recurring pain points, behaviors, and emotional drivers such as mistrust.
User Flow Analysis
Breaking down participant workflows during walkthroughs to pinpoint where they left the platform and why.
SUS Survey Results
Layering quantitative satisfaction metrics with qualitative insights to highlight gaps between user expectations and platform performance.
I aimed for my insights to inform both granular interface refinements and overarching strategic decisions. Utilizing clear evidence and visualizations to illustrate findings was key in presenting to diverse stakeholders.
Customers visit this platform to stay updated and take action, but data inconsistencies throughout features cause mistrust, driving them towards alternate platforms they feel are more trustworthy.
Pinpointing features that cause mistrust
User flows showcasing expected steps vs reality;
Highlighting points of platform abandonment
Actions that require more steps than customers expect lead to frustration and prompt users to leave the platform.
Customers require enhanced visibility of key elements, familiar terminology, and clear information hierarchy to successfully complete key licensing tasks.
*measured by task completion during user walkthroughs
Identifying UI elements that have clear visibility and terminology
Illustrating license ecosystem complexity
Customers overlook the platform because its purpose is unclear amongst the many platforms they use for licensing. A clear definition of the platform's role is critical to justify its value and improve the Cisco licensing experience.
I learned a lot of important factors that play into collaboration and research success.
Stakeholder collaboration is key
I learned the importance of fostering ongoing alignment across cross-functional teams to ensure research delivers maximum value. Regular updates and collaborative synthesis sessions helped bridge gaps between disciplines and made my recommendations more actionable.
Always stay flexible
Working with participants who had inconsistent experiences taught me to stay flexible, dig deeper into emotional drivers, and go beyond the script to uncover more meaningful insights.
Trust is a powerful part of the user experience
I realized through the study how important trust is for user engagement and retention. Addressing seemingly small issues, such as data inconsistencies, can have a profound impact on user behavior and perceptions.