
I’m a mixed-methods UX researcher and strategist with over 10 years of experience working across consumer and enterprise products. I’ve led research to inform product strategy, evaluate new experiences, and support launch decisions, particularly in 0→1 and emerging product spaces. I focus on helping teams clarify priorities, assess risk, and make durable decisions grounded in insights.I'm currently open to work.








I often say research is only 10% of the job. Below is a high-level view of my process to ensure research effectively serves teams and organizations.
Evaluate organization and operating model. Assess structure, values, and processes including how research and insights function today
Identify stakeholders and product priorities. Understand product strategy, goals, and prior experience with research
Assess research needs at multiple levels. Define needs across the macro (market, ecosystem) and micro (features, experiences)
Leverage existing knowledge. Use prior research and subject-matter expertise to avoid duplicative work
Prioritize with the team. Align on what to learn based on impact, risk, and timelines
Scale research when needed. Establish programs (e.g., research communities, rolling research) to support ongoing learning alongside foundational work
Engage stakeholders throughout. Keep teams aligned by sharing early signals and emerging insights as the work evolves
Analyze and translate insights. Synthesize findings and develop frameworks (where relevant) to support strategic decisions
Support ongoing integration. Ensure insights are carried forward into product development and organizational practices
Courtney Barton
Examples of work informing product strategy, experience evaluation, and launch decisions.

Rift S:
Launch Readiness
Led development of a UX framework to evaluate product readiness and support launch decisions.

Portal for Business:
Product Strategy
Defined product direction using jobs-to-be-done research to inform feature development and GTM strategy.

VR UI Redesign:
UX Metrics Evaluation
Crafted UX Metrics to evaluate a system UI redesign, comparing performance to the existing experience.

CONTEXT
Rift S was a new VR headset requiring evaluation of both hardware and system experiences ahead of launch. The team needed a structured way to assess readiness across the end-to-end experience and identify remaining user risks prior to release.
PROBLEM
How do we determine whether the Rift S experience is ready to launch, and what UX issues must be addressed before release?
MY ROLE
Led UX research for Rift S, owning the end-to-end software experience and coordinating closely with the Rift S hardware research team. Partnered with a quantitative researcher to develop the measurement framework and led its application in product readiness reviews.
APPROACH
Developed a multidimensional hardware and software UX measurement framework evaluating effectiveness, understanding, efficiency, and sentiment across the Rift S experience
Applied the framework across key product areas including onboarding, safety and guardian setup, content discovery, navigation, and system controls
Conducted pre-launch research through lab studies combining survey data and observational methods
Partnered with data engineering to enable post-launch measurement through surveys and behavioral data logging
Integrated UX evaluation into product reviews alongside engineering metrics and risk assessments
IMPACT
Decision Support
Provided a structured view of UX readiness across the end-to-end experience
Enabled identification and prioritization of UX risks ahead of launch
Integrated UX evaluation into product reviews alongside engineering metrics
Benchmarked against prior Rift to assess parity and improvement
Outcomes
Enabled go/no-go launch decision
Established a UX risk backlog to prioritize pre- and post-launch work
Scaled the framework across hardware and software experiences
Informed follow-on workstreams addressing user education and system understanding

CONTEXT
Portal was expanding beyond consumer video calling into business use cases. Multiple teams were exploring overlapping opportunities without a clearly defined product direction or shared understanding of the problem space.
PROBLEM
What role should Portal play in business environments, and how should the organization prioritize product development across competing opportunities?
MY ROLE
Led research to define the problem space, align cross-functional teams, and inform product strategy for Portal for Business.
Synthesized existing research and prior product efforts across teams
Interviewed internal stakeholders across product, design, engineering, and data
Identified overlapping workstreams across Meta, and gaps in understanding
Defined key open questions to guide research and decision-making
Established a research roadmap aligned with leadership priorities
Built systems to keep stakeholders engaged and aligned (research library, working sessions, ongoing communications)
APPROACH
Define the problem space
Synthesized prior research, product efforts, and adjacent initiatives
Clarified what was known, assumed, and still needed to be answered
Align the organization
Engaged stakeholders across teams to build shared context
Identified overlapping efforts and opportunities for coordination
Prioritize and execute research
Developed and aligned on a research roadmap across product areas
Designed and led a mixed-methods study to understand how remote workers manage tools, communication, and workflows
Conducted qualitative research with both newly remote and established remote workers to capture differences in behavior and needs
Followed with quantitative research to size and prioritize key jobs and unmet needs
Developed JTBD framework focused specifically on the role of a secondary screen in supporting productivity, organization, and communication
Activate insights into strategy
Translated research into product direction, mission, and principles
Partnered with product and design to shape feature prioritization and planning
Supported go-to-market and positioning decisions
IMPACT
Established product direction
Defined how Portal could serve business use cases across remote and hybrid work
Clarified priority opportunities across product areas and user needs
Aligned cross-functional teams
Created shared understanding across product, design, engineering, and leadership
Reduced fragmentation across parallel efforts in other Meta product orgs
Scaled research as a function
Built research infrastructure (roadmap, channels, library) to support ongoing decision-making
Enabled continuous insight generation across prototype, alpha, and beta phases
Outcomes
Informed product strategy and feature prioritization
Contributed to mission, vision, and principles for Portal for Business
Supported go-to-market planning and product positioning

CONTEXT
The system interface for Oculus was redesigned to help users access content more quickly and support future capabilities like multitasking. Early usability testing identified friction, but the expectation was that these challenges would diminish as users adapted to the new interface over time.
PROBLEM
Were early usability issues simply a function of unfamiliarity, or did the redesigned system introduce persistent usability challenges that would impact long-term experience?
MY ROLE
Led the evaluation of the redesigned system interface and developed a UX metrics framework to assess performance across key dimensions.
Designed and implemented a multi-dimensional UX measurement framework (E.A.S.E: effectiveness, awareness, sentiment, efficiency)
Led quantitative and qualitative evaluation of the redesigned system
Defined UX KPIs across core system components (navigation, panels, system controls)
Synthesized findings across survey and follow-up interviews
APPROACH
Designed a survey to evaluate system experience after sustained use (~14 days)
Surveyed three user groups to capture different perspectives on system experience:
New users on the original system (~14 days of use)
Existing Rift owners using the current system
Users who opted into the redesigned system (~14 days of use with new system interface)
Used this structure to approximate comparative performance between original and redesigned systems
Conducted follow-up interviews to understand perceptions and pain points with those using the redesigned system
KEY FINDINGS
Usability challenges persisted beyond initial exposure. Core issues observed in early usability testing (e.g., getting lost, difficulty navigating) remained after extended use
Redesigned system underperformed relative to the original. All UX metrics for the redesigned system lagged behind the original interface benchmarks
Navigation was the primary driver of experience breakdown. Ease of navigation strongly correlated with: ability to find content, confidence using the system, and frequency of feeling lost
Directional signal vs. long-term potential. Despite friction, most still viewed the redesign positively, as a step in the right direction
IMPACT
De-risked full release of the redesigned system. Surfaced critical UX gaps, informing the decision to continue iteration before expanding to all usersChallenged a critical product assumption. Demonstrated that usability issues were not solely due to novelty and did not resolve with timePrioritized core system improvements. Identified navigation and information architecture as key areas requiring refinement which informed follow-up design sprintsEstablished a scalable evaluation framework Provided a structured approach for ongoing system measurement and iteration