Behavioral Health Data Dashboard

UX Research

The Behavioral Health Index (BHI) is a data dashboard website designed to allow health service providers the ability to see and act on behavioral health needs within their area and across the state.

What is behavioral health? And what are the services and data related to it? Behavioral health refers to the conditions related to both your mental and physical wellbeing. Mental illness, intellectual disabilities, substance abuse, eating disorders, and other addictions are some conditions that fall under behavioral health treatment. In Virginia, those public services are provided by the Department of Behavioral Health and Developmental Services (DBHDS) at a state level, and at a local level by Community Service Boards (CSBs.) Behavioral Health data typically measures the rates of behavioral health conditions in an area (data referred to as ‘outcomes’). For this project, the data also measures social determinants, societal factors like education levels, that are shown to influence behavioral health outcomes (data referred to as ‘indicators.’)

Project's goal: Create a data dashboard website that provides analysis and data visualizations of behavioral health across Virginia for policy advocacy, programming, and fundraising.

Execution:  I was the project's UX researcher and product designer, working with an internal team, external stakeholders, and users to create the site.

Virginia Department of Behavioral Health and Developmental Services (DBHDS)
Project Year
2022 (Ongoing)
Sole Designer
Proposal Writing
Knowledge Mapping
User & Stakeholder Interviews
Affinity Mapping
Comparative Assessment
Strategy Workshop
Data Visualization
Project link


Project Objectives

  • Provide service-provider leadership (and related non-profits) a unique way to analyze/visualize behavioral health data and determine community needs — to offer services/programs and gaps in current care.
  • Translates data into an accessible public health 'story' about the areas they serve and those they overlook.
  • Help justify the client's funding re-allocations across the state.
  • Shift the state's conversation about behavioral health upstream to prevention.


Stakeholders might think of internal users as having a different level of knowledge. External users of the site will be secondary, while internal users will consist of Community Service Board (CSB) staff, who will be the primary users. Primary (internal) users could be hostile to this project because it uses the same data that has changed how much funding the state agency has given each regional office (CSB) in recent years. Primary users will need more data than this website can provide.


Research Methods

In January of 2022, I outlined a set of learning goals for the project based on the client's early objectives for the project.

In total, I conducted the following research activities:

  • Knowledge Mapping
  • User & Stakeholder Interviews
  • Affinity Mapping
  • Personas
  • Comparative assessment
  • Content Strategy

 KNowledge Mapping 

What do we think we know? What do we want to learn?

Creating a knowledge map early in the process helped me identify gaps in knowledge about our users, from which I developed our interview questions.

Spreadsheet of a knowledge map outlining early knowledge and assumptions about our users, so we could determine what we should find out in interviews.

 Conducting Interviews

Gathering requirements from users, project stakeholders, and subject matter experts

Writing Interview Questions: In order to fill the gaps in my knowledge about users, speaking to them was the obvious next step. I had identified assumptions and gaps in my knowledge, so I drafted user interview questions aimed at filling those gaps and verifying those assumptions. For stakeholder interviews, questions were aimed at understanding the project goals and metrics for success from the state agency perspective. I was careful to draft them to be open-ended and non-leading.

Fourteen interviews captured varying perspectives in and outside the organization: I interviewed stakeholders, internal and external users, as well as our data analyst.

Stakeholder Interviews: For an organizational perspective, I interviewed the project leader within the state Agency (DBHDS.) She helped me understand the essential structure of the organization, their goals for the project and how those fit into the organization's mission, and the politics surrounding the project's subject matter.

Subject Matter Expert Interviews: For essential background information, I interviewed our internal data analyst. He served as a subject matter expert on the data, helping me to understand its breadth and limitations.

User Interviews: Our internal user interviewees consisted of internal CSB workers. Obviously user interviews are invaluable, but for this project specifically I wanted to hone in on what types of CSB workers would use the website and how, gauge knowledge levels assumed by our stakeholders, and see what opportunities their expertise could lend to the design. It was also necessary to get perspectives from those working in different (urban, suburban, rural) regional offices across the state, since we were told that rivalling opinions varied dramatically between locations.

Interviews Part Two, External Users: Following our stakeholder interviews, I interviewed our data analyst to learn more about how our dataset stacked up to our user’s / stakeholder’s dreams for what the data might convey. Lastly, in response to our research summary (described below) that identified a lack of data on external users, we went back to interviewing folks she identified as representative of that user group. So I then drafted another set of interview questions and interviewed people in various professions related to behavioral health. I used to create transcripts of the interviews.

Screenshots of Zoom interviews with users.

Building Goodwill for Adoption: It’s important to note that this exercise established the relationships needed to do future user testing and built goodwill towards DBHDS and the project. Most of the users we talked to were eager participants and expressed gratitude for being able to share their opinions. They said they had not been asked for their opinions or listened to in a long time. That goodwill, paired with catering to user needs in design, will create the greatest likelihood of user adoption and engagement.

A spreadsheet organizing user interview answers.

Key Insights from User Interviews:

 Interview Synthesis

Turning Transcripts into Product Mandates

Affinity Mapping: Once the interviews had concluded, we analyzed the qualitative data. After interviewing 14 people, the amount of data was overwhelming. It wouldn't be possible to pull out insights from memory. To synthesize what we heard, I set up marathon work sessions with my project manager/content writer. We reviewed the interview transcripts and used a digital collaboration space to put notable comments onto virtual post-its. Afterward, we analyzed them, grouped them into themes, and extrapolated key insights and product mandates.

Digital post-it notes with takeaways from user interviews are organized into sections, such as data literacy or data needs.Image of a project brief outlining the project's scope, goals, and people involved.

Project Brief: To make sure we agreed on the implications of our interview findings, we assembled a brief that distilled what we heard into a project brief. It was also a valuable artifact to present to stakeholders to see if our ideas about the project were aligned with their vision. We included it in a report for stakeholders summarizing our interview findings.

 Strategy Workshop 

Focusing on what's feasible and most impactful for all

Prioritizing features and focus areas: Users gave us a long wishlist of features and information they wanted to be included in the website. Our resources, time, and technical constraints wouldn't allow us to implement them all. We needed to figure out what was feasible, so we made a 'yes, no, maybe so' list that balanced business goals and user needs against constraints.

Image showing a list of ideas for the website. It's divided up into 3 categories The first is ideas we can and should implement. The second is ideas we may be able to implement in this scope or in a future scope of work. The third category is ideas that we should not implement based on user feedback or ideas we're unable to implement because of constraints.
Our 'yes, no, maybe so' list balanced business goals and user needs against constraints — and identified what was feasible.

Catering to our primary users: Another challenge we faced was that our stakeholders wanted to appeal to a broader external audience than we thought would realistically use the website. Our internal users were the most likely to use the site, had the most to gain, and would deliver the most impact for the organization, so I determined that they should be prioritized. If we prioritized both internal and external users, we risked watering down the content to the point where it wouldn't appeal to our primary users. So we had to figure out how to balance catering to each group appropriately without alienating the other. Still, we also needed stakeholder buy-in on prioritizing the internal users. The next step would be presenting our takeaways to our client.

 User research Report 

Reporting the results to align stakeholder wants with user needs

What did I do? I made two reports of interview findings. One for internal use that had a longer recommendations list and parking lot for ideas, the other succinct version for the stakeholder. This document outlined the information gathered and analyzed from user and stakeholder interviews. Both included evidence-based research pulled from a lit review I assembled to back up design strategy.

Why did I do it? To get alignment with the stakeholders about how we intended to move forward with the project. It served as an overview for stakeholders of how the insights from interviews would inform both the design and content development (product design) of the DBHDS web tool. But we also did it to get stakeholders thinking more soberly about the project's users and constraints. We needed to get across the idea that CSB workers would be our primary audience — they would likely far outnumber external users, and their needs were greater and easier to address by catering the site's content to them. Unlike the external audiences, they'd have direct communication with them to push for adoption. Without an outreach campaign for the external audiences, it was unrealistic to expect them to become a primary user group. So while we would accommodate an external audience, the site's content would be geared to our site's primary internal audience — CSB workers. Not only did we present the merits of appealing to this group, we also described how we would cater content to knowledge levels and interest areas that our stakeholders weren't aware of.

What did it accomplish? By presenting the realities of CSB workers in the context of this web tool the report moved stakeholders more towards the realities of internal user needs. Although they reasserted the goals for an external user base, the conversation resulted in being connected with external users to interview. she confirmed that the report "did an excellent job at extracting key components to hone in on for the website build-out."

Image showing the user research and literature review documents.

“The report did an excellent job at extracting key components to hone in on for the website build-out.”

DBHDS Project Stakeholder
The ten-page user research report, with strategy backed up by evidence-based reasoning in the literature review.


Guiding Design Decisions with Empathy

What did I do? I created a set of personas to represent our users. Four out of five are internal CSB workers with different roles and therefore differing priorities and behaviors for using the website. The fifth persona (purple) represents the external audience characterized by an advocacy strategist that works at a non-profit in a related field (veterans affairs, housing coalitions, etc.)

How was it helpful? As I began sketching out the structure of pages and outlining content areas, it was easier to validate my ideas by asking whether a decision would serve the users. Rather than pouring over interview transcripts or tracking down a post-it note, personas are a much more succinct encapsulation of the user's needs to refer back to. The personas were a useful tool for remembering both user groups and how to prioritize their needs appropriately.

 Comparative Assessment

Identifying the Strengths of Existing Products and Websites

Software: As I was working with a developer to explore software solutions for the site, I looked through various options. Users told us about many types of software they used to organize and analyze data, so if we did integrate a software I figured it best to use one that would be free to the client or familiar with the users. Ultimately, we decided not to use BI software integration, but it was helpful to explore the features that users would already have come to expect from a data dashboard.

Websites: I reviewed websites with similar data sets and identified opportunities to provide something uniquely valuable. As with the software, I identified what relevant features we could consider adding to our dashboard.

Overlapping screenshots of the software interfaces of Yellowfin and Power BI. In front of them is an image of a spreadsheet outlining the strengths, weaknesses, and opportunities of those products as they related to our project.
I analyzed software that was comparable or familiar to the users, as well as websites that were similar in scope to our project.


Exploring Different Forms the Design Could Take

Supporting Secondary Users but Catering to Primary Users: I created initial sketches of the main pages. I made the front page lay-person friendly, but the deeper you adventure into the website the more targeted and complex the information becomes knowing that the internal CSB users will be digging around the website because they have more interest and higher knowledge in behavioral health. Keeping in mind that CSBs tend to think of themselves and their problems as singular, and that they have a lot of hometown pride, CSB users needed to feel the content and features provided were unique to them.

Content for Varied Time and Data-Savviness: We learned that some CSB workers lacked an understanding of the data, some would browse, and some would look for something specific, but all of them lacked time. CSB pages provide a snapshot of the region's performance on a selected set of indicators for those who don't have time to browse but want to look at the most relevant data on their area. For those with limited data knowledge who needed a better understanding of what they should be looking for, a custom query would allow them to browse outcomes or indicators correlated with behavioral health.

Unique page backgrounds. CSBs are not likely to be satisfied with a one-size fits all approach to CSB pages. Even though templated, we needed to be very intentional about opportunities for individual characteristics to come through so that these pages are more likely to be embraced—that could mean an image or illustration that characterizes their region, an introduction that highlights demographic information along with the executive summary, or varied style templates (colorschemes, textures, icons) for urban, rural, suburban CSBs. For example: a weather app uses the same layout template with backgrounds that change with the weather and time of day.

Highly applicable and sortable content: Instead of giving them a generic data guide and a separate primer on social determinants, we combine them. Our user interviews suggested a higher than average understanding of social determinants, making a primer redundant. Instead, I thought it better to provide a data guide with a social determinants lens that explained how users could apply social determinants concepts to their work. In addition, the examples given in the guide change based on the users selection of what type of area they're working in (i.e., rural vs. urban.)

My sketches could only be a starting point for feedback, we'd need to approach the rest as a team.

Image showing early sketches of the site's framwork.


Settling Design Decisions as a Group Before Moving Further

Teamwork makes the dream work! Despite all the research we had done, we still needed to come to a consensus about how to apply it to the design. There were decisions about content and data that I couldn't make alone. Sketchboards would allow us to work faster and more collaboratively while giving me what I needed to move forward. I set up the conference room so our most important research was displayed next to blank wall space for sketching. I presented blown-up versions of my initial sketches for feedback, outlining the areas and decisions that needed more input. I talked through content decisions with writers and talked with EPIs and analysts about my ideas for the data. Together we started making edits to existing ideas and sketching new ones.

An image of an office wall with taped-up notes and diagrams.



We were able to glean a set of opportunities from User Research that extended beyond the bounds of the project, and into organizational and operational changes. Although many ideas were obviously out of the scope of the project, I did provide the list of problems and recommendations to the client for their consideration. This provided additional value to the client, and may lead to more funding opportunities for the Center. So, although we were not funded to create custom CSB pages, we’re hoping to be funded to build these in the next grant cycle with added user-requested datasets.

Lessons Learned

For the sake of the flow of the project, I should have asked to be connected with external users to interview, but I didn't because I underestimated stakeholder interest in engaging this user group. Interviews backed up my belief that external, public sector folks would not be the primary users of the site, both because there was no formal dissemination strategy in place for this group and because we heard from interviewees that this would not be an often-frequented site for them unless additional information or datasets were added.

  • Content Inventory
  • Wireframing
  • Card Sorting and User Testing Sessions
Whiteboard showing the thought process behind balancing the organization's needs and the user's needs by examining reach and impact based on appealing to general vs. specific knowledge levels.