Background: Created a website that provides analysis and visualizations of public health data at the state and national levels. Using data analysis from the Center on Society and Health and funded by The Robert Wood Johnson Foundation.
Project's goal: Deliver educational content alongside a large and complex data set in a simple user interface for government agencies and non-profit organizations to advocate for health policies to state policymakers.
Execution: I partnered closely with my interdisciplinary teammates inside and outside the organization (including epidemiologists, data analysts, geospatial engineers, marketers, writers, and developers.) I also assembled an advisory board of former health officials and state policymakers to ensure that the product provided value to both sets of users. We delivered an elegant solution on time and within budget.
A little background on spearheading the project’s move from print to digital: In early discussions of how to communicate our updated findings in phase II of the project, I reminded the team that phase I’s plan of a single report had ballooned into a whopping ten reports of content, which had taken years of back and forth to complete. Not only did a move to digital represent space-saving and dissemination advantages over a finite medium like print, but it also had the pragmatic advantage of being more likely to be completed on time if content were to expand again.
To determine any process or plan for the project, I needed to understand the project goals and subject matter before I could define the important decisions to be made. I drafted a set of questions about the project's purpose and topic, which came to be the basis of my generative research plan.
In order to get answers to my questions, I conducted the following research activities:
Understanding the Data: I conducted stakeholder interviews with epidemiologists and data scientists. In addition to understanding the value and limitations of the dataset, we identified who would be the likely users of a dataset like this.
Identifying Our Users: The primary users of the site would be policy influencers (i.e., agency analysts, health department epidemiologists, and advocacy groups). They would be able to get the information into the hands of state-level leadership (i.e., governors, state legislators, agency heads) with the end goal of influencing policy decisions in a way that improves environmental conditions/health outcomes.
Defining Internal Goals: Rather than emphasizing the state rankings, The Center wanted to emphasize the data's 'big picture' implications, how variables were interconnected, and the drivers of health outcomes. The challenge, they believed, would be explaining what some might see as "abnormalities" by unpacking how to interpret the data using a social/ecological model — better known as ‘the social determinants of health.’ This model also explains how certain seemingly-unrelated variables can be proxies for one another.
Questing Assumptions About Users: Realizing this concern was another assumption about our users' knowledge level, user research was an essential next step. To figure out how best to present the data to them, my job was to evaluate our primary users’ data-savviness and gauge their understanding of the social determinants of health.
What We Did: With the connections of our director emeritus, we assembled an advisory board of three former health officials and state policymakers. Surveys captured insights from users about the project. Not only did I engage users at the outset of the project, but I also sought feedback from them throughout, following every significant design phase.
What We Learned: Although awareness of social determinants was high in some policymaking circles, it wasn’t universally so. Interest groups and advocacy users were even less likely to have a comprehensive understanding of the policy implications of social determinants. We also learned that users prioritized ease of use, breadth of data, exportable data/visuals, and global social determinants messaging.
Looking to set ourselves apart: After reviewing websites with similar data sets and looking for opportunities to provide something unique/valuable, I found we could distinguish ourselves in a few different ways.
The Breadth of Data: For example, the websites Measure of America and America’s Health Rankings offer state-level health data but limited variable options.
Ease of use: Their interfaces required a decent amount of digging to get information about a particular state or variable.
Relating Social Determinants of Health to Health Outcomes: Additionally, few attempted to shed light on interrelationships between outcomes and indicators related to the social determinants — i.e., how education (indicator) relates to poverty (outcome.)
Access to the Data: In order to apply the data, users cited exportable data/visuals as a top priority. Aside from The Opportunity Atlas, most other websites didn't make it easy to download or distribute graphics.
The breadth of data, ease of use, ‘shareability,’ and information on interrelationships between social factors and health outcomes were areas I identified that would set our website apart and add value to what currently existed.
Once research had provided the answers I needed to feel sufficiently informed on what I was solving for, it was time to move on to the design phase of the project.
To bring the project to life, I used the following methods:
Addressing Stakeholder Fears: I created a sitemap because stakeholders had many internal conversations about how much content could or should be included without exceeding our scope.
Getting Alignment on Scope: The sitemap showed stakeholders the site's size, organization, and purpose. This shaped expectations, generated consensus on what would be built, and created a rough cost estimate. Their reactions also helped give me clarity and validated my assumptions about how many pages were needed to do the jobs required to meet all of our objectives — without going over budget.
Figuring out Navigational Flow: Five main pages were needed to serve our users. The homepage would bring users straight to the national data, which would click through to a state page template with data specific to each state. A third page would offer policy resources for taking action on the data. A fourth page would provide a primer on the social determinants, and a 5th page would explain more about the project and offer the raw data for download.
Spotting Gaps: Wireframing was done in tandem with content strategy. I approached the process iteratively with a content writer; switching between the two helped us outline a content inventory and identify problematic content areas that needed an additional strategy.
Identifying Misalignment with Project Objectives: Some content we developed was misaligned with the project's overall goals, content areas required to provide connective tissue were missing, and some content areas were so abundant that to house them would needlessly complicate the site structure.
Keeping the Project On-Track: Not only did this approach offer a straightforward way for the team to focus on what other information they needed to produce and provide to move forward. The sketchy style also allowed a group of very literal thinkers to see past the aesthetic presentation and focus on information architecture.
Turning Numbers into a Website: Raw data does not make a website, so I knew it would take many rounds of review to take raw data, methods, and principles and turn them into the website our users wanted. So while I was wireframing and prototyping, I was also working with content writers on a content strategy determined by our user’s priorities and knowledge levels and guided by best practices from NN/g and our data analysts. I produced content that was needed but didn't exist, edited existing content that was unclear and esoteric, reduced needless content, and reorganized disordered content.
For example, the impact of variable reduction, renaming, and categorization: Rather than scrolling through an endless list with inconsistently written variables, these changes helped both browsing and seeking users find information among many options more quickly and intuitively. Instead of needing to know the subjective name of a variable precisely as we’d written it, categorization helped browsers who were concerned with a subject find more than one variable of interest. In contrast, fuzzy search helped a seeker locate specific variables immediately.
In the end, my efforts provided:
Translating Data Through Storytelling: The storytelling on our primer on social determinants (the “learn more” page) necessitated an illustrated approach to reinforce the points made throughout the narrative. However, as a solo practitioner at the time, I had neither the capacity nor the time to create illustrations from scratch. With a stretched budget, I sought out existing illustration work that I could purchase and customize to fit our narrative. Italian illustrator Mirko Grisendi’s style and themes blended perfectly with the project.
Researching Map Design for Data Visualization: Traditional map projection of the United States presents interactivity and data visualization challenges. By researching various methods of geographic data visualization, I discovered the Visibility Basemap. The map balanced the scale between large and smaller Northeastern states while simplifying their shapes. States were instantly recognizable by their relative location and characteristic shape. Besides making smaller states easier to see and compare, it eased the challenges of interactivity and labeling. Once a user navigated to a state’s page, a higher fidelity zoomed-in map of each state was needed. Unable to find a map that worked, I re-drew a custom map that combined elements from both. It had the benefits of making states recognizable up close and far away and keeping the geometric sensibility. The custom map I made fulfilled a dual purpose, creating a recognizable landmark on the state page and creating a more cohesive transition for the user’s experience.
Creating a new style guide: Developed in tandem with designing the lo-fi prototype, I created a logo and style guide for the user interface and determined the infographic and illustration approach. I kept the aesthetic friendly but earnest.
The visual design was a continuation of the design work I had done for phase I of the project, then called “Health of the States.”
Typography: I developed a type system with a font trio that complimented each other. Each typeface had a role: a condensed sans for saving space with lengthy variable names, a typeface our users were likely to associate with grassroots political movements, and a clean and simple sans workhorse.
Color Palette: However, the challenge was to use national themes without the implication of political parties, especially in the use of colors. I intentionally developed an alternative palette to red, white, and blue to be reminiscent of our national colors, then added additional hues like yellow to maneuver away from a partisan tone.
Steering Stakeholders in Reviews: I chose to create two phases of prototyping, starting with lo-fi prototypes while working on the content and style guide, then moving into hi-fi prototyping. Each stage of prototyping had a number of iterations. With each iteration, I added a new layer of information to evaluate, thus reducing the cognitive load. As a result, reviews went smoother and were mutually beneficial. By prolonging the “draft” stages of the website for reviews, I could steer the focus to more critical topics without folks getting distracted by window dressing.
Getting what I needed to move forward: For instance, lo-fi interactive prototypes facilitated conversations about content, the structure of the pages, and how the experience would unfold. This approach saved everyone time, effort, and frustration that would have typically resulted from unnecessary back-and-forth about style. The style guide provided an opportunity to get feedback on the aesthetic before applying it.
By the time I created a hi-fi prototype, reviewers could be confident that everything else had been taken care of, so they only needed to look at how well the interactions were performing.
Although user feedback celebrated the site's simplicity and ease of use, there was an obstacle to adoption that could not be overcome. Data analysis took three years out of the four-year grant period. By the time the site was launched, the data was considered too old, making it challenging to promote and gain traction. In addition, without an ongoing funding source for the project, it lacked the resources to update the data, and the site ultimately languished. If I had asked users how long data is considered relevant, rather than assuming our researchers knew, I would not have pursued a website until the data was updated or we had already secured funding for an update.