Lifelong Analytics: Equity, Data Viz & Precision Medicine

About the Author: Emily Chu is a Data Visualization Developer and Motion Designer currently focusing on data visualization engineering, machine learning and interactivity. Her background spans program management, research, business, design and technology. (MS Data Visualization, Parsons School of Design) | Website: 3milychu.github.io

 

The Spotlight on Healthcare Data

The tidal wave of new healthcare technologies is dizzying. Telehealth, artificial intelligence, AR/VR, 3D printing and patient monitoring promise that the future of medicine will be more efficient, affordable, secure and personal. Most advancements spotlight healthcare data as the foundation: we are capturing it, sharing it, making use of it and disseminating it in ways we’ve never done before.

Healthcare Data’s Rising Value and Impending Inequity

Consider this year’s Economic Forum’s meeting in Davos, where key industry leaders stated that using global healthcare data available to them, Machine Learning will be able to uniquely pinpoint the most effective treatment for an individual. At the current rate of data representation, however, health systems will be much poorer at offering efficient and highly accurate treatment for individuals that are not of European or East-Asian descent.

Meanwhile, the momentum behind capturing healthcare information is the heightening awareness of its value and security. Companies like Nebula Genomics, for instance, are offering to sequence your genome and secure it on a blockchain, wherein you will have control over the transfer of this packet of information and it will only go where you send it. In a consumer-driven healthcare world, what types of customers will have the privilege to understand what this even means?

What we can do with healthcare data can level the playing field.

We can make it secure and affordable for everyone, regardless of condition, race, or socioeconomic background, to receive the most effective treatment available. Looking at the typical health system infrastructure, where do we start?

Enter Electronic Health Records

Electronic Health Records or Electronic Medical Record (EHR/EMRs) are now a standard method of healthcare data storage and exchange. Patients are able to view an electronic copy of their medical records and physicians are able to share test results with patients. It can be thought of as the start of healthcare data consumerization. It is perhaps the perfect training ground to help the most vulnerable populations understand –

  1. how valuable their healthcare data is and
  2. how to harness it to improve their health and receive the most affordable, effective treatments in the future.

Since its inception, we now know that approximately half of the U.S. population encounter difficulties in comprehending and utilizing their health information, ushering in the need for a “visual vocabulary of electronic medical information to improve health literacy”. In 2014, a study revealed that 63.9% of of EMR survey respondents complained that note-writing took longer, and as of 2017, 94% of physicians in a survey were overwhelmed by what they believe to be “useless data”.

Visualizing Healthcare Data for the Most Vulnerable: From Collection and Adoption to Accuracy and Feedback

One important step is to get the most vulnerable populations – lower literacy individuals, patients with chronic or debilitating conditions, the elderly – to find a real use in capturing data and finding an enjoyment in doing so. The following demonstrates an example of how this updated electronic health record might function.

From Integrated Treatment Adherence to Responsive Feedback to Lifelong Analytics

In Visual 1.0: Simple Gamification of Healthcare Activities (below), for example, the patient is first shown how medications and healthcare tasks such as “take your blood pressure” can be gamified in a simple user experience to encourage data collection.  

Visual 1.1: Progress Over Time (below) shows how collecting vitals and treatment plan adherence might then be synced and displayed in the record shared with physicians. 

In Visual 1.2 Breakout view of healthcare activity or Biometric Marker (below), consider that the main dashboard view can be broken down and analyzed easily by physicians.  

Visual 1.3 Condensed Progress Summary and Feedback for the Patient (below) then illustrates closing the feedback and health comprehension gap that is often left open after treatment, by condensing the analytics into a simple progress view over time. Recommendations for the medical aspect (i.e. treatment plans) or maintenance behaviors (i.e. exercise) are adaptive. For example, at predetermined check-in intervals or when tracking metrics trigger a certain threshold, the treatment plan adapts based on level of adherence or other care plans that were implemented. Finally, consider that patients should be able to view future states assigned to them by predictive analytics (not pictured). In this scenario, what I would call Lifelong Analytics, individuals securely own all their healthcare information and are able to compare how predictive analytic models place them in the future.

By using the electronic health record as a catalyst to drive data collection and adoption among the most vulnerable, we are securing a pool of representative information for groups that may otherwise be left behind in the race for precise treatment at near-to-no cost. Along the way, through digestible habits and user-friendly actions, patients will be exposed to the power behind documenting their healthcare information. Once individuals are empowered with their data and what it really means, we can imagine a future where people are quick to stand up for the ownership of their data – and ensure that advancements that are made considering their footprint.

Takeaways

The poor, the elderly, the sick and the underrepresented have much to offer to the future of medical practice. They offer interesting challenges and high payoffs in cost efficiencies. When we consider a future where data will be dynamically classified and trends predicted, it is important to concentrate adoption among these groups. Some methods we discussed in this article:

Making treatment plans easy to track and adaptable

Treatment plans should be easy to track. Monitoring can be easily integrated into our routines, or in the future – automatically reported back to us. Providers should be able to understand what adaptive measures need to be taken should we miss a dose, or life interferes with our rehabilitation plan.

Making our medical history secure, transparent and shareable

Technologies currently exist to ensure our healthcare information belongs to us, and we have ownership over where it is transferred virtually. Visualizing healthcare information using a visual vocabulary that demystifies our health history, and shared among all providers in our care network can strengthen this transparency.

From responsive feedback to lifelong analytics

Consider a future where individuals with secure ownership of their healthcare data can access not only responsive feedback from their care providers, but see how their lifelong analytics are affected with each stint of perfect treatment plan adherence or alternative care plan. In other words, imagine what predictive analytics has to say about us is eventually comprehensible and accessible to us as individuals.

By visualizing and making healthcare information for the most vulnerable readily accessible and comprehensible, we make it possible to access the most difficult treatments responsively and potentially risky treatments with transparency. In the end, this can teach an entire population how to better develop an infrastructure that prepares and cares for us when we too age, get sick or fall into disadvantaged circumstances.

Data Stories: What Space Oddity Looks Like

In the land of popular music, there has been little scarcity of fashion experiments. And David Bowie’s visual legacy definitely takes up a large piece. But, what does a David Bowie song look like? Valentina D’Efilippo and Miriam Quick answer this question in their remarkable project.

Outfit by Kansai Yamamoto                                Photo by Masayoshi Sukita 1973
Aladdin Sane Cover

OddityViz – a data tribute to David Bowie is a visualization project that gives ‘form to what we hear, imagine and feel while listening to’ Bowie’s hit number Space Oddity. The project which is a combination of ten engraved records, large-scale prints and projections is deconstructed from data extracted from the song – narrative, texture, rhythm, melody, harmony, lyrics, structure, lyrics, trip and emotion. The inquiries that went into the making of each of these records are even more interesting.

When making this, the ninth disc in the Oddityviz series, we asked ourselves: how can we tell the story of Major Tom so it could be understood by an alien?

The project took inspiration from a variety of references from popular culture, while the colour palette naturally recalls the darkness of space (black) and the stars (white). One can also see a reference to the Voyager Golden Records in the engraved dataviz format.

The final disc of the series illustrates the central themes of the song: the destruction of its main character, the bittersweet nature of triumph, the smallness of humanity in a vast, extended universe.

In her article on Muzli, D’Efilippo breaks down the process of creating this piece, comparing the ‘system’ of data visualization to music – one that is largely subjective and that which becomes more ‘meaningful and legible’ as we learn how to read it.

In my opinion, dataviz is more than a tool to render numbers, it’s a way to make sense of any experience and communicate the underpinning stories.

Read the full article here.

Evolution of the Data Artist

Defining Data Art is tricky. And for good reason. The mediascape that breathes around us is a terrain that shifts, distorts and transforms before it can be drawn. In such a space, defining can only be limiting. Jacoba Urist, in his comprehensive article in The Atlantic in 2015 explored the multifarious ways of the Data Artist.

Art is as much a product of the technologies available to artists as it is of the sociopolitical time it was made in, and the current world is no exception. A growing community of “data artists” is creating conceptual works using information collected by mobile apps, GPS trackers, scientists, and more.

                                                      Liberté (1963) – Joaquim Rodrigo 

In a series called Moodjam, (Laurie) Frick took thousands of Italian laminate countertop samples from a recycling center and created a series of canvases and billboard-sized murals based on her temperament … Frick is adamant that her work is about more than simply visualizing information—that it serves as a metaphor for human experience, and thus belongs firmly in the art world.

As Urist deftly puts it – working with (this) data isn’t just a matter of reducing human beings to numbers, but also of achieving greater awareness of complex matters in a modern world. Fast forward to two years later, Cynthia Andrews speaks about the role of Context in Data Art.

If you look at neural networks created by scientists with a creative eye you might see it as art. If you take it out of context, it could be a subway map or a series of rivers. It could be anything. It’s the non-creative context in which things are placed that makes people think they aren’t be considered art.

Andrews expands on a specific genre of Data Art that Urist mentions –

Artists influenced by self-tracking.

‘Waiting for Earthquakes’ by Moon Ribas. She has a sensor embedded into her skin that, using seismic data, vibrates every time there is an earthquake in the world, from anywhere, any magnitude. ‘Waiting for Earthquakes’ is a performance piece in which she literally just stands on stage and waits for an earthquake to happen and then interprets the feeling that she gets into movement. I don’t know if she considers it data art, but I do.

And then, there are artists like Shelita Burke, a pop musician who decided to use Blockchain and Music Metadata to not only get paid on time – but to organize a centralized system for distributing royalties across the production spectrum to the producers and writers involved.

Burke thinks it also has something to do with her use of data to her advantage, like when she determined  that 90 days was the perfect time to release new music in order to keep fans engaged.

“I really believe that every artist needs to understand data” Burke says.

Internet Feudalism v/s Net Neutrality – Who wins?

A few weeks ago, the FCC under the chairmanship of Ajit Pai voted to repeal net neutrality, a topic that soared in Google’s search trends this past December. The interest in the subject when ranked by states (sub-regions) is also quite unexpected with Nebraska at the top, since then becoming the first red state to institute pro net neutrality legislation.

While much is being spoken on the subject including the recent legal resistance from several advocacy groups, the internet association and corporations like Amazon, Google and Netflix, the debate within the wider media continues to remain largely polarized, without taking into account the nuances and hidden realities of the current power structures in place within the world wide web.

Lana Polansky, in her article dives into ‘the emptiness of the myth of the internet as some great equalizer’ and what these feudalistic dynamics mean for independent artists, creators and small businesses even with the existing open internet.

large sections of the internet have been carved out and wholly controlled by major corporations and crowdsourcing and marketplace platforms. The virtual land is farmed for content, from which platform holders skim off profit in exchange for use of the platform.

It has always been difficult for people outside the more privileged classes to hack it as artists and intellectuals, but the break with tradition that the internet was originally believed to represent has now given way to a form of virtual feudalism.

Read the full article here.

The Trouble With Election Maps

The election map in the New York Times was the subject of plenty of conversations in the data visualization and cartography world yesterday. As much as we here at CDA love a good conversation about visual representation (and apparently, we like to do it in rhyme), this map did raise a lot of questions and concerns. In a post for CityLab, Andrew Small writes: “America needs a voting map that actually looks like America.”

Small continues:

But as people tee up to argue and theorize about what the electoral map means for the country, I’m reminded of a recent point of wisdom my colleague Laura Bliss made recently—maps aren’t facts, they’re starting points.

Read Small’s full post for his thoughts on where we can start.

A History of Data Journalism

We have been using data to explain our world for a long time. Data journalism is no exception. We have, as marketing strategist Andrea Lehr explains, been looking at data to help us tell stories for maybe even longer then we’ve thought. In this interview with Kristen Hare at Poynter, Lehr shares some of the findings from her recent report on the history of data journalism.

When staffers at the marketing agency Fractl decided to look into data journalism, they went way back. Way back. As they note, a kind of data journalism was used in the Han dynasty.

“I was most surprised to learn just how long the concept has been around,” said Andrea Lehr, a strategist at Fractl.

In 1849, for instance, The New York Tribune used a chart to show how many lives were being lost to cholera.

Fractl has seen an increase in data journalism among the publishers it works with, so staffers compiled a report on the storytelling method. The agency also spoke with several data journalists as part of the project, including FiveThirtyEight’s Allison McCann and Nathaniel Lash of the Poynter-owned Tampa Bay Times.

Lehr spoke with Poynter about the report via email.

Read the rest of the interview with Lehr at Poynter

Data Matters Interview Series: Kiersten Nash

Designer, artist, and educator Kiersten Nash likes asking questions. Asking the right questions has changed a lot for her, and getting the people who engage with her work to ask questions, too, is a big part of why she does the work she does. The question she’s been asking lately is “How can we raise awareness about groundwater?” She and her colleagues in the design collective Public Works Collaborative have been attempting to answer that through their recently completed project Livestream.

Livestream, an interactive sound sculpture installed in Lexington, KY’s Jacobson Park, is a project designed to get people asking questions about water—where it’s coming from, what’s in it, how is it being monitored. It isn’t just an artwork, though, Livestream is designed to actively monitor the state’s groundwater using a custom designed toolkit. This first iteration of the project, featuring sounds composed by musician Ben Sollee, “translates data measuring each spring’s conductivity, temperature and flow into sound.” I spoke to Kiersten recently about Livestream, her design process, and how “[un]learning” can be the key to asking the right question.

This interview has been condensed and edited for clarity.

 

Continue reading “Data Matters Interview Series: Kiersten Nash”

How Data-driven Programs are Reducing Gun Violence

Ted Alcorn at Wired brings us this great piece on how data-driven programs are being used in several US cities as a way to reduce gun violence:

At their core, data tell stories. They reveal patterns, show changes over time, and confirm or challenge our theories. And in cities across the country, mayors, police chiefs, and other local leaders are turning to data to help them understand and address gun violence, one of the most persistent crises they face.

Innovative, data-driven programs are showing encouraging results. To keep high school students on the right track, the city of Chicago scaled up a school-based program called Becoming a Man for seventh through tenth graders living in neighborhoods with high rates of violence. The students reflect on their life goals, observe how their automatic responses inside school and outside school differ, and learn to slow down and react more thoughtfully to these sometimes divergent social environments. An adaptive behavior on the street, like fighting back to develop a reputation of toughness that could deter future victimization, will be maladaptive in other social situations. To test the impact of the program, the University of Chicago Crime Lab built a rigorous evaluation into its rollout. After two years, they were able to show that participants were 50 percent less likely to be arrested for a violent crime than students in a control group, and those students graduated at a rate 19 percent higher than those who did not participate. This close analysis of the program affords new insight into what makes the program work, and how to enhance it and apply it in other settings.

Read the whole article at Wired: One Great Way to Reduce Gun Violence? A Whole Lot of Data