The Blurred Lines of Data Sovereignty

Data are often thought of as digital. They are also discrete, spatial, lateral and seemingly non-subjective. Data demands tags, classes, branches and categories in order to be meaningful. Data needs to exist in parts to be whole. The biases that inherently accompany the creation of Data often dwindle into the ether around captivating visualizations that take the center stage. Biases such as choices of tags, design agendas, and economic entities that commissioned the datasets.

Data needs to exist in parts to be whole.

To be visualized coherently, Data needs ‘cleaning up’ and clean datasets project outwards. They raise questions about everything but themselves. Clean datasets afford us the convenience of neat, untarnished algorithms. When these datasets are made open, in the hands of the public domain the resulting algorithmic universe amplifies. We applaud and embrace the ideals of Open Data as reflection and reassurance of a digital democracy.

But, what happens when Data wants (and needs) to be protected? What happens when communities that have collected, created, owned, applied and disseminated knowledge for generations employ methods of preservation that Data-as-we-know-it resists? What happens when it is crucial for certain types of Data to be both sheltered as well as communicated? What happens when Data refuses qualitative distillation and the quantitative bulk is intricately tethered to the undiscountable lived experience?

Indigenous Data Sovereignty is one such domain, specifically the data concerning sexual violence against Indigenous women.

Crosscut interviewed Abigail Eco-Hawk, the Chief Research Officer of the Seattle Indian Health Board about Data Collection and Knowledge Creation that are separate from and not rooted in Western methods of understanding Data.

The following are excerpts from the article.

“When we think about data, and how it’s been gathered, is that, from marginalized communities, it was never gathered to help or serve us. It was primarily done to show the deficits in our communities, to show where there are gaps. And it’s always done from a deficit-based framework.

“As indigenous peoples, we have always been gatherers of data, of information. We’ve always been creators of original technology.

“When I went to the University of Washington, I was able to take some of the Western knowledge systems and understand how that related to the indigenous. I recognized that the systems that were currently working towards evaluation, data collection, technology, science, and the way that we looked at the health of Native people weren’t serving my people, because they didn’t have the indigenous framework.

“Decolonizing data means that the community itself is the one determining what is the information they want us to gather. Why are we gathering it? Who’s interpreting it? And are we interpreting it in a way that truly serves our communities?

“[The Seattle Indian Health Board] had decided to not publish this information because of how drastic the data was showing the rates of sexual violence against Native women. There were fears that it could stigmatize Native women, and that would cause more harm than good. But those women had shared their story, and we had a responsibility to them, and to the story, and I take that very seriously.

One of the ways that there is a continuing genocide against American Indians/Alaska Natives is through data. When we are invisible in the data, we no longer exist. When I see an asterisk that says “not statistically significant,” or they lump us together with Pacific Islanders and Asian Americans — you can’t lump racial groups together. That is bad data practice.

“I always think about the data as story, and each person who contributed to that data as storytellers. What is our responsibility to the story and our responsibility to the storyteller? Those are all indigenous concepts, that we always care for our storytellers, and we always have a responsibility to our stories.

Read the full article here.

How Fake News Really Spreads

Post-truth politics is a “political culture in which debate is framed largely by appeals to emotion disconnected from the details of policy”. The narrative takes center stage and factual rebuttals to the political agenda are consciously disregarded. Demagoguery assumes the role of the protagonist. Fake news, Disinformation and Hoaxes become the setting. And, the Internet is the theatre. Since 2016 however, these post-truth plays have been largely harbored by Twitter.

In a study commissioned by the Knight Foundation, Matthew Hindman of George Washington University and Vlad Barash of Graphika examined how ‘fake news’ actually spread across tweets in the months preceding and post the 2016 US presidential elections.

In a vivid interactive by Accurat that carefully captures the fluid temporality of the twitterverse, the study depicts more than “10 million tweets from 700,000 Twitter accounts that linked to more than 600 fake 
and conspiracy news sites.”

One myth that this study debunks is that fake news is spread by thousands of small, independent sites when in reality, it is largely concentrated around a handful of websites, and in the case of the 2016 elections, 24. A pattern that seems to run throughout these coordinated twitter campaigns is that of Clusters, a network of twitter users who inter-tweet and inter-link to disinformation from these sources. These accounts, a whole 13861 of them, the study samples as the most crucial in spreading fake news, half of which were found to be automated based on their posting cycles.

Check out the full interactive here.

Journey of a Meme: Culture Jamming & Elections

Meme is the basic unit of culture jamming – an idea that utilizes the conventions of mainstream media to disrupt, subvert and plays upon the emotions of political bystanders so as to evoke change and activism.

On Oct 11 2018, the hashtag #JobsNotMobs came into being anchored by a ‘supercut’ viral video of cable news’ use of the word ‘mob’ juxtaposed with footage of various protests that happened last year. This meme slowly made its way through the crevices of social media across 4chan, reddit, facebook and twitter into President Trump’s twitter feed. 

Keith Collins and Kevin Roose in their NY times article, 

visualize the birth and spread of #JobsNotMobs and how it rapidly became part of the Republican campaign narrative in the midterm elections. 

The creator of the meme, who goes by the pen name “Bryan Machiavelli,” told The New York Times he charges $200 an hour for his “memetic warfare consulting” services.

Check out the visualization here.

What Improv Storytelling has to offer to Data Artists

In 2015, Ben Wellington gave a TEDx talk on how he borrowed principles from his lifelong love for Improv Comedy and applied it to his Data Visualization practice. “I accidentally became a data storyteller,” he says.

“The Open Data Laws are really exciting for people like me because it takes data that is inside City Government, and suddenly allows anyone to look at it.”

The narrative that came out of contextualizing this data spotted zones that fervent NYC cyclists are better off avoiding and shed some light on the battle strategies of new yorkers’ favorite pharmacies. Wellington closes the distance between Data Viz and Improv by ‘Connecting with People’s Experiences’ and ‘Conveying one simple (and powerful) idea at a time’.

Alan Alda, the seven-time emmy winning actor of M*A*S*H along with Ocean and Environmental Scientist and Associate Director at The Alan Alda Center for Communicating Science, Dr. Christine O’Connell experimented with a group of scientists, doctors and engineers in 2016 in a workshop to employ Improv Storytelling in communicating their research.

“I think anybody that studies something so deeply, whether you’re an engineer, whether you’re an artist, whether you’re in business, you forget what it’s like not to know” – O’Connell

Empathy lies at the heart of Improv and therefore, at the heart of good communication. The idea of speaking to your audience and working with them to create a common language and evolve into clarity is especially relevant for Data Scientists and Data Artists.

The Data Artist creates an imaginary, artificial environment not dissimilar to that of an Improv actor where certain cues are visible and certain others have to be made up. The logic of this environment, however, needs to be consistent and is as important as the trust established within it.

“Even small breaks can affect credibility. – When we visualize data, we are (asking our audience to suspend their understanding of reality for a moment and accept new rules and conditions). We are asking our audience to understand shapes and forms on a digital screen to be something other than what they are.” – Ryan Morrill, Storybench, October 2017.

The Data Viz equivalent of Laughter in an Improv Comedy Scene is the deriving of Insight, says Morrill, where the logic reveals a reward.

‘Data Matters’ Goes Public!

Today marks the official public launch of Data Matters, a weekly online publication from the Center for Data Arts where you will now find weekly original postings.

With information technology being rapidly woven into every layer of modern life, influencing what we eat, how we learn, how wars are waged, and how our societies are governed, data has never mattered more than it does today. To shed light on data’s many crucial and disparate roles, Data Matters will embrace perspectives from journalism, science, humanities, and the arts, and we will publish pieces in multiple forms that include articles, essays, research papers, and experimental digital media.

Our goal is to make Data Matters a wide open platform for examining the fast changing data landscape from every angle, providing our audience with information and critical insight on this complex and fast-changing subject.

 

 

A History of Data Journalism

We have been using data to explain our world for a long time. Data journalism is no exception. We have, as marketing strategist Andrea Lehr explains, been looking at data to help us tell stories for maybe even longer then we’ve thought. In this interview with Kristen Hare at Poynter, Lehr shares some of the findings from her recent report on the history of data journalism.

When staffers at the marketing agency Fractl decided to look into data journalism, they went way back. Way back. As they note, a kind of data journalism was used in the Han dynasty.

“I was most surprised to learn just how long the concept has been around,” said Andrea Lehr, a strategist at Fractl.

In 1849, for instance, The New York Tribune used a chart to show how many lives were being lost to cholera.

Fractl has seen an increase in data journalism among the publishers it works with, so staffers compiled a report on the storytelling method. The agency also spoke with several data journalists as part of the project, including FiveThirtyEight’s Allison McCann and Nathaniel Lash of the Poynter-owned Tampa Bay Times.

Lehr spoke with Poynter about the report via email.

Read the rest of the interview with Lehr at Poynter