Big Data Won’t Save You From Coronavirus

David Fickling:

That’s not a comforting thought. We live in an era where everything seems quantifiable, from our daily movements to our internet search habits and even our heartbeats. At a time when people are scared and seeking certainty, it’s alarming that the knowledge we have on this most important issue is at best an approximate guide to what’s happening.

“It’s so easy these days to capture data on anything, but to make meaning of it is not easy at all,” said John Carlin, a professor at the University of Melbourne specializing in medical statistics and epidemiology. “There’s genuinely a lot of uncertainty, but that’s not what people want to know. They want to know it’s under control.”

That’s most visible in the contradictory information we’re seeing around how many people have been infected, and what share of them have died. While those figures are essential for getting a handle on the situation, as we’ve argued, they’re subject to errors in sampling and measurement that are compounded in high-pressure, strained circumstances. The physical capacity to do timely testing and diagnosis can’t be taken for granted either, as my colleague Max Nisen has written.

Early case fatality rates for Severe Acute Respiratory Syndrome were often 40% or higher before settling down to figures in the region of 15% or less. The age of patients, whether they get sick in the community or in a hospital, and doctors’ capacity and experience in offering treatment can all affect those numbers dramatically.

Even the way that coronavirus cases are defined and counted has changed several times, said Professor Raina MacIntyre, head of the University of New South Wales’s Biosecurity Research Program: From “pneumonia of unknown cause” in the early days, through laboratory-confirmed cases once a virus was identified, to the current standard that includes lung scans. That’s a common phenomenon during outbreaks, she said. 

Related: the hype cycle.