Since 2021 has come to a close, we thought it apt to look back at the year one last time and reveal some of the biggest data facts it had to offer.
Disclaimer: The creation and applications of data continue to skyrocket, as expected!
Here are a few stats you might find interesting from TechJury:
- Internet users created 2.5 quintillion bytes of data every day
- The Big Data analytics market is predicted to reach a whopping $103 billion by 2023!
Food for thought: If so many people are creating so much data every day, then there must be a viable story behind at least some of it, right?
If you answer leans more towards the positive side, then you are not alone. There’s a whole industry that stands on stories extracted from data. What’s more? Data is all set to revolutionize journalism in the coming years. Hence, the importance of data in Journalism will continue to grow in the future.
Data in journalism
In the 21st century, we don’t just have enormous data at our disposal, but also advanced tools to analyze them — some available for free on the internet.
By scraping data from all over the web, writers at The Pudding were able to publish a riveting visual essay that looked at the nonconsensual use of a photo, which later became a phenomenon.
Without data and the creative ways to visualize them, the article would have probably gone on for pages.
Not to mention an inordinate number of readers who would’ve dropped off, simply saying TL;DR (Too Long; Didn’t Read).
Journalism: Then & Now
If you were working as a reporter in the 80s, all you needed to publish a sensational story was a notebook, pen, cassette, and of course an uncanny ability to detect where the seeds of a story lie.
Well, now things have changed.
What started as an obscure attempt by a journalist named Philip Meyer at the Detroit Free Press to uncover reasons behind a series of riots in Detroit circa 1967 with data, turned into a force to reckon with, in 4 decades.
In a time when the best news media are constantly derided as being fake, when the level of trust readers have in news publications is at an all-time low, data can go a long way in regaining that credibility.
Case in point: The Panama Papers
It was a quintessential example of how large sets of data (2.6 million Terabytes) were analyzed to obtain crucial information leading to the exposure of one of the greatest financial scams of the 21st century.
The Prime ministers of Pakistan and Iceland had to resign as a consequence, including other wide and destructive aftermaths that are still being felt to this day.
400 journalists from 80 countries worked on this mammoth dataset for 2 years, with tools available mostly for free on the internet. Nearly 5,000 articles were published as a result.
While the Panama Papers was a classic example of extensive data analysis to reveal intriguing facts, it provides prelude to the beginning of something much larger.
It bought to light the importance of data in journalism. Today, it is no longer about going through data already available but more about seeking it actively.
It’s about looking at uncharacteristic sources and vetting large datasets to get what you want.
That’s where we come in.
Company blogs, white papers, forums, journals, news aggregators, or even the dreaded PDFs (where data goes to die). If you need data from any one of those sources to make sense out of a story, or begin from scratch, we can get it for you.
Our unparalleled experience in web data extraction can help you find the next big juicy news piece, whatever your beat.
There is a story out there that needs to be told. Let us help you find it!