- Humankind has been storing data for millions of years, as wall paintings, in books and more recently in super-sized data centers.
- Technological advancements have increased our ability to create and store data.
- Each day on Earth we generate 500 million tweets, 294 billion emails and 4 million gigabytes of Facebook data.
- Around 150 years from now, the number of digital bits would reach an impossible value, exceeding the number of all atoms on Earth.
In a May 2011 special research report, Big data: The next frontier for innovation, competition, and productivity, the management consulting firm McKinsey put forth the case that “Big data will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus.” The McKinsey report went on to note that, “The amount of data in our world has been exploding. Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers. The increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the Internet of Things will fuel exponential growth in data for the foreseeable future.”
General usage of the term “Big Data” can be traced to the McKinsey report and similar reports from IBM that ensued around this time. The McKinsey report was prescient in its observations that “Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers.” In retrospect, this was the key insight. From this point forward, interest in data would no longer be limited to the purview of “a few data-oriented managers,” but rather would become the purview of “leaders in every sector.” The McKinsey report went on to describe the advent of the era of Big Data as heralding “new waves of productivity growth, innovation, and consumer surplus.” The report contained one important caveat however, noting that these advances were all predicated “as long as the right policies and enablers are in place.”
An Indian farmer dries harvested rice from a paddy field in Assam.
Ending hunger is one of the top priorities of the United Nations this decade. Yet the world appears to be backsliding, with an uptick of 60 million people experiencing hunger in the last five years to an estimated 690 million worldwide.
To help turn this trend around, a team of 70 researchers published a landmark series of eight studies in Nature Food, Nature Plants, and Nature Sustainability on Monday. The scientists turned to machine learning to comb 500,000 studies and white papers chronicling the world’s food system. The results show that there are routes to address world hunger this decade, but also that there are also huge gaps in knowledge we need to fill to ensure those routes are equitable and don’t destroy the biosphere.
Before the global pandemic struck in 2020 and the world was turned on its head, artificial intelligence (AI), and specifically the branch of AI known as machine learning (ML), were already causing widespread disruption in almost every industry.
The Covid-19 pandemic has impacted many aspects of how we do business, but it hasn’t diminished the impact AI is having on our lives. In fact, it’s become apparent that self-teaching algorithms and smart machines will play a big part in the ongoing fight against this outbreak as well as others we may face in the future.
AI undoubtedly remains a key trend when it comes to picking the technologies that will change how we live, work, and play in the near future. So, here’s an overview of what we can expect during what will be a year of rebuilding our lives as well as rethinking business strategies and priorities.
The coronavirus is proving that we have to move faster in identifying and mitigating epidemics before they become pandemics because, in today’s global world, viruses spread much faster, further, and more frequently than ever before.
If COVID-19 has taught us anything, it’s that while our ability to identify and treat pandemics has improved greatly since the outbreak of the Spanish Flu in 1918, there is still a lot of room for improvement. Over the past few decades, we’ve taken huge strides to improve quick detection capabilities. It took a mere 12 days to map the outer “spike” protein of the COVID-19 virus using new techniques. In the 1980s, a similar structural analysis for HIV took four years.
But developing a cure or vaccine still takes a long time and involves such high costs that big pharma doesn’t always have incentive to try.
The technique, inspired by quantum cryptography, would allow large medical databases to be tapped for causal links
Understanding how the world works means understanding cause and effect. Why are things like this? What will happen if I do that? Correlations tell you that certain phenomena go together. Only causal links tell you why a system is as it is or how it might evolve. Correlation is not causation, as the slogan goes.
This is a big problem for medicine, where a vast number of variables can be interlinked. Diagnosing diseases depends on knowing which conditions cause what symptoms; treating diseases depends on knowing the effects of different drugs or lifestyle changes. Untangling such knotty questions is typically done via rigorous observational studies or randomized controlled trials.
These create a wealth of medical data, but it is spread across different data sets, which leaves many questions unanswered. If one data set shows a correlation between obesity and heart disease and another shows a correlation between low vitamin D and obesity, what’s the link between low vitamin D and heart disease? Finding out typically requires another clinical trial.
It may not be long before you’ll have to forget about walking down the street anonymously, says a New York Times report.
“Just a face in the crowd.” That figure of speech may one day need a footnote to explain it.
What if a stranger could snap your picture on the sidewalk then use an app to quickly discover your name, address and other details? A startup called Clearview AI has made that possible, and its app is currently being used by hundreds of law enforcement agencies in the US, including the FBI, says a Saturday report in The New York Times.
The app, says the Times, works by comparing a photo to a database of more than 3 billion pictures that Clearview says it’s scraped off Facebook, Venmo, YouTube and other sites. It then serves up matches, along with links to the sites where those database photos originally appeared. A name might easily be unearthed, and from there other info could be dug up online.
2020 predictions from IDCGETTY
IDC released today its worldwide IT industry predictions for 2020 in a webcast with Frank Gens, IDC’s senior vice president and chief analyst.
The focus for the 10 predictions for next year and beyond is the rise of the digital economy. By 2023, IDC predicts, over half (52%) of global GDP will be accounted for by digitally transformed enterprises. This digital tipping point heralds the emergence of a new enterprise species, the digital-first enterprise.
To drive digital supremacy, an enterprise must devote half of its budget to supporting digital innovation, establishing a large-scale, high-performing, digital innovation factories and a third-party ecosystem to produce digital products and provide fee-based wholesale digital services to other enterprise. The latter will be an entire new enterprise competency, similar to the management of Amazon’s platform for third-party sellers. IT resources will continue their migration to the cloud (and multi-clouds) and there will be heavy investment in automation and orchestration systems, using artificial intelligence and machine learning.
Will work for data
We’ve been running a data science experiment over the past few months. Our first goal was to compare and contrast the amount of data we could actively gather using a link to an online survey (please click here to take it) vs. the amount of data we could passively gather using our cookies and pixel-monitoring tools. Our second goal was to compare and contrast the value of self-reported data vs. observed behavioral data. Our final goal was to turn both data sets into actionable insights and analyze the results. We were shocked, but not surprised, by what we learned.
Gatwick first trialled facial-recognition-based checks at some of its departure gates last year
Gatwick has become the UK’s first airport to confirm it will use facial-recognition cameras on a permanent basis for ID checks before passengers board planes.
It follows a self-boarding trial carried out in partnership with EasyJet last year.
The London airport said the technology should reduce queuing times but travellers would still need to carry passports.
Privacy campaigners are concerned.
Sidewalk Labs says it will spend $1.3 billion on the project in the hopes of spurring $38 billion in private sector investment by 2040
Sidewalk Labs, Alphabet’s smart city subsidiary, released its massive plan Monday to transform a slice of Toronto’s waterfront into a high-tech utopia. Eighteen months in the making and clocking in at 1,524 pages, the plan represents Alphabet’s first, high-stakes effort to realize Alphabet CEO Larry Page’s long-held dream of a city within a city to experiment with innovations like self-driving cars, public Wi-Fi, new health care delivery solutions, and other city planning advances that modern technology makes possible.
Previously, Sidewalk Labs called it “a neighborhood built from the internet up.” But on Monday, Sidewalk Labs CEO Dan Doctoroff went a step further to describe it as “the most innovative district in the world.”
‘Being smart is about working in a smarter way with different partners and empowering citizens’
Stockholm is one of the world’s most connected cities, and a beacon for innovators and international talent. We are also a forward-looking city, leading the environmental and smart city agendas. By 2040, we have the ambition to be both carbon neutral and the smartest city in the world.