A new form of empire-building is underway, and your personal information is the territory
We are witnessing the emergence of a new form of colonialism—one that doesn’t require gunboats or territorial occupation, but instead harvests the most intimate resource of the 21st century: human data. While we debate traditional geopolitics, a silent war is raging for control over the digital essence of humanity itself.
The battleground is no longer geographic—it’s neurographic. AI companies aren’t just collecting data; they’re mapping the collective unconscious of our species, one interaction at a time.
For decades, engineers have dreamed of a single device that could fluently translate between the lightning-fast language of light and the high-bandwidth whisper of terahertz waves. Now, a team at EPFL and Harvard has done exactly that—on a chip so small it could ride on your fingernail.
Terahertz (THz) radiation sits in the electromagnetic no man’s land between microwaves and infrared light—too fast for conventional radio tech, too tricky for optical systems to harness directly. But if you could get THz signals to talk to existing optical networks, you’d open the door to ultra-secure 6G communications, millimeter-precision radar, and data transfer speeds that make today’s fiber optics look like dial-up.
In the intricate landscape of modern data management, “data observability” emerges as a vital practice, extending beyond the mere monitoring of data pipelines’ “health and state.” This practice involves deploying technologies and activities that empower business operators to proactively identify, examine, and resolve data-related challenges in near real time.
The Significance of Data Observability:
Organizations heavily reliant on accurate and reliable data for decision-making face challenges in ensuring data quality. This is precisely where data observability becomes indispensable. It is the practice of real-time monitoring and understanding of the health, performance, and reliability of data pipelines. By embracing data observability practices, businesses gain enhanced visibility into their data infrastructure, ensuring smooth operations and consistently delivering high-quality insights.
In a May 2011 special research report, Big data: The next frontier for innovation, competition, and productivity, the management consulting firm McKinsey put forth the case that “Big data will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus.” The McKinsey report went on to note that, “The amount of data in our world has been exploding. Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers. The increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the Internet of Things will fuel exponential growth in data for the foreseeable future.”
General usage of the term “Big Data” can be traced to the McKinsey report and similar reports from IBM that ensued around this time. The McKinsey report was prescient in its observations that “Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers.” In retrospect, this was the key insight. From this point forward, interest in data would no longer be limited to the purview of “a few data-oriented managers,” but rather would become the purview of “leaders in every sector.” The McKinsey report went on to describe the advent of the era of Big Data as heralding “new waves of productivity growth, innovation, and consumer surplus.” The report contained one important caveat however, noting that these advances were all predicated “as long as the right policies and enablers are in place.”
An Indian farmer dries harvested rice from a paddy field in Assam.
Ending hunger is one of the top priorities of the United Nations this decade. Yet the world appears to be backsliding, with an uptick of 60 million people experiencing hunger in the last five years to an estimated 690 million worldwide.
To help turn this trend around, a team of 70 researchers published a landmark series of eight studies in Nature Food, Nature Plants, and Nature Sustainability on Monday. The scientists turned to machine learning to comb 500,000 studies and white papers chronicling the world’s food system. The results show that there are routes to address world hunger this decade, but also that there are also huge gaps in knowledge we need to fill to ensure those routes are equitable and don’t destroy the biosphere.
Before the global pandemic struck in 2020 and the world was turned on its head, artificial intelligence (AI), and specifically the branch of AI known as machine learning (ML), were already causing widespread disruption in almost every industry.
The Covid-19 pandemic has impacted many aspects of how we do business, but it hasn’t diminished the impact AI is having on our lives. In fact, it’s become apparent that self-teaching algorithms and smart machines will play a big part in the ongoing fight against this outbreak as well as others we may face in the future.
AI undoubtedly remains a key trend when it comes to picking the technologies that will change how we live, work, and play in the near future. So, here’s an overview of what we can expect during what will be a year of rebuilding our lives as well as rethinking business strategies and priorities.
The coronavirus is proving that we have to move faster in identifying and mitigating epidemics before they become pandemics because, in today’s global world, viruses spread much faster, further, and more frequently than ever before.
If COVID-19 has taught us anything, it’s that while our ability to identify and treat pandemics has improved greatly since the outbreak of the Spanish Flu in 1918, there is still a lot of room for improvement. Over the past few decades, we’ve taken huge strides to improve quick detection capabilities. It took a mere 12 days to map the outer “spike” protein of the COVID-19 virus using new techniques. In the 1980s, a similar structural analysis for HIV took four years.
But developing a cure or vaccine still takes a long time and involves such high costs that big pharma doesn’t always have incentive to try.
The technique, inspired by quantum cryptography, would allow large medical databases to be tapped for causal links
Understanding how the world works means understanding cause and effect. Why are things like this? What will happen if I do that? Correlations tell you that certain phenomena go together. Only causal links tell you why a system is as it is or how it might evolve. Correlation is not causation, as the slogan goes.
This is a big problem for medicine, where a vast number of variables can be interlinked. Diagnosing diseases depends on knowing which conditions cause what symptoms; treating diseases depends on knowing the effects of different drugs or lifestyle changes. Untangling such knotty questions is typically done via rigorous observational studies or randomized controlled trials.
These create a wealth of medical data, but it is spread across different data sets, which leaves many questions unanswered. If one data set shows a correlation between obesity and heart disease and another shows a correlation between low vitamin D and obesity, what’s the link between low vitamin D and heart disease? Finding out typically requires another clinical trial.
It may not be long before you’ll have to forget about walking down the street anonymously, says a New York Times report.
“Just a face in the crowd.” That figure of speech may one day need a footnote to explain it.
What if a stranger could snap your picture on the sidewalk then use an app to quickly discover your name, address and other details? A startup called Clearview AI has made that possible, and its app is currently being used by hundreds of law enforcement agencies in the US, including the FBI, says a Saturday report in The New York Times.
The app, says the Times, works by comparing a photo to a database of more than 3 billion pictures that Clearview says it’s scraped off Facebook, Venmo, YouTube and other sites. It then serves up matches, along with links to the sites where those database photos originally appeared. A name might easily be unearthed, and from there other info could be dug up online.
IDC released today its worldwide IT industry predictions for 2020 in a webcast with Frank Gens, IDC’s senior vice president and chief analyst.
The focus for the 10 predictions for next year and beyond is the rise of the digital economy. By 2023, IDC predicts, over half (52%) of global GDP will be accounted for by digitally transformed enterprises. This digital tipping point heralds the emergence of a new enterprise species, the digital-first enterprise.
To drive digital supremacy, an enterprise must devote half of its budget to supporting digital innovation, establishing a large-scale, high-performing, digital innovation factories and a third-party ecosystem to produce digital products and provide fee-based wholesale digital services to other enterprise. The latter will be an entire new enterprise competency, similar to the management of Amazon’s platform for third-party sellers. IT resources will continue their migration to the cloud (and multi-clouds) and there will be heavy investment in automation and orchestration systems, using artificial intelligence and machine learning.
We’ve been running a data science experiment over the past few months. Our first goal was to compare and contrast the amount of data we could actively gather using a link to an online survey (please click here to take it) vs. the amount of data we could passively gather using our cookies and pixel-monitoring tools. Our second goal was to compare and contrast the value of self-reported data vs. observed behavioral data. Our final goal was to turn both data sets into actionable insights and analyze the results. We were shocked, but not surprised, by what we learned.