Explained: How 1.4 Petabytes of data from a brain speck can ‘mass-humble’ AI labs
The comparison underscores a paradox at the heart of modern AI development: today’s neural networks are inspired by the brain, yet researchers still cannot fully decode the wiring of even a microscopic portion of the biological system they aim to emulate.

- Feb 22, 2026,
- Updated Feb 22, 2026 2:09 PM IST
A striking comparison between the scale of the human brain and the ambitions of artificial intelligence research is sparking conversation across the tech community, after industry commentator Aakash Gupta highlighted the sheer complexity of mapping even the tiniest fragment of neural tissue.
In a recent post on X (formerly Twitter), Gupta pointed to a decade-long scientific effort led by researchers at Harvard University in collaboration with Google, arguing that the findings should “mass-humble every AI lab on the planet.”
A cubic millimeter, 10 years of work
The project focused on reconstructing just one cubic millimeter of human brain tissue — roughly one-millionth the volume of an entire brain. Yet the scale of the undertaking rivaled that of major industrial research programs.
Scientists spent 10 years mapping the sample. The imaging process alone ran continuously for 326 days, during which the tissue was sliced into 5,000 ultra-thin sections, each only 30 nanometers thick. These were scanned using a $6-million electron microscope.
Even after imaging, the challenge had only begun. The dataset was so vast that automated machine-learning systems were required to stitch the images into a coherent three-dimensional reconstruction — something Gupta noted “no human team could process” manually.
Massive data from a microscopic sample
From that speck of tissue, researchers identified:
- 57,000 cells
- 150 million synapses
- 230 millimeters of blood vessels
All of it translated into 1.4 petabytes of raw data — roughly 1.4 million gigabytes —compressed from a fragment smaller than a grain of rice.
Gupta extrapolated the implications: mapping the entire human brain at the same resolution would generate an estimated 1.4 zettabytes of data, comparable to the total amount of data produced globally in a year. Storing such information could cost tens of billions of dollars and require a data-center footprint measured in hundreds of acres.
Discoveries that defy textbooks
Beyond the engineering feat, researchers encountered biological structures that remain poorly understood.
According to neuroscientist Jeff Lichtman, who led the Harvard effort, the work revealed “a chasm between what we already know and what we need to know.”
Among the unexpected observations:
- A single neuron forming more than 5,000 connection points
- Axons curling into tightly coiled whorls with no clear explanation
- Cell clusters arranged in mirrored patterns
Such findings suggest that even foundational assumptions about neural wiring may be incomplete.
Given the immense technical burden, scientists are not attempting a full human brain map next. Instead, the field’s immediate target is a mouse hippocampus — about 10 cubic millimeters of tissue — over the coming five years.
Even that modest step represents a scale increase of roughly 1,000 times compared to the tissue already analyzed, positioning it as a crucial proof-of-concept for connectomics, the science of mapping neural connections.
A reality check for AI ambitions
Gupta used the comparison to underscore a paradox at the heart of modern AI development: today’s neural networks are inspired by the brain, yet researchers still cannot fully decode the wiring of even a microscopic portion of the biological system they aim to emulate.
While cutting-edge AI models run on vast computing clusters, the human brain operates on about 20 watts of power — less than a household light bulb — raising enduring questions about efficiency, architecture, and what intelligence truly requires.
The takeaway, Gupta argued, is less about diminishing AI’s progress and more about recognising the extraordinary complexity of its biological template — one that science is only beginning to map.
A striking comparison between the scale of the human brain and the ambitions of artificial intelligence research is sparking conversation across the tech community, after industry commentator Aakash Gupta highlighted the sheer complexity of mapping even the tiniest fragment of neural tissue.
In a recent post on X (formerly Twitter), Gupta pointed to a decade-long scientific effort led by researchers at Harvard University in collaboration with Google, arguing that the findings should “mass-humble every AI lab on the planet.”
A cubic millimeter, 10 years of work
The project focused on reconstructing just one cubic millimeter of human brain tissue — roughly one-millionth the volume of an entire brain. Yet the scale of the undertaking rivaled that of major industrial research programs.
Scientists spent 10 years mapping the sample. The imaging process alone ran continuously for 326 days, during which the tissue was sliced into 5,000 ultra-thin sections, each only 30 nanometers thick. These were scanned using a $6-million electron microscope.
Even after imaging, the challenge had only begun. The dataset was so vast that automated machine-learning systems were required to stitch the images into a coherent three-dimensional reconstruction — something Gupta noted “no human team could process” manually.
Massive data from a microscopic sample
From that speck of tissue, researchers identified:
- 57,000 cells
- 150 million synapses
- 230 millimeters of blood vessels
All of it translated into 1.4 petabytes of raw data — roughly 1.4 million gigabytes —compressed from a fragment smaller than a grain of rice.
Gupta extrapolated the implications: mapping the entire human brain at the same resolution would generate an estimated 1.4 zettabytes of data, comparable to the total amount of data produced globally in a year. Storing such information could cost tens of billions of dollars and require a data-center footprint measured in hundreds of acres.
Discoveries that defy textbooks
Beyond the engineering feat, researchers encountered biological structures that remain poorly understood.
According to neuroscientist Jeff Lichtman, who led the Harvard effort, the work revealed “a chasm between what we already know and what we need to know.”
Among the unexpected observations:
- A single neuron forming more than 5,000 connection points
- Axons curling into tightly coiled whorls with no clear explanation
- Cell clusters arranged in mirrored patterns
Such findings suggest that even foundational assumptions about neural wiring may be incomplete.
Given the immense technical burden, scientists are not attempting a full human brain map next. Instead, the field’s immediate target is a mouse hippocampus — about 10 cubic millimeters of tissue — over the coming five years.
Even that modest step represents a scale increase of roughly 1,000 times compared to the tissue already analyzed, positioning it as a crucial proof-of-concept for connectomics, the science of mapping neural connections.
A reality check for AI ambitions
Gupta used the comparison to underscore a paradox at the heart of modern AI development: today’s neural networks are inspired by the brain, yet researchers still cannot fully decode the wiring of even a microscopic portion of the biological system they aim to emulate.
While cutting-edge AI models run on vast computing clusters, the human brain operates on about 20 watts of power — less than a household light bulb — raising enduring questions about efficiency, architecture, and what intelligence truly requires.
The takeaway, Gupta argued, is less about diminishing AI’s progress and more about recognising the extraordinary complexity of its biological template — one that science is only beginning to map.
