ResearchThe Limits of Our Brains: How Feedback Can Influence Behavior on a Cellular ScaleREAD THE PAPER a feature-generalizable technique for neural conditioning![]()
It is not surprising to anyone that our brains are capable of tremendous change. Indeed, our neural
plasticity is the attribute
that allows us to adapt to our world every day, learn new skills, and form memories.
Neurophysiologically,
this happens with modulation
of the potentiation of synapses, the information transfer mechanism in our brains. Our brain does a
pretty
good job of regulating this
itself, but mental illnesses such as depression and PTSD, as well as neurodegeneration caused by
strokes
and seizures still fails to be treated by our own brains. What, then, if researchers, scientists, and
doctors
had the power to manipulate this potentiation externally?
Shrinking the World with Data CompressionREAD THE PAPER geospatial compression through quad-tree raster decomposition![]() ![]()
Picture a self-driving car navigating through a bustling cityscape, drones buzzing over a field to
monitor crop health, or robots moving with precision in a busy warehouse. What's the common thread
tying
all these scenes of the near-future together? It's LiDAR, the technology that lets machines see the
world in 3D. But there's a snag: LiDAR creates huge piles of data that our current tech struggles to
handle. That's where General Purpose Geospatial Compression (GPGC) comes
into play.
Bad AI Teaching Badly Trains Better AIsREAD THE PAPER underfitting heuristic segmentation models for superior neural results![]()
Machine learning usage for simple tasks is often inhibited by the expense and difficulty of assembling
a
high-quality dataset
to train a model. ML as a field has followed a consistent paradigm of devoting extensive effort to
curating a reliable, robust, and
extensive dataset to train a model on. Much of the time this dataset is more interesting than the
model
itself, and certainly more expensive and time consuming to create!
|