Jake Barrera

Product | AI | Software | Innovation

Back

What Physiology Taught Me About Neural Networks

When I was studying biomedical engineering, one idea stayed with me: the human body runs on feedback loops. Physiology is not just chemistry or anatomy in isolation. It is sensing, signaling, correction, and adaptation happening across the nervous system, endocrine system, cardiovascular system, immune system, and musculoskeletal system.

Most homeostatic loops are negative feedback loops. They detect deviation from a target and push the body back toward range. Blood pressure, temperature, glucose, and breathing all work this way. But positive feedback loops matter too. They amplify a signal when the body needs commitment instead of stability.

Blood Pressure Is a Multi Layer Control System

A simple example is standing up too quickly. Blood briefly pools in the legs, arterial pressure falls, and stretch receptors in the carotid sinus and aortic arch detect it. Their signals travel through the glossopharyngeal and vagus nerves to the nucleus tractus solitarius in the brainstem. The autonomic nervous system responds almost immediately. Parasympathetic tone drops, sympathetic tone rises, heart rate increases, and blood vessels constrict.

That is the fast loop. If the pressure problem persists, the kidneys join in. Reduced renal perfusion and sympathetic stimulation can increase renin release. Renin helps generate angiotensin II, which constricts blood vessels and stimulates aldosterone release. Aldosterone promotes sodium retention, and water follows. The same problem can recruit a neural reflex in seconds and a hormonal volume conserving system over a longer window.

That is what I still find beautiful about physiology. The body does not solve regulation with one pathway. It layers control loops with different sensors, time scales, and effectors.

Glucose Regulation Is a Conversation, Not a Single Hormone

Glucose control is another great example. After a meal, rising blood glucose stimulates pancreatic beta cells to release insulin. Insulin promotes glucose uptake in tissues like muscle and fat and shifts the liver toward glycogen storage. As glucose falls back toward range, insulin secretion decreases.

But the reverse side matters just as much. During fasting, pancreatic alpha cells release glucagon, which tells the liver to increase glycogenolysis and gluconeogenesis. And in real life, it is not just insulin versus glucagon. The autonomic nervous system and stress hormones also shape the response. Epinephrine can raise glucose quickly, and cortisol shifts metabolism over a longer window.

Even one variable like glucose is controlled by a distributed network, not a single wire.

Temperature Control Combines Sensors, Brain, Vessels, and Skin

Thermoregulation is equally elegant. Peripheral and central thermoreceptors send information to the hypothalamus, especially the preoptic area. If temperature rises, the body can increase sweating, dilate skin blood vessels, and change behavior by making us seek shade, drink water, or slow down.

If temperature falls, the body can constrict peripheral vessels, trigger shivering, and recruit endocrine support for heat production. Again, one sensed error spreads into vascular, neural, muscular, and behavioral outputs at once.

Pain Shows How Fast Several Systems Can Fire Together

The example that still best connects physiology to learning is touching something hot. Heat activates thermoreceptors and nociceptors in the skin. Those signals enter the spinal cord, and before the brain has fully interpreted what happened, local spinal circuits can trigger a withdrawal reflex. At the same time, ascending pathways send the signal to the brain, where it becomes conscious perception.

But it does not stop there. Pain can also raise heart rate, dilate pupils, activate sympathetic output, trigger adrenal release of epinephrine and norepinephrine, and start local inflammatory signaling. In one moment, sensory, motor, autonomic, endocrine, and immune related processes begin coordinating around the same event.

A single stimulus does not travel down one line. It branches through a network of loops with different priorities. Some protect immediately. Some make the event memorable. Some help restore equilibrium after the danger passes.

Positive Feedback Is Real, and It Matters

Positive feedback loops are less common, but they are some of the clearest examples of physiological amplification. In labor, cervical stretch promotes oxytocin release. Oxytocin strengthens uterine contractions, which increases cervical stretch and promotes more oxytocin, until delivery interrupts the loop.

Blood clotting also has this flavor. Vascular injury triggers platelet activation and a coagulation cascade in which activated factors help activate more downstream factors. The point is to build a fast, reinforcing response until bleeding is controlled.

Lactation also combines several layers of regulation. Suckling activates sensory nerves that signal the hypothalamus. Oxytocin supports milk ejection, prolactin supports milk production, and milk removal itself helps sustain future production.

Why This Still Reminds Me of Neural Networks

This is part of why neural networks still feel intuitive to me. The body is a giant sensor array that turns heat, pressure, stretch, chemistry, and light into electrochemical signals. Those signals move through layers of processing, thresholds, inhibition, amplification, and memory. The hidden layers in an artificial neural network are not organs, hormones, or synapses, but the analogy still helps me think.

A child touches fire once and learns quickly because the signal is strong, costly, and reinforced by multiple systems at once. Other lessons come from weaker signals that need repetition. That also feels familiar in machine learning. Some patterns are learned quickly because the error signal is large and unambiguous. Others need many passes before the system forms a stable representation.

The analogy is not perfect. Human learning is embodied, emotional, hormonal, social, and deeply context dependent. Systems like ChatGPT do not feel pain or regulate blood pressure. But the comparison is still useful because both biological and artificial systems improve by receiving signals, adjusting internal parameters, and changing future responses.

What makes the body more fascinating to me is that it does all of this with extraordinary efficiency. It is slower than a digital processor at many explicit calculations, but it is massively parallel, adaptive, fault tolerant, and astonishingly low power. It is not just computing. It is sensing, deciding, repairing, predicting, and surviving all at once.

That idea has stayed with me since those early physiology classes. Intelligence is not only about speed. Sometimes it is about what a system can do with very little energy, many overlapping signals, and a lifetime of feedback.

If machine learning wants to move closer to what living systems can do, I think it has to improve in a few important ways.

  1. It needs better continual learning without forgetting what it already knows. The body does not usually retrain from scratch. It keeps adapting while preserving useful priors.

  2. AI needs richer embodiment, or at least richer grounding. Living beings learn through action, sensation, consequence, and context. They do not just process symbols. They feel temperature, force, hunger, fatigue, risk, and reward through a body that is always in the loop.

  3. Our models need to become far more energy efficient. The human brain and body perform extraordinary real time coordination on a tiny power budget compared with modern large scale training and inference systems.

  4. AI needs to become better at integrating many control loops at once. In biology, perception, memory, movement, endocrine signaling, autonomic regulation, and repair processes can all influence one another continuously. Our models are powerful, but they are still much more modular and much less self maintaining than a living organism.

  5. Intelligence in nature is not only pattern recognition. It is self preservation, adaptation under uncertainty, graceful degradation, and learning from sparse but meaningful signals. AI has made huge progress, but living systems still set the deeper benchmark. They do not just compute. They persist.

Sources

  1. NCBI Bookshelf: Physiology, Baroreceptors
  2. PubMed: Regulation of blood pressure by the arterial baroreflex and autonomic nervous system
  3. NCBI Bookshelf: Glucagon Physiology
  4. NCBI Bookshelf: Physiology, Thermal Regulation
  5. NCBI Bookshelf: Physiology, Stress Reaction
  6. NCBI Bookshelf: Physiology, Lactation
  7. NCBI Bookshelf: Physiology, Hemostasis
  8. PubMed Central: Macula densa sensing and signaling mechanisms of renin release