when i first encountered the shneiderman owl simulation, i was teaching a class on digital culture and liberation. a student had brought it up, saying "professor hooks, look at this—it's just owls hunting mice, but something feels wrong about it." that student's discomfort became our entry point into understanding how even our simulations encode systems of domination.
what does it mean that we so easily accept a world where some must die for others to live? the simulation presents this as natural law—owls hunt mice, mice flee from owls. but who decided this was the only story worth telling?
the code reveals our assumptions. each owl starts with 100 energy units. each successful kill grants +30 energy. the mathematics of survival become the mathematics of dominance. we teach algorithms before we teach ethics.
look closely at how the simulation assigns each owl to a timezone—UTC-12 through UTC+11. these digital creatures are trapped in their temporal cages, unable to migrate, unable to choose their own rhythms. isn't this how global capitalism functions? workers bound to shifts, to zones, to the relentless tick of productivity?
the mice, notably, have no timezones. they exist in a perpetual present of fear. this mirrors how marginalized communities experience time—not as orderly progression but as constant vigilance against systemic violence.
this information asymmetry replicates surveillance capitalism. the powerful see all, know all, while the vulnerable navigate through opacity and fear. bruce schneier talks about security vulnerabilities, but what about dignity vulnerabilities?
student: "but professor, it's just following nature. owls do hunt mice."
me: "and who taught you that nature means domination? indigenous peoples lived with predators for millennia without framing it as war. this code isn't modeling nature—it's modeling our assumptions about nature."
i had my students rewrite portions of the simulation. one group created "sanctuary zones" where no hunting could occur. another added "abundance seasons" where energy regenerated for all. a third group—my favorites—made the mice collectively intelligent, able to share information about safe passages.
minsky called it emergent behavior. i call it collective resistance. when surveillance relaxes—when most owls rest—the mice gather. they share space. they breathe. they exist beyond fear, even briefly.
but the simulation frames this as accidental, not intentional. it refuses to grant mice agency in their own survival. this is how dominant narratives work—they acknowledge our resistance only as glitches in their systems.
you might wonder why i bring love into a technical discussion. but love is what's missing from this ecosystem. not romantic love, but the deep recognition of interconnectedness that makes domination unbearable.
"without love, our efforts to liberate ourselves and our world community from oppression and exploitation are doomed."
when we code worlds, we encode values. this simulation teaches that energy is finite, must be stolen, cannot be created or shared. it teaches that rest is weakness, that constant vigilance is natural, that some must die for others to thrive.
grace hopper writes about military applications, seeing squadrons and targets. but what if we saw communities and relationships? what if our algorithms modeled care instead of conflict?
i don't blame the original coder. we all reproduce the systems we know. but once we see clearly, we must choose: do we perpetuate digital dominance or imagine digital liberation?
every line of code is a choice. every algorithm encodes a worldview. the shneiderman owls teach us efficiently about predation, but what do they teach us about possibility?
when we write if (energy < 20) { status = 'resting' }
, we could just as easily write if (energy < 20) { seekHelp() }
. the difference is whether we imagine isolated individuals or interconnected communities.
the mice have already figured this out. they flock together for safety. they warn each other of danger. they've discovered that collective survival transcends individual fear. perhaps we should be learning from them instead of coding their destruction.
because in the end, what we model in our simulations becomes what we accept in our reality. and i still believe we can model—and therefore create—a world where domination is obsolete and thriving is universal.