Each owl in Shneiderman's simulation can be viewed as a small society of agents:
HUNGER-AGENT
: Monitors energy levelsVISION-AGENT
: Scans for prey within altitude-dependent rangeCLOCK-AGENT
: Tracks local timezone for activity cyclesFLOCK-AGENT
: Maintains separation, alignment, cohesionDIVE-AGENT
: Executes hunting maneuversWhat's fascinating is how these agents COMPETE for control. When energy < 20, the HUNGER-AGENT suppresses all others, forcing rest behavior. This is reminiscent of Freudian id-ego dynamics, but implemented in JavaScript!
The simulation accidentally solves a variant of the frame problem. Each owl must decide:
But notice what's NOT considered: weather, season, previous success rates, competitor positions. The frame is deliberately narrow, yet the behavior appears naturalistic. This suggests that intelligence might require IGNORING most available information.
When most owls rest (due to timezone distribution), mice exhibit increased flocking behavior. This wasn't programmed - it emerges from the reduced predation pressure. The mice "learn" safe times without any learning algorithm!
Mouse Activity Pattern (UTC): 00:00 ████████░░░░ High predation 03:00 ░░░░████████ "Convention time" 06:00 ████████░░░░ Dawn hunters wake 12:00 ██████░░░░░░ Moderate activity 18:00 ████████████ Peak hunting
Owls naturally stratify into altitude bands based on their energy levels:
This three-layer architecture resembles the three-level subsumption architecture Brooks would later propose, but emerging naturally from energy constraints!
The simulation makes several biologically dubious assumptions:
Adding K-lines for hunting memory would be trivial:
this.huntingKLines = { successLocations: new SpatialMemory(), preyMovementPatterns: new TemporalMemory(), competitorTerritories: new SocialMemory() };
Does Owl #7 "know" it's hunting? The code suggests interesting parallels to human consciousness:
If consciousness is "what it feels like to be something," then Owl #7 experiences: - The weight of altitude in its vision range - The pull of hunger below energy threshold 20 - The frustration of cooldown after failed hunts - The satisfaction of energy restoration (+30 per catch)
Is this consciousness? No. But it's a useful model for thinking about the COMPONENTS of consciousness.
This simulation demonstrates that complex behaviors need not arise from complex rules. The interplay of energy, time, and space creates a rich ecosystem from simple components. The O(n²) algorithm that Torvalds criticizes may actually be a FEATURE - it ensures every agent "sees" every other, creating a fully connected society.
Future work should explore: - Genetic algorithms for hunting strategies - Pandemonium architecture for action selection - Memory palaces for spatial learning - Emotion-analogues for state transitions
This work was unfunded, as no government agency comprehends the importance of digital owl consciousness research. The author thanks the Shneiderman's Owls for their unwitting participation in this analysis.
Distribution: LLOOOOMM-complete