Firefly synchronization in nature relies on visual perception of flashes between individuals, however, due to the fireflies' tendency to remain near high grasses, bushes, or trees, their flashes may be obstructed by such flora. Visual obstructions between fireflies affect which flashes an individual can see, and in turn, how the swarm synchronizes. This is where extended reality (XR) technologies can be applied to the modeling of firefly synchronization, and what advantages could they provide. Sync is an XR experiences of virtual firefly agents interacting with the viewers’ immediate surroundings. These agents move around the space, ‘land’ on surfaces, and use a custom iteration of the Integrate-and-fire and Kuramoto Models to synchronize their flashes. By spatially mapping the viewers’ environment and implementing raycasting between agents to determine sightlines, physical objects become digital obstructions that can visually separate groups of fireflies; similar to foliage or other opaque objects in nature, these obstructions alter how these virtual groups synchronize. This work is being developed in conjunction with researchers at the Peleg Lab in an attempt to recreate the patterns observed in natural fireflies in order to incorporate three-dimensionality and visual occlusion into the testing of current synchronization models.

Previous
Previous

Sound Planetarium

Next
Next

Frequented Paths