The project MediaFlies implements an interactive multi-agent system that incorporates flocking and synchronization in order to generate a constantly changing visual output. It relies on prerecorded or life video material, which is fragmented and recombined via the agents’ activities. Users influence these visuals via video based tracking and thereby affect the amount of disturbance of the original material. The project draws its inspiration from the biological phenomena of flocking and synchronization. Simulations of these phenomena form the basis for the generative behavior of MediaFlies. In its current implementation, MediaFlies does not support audio. Future improvements will address this shortcoming and will also implement feedback mechanisms by which the agents’ behaviors are influenced by qualitative properties of the media material itself.