Moving Light: can light steer a crowd?

Moving Light is an unprecedented crowd-light interaction experiment in real-life performed during the Glow Lightfestival in Eindhoven (11-18 Nov. 2017).

Introduction

Light with intensity, color and pattern variations is a most basic and instantaneous form of communication.
Emergency conditions, evacuation routes, availability of services are quickly understood through, e.g., flashing and/or colored illuminated signage.

Dynamic illumination, smartly adapting to crowding conditions and infrastructural needs, is thus a natural candidate for steering walking pedestrians.

  • Which light stimuli are more effective at steering
    crowds?
  • Can light dimming over perform conventional
    signage (e.g. arrows)?
  • Does crowd density disrupt light-based steering?
Driven by these questions we created “Moving Light”, a  crowd dynamics experiment and interactive exhibit, as part of the Glow Lightfestival 2017 (11-18 Nov.).

#movingLightGlow 

The Moving Light exhibit is a 23m x 6m corridor to be crossed by about 20.000 visitors per day. After an interactive experience of 1 minute, visitors leave the exhibit choosing on which side to bypass a central elongated obstacle.

In absence of external stimuli we expect pedestrians to
choose either side with 50:50 probability.

Can illuminated signage and uneven light sway such
decision?

We share the design and very preliminary results at the ILIAD 17 conference in Eindhoven with the poster below.

See high resolution version on ResearchGate  or contact us.

Tweets about Moving Light (#movingLightGlow )

 

About Moving Light

Moving Light has been created by Alessandro Corbetta1, Maurice Donners2, Antal Haans3, Fedosja van der Heijden2, Matthijs Hoekstra3, Martijn Hultermans2, Ion Iuncu2, Werner Kroneman1, Timo LeJeune2, Bert Maas4, Sjoerd Mentink2, Randi Nuij3, Philip Ross5, Sara Schippers3, Dragan Sekulovski2, Federico Toschi1, Marius Trouwborst2, Sander van de Wijdeven2 and Walter Willaert2

  1.  Crowdflow Research Group, TU/e, Dept. of Applied Physics
  2.  Risultati immagini per philips lighting research logo  Philips Lighting Research
  3.  Intelligent Lighting Institute, TU/e
  4.  Studio Lucifer, Eindhoven
  5.  Studio Philip Ross, Eindhoven

Moving Light has been further supported by

 

 

Dataset on diluted pedestrian dynamics is available for download

Our dataset on diluted pedestrian dynamics is available for download at the 4TU datacentrum repository at 

    https://doi.org/10.4121/uuid:25289586-4fda-4931-8904-d63efe4aa0b8 

The dataset has been employed in our previous publication

[1] A. Corbetta, C. Lee, R. Benzi, A. Muntean, F. Toschi. Fluctuations around mean walking behaviours in diluted pedestrian flows. Phys. Rev. E. 95, 032316, 2017.

Basic scripts for the usage are available on github at https://github.com/crowdflowTUe/MF_landing_data_analysis 

 

Dataset description

This is a dataset of pedestrian trajectories recorded on a nearly 24/7 schedule in a landing in the Metaforum building at Eindhoven University of Technology.  The purpose of the dataset is to enable ensemble analyses of diluted pedestrian motion.

The data acquisition spanned over a year and, overall, we collected about 250.000 trajectories. Via an overhead Microsoft Kinect sensor we first obtained depth imaging, then we employed ad hoc localization algorithms and Particle Tracking Velocimetry to estimate the trajectory of individual heads (cf. [1]). The current dataset includes 20.000 trajectories from pedestrians walking undisturbed, i.e. in diluted conditions (see Figure). In other words, we considered individuals walking alone in the facility.

The dataset includes 10.000 trajectories of pedestrians crossing the landing entering from the left hand side (file: “left-to-right.ssv”) and 10.000 trajectories of pedestrians entering in the opposite side (file: “right-to-left.ssv”, right-left reference is given according to [1]).
 
The trajectories are in the following table format: 
 
Pid Rstep X Y X_SG Y_SG U_SG V_SG
 
where:
 

  • Pid: unique identifier of a trajectory
  • Rstep: identifier of the timestep (starts from zero, the first 5 and last 5 samples are eliminated as typically less precise)
  • X,Y: position in Cartesian coordinates (in meters)
  • X_SG,Y_SG: position in Cartesian coordinates after Savizky-Golay smoothing (in meters, cf. paper)
  • U_SG, V_SG: velocity in Cartesian coordinates after Savizky-Golay smoothing (in meters per second, cf. paper). 

To use the dataset please cite [1] as well as this dataset (DOI: https://doi.org/10.4121/uuid:25289586-4fda-4931-8904-d63efe4aa0b8).