Moving Light: can light steer a crowd?

Moving Light is an unprecedented crowd-light interaction experiment in real-life performed during the Glow Lightfestival in Eindhoven (11-18 Nov. 2017).

Introduction

Light with intensity, color and pattern variations is a most basic and instantaneous form of communication.
Emergency conditions, evacuation routes, availability of services are quickly understood through, e.g., flashing and/or colored illuminated signage.

Dynamic illumination, smartly adapting to crowding conditions and infrastructural needs, is thus a natural candidate for steering walking pedestrians.

  • Which light stimuli are more effective at steering
    crowds?
  • Can light dimming over perform conventional
    signage (e.g. arrows)?
  • Does crowd density disrupt light-based steering?
Driven by these questions we created “Moving Light”, a  crowd dynamics experiment and interactive exhibit, as part of the Glow Lightfestival 2017 (11-18 Nov.).

#movingLightGlow 

The Moving Light exhibit is a 23m x 6m corridor to be crossed by about 20.000 visitors per day. After an interactive experience of 1 minute, visitors leave the exhibit choosing on which side to bypass a central elongated obstacle.

In absence of external stimuli we expect pedestrians to
choose either side with 50:50 probability.

Can illuminated signage and uneven light sway such
decision?

We share the design and very preliminary results at the ILIAD 17 conference in Eindhoven with the poster below.

See high resolution version on ResearchGate  or contact us.

Tweets about Moving Light (#movingLightGlow )

 

About Moving Light

Moving Light has been created by Alessandro Corbetta1, Maurice Donners2, Antal Haans3, Fedosja van der Heijden2, Matthijs Hoekstra3, Martijn Hultermans2, Ion Iuncu2, Werner Kroneman1, Timo LeJeune2, Bert Maas4, Sjoerd Mentink2, Randi Nuij3, Philip Ross5, Sara Schippers3, Dragan Sekulovski2, Federico Toschi1, Marius Trouwborst2, Sander van de Wijdeven2 and Walter Willaert2

  1.  Crowdflow Research Group, TU/e, Dept. of Applied Physics
  2.  Risultati immagini per philips lighting research logo  Philips Lighting Research
  3.  Intelligent Lighting Institute, TU/e
  4.  Studio Lucifer, Eindhoven
  5.  Studio Philip Ross, Eindhoven

Moving Light has been further supported by

 

 

Glow 2017: the “Moving light” exhibit/experiment is in development!

We’ll proudly propose the “Moving Light” exhibit at the lightfestival GLOW2017. GLOW2017 is open between the 11th and the 18th November and, as usual, its route spans the entire city (see map below). 

As it happened for INFLUX, at GLOW2016, the exhibit will hide a massive crowd dynamics experiment. We’ll investigate the possibility for light stimuli to influence individuals’ turning decision. 
Stay tuned! We are getting ready…

 

 

 Risultati immagini per philips lighting research logo

Dataset on diluted pedestrian dynamics is available for download

Our dataset on diluted pedestrian dynamics is available for download at the 4TU datacentrum repository at 

    https://doi.org/10.4121/uuid:25289586-4fda-4931-8904-d63efe4aa0b8 

The dataset has been employed in our previous publication

[1] A. Corbetta, C. Lee, R. Benzi, A. Muntean, F. Toschi. Fluctuations around mean walking behaviours in diluted pedestrian flows. Phys. Rev. E. 95, 032316, 2017.

Basic scripts for the usage are available on github at https://github.com/crowdflowTUe/MF_landing_data_analysis 

 

Dataset description

This is a dataset of pedestrian trajectories recorded on a nearly 24/7 schedule in a landing in the Metaforum building at Eindhoven University of Technology.  The purpose of the dataset is to enable ensemble analyses of diluted pedestrian motion.

The data acquisition spanned over a year and, overall, we collected about 250.000 trajectories. Via an overhead Microsoft Kinect sensor we first obtained depth imaging, then we employed ad hoc localization algorithms and Particle Tracking Velocimetry to estimate the trajectory of individual heads (cf. [1]). The current dataset includes 20.000 trajectories from pedestrians walking undisturbed, i.e. in diluted conditions (see Figure). In other words, we considered individuals walking alone in the facility.

The dataset includes 10.000 trajectories of pedestrians crossing the landing entering from the left hand side (file: “left-to-right.ssv”) and 10.000 trajectories of pedestrians entering in the opposite side (file: “right-to-left.ssv”, right-left reference is given according to [1]).
 
The trajectories are in the following table format: 
 
Pid Rstep X Y X_SG Y_SG U_SG V_SG
 
where:
 

  • Pid: unique identifier of a trajectory
  • Rstep: identifier of the timestep (starts from zero, the first 5 and last 5 samples are eliminated as typically less precise)
  • X,Y: position in Cartesian coordinates (in meters)
  • X_SG,Y_SG: position in Cartesian coordinates after Savizky-Golay smoothing (in meters, cf. paper)
  • U_SG, V_SG: velocity in Cartesian coordinates after Savizky-Golay smoothing (in meters per second, cf. paper). 

To use the dataset please cite [1] as well as this dataset (DOI: https://doi.org/10.4121/uuid:25289586-4fda-4931-8904-d63efe4aa0b8).

[UPDATE] Glow 2017: looking for student assistants for large crowd management experiment

TU/e and Philips Lighting Research are preparing an unprecedented crowd management experiment to be conducted at the Glow light festival in November 2017 in Eindhoven. The Crowdflow Research Group, TU/e, has developed a unique pedestrian tracking system based on overhead Kinect sensors and custom processing software. This set-up will be deployed at Glow to record and analyse a flow of several 100.000’s of people. We look for student assistants to help set-up the system for Glow and get it operational. If you are pragmatic, proficient in python, web and linux programming, and if you can work under pressure of a big festival, please contact Alessandro Corbetta (a.corbetta@tue.nl).

Minimum required commitment is one day per week.

Call QR code:

Something about our experience at last year glow is here.

 

 

Presenting @ AVSS 17, Tuesday 29th of August

On Tuesday 29th of August I’ll be in Lecce at the “14-th IEEE International Conference on Advanced Video and Signal-Based Surveillance” presenting our work 

  • Weakly supervised training of deep convolutional neural networks for overhead pedestrian localization in depth fields
  • A. Corbetta, V. Menkovski, F. Toschi
  • Link

The presentation will be in at the “Signal Processing for Understanding Crowd Dynamics” workshop. 

Come by if you are around!

 

MSc final projects available!

We are looking for enthusiastic Msc students in Physics or Applied Mathematics for final projects in statistical crowd dynamics.

The projects, developed within the group of Prof. F. Toschi (WDY), involve an exciting mixture of fundamental physics research and technological development. They are addressed to IT & computing enthusiasts willing to face state-of-the-art research challenges.

For further information please contact us via email.

Dense crowd dynamics analysis: from agglomerative clustering to deep learning

We employ clustering algorithms to isolate individual pedestrians in Kinect 3D depth maps. High crowd densities or unusual body shapes increase the error rate of such approach. Deep learning based image recognition techniques showed promising results in the analysis of dense crowds. A systematic development of such techniques in our specific context for real-time and offline data analysis is a natural next step. The main objectives of this project are:

 building datasets from our extensive recordings at Eindhoven station and/or at Naturalis museum for systematic benchmarking;
 formulating heuristics to improve performance of current agglomerative clustering approaches;
 employing deep learning image recognition techniques such as Faster-rCNN to reduce the error rate and classify non-usual detections (children, bikes,…).

Keywords: dense crowd dynamics | agglomerative clustering | GPU/CUDA-based deep learning | convolutional nets | parallel computing.

Far-range crowd dynamics

Microsoft Kinect sensors are quite limited in depth range (~6 m) and produce depth maps in VGA resolution. Recent depth cameras such as Zed, by Stereolabs, allow FullHD depthmaps with range up to 20m, even in sunlight. These specifications enable more flexible outdoor crowd tracking setups with less sensors, possibly not strictly overhead. The main objectives of this project are:

 integrate Zed sensors into our current real-time crowd tracking environment;
 device algorithms for non-overhead crowd dynamics analysis;
 pursue real-time crowd analyses in outdoor crowded scenes.

Successful Zed sensors setups will upgrade our crowd tracking system in Stratumseind.
Keywords: outdoor crowd analysis | Zed stereo cameras | GPU-based depth maps processing | projective geometry | parallel computing.

Crowd flux analytics

Robust estimates of pedestrian fluxes are paramount in any pedestrian facility to assess, e.g., current occupancy. Reliable automatic estimates are a challenge at high densities or in presence e.g. of kids school groups. We expect our real time pedestrian tracking to improve state-of-the art flux assessments with both scientific and technological aims. The main objectives of this project are:

 device algorithms for robust real time crowd flux estimation from Kinect-based noisy tracking;
 analyze and model real life pedestrian arrival stochastic processes;
 device heuristics to detect irregular/rare/dangerous events from tracking.

This project involves fast paced testing and deployment in our measurement locations, among others, Naturalis museum, Leiden.

Keywords: arrival processes | flux assessment from noisy tracking | real time data analysis.

Dense and diluted statistical crowd dynamics on a wide corridor

In the period September 2014-April 2015 we performed 24/7 crowd tracking at Eindhoven train station aiming at an unprecedented crowd data collection for statistical analysis. Such measurements are enabling data-driven particle-based models at all density levels.
The main objectives of this project are:

 map the dataset for homogenous traffic conditions employing both typical observables and pattern matching;
 improve current particle-based models for pedestrian dynamics to reproduce (statistically) the observed dynamics;
 device heuristics to detect irregular/rare/dangerous events from tracking.

Keywords: statistical mechanics | mathematical modeling | large scale data analysis| parallel computing.

WordPress Appliance - Powered by TurnKey Linux