Agent Perception Range and Local Interaction Modeling in Crowd Evacuation

Agent Perception Range and Local Interaction Modeling in Crowd Evacuation

1. Problem Description

In crowd evacuation simulation, individual movement decisions depend on the perception of the surrounding environment (such as other pedestrians, obstacles, exit locations, etc.). However, an individual's perceptual capabilities are limited (e.g., field of view, attention allocation), and the perception range may vary among individuals due to physiological conditions, experience differences, or environmental interference. How to reasonably model an agent's perception range and local interaction mechanisms, so that they can both reflect real behavioral characteristics and avoid computational complexity explosion, is one of the core problems in evacuation simulation.


2. Key Parameters for Perception Range Modeling

An agent's perception range is typically defined by the following parameters:

  1. Perception Radius (r): The maximum perception distance centered on the agent, often circular or sector-shaped (simulating the direction of view).
  2. Field of View Angle (θ): Reflects the width of the visual field (e.g., human horizontal field of view is approximately 200°).
  3. Perception Priority: Agents may prioritize attention to specific objects (e.g., exit signs, dense crowds, anomalous movements).
  4. Information Attenuation: The reliability of perceptual information decreases with increasing distance or more obstacle occlusion.

3. Common Modeling Methods for Local Interaction

(1) Geometry-Based Perception Models

  • Circular Perception Model: The simplest approach, where all objects within the agent's perception range are treated equally.
    • Disadvantage: Cannot reflect the directionality of the field of view, potentially overestimating rearward perception capability.
  • Sector Perception Model: Combines perception radius and field of view angle, better aligning with human visual characteristics.
    • Implementation: Determines if a target is within the sector area via geometric calculations.

(2) Physics-Based Occlusion Handling

  • Detects line-of-sight occlusion via Ray Casting or Ray Tracing:
    • If an obstacle exists between the agent and the target, the target is invisible.
    • Examples: Wall occlusion, partial occlusion by crowds.

(3) Attention-Based Dynamic Filtering

  • Agents dynamically adjust their perceptual focus based on current state:
    • Panic State: May pay more attention to the movement direction of others while ignoring exit signs.
    • Herd Behavior: Prioritizes perceiving the movement trends of the majority.

4. Design of Local Interaction Rules

Perceptual information needs to be translated into behavioral decisions. Common interaction logic includes:

  1. Collision Avoidance Rule: Uses the positions and velocities of other agents within the perception range to calculate collision risk and adjust direction.
  2. Following Behavior: If the movement direction of an agent ahead aligns with one's own goal, one may choose to follow to reduce decision-making cost.
  3. Crowd Avoidance: Actively detours or slows down when perceiving local density exceeding a threshold.
  4. Information Transmission: Influences decisions of nearby agents by transmitting emergency information (e.g., "exit closed") via auditory or visual means.

5. Model Implementation Steps (Taking Sector Perception as an Example)

Step 1: Determine Agent State

  • Input: Agent position \((x_i, y_i)\), orientation angle \(\phi_i\), visual field parameters \((r, θ)\).

Step 2: Filter Objects Within Perception Range

  • For another agent or obstacle \(j\):
    • Calculate distance \(d_{ij} = \sqrt{(x_j - x_i)^2 + (y_j - y_i)^2}\).
    • If \(d_{ij} > r\), ignore object \(j\).
    • Calculate relative angle \(\alpha_{ij} = \text{atan2}(y_j - y_i, x_j - x_i)\).
    • If \(|\alpha_{ij} - \phi_i| > θ/2\), ignore object \(j\) (outside the field of view angle).

Step 3: Handle Occlusion Effects

  • Cast a ray from agent \(i\) to object \(j\). If the ray intersects with an obstacle, object \(j\) is invisible.

Step 4: Information Weighting and Decision Making

  • Assign a weight to each visible object (e.g., higher weight for closer distance).
  • Comprehensively calculate the movement direction based on weights (e.g., superposition of collision avoidance vectors, attraction to the target point).

6. Challenges and Optimization Directions

  1. Computational Efficiency: Pairwise detection complexity in large-scale crowds is \(O(n^2)\). Spatial partitioning (e.g., quadtrees, grid indexing) is needed to accelerate queries.
  2. Behavioral Realism: Introduce psychological experimental data to calibrate perception parameters (e.g., field of view contraction in panic states).
  3. Dynamic Environment Adaptation: Perception range adapts to environmental changes (e.g., visual range shrinks in dense smoke, reliance on hearing increases).

7. Application Example

In a fire evacuation simulation:

  • Agents' perception range is affected by smoke concentration, possibly allowing identification of only nearby exit signs.
  • Local interaction leads to "blind following": some agents follow the crowd because they cannot see the exit, even if the crowd direction is suboptimal.
  • By adjusting perception parameters, the effectiveness of different guidance strategies (e.g., sound and light alarms) can be evaluated.

Through fine-grained modeling of perception and interaction, simulations can more realistically predict crowd dynamics, providing a basis for emergency management.