Idea
FlameFighterVR is a drone assisted immersive firefighting training and engagement platform. It solves four persistent constraints in drills and live operations. Crews rarely train in the exact places they serve. Scenarios repeat and lose instructional value over time. Practice often requires exposure to hazardous conditions. Budgets limit access to complex multi agency exercises. FlameFighterVR delivers incident realistic workflows from the first alarm to coordinated response including rapid drone launch, aerial assessment, and team execution. City layouts, building envelopes, weather conditions, and fire behavior are rendered with high fidelity to pressure test decision making and coordination.
EU space technologies
The simulator uses Copernicus satellite imagery to reconstruct realistic environments that match the streets and structures where users operate. Sentinel 2 Level 2A optical imagery provides high resolution surface reflectance for true to place textures and material cues. Sentinel 1 radar imagery enhances mapping in cloud prone regions and supports change detection for scenario updates. EU DEM supplies elevation and slope for terrain aware path planning and plume dynamics. Copernicus Atmosphere Monitoring Service fields inform wind, humidity, and smoke dispersion so visibility and fire spread respond to local conditions. These datasets are orthorectified and tiled into georeferenced scene layers inside the engine, producing photoreal ground truth backdrops, accurate heightmaps, and repeatable mission waypoints. The same layers guide autonomous drone routes, hotspot geo tagging, and post mission review, increasing both realism and training transfer.
EU Space for Consumer Experience
Challenge selection is Immersive gameplay with space technology and data. Copernicus powered scenes transform professional training into missions that feel alive, local, and consequential. Users choose a city, load recent satellite layers, and face evolving fire fronts and weather that mirror reality. Success depends on reading actual wind corridors, terrain constraints, and access limitations rather than scripted patterns. This raises engagement for the public, improves preparedness for volunteers, and strengthens confidence and performance for professionals.
Team:
Labeat Matoshi, VR developer. Responsible for product vision, core gameplay systems, and systems integration.
Kastriot Parduzi, Responsible for art direction and scene quality based on Copernicus textures, graphic designs and 3D assets.
Visar Dabiqaj involved in Pitching, manage real life simulations through gamification of space tech.