Captain Ted Kalnas (Fire Captain II & UAS Pilot, USAR Task Force 5 A Platoon, Los Angeles Fire Department)
Bill Schrier (Public Safety Consultant, Seattle Police Department)
Walt Magnussen (Director of the Internet2 Technology Evaluation Center, Texas A&M University)
Location: N259
Date: Monday, March 27
Time: 1:30 pm - 2:30 pm
Pass Type:
All Access, Quick Pass Monday
Track:
IoT, Smart-X and AI, Situational Awareness
Format:
Panel Session
Vault Recording: TBD
For decades, the only way for responders to get information about what is happening at an incident was to physically be at the location, even though doing so could be hazardous. Today, various types of drones and robots can enter even the most dangerous areas via air, land or water with multiple sensors to unprecedented levels of situational awareness—video, photos, sounds and valuable environmental information—without risking the health of a first responders.
While drone usage has been highlighted in high-profile scenarios like the massive Notre Dame fire in France, it also is increasingly being leveraged on a daily basis, with some jurisdictions sending drone to the location of 911 callers to gather additional intelligence before officers get to the scene.
But there are challenges that accompany the capabilities of drones and robots. Operationally, what are the best practices to control these assets, particularly as they are used in beyond-visual-line-of-sight (BVLOS) situations. What are the limits to the information that can be gathered? How can the data be processed quickly to drive actionable decisions?
From a policy perspective, what impacts do geopolitical issues surrounding countries like China have on the cost and training needs associated with a first-responder drone program? Finally, should drone/robots be used only to gather situational-awareness information, or should they also be used to take actions, even those that involve the use of lethal force?