The US Department of Defence (DOD) has launched the Defence Innovation Unit’s (DIU) xView2 Challenge, seeking disaster relief AI innovations that can assess damage automatically from the sky. They are seeking machine learning and AI innovations that can quickly assess building damage through computer algorithms following natural disasters and speed up the crucial search and rescue process.
The competition follows from xView1, which looked for computer visions algorithms to locate objects on the ground useful for search and rescue. DIU has led a team from academia and industry to create a database to allow for damage assessment following disasters, which has formed the launch point of the new contest.
The largest and most diverse annotated building damage dataset, xBD allows ML/AI practitioners to generate and test models to help automate building damage assessment. The open-source electro-optical imagery (0.3 m resolution) xBD dataset will encompass 700,000 building annotations across 5,000 square kilometres of freely available imagery from 15 countries.
Seven disaster types are included: wildfire, landslides, dam collapses, volcanic eruptions, earthquakes/tsunamis and wind and flooding damage. There are three categories in the competition:
A share of $150,000 will be awarded for the winners of each category. Findings will be put to use in disaster relief including rerouting roads, identifying natural disasters, resource allocation, decision-making, object identification, and obstructed roads as well as going towards creating algorithms to bring humanitarian aid to people quickly and efficiently.
If you would like to join our community and read more articles like this then please click here.
ai competition Humanitarian Aid Machine Learning Research Search and Rescue US DoD