Toggle Menu
Case Study

Using AI to Enhance Disaster Relief with an Eye in the Sky

Client

Department of Defense

Market

Federal

Industry

National Security

Offerings

Artificial Intelligence & Analytics

Last fall, Excella participated in the Department of Defense’s (DoD) Eye in the Sky Challenge. The goal of developing Machine Learning (ML) and computer vision algorithms to accelerate the analysis of satellite imagery and enhance disaster response was a perfect fit for our vision and mission. At Excella, we use technology to make bold new visions a reality. Eye in the Sky was a great opportunity to showcase our talents while helping to make an impact in the community. Through our participation in the competition, we were able to develop new technologies, demonstrate how they could enhance disaster response, and validate the best practices we’ve been using for our clients to enhance Automated Intelligence (AI) efforts using Agile and DevSecOps techniques.

What We Did

We formed a small team of experts, gave them a limited set of hours—so as not to interfere with existing client obligations—and asked them to work under Excella Labs, our internal Innovation Engine. We’ve shown that Agile techniques can improve AI and ML work for our clients, so we leveraged that experience. We established cadenced timeboxes, one day each week, when the team could come together and collaborate. Each day was divided into a series of iterations where we focused on a small set of deliverables, demonstrated progress, gathered feedback, and planned our next set of objectives. This allowed us to fully capitalize on the creative potential of our team. We gave them a high-level objective and allowed them to self-organize. Rapid feedback and learning throughout the process allowed them to explore various approaches and identify the best path for implementation.

The team emphasized rapid learning throughout the process. We used DevSecOps to enhance our ML expertise by creating an automated pipeline. The pipeline gave us rapid feedback on our work, its progress, and its quality. We identified unknowns—our largest risks—and prioritized work that would bridge these knowledge gaps. We reduced risk and steadily gained confidence in our approaches and algorithms. We pursued different ML solutions in parallel, to keep our options open and make the most of our limited time. We know from experience that the pace of ML work is constrained by the speed of learning; a sophisticated DevSecOps infrastructure meant that our team could learn more quickly, pursue alternative options, and compare results before committing to a specific path.

house in magnifying glass

We used our increasing knowledge and automated infrastructure to determine that two passes through the images would work best. In the first, we used a well-trained algorithm to identify all the buildings. It proved to be very accurate and successful. In the second, we took on the more difficult challenge of assessing the level of damage to each building. This was far more complex. Unless a building was utterly demolished or had suffered obvious visible damage, it was difficult to accurately assess what had happened to it. Flooding damage was particularly difficult to reliably categorize. Although our damage recognition algorithm steadily improved, there were still enhancements we wanted to make and training we wanted to perform at the time the competition ended. This kept our score from being higher. We prioritized building a very solid technical infrastructure that allowed frequent iteration, rapid learning, and continuous improvement, to give us, DoD, and future clients an extremely effective and scalable foundation for rapidly deploying AI/ML solutions with clients in the future.

map images where AI is used to make predictions about what objects in the images are buildings

By participating in Eye in the Sky, we learned that we can play with the best. Although our entry did not win, in just a short amount of time and with a small team, we built an effective infrastructure for validating, training, and deploying an AI solution. We proved that our Agile techniques for AI, ML, and data work well and promote rapid learning. We know how to adapt our effective principles to introduce specific, contextually appropriate practices, whether we’re participating in a DoD competition or solving the latest problem from one of our clients. Our well designed DevSecOps infrastructure allows fast feedback, accelerating the learning and tuning that is critical to making AI effective. This enhances the performance of AI algorithms, validates them more quickly, and accelerates time to market. As one of our team members noted:

“While we definitely want to place as high on the leaderboard as we can, we recognize for this to be implemented at scale, it’s going to take more than a high score. You’re going to need other things that help you deploy your software or integrate with other systems. That’s one of the special things about Excella. We value the process, the craftsmanship, and the quality of what we’re producing.”

With Eye in the Sky, we proved that Agile and DevSecOps for AI is real and that we’re at the forefront of how best to make it work. Our approach to Eye in the Sky had the following demonstrable benefits:

cog with check

More rapid learning and validation: by using automated DevSecOps pipelines we explored alternative AI and ML algorithms in parallel and quickly determined which would be most effective.

Faster time to market: by using Agile techniques to establish a regular cadence of feedback and delivery we capitalized on our identification of the most effective approaches and produced working software very quickly.

Improved extensibility: by coupling all these elements together—DevSecOps, Agile, and AI/ML expertise—we created a solution that supported rapid improvement and enhancement, allowing us to continuously improve it based on frequent feedback and training.

We’ve enjoyed bringing these advantages to our clients’ AI and ML challenges; it was great to demonstrate what we’ve learned in a public competition.

Conclusion

For us and our team, Eye in the Sky was a great experience. It was an extremely effective example of how we regularly invest in the skills of our team members and experiment with
new techniques that we use with our most sophisticated clients. If one of our small teams can build a sophisticated infrastructure, select and tune complex algorithms, and submit a worthy entry into this high-profile competition using just 10 days, imagine what can happen with a fully committed team on a real-life project. If you want to learn more about how we’re bringing Agile, DevSecOps, and AI expertise to our clients outside of these competitions, read our JAIC Press Release.