Toggle Menu

Insights / Artificial Intelligence (AI) / Why Government Agencies Need to Incorporate Explainable AI in 2021

June 07, 2021

Why Government Agencies Need to Incorporate Explainable AI in 2021

1 min read

Jump to section

Written by

Claire Walsh

Consumers want reassurance about ethical use and fairness related to AI.

In a world fueled by digital data, the use of artificial intelligence is prolific—from the automation of human processes to discovering hidden insights at scale and speed. Machines can do many tasks far more efficiently and reliably than humans, resulting in everyday life that increasingly resembles science fiction. This inevitably sparks concern about controls—or lack thereof—to inspect and ensure these advanced technologies are used responsibly.

Consumers want reassurance about ethical use and fairness related to AI. Businesses need to mitigate the risk of unintended consequences when employing these advanced, complex solutions. Enter: Explainable AI, or XAI, an attempt to create transparency in the “black box” of artificial intelligence.

Can you confidently answer the simple questions below about your current AI solutions?

  • Why did the AI model make a specific decision or prediction?
  • When the result is unexpected, why did the model pick an alternate choice?
  • How much confidence can be placed in the AI model results?

Continue reading on NextGov.

Learn more about Explainable AI via our complimentary XAI eBook.

Download eBook

Claire Walsh

You Might Also Like

Resources

Federal HR Modernization: Mapping Chaos to Clarity

Market Solutions Architect, Charles Fiery, joins host John Gilroy of Federal Tech Podcast to discuss...

Resources

From Detection to Prevention: Using Technology to Reduce Fraud in Federal Programs

Excella Co-Founder and Chief Technology and Innovation Officer, Jeff Gallimore, joins host John Gilroy of...

Resources

How AI Integration Accelerates Federal Digital Transformation

Excella AI Engineer, Will Angel, joins host John Gilroy of Federal Tech Podcast to examine...