Occasionally there are extreme events that occur infrequently but have a significant market impact.  These are sometimes called "black swan" events; in Artificial Intelligence, "edge cases" or "out-of-distribution" datapoints.  The financial crisis of 2008 was one such event; Covid-19 is the latest.

However, systems that rely on AI may not be designed for such events. For example, the training data may not go back as far as similar events (if there are similar events) so the AI won't have learned what happens in a market shock. Further, the processes and human oversight used may only be designed for "business as usual".  Peaks in demand, as seen during the Covid-19 pandemic, may be seen instead as denial of service attacks, resulting in the system slowing or even shutting down.  When AI struggles to cope with a "black swan" event it can quickly affect business, as seen by how online retailers and supermarkets have struggled during increased demand during the Covid-19 pandemic.

"Black swan" events emphasise the importance of human input into every stage of the design and deployment of AI.  AI may need to be designed for a wider-variety of circumstances and, where events are outside of those parameters, humans need to be ready to take control.

Those who design and deploy AI are expected to understand how it works.  This is, in part, because their business depends on it.  It is also because various governmental institutions (e.g. the European Commission, the US White House, the OECD) and regulatory organisations (e.g. the Information Commissioner's Office) expect this, too.

In April 2020 we are expecting judgment to be handed down in the case of Tyndaris v VWM, the first English case looking at liability when AI-powered algorithmic trading goes wrong.  Two key questions will be whether it can be explained how the AI operates, and what the parties' understanding was of what the AI could do. 

Directors of businesses that use AI should be taking stock of how AI operates within their companies and what their contingency plans are if unexpected events occur or the AI does not act as expected.