All Insights | Alumni | Diversity | Education | ESG | Ethics | Event | Human Resource Management | Innovation | Leadership | Purpose | Video | Webinar
What Boards and Executives Should Know About A.I. Risks and Opportunities
March 29, 2021
In the February 9th session of the McMaster Collaboratorium, Principal of The Directors College Michael Hartmann was joined by Carole Piovesan, Partner & Co-Founder, INQ Data Law and Lloyd Komori, C.Dir., Faculty at The Directors College, and former Chief Risk Officer, OPG for an interactive session exploring the real and emerging A.I. risks boards should be paying attention to and ensuring that a plan is in place for A.I. oversight.
What exactly is A.I.? As defined by the U.S. Federal Legislation A.I. includes:
A system that performs tasks under varying and unpredictable circumstances without significant human oversight, or that can learn from experience and improve in performance when exposed to data sets.
An artificial system developed in computer software, physical hardware, or other context that solves tasks requiring human-like perception, cognition, planning, learning, communication, or physical action.
An artificial system designed to think or act like a human, including cognitive architectures and neural networks.
A set of techniques, including machine learning, that is designed to approximate a cognitive task.
An artificial system designed to act rationally, including an intelligent software agent or embodied robot that achieves goals using perception, planning, reasoning, learning, communicating, decision making, and acting.
COVID has accelerated digital transformation, that results in the development of technology that can accommodate human interactions and the collection and dissemination of masses of data. A.I. outputs are only as good as the data it gets.
The regulatory environment is fluid and catching up. There has been a movement towards a requirement for greater accountability for A.I. systems and greater diligence for entities who choose to adopt A.I. systems because of their profound risk. Guidance is coming out around fairness and bias, and liability law is being rewritten.
Oversight of A.I. implementation must account first for people and culture. Organizational culture is very important, but there is a friction between traditional processes and the creation of digital processes. Talent requirements need to be examined.
Second for data practices and data ethics. A.I. is all about the data, so thinking dynamically about personal information and analyzing how the data will be used against any ethical issues. Document the process to satisfy any defense requirements.
Third for A.I. risk stratification exploring the explainability of the A.I. system. Define explainability internally and create governance around the standards. Digital literacy is important on the board as A.I. is the new cyber – where we were with cybersecurity 5 years ago in terms of regulation, mitigation standards, and valuation.
Finally examining controls and training to determine requirements for reporting and communication to the boardroom must be undertaken.
Pre-pandemic digital innovation projects that focused on A.I. focused largely on data management and analytics, and the risk associated with any project were the focus of the technology teams and regulatory compliance professionals. Innovation projects did not require the input of the board.
The pandemic has shifted the risk landscape. The first is working from anywhere and from device, as the workforce is distributed to escape the virus. Without exception, this shift has increased security issues exponentially. The second, is the issue with privacy – the collection, distribution and use of personally identifiable information – as digital data collection expands swiftly.
Ransomware and hacking criminals have changed their behaviour. Ransomware has been industrialized. Anyone can hire an attack and being attacked once, does not preclude subsequent attacks. Cyber crime is estimated to be $6.5 Trillion in 2021 – in terms of GDP, this would be the third largest economy in the world.
Boards should question their cyber security maturity. No organization is fully prepared. Get independent analysis and get educated about what needs to happen, by whom and when. Focus on protection of your crown jewel assets. Money spent that does not address protection of the main assets is not well spent.
Effective oversight requires an interdisciplinary approach with input from board members with varied experience and expertise. The board needs digitally-savvy directors, but effective governance and oversight around cyber benefits from multiple, diverse viewpoints to create a robust framework.
Click here to watch the webinar.