The Value of Artificially Intelligent Members on Corporate Boards
Business getting more artificial
As AI becomes more intelligent and astute, it is being implemented in a variety of settings and industries, including money-laundering detection, pricing strategy, drone package delivery, facial recognition, and more. We all know that AI can detect patterns and trends in vast amounts of seemingly incomprehensible data. This capability has been successfully used to generate meaningful insights used in managerial recommendations that have proven game-changing in corporate decision-making. What if AI was involved in more abstract decisions at the highest corporate level, drawing conclusions from sets of historical data that can be used to predict each potential decision’s most likely outcome?
Should you invite a robot algorithm to your corporate board?
The people typically selected to serve on corporations’ executive boards are directors, vice presidents, founders, and CEOs who are known for being short on time and resources. They can only serve on that many boards before literally running out of hours in the day. So, what if you desperately need this level of insights to steer your company to success but have a hard time securing the humans that can deliver it? Enter artificial intelligence, everyone’s favorite replacement for all things human. But can AI replace humans on executive boards?
In recent years we have seen how corporate success can be easily tainted by the extravagance of one person (typically its CEO). Boards are frequently faced with the prospect of overseeing CEOs and their teams, while also needing to understand their work and trust their assessments. The latter haven’t always been accurate or fully objective, leading to the idea that having impartial machine intelligence in place of a board member might not be such a bad option after all.
The way AI can aid boards is by sifting through vast amounts of various data before recommending a specific action that’s most likely to result in the desired outcome. AI can be used as a trusted counselor for different types of decisions, including financial and investment strategy, and mitigating risk.
Bias built into AI
One thing is certain, before ‘inviting’ any robot algorithms to their boards, members must be fully aware of how this course may impact their business models and decision-making and evaluate the risks it carries.
As most executives until recently have been largely white, middle-aged and male, it is likely that the datasets used for teaching AI have included mostly data generated by this type of profiles. This may be problematic for companies that aim to closely meet the needs of their growingly diverse customers. Of course, bias is inherent in humans but the whole premise of AI is that it should make traditional decision-making better, avoiding the pitfalls of error-prone human thinking.
To oversee AI ethics, Google has recently created a special council comprised of human board members. Known as the Advanced Technology External Advisory Council (ATEAC), its goal will be determining whether AI programs and advances the company undertakes have the potential to cause inadvertent harm to humans in any way.
Additional (unexpected) applications of AI
AI is an ever-evolving technology. Just like humans, robot intelligence must also learn and adjust based on new knowledge, past mistakes and ongoing trends. Ensuring AI is fed correct, recent data is key to its effectiveness not only on executive boards, but in general. An added advantage of AI over human executives is being able to act as a mediator whenever there is disagreement between board members – a frequent occurrence on most boards – but only if its assumptions are valid.
Today, more companies are starting to experiment with introducing artificially intelligent board members – one example is venture capital firm Deep Knowledge Ventures whose AI-based software Vital is helping make complex decisions based on numbers and data, rather than the subjective assessments that we, humans, are more prone to producing.