How should boards approach AI?
Being a director is about doing not talking. Two simple things: choose priorities and make decisions. So should AI be a priority? Context!
How should boards approach AI? What is your role in this as a non-executive director?
The lazy answer is "AI should be a strategic priority" or some similar form of words.
If you are a company director, maybe especially a non executive, LinkedIn and the business press are bombarding you with this stuff right now.
Surely the message is clear. Your job as a non-exec is to speak up about this stuff in board meetings.
It is easy to think the job of a director is talking. The image of men in suits sitting at a formal table in an ostentatious meeting room is hard to shake.
Being a director is not about talking. You are responsible for the whole business. Being a director is about doing not talking.
Responsible for everything is an impossible place to start. So the first thing the board needs to do is choose priorities. (By the way, this is the hardest thing!)
Should AI be one of those priorities?
What does that even mean? AI is a bunch of technologies with a whole raft of opportunities, threats and risks. Popping up in a board and asking: "What about AI?" doesn't really help much.
All that hype and noise can create a false sense of urgency, importance and certainty. Certainty in the sense that an awful lot of people are certain about what is going to happen. To be fair, these are pretty easy to ignore.
In the case of AI, a moral/ ethical panic is also in the air.
You need to stay grounded in what AI actually means for your business - my favourite keyword context.
So the answer to the question depends on the real world opportunities, problems and risks of the business whose board you sit on.
That probably means some work for the executives and the team.
Which exposes another key element of the director role. Everything is dynamic. You can't actually decide if AI is a priority without diverting some resources to asking the question.
So actually the first question is really: "Is it worth exploring AI deeply enough to decide if it is a priority and if yes, what are the specific areas where we need to focus?"
What should you stop doing?
Running a business is dynamic. Boards are always making decisions about where to allocate resources. You never have all the information you need and getting more information is a choice about resource allocation in itself.
I could easily work my way down from that framing in ever finer levels of detail. It would not help anyone.
Instead, think about just one question:
"What will we stop doing to make time for this?"
That might be small in the first phase of investigating. For example, stop work on a new product roadmap until we have figured out where AI fits.
Ultimately, it might mean betting the whole business on AI. Or it might be saying to the team, let's just wait and revisit this topic in a couple of years.
And that's really it. There are a whole lot of uncertainties and unknowns. The role of the board is not to list all of these in a lovely colourful risk register and forget about them. The role of a director is not cover your arse by raising the "question" in the board and then head through for the lunch.
The job is simple: Choose priorities and make decisions.
My job is helping business leaders and entrepreneurs make better decisions. If you think I could help you do that around AI, please get in touch.
Thanks for reading