Every organisation has leadership and governance structures. Most of the time, they work against the people they’re supposed to help. Committees that slow decisions to a crawl. Approval chains designed to protect the approver, not the outcome. AI governance policies that say “be careful” without specifying what careful looks like.
I’ve sat in rooms where a team waited three months for sign-off on a dashboard tool. Reversible, low-risk, zero financial exposure. Three months.
I run 2Stallions, a 40+ person digital marketing agency across Singapore, Malaysia, Indonesia, and India. I’ve advised 30+ startups through Accelerating Asia and Plug and Play APAC, worked with boards evaluating AI investments and technology oversight, and built AI-powered products from scratch. My view of governance comes from the operating side: I’ve been on both sides of the approval chain, and I know what happens when governance is designed by people who’ve never shipped anything.
This page covers the governance areas I spend most of my time on: data governance, AI oversight for boards, digital transformation, and decision architecture. Each section gives you the operating philosophy. For the detail, follow the links to specific articles.
Why Governance Matters More When You Build Things
Governance gets a bad reputation because most people experience it as friction. But the problem isn’t governance itself. It’s governance designed without input from the people doing the work.
When it works, governance does three things. It clarifies who decides what, so teams aren’t waiting for permission they don’t need. It concentrates oversight where the stakes are genuinely high. And it gets out of the way everywhere else.
The companies I advise that move fastest aren’t the ones with the least governance. They’re the ones where it’s precise. The board focuses on decisions that actually need board-level attention. Managers handle cross-functional alignment. Teams closest to the customer handle everything reversible.
My operating philosophy: centralise the guardrails, decentralise the decisions within them. Every governance conversation I have starts there.
Data Governance: Give Teams the Authority to Act
Data governance is how an organisation manages the availability, usability, integrity, and security of its data. Most companies treat it as a control function, and that framing is the problem.
Wavestone’s 2024 survey of Fortune 1000 companies found that 48 percent now consider themselves “data-driven,” double the figure from the year before (Wavestone, 2024). But the same survey found that cultural challenges, not technology, remain the top barrier. The tools exist. The permission structures often don’t.
At 2Stallions, we operate across four countries. If every data decision had to route through Singapore headquarters, we’d lose weeks on things that should take hours. So we use a simple filter: one-way doors (irreversible or costly decisions like choosing a core data platform) get careful deliberation. Two-way doors (reversible, low-risk decisions like testing a dashboard tool) are made by the team closest to the work. That single filter eliminates about 80 percent of unnecessary approvals.
What should boards track instead of individual data decisions? Data quality scores, tool adoption rates, time-to-insight (how long from question to answer), and incident frequency. These give you genuine oversight without requiring the board to inspect every tree.
I wrote about this framework in detail: Data-Driven Decision Making Belongs With Teams, Not Boards. For startups building data culture from scratch: How to Build a Data-Driven Culture When You’re Still Figuring Things Out.
AI Governance: What Boards Can’t Afford to Get Wrong
Global AI spending is on track to reach $2.5 trillion by 2026, up from $1.5 trillion in 2025 (Gartner, 2026). Boards that can’t evaluate these investments aren’t governing. They’re spectating.
The numbers are stark. Deloitte’s 2025 survey of 695 board members across 56 countries found that 66 percent of boards have “limited to no knowledge or experience” with AI (Deloitte, 2025). Only 2 percent described their boards as highly knowledgeable. And 31 percent still don’t have AI on the agenda at all.
This isn’t just a knowledge gap. McKinsey found that only 18 percent of organisations have an enterprise-wide council with authority to make decisions on responsible AI (McKinsey, 2024). Most companies are deploying AI without anyone clearly accountable for how it’s used, what data it touches, or what happens when something goes wrong.
What boards need to ask
Directors don’t need to understand how transformer models work. They need to evaluate four things:
- Is this solving a real business problem? Most failed AI projects fail because they solve the wrong problem, not because the technology breaks.
- Do we have the data? Data quality and access are the bottlenecks in most AI implementations, not the models.
- Are the risks managed? Bias, privacy, accuracy, security. Each is real, each is manageable, each needs to be visible at board level.
- How will we know it worked? Not “we’re using AI.” What metric moves, by how much, by when?
I covered this gap in depth: Every Director Needs AI Fluency.
Singapore’s AI governance framework
Singapore has been ahead on this. The IMDA’s Model AI Governance Framework for Generative AI, finalised in May 2024, covers nine governance dimensions and received input from over 70 organisations including Microsoft, OpenAI, Google, and the US Department of Commerce (IMDA, 2024). In February 2025, IMDA launched the Global AI Assurance Pilot for technical testing of GenAI applications in production.
For companies in Singapore, it’s a ready-made foundation. For companies elsewhere in Asia, it’s still one of the most practical frameworks available. It’s principles-based guidance rather than legislation, which means companies can adopt it without waiting for regulators.
I cover the broader AI strategy and implementation side on the AI for Business topic page.
Digital Transformation: Why Three Out of Four Fail
BCG’s research on corporate transformations found that only one in four delivers enduring, value-creating change (BCG, 2025). Three out of four either stall, get abandoned, or produce results that don’t stick. When you look at why, governance is almost always part of the story.
Transformations fail when the oversight model doesn’t match the pace of change required. If your AI pilot needs six months of committee approvals before it can go live, you’ll always trail competitors who can test in six weeks.
But the opposite failure is equally common: transformations launched without clear ownership, without defined metrics, without anyone accountable for outcomes. That isn’t speed. It’s chaos that looks like agility until the quarterly review.
The boards I work with that get transformation right share three things. They define success metrics before launch, not after. They assign a single accountable owner per initiative, not a committee. And they build structured check-ins that track progress without turning every meeting into a review gate.
Board Governance in the Age of AI
Deloitte’s 2025 survey found that 40 percent of boards are reconsidering their composition because of AI (Deloitte, 2025). That’s a signal worth watching.
Most boards were assembled for an era where the primary governance challenges were financial: audit committees, compensation oversight, risk functions focused on market and credit exposure. The challenges shifting to the top of the agenda now are different. AI investment evaluation. Data strategy oversight. Cybersecurity. Digital transformation governance. These require people who’ve built and operated in these areas, not just reviewed reports about them.
Someone who’s deployed AI in their own operations asks different questions than someone who’s only read the vendor pitch deck. Running a multi-country team where data inconsistency between your Singapore and Jakarta offices causes real problems gives you a different view of data governance than reading about it in a policy document. The operating perspective changes what a board asks, and the questions a board asks determine whether governance enables the organisation or holds it back.
Patterns I See Repeatedly
After years of advising companies and working with boards across Southeast Asia, five governance mistakes keep showing up.
The most common is designing governance for control instead of speed. The instinct is always to add approval layers. Each one feels like it reduces risk. In practice, each one reduces the chance that anything actually happens. The goal isn’t zero risk. It’s managed risk at pace.
Boards also govern things they don’t understand. A board that lacks AI literacy will either rubber-stamp AI initiatives or block them out of uncertainty. Neither response is governance. Invest in board education before building board-level oversight structures.
Another pattern is treating all decisions as equal. A team testing a dashboard tool does not need the same sign-off process as a company committing to a three-year data platform contract. Most governance frameworks fail because they apply uniform process to decisions of vastly different weight and reversibility.
People also confuse governance with compliance. Compliance is the legal minimum. Governance is how you build an organisation that consistently makes good decisions. A fully compliant company can still have terrible governance, with approvals that take months, nobody owning data quality, and AI initiatives dying in committee.
The last one is copying enterprise governance at startup scale. A 30-person company doesn’t need a data governance committee. It needs shared metric definitions and a weekly meeting where people look at the numbers together.
Frequently Asked Questions
What is data governance?
Data governance is how an organisation manages the availability, usability, integrity, and security of its data. Effective data governance gives teams closest to the customer the authority and tools to act on data within clear boundaries, enabling faster decisions rather than routing everything through central committees.
What is AI governance?
AI governance is the set of policies, practices, and oversight structures that ensure AI systems are deployed responsibly. It covers accountability for AI decisions, bias monitoring, data privacy, transparency, and human oversight. For boards, AI governance has become as immediate as financial governance, especially as global AI spending approaches $2.5 trillion annually.
Why do boards struggle with AI oversight?
Most board members lack the practical AI experience needed to evaluate investments and risks. Deloitte's 2025 survey found 66 percent of boards have limited to no AI knowledge, with only 2 percent describing themselves as highly knowledgeable. This leads to either rubber-stamping proposals or blocking them out of uncertainty. Board education focused on evaluation frameworks, not technical depth, closes the gap.
What is Singapore's IMDA AI governance framework?
The Model AI Governance Framework for Generative AI, developed by Singapore's IMDA and the AI Verify Foundation, is a principles-based guide covering nine governance dimensions for responsible AI deployment. Finalised in May 2024 with input from over 70 organisations, it's guidance rather than legislation, making it a practical starting point for companies across Southeast Asia.
How do you decide which decisions to centralise vs decentralise?
Use reversibility as your filter. Decisions that are easily reversed with limited downside, like testing a tool or adjusting campaign targeting, should be made by the team closest to the work. Decisions that are irreversible or carry significant financial or legal exposure, like core platform choices or vendor contracts, need escalation. In most organisations, 80 percent of decisions are reversible and don't need senior approval.
Why do digital transformations fail?
BCG found that three out of four corporate transformations fail to deliver enduring change. The most common governance-related causes are mismatched oversight (too slow for the pace required), diffused accountability (committee ownership instead of a single owner), and absent success metrics at launch. Boards that define measurement criteria upfront and assign individual owners significantly improve their odds.
What skills do modern boards need?
Boards increasingly need members with practical experience in AI deployment, data strategy, cybersecurity, and digital transformation governance. Deloitte found 40 percent of boards are reconsidering composition because of AI. The key gap is between directors who can evaluate technology investments from operating experience and those who rely solely on management presentations or vendor briefings.
I advise founders and boards on governance that enables speed without losing control. If your organisation’s decision-making needs a reset, let’s talk.