THE BOARD BRIEF
Weekly Intelligence for Directors Who Want to See What's Coming
February 11, 2026 | Issue #2
THE BIG STORY
The AI Literacy Gap: Why "Understanding AI" Isn't Enough for Competent Oversight
Two-thirds of directors now report using AI for board work. Forty-four percent of companies listed AI experience as a board qualification last year, up from 26% the year before. Thirty-five percent of boards say they've integrated AI into their oversight activities.
By every surface metric, boards are embracing AI. But there is a widening gap between using AI and understanding it well enough to govern it, and that gap is becoming a fiduciary liability.
Consider a simple test: How many directors approving AI capital expenditure budgets this quarter can explain why those budgets are suddenly 30% to 50% higher than projected? The answer, in most boardrooms, is very few. And the reason is a global memory crisis that most directors have never heard of.
The semiconductor memory market is experiencing what analysts are calling "RAMmageddon." DRAM prices have surged 171% year-over-year. DDR5 spot prices have quadrupled since September. Micron Technology has confirmed that its entire production capacity for High Bandwidth Memory (the specialized chips that power AI accelerators) is sold out through the end of 2026. The OpenAI Stargate project alone could consume up to 40% of global DRAM output.
This isn't an obscure supply chain detail. It is the single largest variable in whether a company's AI strategy will hit its cost targets this year. Yet it lives in a technical layer that most boards never see, buried under management presentations about "AI transformation" and "digital acceleration."
That's the AI literacy gap in action. Not a lack of enthusiasm. A lack of the specific, technical knowledge required to ask the right questions.
Why this is a governance problem, not just an education problem:
The WilmerHale/Equal AI Governance Playbook, published last month, was direct: "AI governance isn't just good practice; it has quickly become a legal and strategic imperative. Boards that take the time now to assess governance structures and elevate AI literacy will meet Delaware's oversight standards." The implication is clear. Boards that don't elevate AI literacy may not meet those standards.
NACD's Shelly Palmer put it more bluntly: "2026 is not the year to delegate technology understanding. It's the year every director becomes fluent in AI strategy, data ethics, and digital accountability. This is a fundamental fiduciary responsibility."
And Korn Ferry's research has surfaced a subtler problem. AI is now giving directors the ability to look beyond what management provides, using AI tools to independently analyze data and challenge assumptions. That capability may be creating a new obligation. If a director could have used AI to verify management's claims and didn't, the traditional business judgment rule defense (which protects directors who rely in good faith on management's information) gets harder to invoke.
Three dimensions of the literacy gap boards should address now:
1. Technical literacy. Not coding. Not data science. But the ability to understand the infrastructure, cost drivers, and constraints behind AI initiatives. A director who can't distinguish between training and inference costs, or who doesn't know that AI memory chips consume three times the manufacturing capacity of standard memory, cannot meaningfully evaluate an AI budget.
2. Risk literacy. Only 22% of companies using AI have usage policies in place. AI-related shareholder proposals tripled in 2025, according to the Harvard Law School Forum on Corporate Governance. State AI regulations are proliferating. The Colorado AI Act (delayed to June 30) and the EU AI Act (timelines in flux) represent near-term compliance obligations that most boards have not discussed in detail.
3. Oversight literacy. Understanding what questions to ask, what metrics to demand, and what governance structures to put in place. The shift from "passive awareness" to "active oversight" that governance experts are calling for requires directors to know what good AI governance looks like, not just that it matters.
Questions to ask in your next board meeting:
→ "Can each of us articulate, without management's help, the three biggest technical constraints on our AI roadmap?"
→ "What is our board's plan to close the gap between AI usage and AI competence? Do we have a structured AI education program for directors, or are we relying on ad hoc learning?"
→ "Have we evaluated whether our directors' AI literacy meets the fiduciary standard that regulators, courts, and proxy advisors are moving toward?"
→ "If our largest AI initiative failed to deliver ROI within 18 months, would this board have the technical literacy to diagnose why, or would we be entirely dependent on management's explanation?"
The 66% of directors using AI and the 22% with governance policies represent a 44-point gap between adoption and oversight. Closing that gap is not optional. It is, as Palmer said, a fundamental fiduciary responsibility.
ON THE RADAR
Five signals board members should track this week
1. New York Proposes the Strongest Data Center Moratorium in the Country
On February 6, State Senator Liz Krueger and Assemblymember Anna Kelles introduced S.9144, a bill that would freeze permitting for new data centers in New York for at least three years while state agencies conduct environmental and economic impact reviews. New York is at least the sixth state to propose a moratorium, joining bipartisan efforts in Maryland, Georgia, Oklahoma, Virginia, and Vermont. The bill targets hyperscale facilities over 20 megawatts (exempting public research projects like Empire AI in Buffalo) and would require the Public Service Commission to report on ratepayer cost impacts. New York's grid could face a 1.6 gigawatt shortfall, and the state's interconnection queue for large load projects doubled from 6,800 MW to 12,000 MW between September 2025 and January 2026. For directors: if your AI strategy depends on expanding data center capacity, state-level regulatory risk is no longer hypothetical. Both parties are finding common ground here; Florida's Governor DeSantis and Vermont's Senator Sanders have both called for restrictions.
2. The Memory Crisis Boards Aren't Seeing
The global memory shortage is hitting AI budgets harder than most boards realize. Three companies (Samsung, SK Hynix, Micron) control 95% of DRAM production. Manufacturing one gigabyte of High Bandwidth Memory consumes roughly three times the wafer capacity of standard DRAM, and all three manufacturers have shifted production toward HBM's higher margins. The result: Gartner forecasts a further 47% DRAM price increase in 2026. PC and device prices are expected to rise 15% to 20%. Companies that secured long-term memory supply agreements are insulated; companies buying on spot markets face real budget risk. Boards approving AI infrastructure spending should be asking management whether memory costs are locked in or exposed to further escalation.
3. Proxy Advisors Face Unprecedented Political Pressure
President Trump's December 2025 executive order, combined with House committee hearings scrutinizing ISS and Glass Lewis and state Attorney General investigations, is reshaping the proxy landscape for 2026. The Harvard Law School Forum on Corporate Governance flagged the key question: "Who will be driving voting outcomes, and how should companies respond?" With the SEC no longer providing substantive guidance on shareholder proposal exclusions (as noted in Issue #1) and proxy advisors under political siege, companies face a proxy season where voting dynamics are less predictable than any in recent memory. Boards should be stress-testing their shareholder engagement strategies now.
4. Government as Shareholder: A New Governance Variable
The U.S. government's growing portfolio of equity stakes in strategic companies is creating governance dynamics that most boards have never navigated. The golden-share provision in the US Steel/Nippon Steel transaction gives the government veto rights over a range of corporate decisions. Stakes in Intel, Lithium Americas, MP Materials, and others primarily carry economic rights, but the trend line is clear: government involvement in companies deemed strategically important is expanding, and it comes with constraints on M&A, operations, and partnerships. The Harvard Law School Forum's Wachtell Lipton commentary (February 3) identified this as a defining governance issue for 2026. If your industry touches national security, semiconductors, critical minerals, or energy infrastructure, the question is not whether government will take an interest, but when.
5. Delaware's SB 21: The Biggest Governance Reform in Years
Amendments to the Delaware General Corporation Law have expanded safe harbor protections for directors, officers, and controlling shareholders in conflicted and controlling shareholder transactions, while limiting the scope of Section 220 books-and-records demands. This is Delaware's direct response to the competitive challenge from Texas and Nevada, which have been attracting incorporations with more management-friendly governance frameworks. For directors at Delaware-incorporated companies, liability exposure has shifted. Work with counsel to understand whether your board's processes and protections need updating in light of the new provisions.
THE BOARDROOM QUESTION
Each week, one question worth raising at your next meeting.
"What is the total cost of our AI infrastructure, including the hardware, energy, memory, and talent components, and how has that cost changed in the last six months?"
Most AI budgets are presented to boards as software and cloud services line items. But the real cost of AI includes specialized memory chips (up 171% year-over-year), data center capacity (increasingly constrained by moratorium proposals and grid limitations), energy consumption (rising fast enough to trigger regulatory backlash), and scarce talent in MLOps, AI governance, and agent orchestration. A board that only sees the software line is governing with partial information. This question forces a full-stack cost picture and exposes whether management's AI ROI projections account for the infrastructure reality.
REGULATORY WATCH
What's moving in Washington and beyond
Commerce Department AI Review Due March 11
The administration's evaluation of state AI laws, mandated by the December executive order proposing federal preemption, could reshape the regulatory landscape in a single report. If the Commerce Department recommends preemption, state laws like Colorado's AI Act face potential invalidation. If it doesn't, the current patchwork intensifies. Either outcome demands board attention.
Foreign Insider Reporting Rule: March 18 Deadline Approaching
As noted in Issue #1, the Holding Foreign Insiders Accountable Act makes officers and directors of foreign private issuers subject to Section 16(a) public reporting requirements starting March 18. If your board includes directors of FPI-status companies, compliance should be confirmed now, not at the deadline.
California SB 253 Climate Reporting: August 10
First reporting requirements under the Climate Corporate Data Accountability Act take effect August 10. Draft reporting templates have been available since October. If your company has California operations and hasn't begun compliance planning, the timeline is getting tight, particularly as legal challenges remain unresolved.
WHAT'S AHEAD
Next week: When the government becomes your shareholder. What boards need to know when strategic importance attracts state involvement, and how governance changes when your shareholder has sovereign interests.
COMING SOON FOR MEMBERS
BoardroomRadar Premium is launching soon with tools built for directors:
→ AI Advisor: Get instant answers to your oversight questions, from fiduciary duties to committee responsibilities to emerging risks, 24/7.
→ Board Prep Generator: Custom meeting preparation materials tailored to your industry and committee focus. Minutes, not hours.
→ Question Bank: 500+ oversight questions searchable by topic, committee, and situation. Never walk into a meeting unprepared.
Researched, written, and edited in collaboration with Claude by Anthropic.