The Programmable Capital Revolution
Tokenisation will fundamentally restructure global finance, but the shift will take a decade
In 1971, President Nixon ended the dollar's convertibility to gold. This decision launched a fifteen-year transformation of global finance. Goldman Sachs recognised the opportunity early, building a block trading business that would become the foundation of its dominance.
Today, tokenisation represents another systemic reset. It enables investors to separate steady income from volatile gains, makes financial engineering accessible beyond investment banks, and embeds complex transactions directly into programmable code. The same forces that reshaped finance in the 1970s and 1980s are converging: regulatory change, technology, and structural innovation.
Michael Green, portfolio strategist at Simplify and a noted critic of passive investing, recently declared that “tokenization is starting to actually matter.” His assessment carries weight precisely because of his scepticism towards market fads. In a recent newsletter, he demonstrated how tokenisation could split a high-yield bond portfolio into two products: one delivering steady 10% returns, another capturing the remaining upside whilst bearing all volatility. Smart contracts handle what once required armies of lawyers, trustees, custodians, and accountants.
This transformation extends beyond technical innovation. Just as the 1971 monetary shift freed capital from gold, tokenisation frees ownership from the binary choice between public and private markets. With tokenisation, companies can provide liquidity whilst maintaining flexibility. Smart contracts automate governance. Real-time settlement replaces quarterly marks.
History shows that financial resets reward early positioning. Firms that built capabilities before the 1975 deregulation of commissions on the New York Stock Exchange captured lasting advantages. Those that understood junk bonds before pension funds could buy them dominated that market. The pattern repeats: recognise the shift, build infrastructure, position for the new ecosystem.
The following analysis examines how the last major market reset unfolded, what tokenisation changes about market structure, and which jurisdictions and institutions are positioning to lead programmable finance. Read on 👇
1/ Tokenisation enables investors to separate steady income from volatile gains
I am usually a late adopter. I prefer to wait months, sometimes years, before deciding whether a new idea is worth my attention.
This was true of tokenisation. Until recently, I paid little attention to it. Then Michael Green, portfolio strategist at Simplify and a well-known critic of passive investing, caught my interest. In an edition of his newsletter Yes, I give a fig a few months ago, he highlighted a tweet he had posted earlier: “Tokenization is starting to actually matter. Incredibly exciting.” He then explained how tokenisation could apply to structured financial products, complete with detailed calculations and simulations.
Michael started with a high-yield corporate bond portfolio generating around 15% annually but with large swings in value and wanted to create two products from it: one providing steady 10% returns, and another capturing the remaining upside while bearing all the volatility of the portfolio.
Traditional approaches struggle with this. The expensive route is to build structured products using lawyers, trustees, and calculation agents. It works, but only at a scale most investors cannot reach. The cheap route is to adjust the portfolio’s leverage or cash allocation. For example, you could borrow money to increase exposure or hold more cash to reduce risk. These tweaks can make returns more stable or more volatile, but they cannot separate steady income from risky gains—the upside and the downside always move together.
Michael’s tokenised solution splits the portfolio into two types of tokens: senior and junior. Think of it like slicing a cake: senior token holders get the first, guaranteed slice (10% returns each year, with any shortfall covered by reserves), while junior token holders get whatever is left over, which could be bigger but comes with more uncertainty. Smart contracts automatically handle the calculations and payments, replacing the army of intermediaries that would be needed in traditional finance. What once required hundreds of millions to be viable can now launch with just thousands, and every step is fully transparent on the shared ledger known as the blockchain.
The key innovation here lies in making sophisticated financial structuring accessible to many more investors. With such an elegant approach, a property developer can issue senior tokens paying steady yields to pension funds while selling junior tokens capturing appreciation to hedge funds. A venture fund can offer predictable returns to conservative investors while preserving upside for risk-seekers. An infrastructure project can separate operational cash flows from development gains.
In short, tokenisation turns financial structuring from an artisanal craft practised by investment banks for their biggest clients into programmable logic anyone can use.
It is similar to what Goldman Sachs pioneered decades ago with block trading, which allowed investors to buy or sell large amounts of stock without affecting the market price; by focusing on mechanics and transparency rather than relying on their own balance sheet, Goldman Sachs made a complex process accessible to a wider group of investors.
Tokenisation does the same for structured finance: it replaces complicated legal documents and multiple layers of intermediaries with code embedded in the token itself, opening up opportunities that were once reserved for large institutions. And like block trading in the 1960s, tokenisation will likely begin with overlooked market segments before external forces drive rapid growth.
2/ Digital markets still rely on analogue design
One reason tokenisation is hard to explain is that financial markets have been digital for decades, and many people don’t see the difference between digitisation and tokenisation.
Yes, finance today is already digital; however, we don’t get the full benefit of it because records remain fragmented across separate ledgers. When you buy shares, your broker updates one database, the exchange another, the clearing house a third, the custodian a fourth. Each institution keeps its own records, and they must be reconciled. A single equity trade can generate dozens of messages across incompatible systems. When Lehman Brothers collapsed in 2008, it took years to untangle ownership across these ledgers.
This friction exists because paper-based processes were digitised without redesigning the structure. SWIFT, for example, is the secure messaging system banks use to send payment instructions to each other. It processes 40 million daily messages coordinating between institutions rather than executing transactions. Settlement still takes days as systems struggle to agree. Each boundary between institutions adds delay, cost, and risk of failure.
Tokenisation offers a different architecture. In a tokenised market, the token itself is the ownership record on a shared ledger visible to all participants. Rules that are currently enforced by separate institutions embed directly into token software. Instead of coordinating between fragmented databases, participants rely on the same source of truth—the token itself.
Take Michael’s structured financial product, which I described above:
In the traditional model, legal documents define the terms, calculation agents compute payouts, trustees manage reserves, transfer agents record ownership, and paying agents distribute cash. Each party maintains separate records, and disputes require human interpretation of ambiguous text.
The tokenised version encodes these functions in software. The smart contract calculates values from price feeds, manages reserves, records ownership, and distributes payments. Legal terms become algorithmic rules that execute without interpretation. Settlement is instant because there are no separate ledgers to reconcile.
Of course, it is important to note that tokens do not freeze rules forever. If regulators change requirements, lawmakers alter the framework, or the issuer and the holder agree on any change of the terms, the token or smart contract must be updated to reflect those changes. Tokenisation reduces the need for ongoing reconciliation between institutions, but it does not remove the need to adapt when the rulebook itself or the stipulations evolve.
Finally, tokenising assets alone is not enough if cash still moves through traditional rails. I am usually wary when people say a system doesn’t work only because the ‘true’ version has not been tried yet. It sounds too much like the claim that true communism was never tried, therefore you cannot say it failed. Yet in this case, tokenisation really does need a systemic reset to deliver its full benefits in financial markets. And that is not a utopian idea: financial markets have gone through systemic resets before, and there is no reason to think it could not happen again.
3/ History provides the transformation template
As with tokenisation, significant change in financial markets has required a full reset rather than incremental fixes. Between 1971 and 1986, markets underwent precisely such a reset. It began when US President Richard Nixon ended gold convertibility in 1971 and concluded with London’s Big Bang deregulating the City in 1986. This fifteen-year sequence, effectively a complete market reset, offers a useful template for understanding how tokenisation may unfold.
There are many similarities between past and present constraints. In 1970, fixed commissions and gold backing limited capital flows while institutional investors demanded new services. Today, passive indexing dominates public markets, while private equity locks capital in decade-long funds. In both periods, technology created new possibilities: computers automated trading and telecommunications connected global markets then; today, blockchain enables programmable assets and smart contracts automate complex transactions.
Crucially, both resets required the entire ecosystem to adjust together. The 1970s succeeded because multiple elements evolved simultaneously: monetary policy abandoned gold, new regulations enabled pension fund investing, commissions deregulated, antitrust rules relaxed, and international barriers fell. Each change reinforced the others, producing compound effects.
Tokenisation requires the same systemic alignment. Digital currencies must provide payment rails. Securities laws must recognise tokens. Custody rules must accommodate cryptographic ownership. Tax codes must clarify treatment. Trading infrastructure must support instant settlement. Piecemeal adoption will likely disappoint because each component depends on the others.
The speed and sequence of change also matter. The 1971–1986 reset took fifteen years from Nixon Shock to London Big Bang. Firms that understood the unfolding pattern—such as Goldman Sachs with block trading or Drexel Burnham Lambert with junk bonds (see below)—were able to capture advantages. For tokenisation, organisations that invest now in capabilities, experiment responsibly, and prepare for regulatory developments will likely be better positioned as the ecosystem evolves.
That said, success depends less on being first and more on building the infrastructure and expertise the system ultimately requires. To understand how such a systemic reset unfolds in practice, it is useful to examine the last major market reset in detail—how money, institutions, and corporate structures were once transformed over the course of 15 years—and draw parallels to the challenges tokenisation faces today.