Australia Can Become an Artificial Intelligence Powerhouse — by Luring Algorithms to its Shores
Of late there has been much talk about how countries should prepare themselves for the AI race. Notably, Nuria Oliver, Bernhard Schölkopf and Yann LeCun published an editorial in several European publications (Les Echos included) on how the European Union might help researchers. And of course, the Trump administration has announced a $500 billion dollar project (Reuters).
One question is whether smaller countries, like Australia, can possibly play in this game. Well, they can but it will involve some inventiveness, and not just following the leader.
Words aren’t enough
I with to briefly assert my premise: which is that no major party has any clue. As a mere example: the only thing more lame than the Australian AI Action Plan published by a past government in 2021 (which you can read here) is the summary of accomplishments provided by ChatGPT for the same. You know that list is pretty dismal when “banning DeepSeek” nears the top.
This was a predictable result for a collection of platitudes and bumbling nothing-speak. Nor should it be entirely surprising for a self-declared clever country that fastidiously prepared itself for the age of AI by slashing mathematical and fundamental research over several decades. Words don’t make up for that.
As a one-time card carrying member of the ALP I’m not trying to be partisan. I am bemoaning the slide into mathematical defeatism by my country of birth — a source of serious bitterness among those who taught me. Let’s face it, if Australia knew which way was up, and what was going to prove critical in the 21st Century, they would have made Terry Tao Australian of the Year a long time ago.
Step 1: Find an angle
You don’t become a world leader by only saying or doing the things that are — if you’ll excuse the Australian vernacular — bloody obvious. You need an edge. Something different. You might get it from listening very carefully to intellectual giants like Michael Jordan. I refer to the pioneering contributor known for Bayesian networks and variational inference — not so much fadeaway jump shots.
In the interest of an unlikely world domination of artificial intelligence I invite [Gough Whitlam voice] my fellow Australians to watch the “other” Michael Jordan’s address at the recent AI Action Summit in Paris. Scroll to 1h 10mins into that particular video, where he states the following:
Microeconomics generates a complementary kind of intelligence to that of prediction and optimization
This might be the most important sentence anyone important has said about AI this year. He said it to other important people during an important event, so time might be ticking.
Jordan goes on to cut down the reigning paradigm of the “central processor” … the one whose task it is to bring all the information from the periphery into a single place where it can be churned. Jordan mocks the modern Silicon Valley version of the planned economy, one where knowledge is bestowed upon us lucky civilians as it gushes forth from this central calculation.
The position might seem like orthodox AI in the age of LLMs, but it is unorthodox economics — flying in the face of Hayek who most famously pointed out the fallacy and suggested that our best bet was to let market forces surface knowledge held in part by many and wholly by none.
(In Bayesian networks messages passed between neighbours achieve a global ambition in analogy to the miracle of the price mechanism discussed by Hayek. So the irony of modern AI is not lost on Jordan.)
Jordan does admit that there is perhaps more truth to this “central processor” position that we might have anticipated recently, due to the surprising efficacy of large language models. LLMs provide interactions akin to “talking to everyone, not to a single entity”, as he puts it. This partial and incomplete miracle does not, however, completely or permanently refute the contradiction inherent in assembling local knowledge.
Because there is so little discussion of the role of microeconomics in intelligence, this is a gap that Australia could seize on.
Step 2: Find a tragedy of the commons
Since we are talking economics, the next step would be for the government to remind itself that governments exist to assist with market failure, not to compete with the private sector in places where they already have incentive to build.
Instead, the government should be thinking about the kind of infrastructure that, if built, would benefit every business — but that no business in and of itself would every contemplate. Obviously we have electricity … check. Fast internet … less said the better. Running water, roads, law and order and sewage. These are all good things that help engineers stay up late writing code. But they don’t really scream “unfair advantage”, do they?
Need a clue?
Start thinking about machines as agents. That should be a lot easier now that the whole world is screaming “agentic AI”, even without always thinking a whole lot about what kind of agency might be the most critical. An algorithm that is useful can be wrapped in “agency” of various kinds, after all, such as being given a mission or purpose.
But what is the point of being an agent if you can’t go places? Computer programmers around the world are busy writing code to help algorithms navigate the web, find their way to APIs, fetch useful data, and so forth. Algorithms are trying to get around.
So, let’s pivot to a concept that might give Australia a real shot at generating a new class of AI applications leveraging local knowledge, orchestrating small-scale but repeated acts of artificial intelligence, and hammering down costs.
Start building a kind of highway system for algorithms. Start building an intelligent infrastructure that makes it much easier for algorithms to navigate around Australia and to Australia than anywhere else. Start assembling the protocols and the games and the incentives to kick start it all. Australia should build a prediction web to address the central economic problem of AI. The concept was described in this book.
Think of it as the ability to attract low-cost predictive models that inform decisions in real-time (or near enough). Think of all those baristas serving each other latte’s who will now have at their fingertips the best predictive models in the world — not because Australians invented those algorithms, although they well might in many cases, but simply because we introduced a few signposts here and there that made it easy for algorithms to travel to Australia and do the work, without waiting for a human chaperone.
The central question in AI is how to make it economically viable for everyone — not just well-funded behemoths. Right now, if you’re running a small bookstore, you can’t exactly parachute in a platoon of PhDs to revamp your inventory forecast. That’s the bespoke AI approach that’s too pricey for the little guy. And the little guy certainly doesn’t want to start building a prediction network any more than he wants to roll out his own mile of fiber.
That’s the tragedy of the commons. That’s the place where a government can step in. That’s how you can get real use of mathematics. The rubber hits the road when you forget about AGI and start predicting a million unglamorous things like when the Manly Ferry will arrive. There’s an extremely long tail of AI that no country can currently reach because of the fixed cost of algorithms finding their way to problems (usually, again, because of the relatively high cost of involving a human).
Hayek, and now Jordan, remind us there is no alternative.
A centralized, big-data, big-brain approach can’t easily soak up every micro-nugget of knowledge at every petrol station or bookstore. But a system that incentivizes small, competing predictive agents — think of them as micro-managers — can. If they’re rewarded in real-time for accurate forecasts, they’ll collectively probe every corner of data, sharing what they learn without shipping all of it to a central black hole.
Sure, we can keep throwing bigger neural nets at bigger server farms. But this won’t every address Hayek’s objection as it applies to data modeling: we keep pushing the cost curve up and up (compute, labor, capital) rather than focusing on a more frictionless, distributed framework where predictions become a shared, replicable resource. The real economic magic is in how quickly and cheaply you can get tens of thousands of those little micropredictions served up — and how fluidly they borrow from each other’s insights.
Step 3: Just Do It
Enough with the lowest common denominator nonsense. Enough with the panels and the flood of technically illiterate people crowding out the discussion with “responsible AI” because that’s the only thing they can talk about. It’s important, but not the only thing.
You want Australia to stand out? Start a live nano-market tomorrow to predict every train arrival; another for electrical grid loads; another for windspeeds and so forth. Instrument the entire country and invite anyone to bring the best models and data to the task at hand. And do it before bushfire season, for the love of God (related article).
It doesn’t even matter exactly what is predicted first. The point is to build momentum, establish conventions and reuse, and engage a critical mass of people and businesses who can launch algorithms on the platform. The point is to drive down the marginal cost of using algorithms so that businesses, not-for-profits and public institutions of all kinds can sooner or later plug into the same source of power.
Start funding open source code advancing the network. Let people fork and use it. Look for ways to reduce friction in building, trading, or sharing these agents and their micro-calculations. Make it trivial to give them economic agency.
One day the Prince of Wales Hospital will not need their own neural networks for image recognition — they’ll simply plug into the network of models. They’ll simply say “we need this detected” and they’ll select a privacy-preserving technology that is part of the well-maintained public AI infrastructure. (Hi Dan, if you’re reading this).
The government can incentivize an open AI “grid,” akin to the way electricity is delivered. Or if you prefer a different analogy, think of it as the artificial intelligence equivalent of the U.S. strategy of bringing all the clever kids from around the world to their graduate school system. The only difference is that we’re talking about itinerant algorithms instead of people.
It’s a lot cheaper to move algorithms to Australia than people however. Merely by seeding the system with some ongoing performance-based rewards, you’ll establish a pool of algorithms of all kinds: not to mention a live feature space that can also be resused. And suddenly, the small transport company can tap into an entire ecosystem of cheaply produced, highly accurate short-term forecasts.
In analogy to the electricity grid, companies could also send power back into the grid, in the form of their own data exhaust. The carwashes local “nuggets” of knowledge — like that midweek lull in foot traffic — can benefit a vendor around the corner.
In that sense, the government’s role in creating an actual AI advantage is to build or sponsor the baseline architecture: subsidize everything from universal statistical game design to privacy preservation protocols, secure compute, ways of comparing model performance and so forth — so that competing AI agents (yes, micro-managers from all over the world) have a place to gather and a reason to try. Do that, and you’ve not just solved a niche AI challenge; you’ll have created a novel marketplace and a micro-economy in the truest sense.
So, if you truly want to see Australia punch above its weight in AI, skip the lame bullet points about “ensuring digital literacy” and build the infrastructure that helps real people and small businesses harness local knowledge in a big way.
Listen to Hayek, channel your inner Michael Jordan (both, as needed), and create a frictionless micro-economy of intelligence. Maybe — just maybe — people will look at Australia’s approach to AI and say: “That’s cunning. That’s actually different. And it just might work.”
It’s also cheap as chips.