Beyond Moore: Part 2, Robots in Disguise with Robot Pigeon
Part 2 of 2 - New frontiers in a generation-defining industry.
Calling all Pigeon fanciers!
In this edition:
7 Things We Like (2 min read).
Portfolio News: Including an electrified race, and a new chip with 2x efficiency (3 min read).
Deepdive Part Two: Beyond Moore. We outline the transformation beyond transformer models, a new frontier for semiconductor hardware, and the embodiment of artificial intelligence (7 min read). Full article here.
Frontierspeople: Podcasts, posts and more!
Live long and prosper,
AndrewJS, AndrewG, Harry, Rebecca, Wayne and Grant.
Unsubscribe any time here.
7 Things We Like
Marc Andreessen’s recent comments - on the a16z Podcast - discussing AI changing the generalists vs specialists debate when it comes to building generation-defining companies.
Dealroom’s Global Tech Ecosystem Index 2025, which crucially highlights that Paris has overtaken London as Europe’s biggest tech ecosystem.
A recent conversation by Freethink on why the future of pharma is off-planet (we agree, and have backed BioOrbit for the future of in-space drug manufacture, starting with antibodies for intravenous cancer treatment).
The Performance Paradox in Venture Capital by Equidam, discussing the value of larger portfolios and high-conviction funds at pre/seed stage.
Josh Kopelman discussing the ‘Venture Arrogance Index’ on the Uncapped Podcast with Jack Altman, as a simple formula to evaluate a VC firm's business model. It uses five core variables: Fund Size [$], Investment Window [Years], Percent Ownership on Exit [%], Target Investment Multiple, and Total Value of Venture Startups [$/Year].
This new tool - for UK readers - to check how much of your pension is invested into UK growth assets - including venture. In the UK and Ireland, just 0.007% of pension fund assets under management are invested into VC - lower than the European average, and a world away from North America, where 15% of pension fund assets are allocated to private equity, including VC.
Not new but still great piece from Omri Drory on How VCs Become Assholes. And how to remain focused on the bigger picture in an industry dominated by the power law which comes alongside a lot of negativity and failure.
Portfolio News
Is it a car? Is it a plane?
Aerovolt starred in the UK’s first race between an electric car and an electric plane - between Top Gear / Grand Tour's James May and Richard Hammond. The race - featured on the DriveTribe channel - sees a Porsche AG take on Pipistrel Aircraft (piloted by Philip Kingsley-Dobson), using Aerovolt’s easy-to-use charging network to showcase point-to-point flying between multiple airports.
Reversing into a new era of compute
Vaire completed initial tests of its new chip components, demonstrating its potential to halve the amount of electricity required to run many computations, including those used in artificial intelligence. Vaire Computing has the potential to completely decouple our demand for compute with our demand for energy, with reversible computing systems that are up to 5,000x more efficient than today’s "irreversible" computer processors. This has boundless potential to transform the future of AI.
Dodging DOGE
Stotles unveiled its $13m Series-A funding round, led by Headline and Acton with participation from Form Ventures and Seedcamp, as they continue to build the first platform to cover the entire commercial process of selling to the public sector. The team have recently redesigned and relaunched their platform, becoming the leading partner of private sector suppliers in the space. The latest version supports the end-to-end commercial process of creating strategy; building qualified pipelines; finding the right tenders; and qualifying those opportunities to generate a first draft of a bid.
The machine is alive
Biological Black Box (BBB-Tech) emerged from stealth with its Bionode platform - a computing system that integrates living, lab-grown neurons with traditional processors. “Over the last 20 years, three independent fields - biology, hardware, and computational tools - have advanced to the point where biological computing is now possible,” said Alex Ksendzovsky, BBB’s co-founder, offering the potential for order-of-magnitude shifts in efficiency and adaptability, compared to conventional GPUs.
Speaking of change…
Metavoice shifted the business model for its hyperrealistic voice AI to a developer-focused platform. The team has achieved technical breakthroughs to enable a step change in efficiency for speech-to-speech processing, as it continues to deliver a voice experience that sounds as familiar and natural as a human agent.
Talking things through
Limbic started rolling out a new voice AI agent - Limbic Intake Agent - to help connect patients to behavioral health services. It will initially help US partners with overflow, answering calls when an existing center’s lines are busy or closed. The AI responds to patients in real time and can field an unlimited number of calls, freeing up overburdened staff. Its name can be customized to align with a provider's brand, but it will always introduce itself as an AI.
Sharing the load
Gensyn’s Testnet continued to signal a defining shift towards decentralized ML training. In just 2 months, the team has connected more than 56k on-chain accounts, 6.7M on-chain transactions, training >45k decentralised models through RL Swarms.
DEEPDIVE
Beyond Moore: Part 2, Robots in Disguise
In this second half of our two part post, we look at what the opportunities and challenges are as AI evolves. Read part 1 here.
Back in the early years of the world wide web, everyone thought it was all about “portals” like Hotbot, Yahoo and AltaVista.
They delivered stocks news, weather, sports AND search. It wasn’t until Google blew them away with better technology and a key insight that it was critical to get people OFF their website as soon as possible, by delivering a good search result, that everything changed. Essentially overnight in business terms, all the competitors went the way of the horse and cart.
So with AI, the jury is still out on whether the large-model companies (OpenAI, Google, Meta, Inflection) and the startups building on top of them, will win out. Clumsy analogy it may be, but as Rory O'Driscoll of Scale Ventures said recently “No one knows nothing right now”.
So it's possible the defensive moat of today’s leaders is more fragile than we assume, especially given the amount of work being done in the open-source domain.

With go-to-market timelines compressing rapidly, companies are now able to go from research to billion-dollar products in less than 12 months. There has been a doubling in the number of new foundation models released in the past year alone with 65% of these being open-source.
Transformation Beyond Transformers
We’re also incredibly early in the drive towards artificial ‘super intelligence’ - an AI that surpasses human intelligence in all aspects. Though here things become murky. Surpassing all human knowledge has arguably been achieved, as most of the major LLMs will know more (cribbed from the internet) than any human possibly can. But while rapid advance toward next generation AI is inevitable, transformers can only “see” a fixed number of tokens at a time, with longer contexts coming with exponentially greater compute costs, and some would argue that while LLMs will probably continue to sound smarter and smarter, LLMs will never reach truly human level “intelligence” (depending, of course, on how you define intelligence).
“As the amount of training data increases and models get larger, it becomes harder to differentiate between memorization versus real learning or intelligence. From a product standpoint, this may not matter, but it will cause huge frustration as soon as you ask about something the model wasn't trained on, but could be worked out with common sense.” says Zehan Wang, CEO of Paddington Robotics.
A huge part of the problem is that common sense involves things that we take for granted, and often don’t write down in the content that AI models are trained on.
A lack of native memory, overall interpretability and high retraining costs, present opportunities for new architectures like State Space Models (SSMs), Mixture of Experts (MoEs), Neural Module Networks (NMNs), and Neurosymbolic Architectures.
“I don’t think we’ve yet seen the technology which will take us to artificial general intelligence, true AGI” says Andrew J Scott, 7percent’s Founding Partner.
A centralized, data-center-focused model of development may also prove simply too limited for the scale required. Decentralized systems being developed by GensynAI, may provide a solution to the vast compute capacity needed.

Embodied AI and robotic systems will need on-the-fly learning, to create compelling, useful robots which learn and adapt in real time based on the world around them.
Ben Fielding suggests that “...we’ll likely move past benchmarks as the standards for assessment and over to pure productisation…the big labs are still competing on benchmarks as a customer acquisition tool, but they don't matter. What matters is usability and whoever acknowledges that first and doubles down hard will win more users.”
New Physical Limits
The progress of computing until now has been facilitated by the improvement in computer processing power.
In 1965 Moore’s law predicted that computing power would double every two years, driven technically because the number of transistors on a microchip was increasing so rapidly. Remarkably this has held broadly true for half a century. Increasingly powerful mobile devices and the falling cost of technology in general, has all depended on this progress along with increased storage capacity (Kryder’s law), data transmission (e.g., Butters’ law) and broader economies of scale (Wright’s Law).
But as transistors reach atomic-scale, they are nearing their minimum feasible size which physics will allow. IBM’s current experimental transistors are only around one order of magnitude larger than the silicon atoms that are used to make them!
In addition, with today’s architectures using logic gates that destroy one bit of information in every operation (with a subsequent release of heat/energy) the Landauer limit places a hard physical floor on how energy-efficient the existing semiconductor architectures can be.
There is now heavy investment into sticking a bandage over this energy problem, with a raft of startups focused on novel cooling solutions or energy solutions that may provide iterative improvements. But these are unlikely to be the big winners of the next decade, dependent as they are on the existing architecture.
For new entrants to win, they must deliver a real step change in capability, not incremental progress. It’s very hard for example, to beat Nvidia on GPUs for training models (using the CUDA ecosystem to fight off competition from an increasingly well funded open source ecosystem).
And today’s hardware is expensive: Meta’s 2023 LLaMA model was trained on $30m of GPU hardware. Training models and re-training them for certain tasks is currently the most capital-intensive part of the revolution.
Fundamentally new hardware approaches (analog, photonic, reversible, neuromorphic, quantum, biological and 3D chips) are soon to disrupt the status quo. Leveraging scientific breakthroughs - materials such as graphene and silicon carbide which both have unique properties - could enable the creation of smaller transistors or entirely new types of chip. The big winner(s) will be those who transcend the current stack.
However, hardware alone is not enough. Innovators will have to bridge the gap between their hardware and the software which will use it, in order to compete.
Funding in AI chips and processors has been dominated by the US (45%) and China (27%), with Europe accounting for just 6% of investment in the sector.
The UK and Europe also missed the starting gun for AI, despite having decades of academic research in the field. And the clock is now ticking before it's out of time to catch up.
Europe and the UK must focus on areas of excellence, in quantum, next gen computer and AI, to avoid the US owning the next digital decade, as it has dominated the internet, cloud storage, search, email and thus far, AI.
Technology Is The Future Of Everything
How and if we are able to solve the hardest problems of humanity, from the curing of disease to climate change, will hinge on the success of these new technologies. The companies providing that technology will become the drivers of the global economy.
For society, technological resilience means investing heavily in technologies that span the compute stack, but also the software and hardware infrastructure that supports it. Little point having a data centre if someone can take out your power supply with a software hack.
We must retool for the coming age of embodied AI. The intelligence that is moving off our computer screens and into physical world robotics, is a kind that we interact with on a day to day basis. The leadership by free democratic nations in these fields is critical to avoiding long term economic decline, but also societal manipulation or loss of liberty.
AI represents a potential challenge to our existing way of life, from fake news or financial scams to future robots gone rogue, it’s vital we get to set the regulations to ensure our safety. That’s difficult to do if you're beholden to others.
We are witnessing a tidal wave of ambitious founders tackling these problems at the intersection of compute and AI. Those breakthroughs offer the most visionary investors, opportunities for outsized returns and to help shape the society of the future.
At 7percent, over 25% of our investments have been made as part of our Future Compute thesis, focused on next-generation architectures and foundational intelligence, with portfolio companies including Gensyn, Nu Quantum, Magic Pony Technology (Exit to Twitter), Plumerai, Universal Quantum, Vaire, and (most recently) Paddington Robotics.
Special thanks to Ben Fielding of Gensyn and Zehan Wang of Paddington Robotics for adding their insights.
Our Frontierspeople
Andrew and Wayne attended the 2025 edition of DragonChasers - cofounded by Andrew - bringing together >80 of Europe’s leading GP/LPs in Marrakech and the expansive Moroccan desert.
Latest editions of the Upside Podcast with Andrew J Scott discussing:
Andrew Scott featured BVCA Accelerate’s Panel - From Lab to Market: Accelerating UK Spinouts
Harry featured on a panel at Barclay’s Eagle Labs, discussing ‘What VCs Look For?’ - along with Concept Ventures, Superseed, Amadeus, and Philip Hare & Associates.
Harry and Wayne both featured at LVCN Summit - discussing ‘Succeeding in Deeptech’ and ‘The Gen AI debate’ respectively.
Wayne attended the 2nd installment of Mashup Malmo - a 3 day event bringing together leading investors and entrepreneurs from the Nordic region, including a 24 hour AI hackathon.
Where we’ll be in June
Attending SuperReturn, Berlin, 2nd-6th June
Attending London Tech Week, London, 9th-13th June
Attending GVC Symposium, London, 17th-19th June
Attending Blue Earth Forum, London, 24th-26th June
Onwards and upwards!
You’ve received this email as a known contact of the 7percent team or because you requested them. Click here if you would not like to hear from us in future.
https://open.substack.com/pub/echoesofedentv/p/woe-woe-woe?r=5mm4j1&utm_medium=ios