|
Green Ash Horizon Fund Monthly Factsheet - July 2025
|
|
The Horizon Fund’s USD IA shareclass rose +4.39% in July (GBP IA +4.43% and AUD IA +4.34%), versus +1.29% for the MSCI World (M1WO)
- The markets broadly finished in the green in July, on the back a strong Q2 earnings season. With around two-thirds of the S&P 500 having reported, sales have grown +6.0% YoY and earnings +9.4%, beating estimates by +2.6% and +8.2% respectively
- Equities finished the month off the highs, following hawkish commentary from Fed Chair Powell and a variety of tariff announcements from the White House
- The fund was recently awarded a Citywire rating of A and a 5 star rating from Morningstar
Please click below for monthly factsheet and commentary:
|
|
|
|
Source: Bloomberg; Green Ash Partners. The Green Ash Horizon Strategy track record runs from 30/11/17 to 08/07/21. Fund performance is reported from 09/07/21 launch onwards (USD IA: LU2344660977; performance of other share classes on page 3). Strategy Track record based on managed account held at Interactive Brokers Group Inc. Performance calculated using Broadridge Paladyne Risk Management software. Performance has not been independently audited and is for illustrative purposes only. Past performance is no guarantee of current of future returns and you may consequently get back less than you invested. Benchmark used is M1WO Index
|
|
|
Here are some tidbits on the themes.
|
|
- This time last year hyperscaler stocks had a wobble following raised capex plans outlined on 2Q24 earnings calls, and we wrote a rebuttal to a Sequoia blog, making the case that hyperscalers were overbuilding AI infrastructure (Is there a $600BN Hole in GenAI). A year on, FY25e capex estimates have risen by +70% to $355BN and FY26e estimates have risen +90% to $424BN
- As we wrote last month in We Are Building Pyramids Again, none of us are naturally wired to think in in exponentials, but AI labs at the frontier remain fully committed to pursuing scaling laws across pre-training, synthetic data generation and reinforcement learning - we should get another data point along this curve shortly with the imminent release of GPT-5 by OpenAI, today at 6pm GMT (coinciding with a reported share sale at a $500BN valuation)
|
|
|
There have been huge upwards revisions to FY25e and FY26e capex plans, with 2026 looking increasingly locked-in as another strong growth year for the AI infrastructure theme
|
|
|
|
Source: Bloomberg; Green Ash Partners
|
|
|
AI capex, defined as information processing equipment + software, has added more to US GDP growth this year than consumer spending
|
|
|
|
Source: Renaissance Macro, Bloomberg
|
|
- The ROI on these investments is also becoming apparent, in three forms:
- Driving growth in hyperscalers' core businesses - Meta just reported ad impressions grew +11% YoY and average price per ad +9%, driven by new AI ranking models (Andromeda, GEM, Lattice) and Advantage+ roll-out. GEM increased conversions by approximately +5% on Instagram and +3% on Facebook Feed and Reels. On the engagement side, AI-powered recommendation systems led to a +5% increase in time spent on Facebook and +6% on Instagram this quarter alone
- Increasing operating leverage - On the Microsoft earnings call, CFO Amy Hood pointed to the "compounding S-curves" of efficiency gains at every layer of the tech stack as a key lever for offsetting investment costs
- Accelerating cloud revenues - Microsoft Azure and Google Cloud both reported significant accelerations in cloud revenues. Azure, Google Cloud and AWS all reported being capacity constrained - a bottleneck that is expected to be alleviated in H2 now that NVIDIA's Blackwell production is full ramped
|
|
|
Azure and Google Cloud showed significant growth inflections in 2Q25. AWS showed only modest sequential improvements
|
|
|
|
Source: Bloomberg; Green Ash Partners
|
|
|
AWS' lagging might have been due to efforts to promote their own AI chips - by prioritising their Trainium program (including pushing Anthropic to use it for their model training), Amazon has under-ordered NVIDIA GPUs which are likely to be the preferred option for their cloud customers
|
|
|
|
Source: Coatue
|
|
- OpenAI released their long-anticipated open source model, in two sizes - a 120BN parameter model that can run on a single H100 GPU, and a smaller 20BN parameter model that can run on a PC/laptop with >16GB of RAM
- The models are state of the art for their size, and the 120BN version is competitive with some of the frontier proprietary models (including some of those available commercially from OpenAI via API)
- This is something of a power move, undercutting the competition with a free alternative, and suggests the upcoming GPT-5 will be enough of a step up that users will want to pay for it
|
|
|
Both models are highly optimised for the intelligence/parameter count trade off, especially compared to other open source peers
|
|
|
|
Source: Artificial Analysis
|
|
- The US passed the GENIUS Act (Guiding and Establishing National Innovation for U.S. Stablecoins), blessing a novel store of value that acts as a bridge between the world of crypto and traditional finance. The stablecoin market capitalisation sites at $250BN (about equivalent to 1% of global M2 money supply and, up from $20BN five years ago), and by some estimates could expand into the trillions in the next few years
- Amazingly, stablecoin volumes now exceed Visa and Mastercard combined, however 90% of these relate to crypto trading
- Stablecoins are a clear win for the US government, as they bolster the US dollar's status as the world's reserve currency and create a major new buyer of short-dated US Treasuries
|
|
|
Total stablecoin volumes now exceed Visa and Mastercard combined
|
|
|
|
Source: BCG, Apollo Chief Economist
|
|
|
The use of stablecoins in payments is still substantially smaller than the dominant payment networks
|
|
|
|
Source: Cex.io, Visa, Mastercard, Apollo Chief Economis
|
|
|
If the stablecoin market expands to the $2-3 trillion estimates, it would be a larger buyer of short-dated US Treasuries than Japan and China combined
|
|
|
|
Source: US Treasury, Macrobond, Circle, Tether, Apollo Chief Economist. Note: USDT is as of Q125 and USDC as of May 2025
|
|
- Quantum computing company IonQ partnered with AstraZeneca, AWS and NVIDIA, successfully applying a quantum-classical hybrid workflow to a class of chemical transformations used for the synthesis of small molecule drugs. By integrating IonQ’s Forte quantum processing unit (QPU) with the NVIDIA's CUDA-Q platform, the team achieved a >20x improvement in time-to-solution compared to previous implementations. The technique maintained accuracy while reducing the overall expected runtime from months to days
- Quantum computers are currently too small and error-prone for commercial drug discovery, however quantum-classical hybrid architectures may help speed time to market, and IonQ expect the first narrow quantum advantage-capable QCs to arrive later this year
|
|
- Adding to calls for more power generation from hyperscaler and frontier AI labs, Anthropic published a paper entitled Build AI in America, which estimates 50GW of new power capacity will be required by 2028 for the US to maintain their leadership in AI
- Utilities do seem to be rising to the challenge - American Electric Power (AEP) increased their 5 year capital plan +30% to $70BN, citing 24GW of contract-backed customer demand through 2029, three quarters of which comes from AI datacentres
- The EEI expects its member utility companies to invest $1.1TN to support growing electricity demand in the five year period 2025-29 - almost as much as the previous ten year period 2015-24
|
|
|
US electric loads were stagnant during the 2010s, but are growing again due to AI datacentre expansion
|
|
|
|
Source: Energy Institute; BofA Global Research
|
|
|
|