Uplink: AI, Data Center, and Cloud Innovation Podcast

Navigating the Future of Data Infrastructure with Jonathan Atkin

Megaport Season 1 Episode 7

AI is driving data infrastructure to a breaking point.

In this episode of Uplink, Jonathan Atkin—Managing Director and Global Head of Communications Infrastructure Investment Research at RBC Capital Markets—joins host Michael Reid to break down the shifting landscape of digital infrastructure.

They explore how:

  • AI is fueling massive increases in capital expenditure
  • Hyperscalers are adapting their strategies to stay competitive
  • Starlink and satellite connectivity could reshape rural access
  • Power and cooling innovations are redefining the data center
  • Private infrastructure developers are gaining momentum

Jonathan also offers his perspective on the ROI challenges of AI infrastructure—and why companies are still betting big.

--

🚀 Uplink explores the future of connectivity, cloud, and AI—with the people shaping it. Hosted by Michael Reid, we dive into cutting-edge trends with top industry experts.

 👍 Follow the show, leave a rating, and let us know what you think. New episodes every two weeks.

 🎧 Listen on Spotify, Apple Podcasts, or wherever you get your podcasts. All streaming links available at: https://www.uplinkpod.com/

 📺 Watch video episodes on YouTube: https://mp1.tech/uplink-on-youtube

 🔗 Learn more:
Megaport – https://www.megaport.com/


Speaker 1:

Welcome to Uplink, where we explore the world of digital infrastructure, uncovering the technology fueling AI and cloud innovation, with the leaders making it happen.

Speaker 2:

well, john, we finally get to catch up absolutely at itw. And now you're, where are you? Are you singapore? Not just itw, but I went to I think it was your world tour in san jose. Was that the first stop, or?

Speaker 1:

it's true, that's true. You travel more than me, which is impressive. Most people would say unhealthy actually.

Speaker 2:

I'm in Singapore. So Hong Kong and Singapore this week, so I'm in Singapore right now.

Speaker 1:

Fantastic. Well, thank you for taking the time to do the sessions. We've been rolling out a holly for these Uplink podcasts, getting a bit of a sense for what's going on in the world. We're talking all things AI, data centers and everything else in between, the connectivity components to it, and I think it's probably worthwhile quick intro from your side, in your role, and then I can sort of add some color to it as well, and certainly how the world sees John Atkin, and maybe if you just sort of intro your position and then I'll add that color to it.

Speaker 2:

So, based in San Francisco, and I head up at RBC, the investment research team that's focused on all things data infrastructure so that's mobile infrastructure, towers, small cells, connectivity, which is principally fiber, data centers, ai and then, tangentially, if you want to learn about AI, you have to kind of become an involuntary expert in things like the transmission grid, liquid cooling and so forth. My job has gotten a lot more interesting over the last two, three years with some of these other tangential sectors around liquid cooling and software and chips, and you see a lot of capital coming into the sector from real estate and other adjacent sectors, so a lot of it's coming together and happy to spend time with you.

Speaker 1:

Yeah Well, in your world. I'd say it certainly feels like the hottest moment in time probably ever for your particular world and I know you've been in this game for a while, particularly focused on your area of expertise. But, as you just explained, some of the fusses that have changed in that space, the changes off the charts, or did you feel something similar before?

Speaker 2:

So I think if you go way back to prior decades, social networking Facebook at the time was at one point the leading leaser on the planet of third-party data center space. They went public and then used their balance sheet to build gigantic campuses. So at the time, an eight megawatt lease. In fact, here in Singapore there was an eight megawatt lease. They signed and that was considered absolutely beyond all proportion. And then we got cloud a couple years later, amazon principally, followed by Microsoft and Oracle and Google, so you had those four companies with various types of cloud platforms. Social networking, by the way, never stopped, so a lot of ancillary drivers that drove us through the last decade. Those continue. I mean, the cloud companies are still putting up double digit percentage growth in revenues for their cloud divisions. And then AI kind of came along towards the end of 2022. And that's just put a punctuation mark on some of these previous cycles. Yeah.

Speaker 1:

And probably, as you were just talking there, what's interesting, even if you said double digit growth doesn't sound like an insane amount, the reality is it's off such a massive base that if you compared just 20 points of growth now to, say, three years ago, when they're doing 200% growth or something, it's still significantly more that they're adding into the grid or into the power that they need or into the data centers that they need access to. So it really is. Even though it's percentage-wise doesn't sound insane, it still is an insane amount of requirement that they're needing. That's without, like you said, without even thinking about.

Speaker 2:

AI, that's correct. So gigantic base and still, in some cases, multiple, double-digit growth. What AI has kind of brought is the need to invest in even more powerful GPUs. I mean, GPU does run part of the cloud, but GPU is absolutely essential for things like AI training. So you look at the cost of a data center versus the cost of the equipment inside a cloud data center and those are comparable, but the kit is more expensive than the data center itself. That's even more distorted with AI training, and so the CapEx budgets of these big hyperscalers as they need to even enter the AI race, much less maintain competitive have gone quite higher. So I think the smallest hyperscaler CapEx budget we're seeing is in the mid-65 billion range, and then it goes higher from there.

Speaker 1:

It's amazing and so you know for folks listening. John is an analyst for RBC, probably the most, if not the most, world-renowned analyst in this space, crossing globally. But I think the majority of your focus is in the US, but you also run all of the global DCs towers. You look after us as well in terms of connectivity. You're across all these different trends and environments. Your job is to actually model this as well and actually provide guidance and thoughts for investment firms and so forth as to how you think about it. When you go and change the fundamentals of the CapEx requirements that these companies need, what does it do to your models? How do you change how you think about that? Do you actually have to fundamentally think more long-term in terms of what it could be in the future? Do you have to predict further out, even though that sounds scary, trying to predict even more than two years out, but how does it play out for you?

Speaker 2:

There's different clients. There's those that are kind of long-term focused, you know with pension funds and sovereign wealth funds. So they're thinking in terms of half a decade, decade plus. And then you have other types of clients that are just thinking about kind of the next quarterly results. They're also thinking big picture. But quarterly or half-yearly changes in some of the KPIs can definitely move. You know how valuations are perceived in the public market. So I had to kind of think both short-term and long-term.

Speaker 2:

Despite the staggering amounts of capital that these big software and content and internet companies are spending, they generally are cash flow positive companies. They can afford to spend it for the next foreseeable number of years. So it's not necessarily expressing the balance sheet of these hyperscalers. The question that's going to be becoming more and more important to their investors and that's not a space that I cover directly is are they getting a sufficient return on the triple-digit billions of investment that have gone into AI infrastructure? And they get asked this question every once in a while on their earnings calls and there's no answer that they're willing to kind of quantify at this point. I think Microsoft is the one company that actually gives you the most precision as to what percentage of their cloud revenue growth they attribute to AI, so they're somewhat helpful, but the actual return on investment is something that they're not communicating at this point and, frankly, all of them are still. They view themselves as kind of the early innings of this race, where you need to be competitive by having a large foundation model, but as we pivot more towards inference, which, if you view training as a cost center, inference is how you're going to actually run these models and make money.

Speaker 2:

Lots of different approaches are being tried and nobody really knows to what extent this is going to drive return five or six years from now. And not to get too detailed, but I think the important thing to keep in mind is that these expensive GPU kit that they've been spending money on, they don't last forever, so the useful life is finite, and so, as we get closer to the end of the decade, they're not only going to have to spend for growth to maintain competition in terms of their model infrastructure, but they're going to have to replenish stuff that wears out. At that point, I think they're going to be asking themselves the question as much as investors are asking them now, as to what return did we get? And then we're going to see a little bit of a fork in the road where some companies conclude that they're making money, they'll continue to spend a lot and others might pull back. But that's really kind of a three to five-year-out question that nobody frankly knows the answer to at this point.

Speaker 2:

Yeah, on the Fiverr side, I would say that, as you would well appreciate, ai clearly leading to a lot more data center construction, data center leasing by third-party providers, but also, if you are in the fiber business, you're seeing a big jump up in both wavelength sales and dark fiber sales, so there's been a direct impact there. Mobile infrastructure, which is the third area that I focus on, there's not as much of a direct linkage. That's just more based on mobile data traffic. Ai doesn't really affect, at least at this point, mobile traffic.

Speaker 1:

So you've seen a change, particularly for the fiber and wavelength, in the last what year or so? When would you have said that AI started to have some effect? Or is it actually just a broader market actually starting to gain some momentum post-COVID or something like that? Momentum post-COVID or something like that?

Speaker 2:

You know, I think I mean post-COVID there was a catch-up, and Cogent is a company that's talked about, how you know. During the pandemic they saw multiple years' worth of gap-up in traffic usage within a short amount of time, so maybe five I forget what they said three to five years' worth of growth was compressed into one year and then towards the end of the pandemic, they kind of continued their regular growth path. But in terms of things like strand count that connects data centers across the metro between cities, and then subsea capacity, I'd say beginning in the 24, it became quite apparent and even during middle to late 23, that the demands around model building and running the models were just going to consume more bandwidth resources.

Speaker 1:

Yeah, I mean, obviously you've seen just the number of chips that have been procured, the data center space that they've gone and taken up, the data centers that have been built, the CapEx that have been put in to build the more data centers to try and access to more of those, and the more chips that have been bought by every other company that's trying to hustle after it. We talk about this a lot. That data center is an island if you don't connect it, and so the connectivity sort of is coming fast behind that. We've seen that pick up, certainly in terms of the number of AI sort of GPU as a service providers that are, whether Megaport's on ramping.

Speaker 1:

And I saw CoreWeave. I think they announced some results just recently. That thing was like some insane number 4 million percent growth year on year in revenue and sort of billions of dollars of backlog, I mean. And actually what I didn't realize about CoreWeave was I was sort of thinking CoreWeave was far more around servicing one large customer being Microsoft and I think maybe ChatGPT or whatever it is, but actually I think they reported some like 1,600 customers in there. So what's interesting is that it looks like more and more customers are starting to adopt GPU as a service or their own training models or whatever it may be, which I think a year ago they weren't sure the enterprises were still struggling to figure out what they're doing, or trying to figure out these models Certainly feels like it's coming over the hype and been a bit more realistic for enterprises beyond just three or four players.

Speaker 1:

And then for us, we're seeing these inference piece play out, and so you know, if you follow Grok G-R-O-K, I just saw them publish another. They're just I mean, they're a chip company different to NVIDIA, that's very focused on inference at the edge and just an incredible speed, and so then it becomes about low latency and how quickly you can sort of process these AI experiences, and so they're rolling out everywhere as well. So have you got a perspective of all this inference component? And I don't know, it actually feels very hard to keep across. To be honest, it's pretty fast moving. How do you sort of look at it?

Speaker 2:

Yeah, I think in terms of actual. So Brock actually uses lower power consumptive chips, so they have a lot of deployment, but you're not necessarily going to see it in the power metrics the way you would with like an NVIDIA Blackwell architecture. But I think it's going to be a continuum where you see inference at the far edge and that would be on the Waymo vehicle where I live in San Francisco, or maybe on the next version of the iPhone, apple Intelligence or on the factory floor. So there's kind of on-premise far edge where clearly there's going to be part of it that's deployed there and then a lot of it's going to be in larger data centers close to where the cloud is. So it's an interesting topic to ponder whether inference is going to be deployed at scale in, say, 20 megawatt data centers.

Speaker 2:

To take a Sydney Australia example, I think that Eastern Creek is where you're going to see a lot of inference because that's where the cloud is, but in Mascot or Alexandria or, closer in Macquarie Park, maybe not so much. So I think inference needs access to cloud databases, cloud applications, cloud storage, and so for latency reasons it's possible that inference would be kind of cloud adjacent and the latency is quick enough to most of the GDP and most of the end users, where you don't need to be inside that ring road of whatever mentor we're talking about. It can be out where there's a lot more contiguous capacity and space blocks. But there will of course be exceptions and there'll be use cases, I'm sure, where inference is going to be populating smaller data centers as well. But if you measure it by the megawatts of energy draw, I tend to think it's going to be further out than people think and the latency argument would sort of support that. And then you've got far edge where there clearly will be an imperative in some applications for inference to take place on the device.

Speaker 1:

Yeah, and inference. As you pointed out, some of these chips that are coming out that are doing the inference component aren't burning much power. I mean, they're very, very power efficient, from what we can see, or heat efficient as well, and so you don't need to cool them, you don't need racks. So that does change what I think is required for inference. I mean, and I don't know if you've heard some of these rumors around the Grok, grok now Elon's AI farms that he's building out. I think you know the story was they could get 50,000. I think ChatGPT could get 50,000 NVIDIA chips to connect to each other.

Speaker 1:

The Grok, you know Elon figured out to do 100,000 and then he put it in that crazy factory and then I heard rumors he's up to pushing 300,000 and has plans for a million GPU chips. Now that is a very different story. That is an unbelievable amount of power. Compute and heat seems to be pushing that. Now you would have to assume that if they're going down that path, chachi Pateer is going to go down that path and all these other folks are going to have to follow. You'd see Google and even Facebook sort of or at least Meta with Alarma projects follow. I assume it seems to be early days even I think people were having this perspective that actually you know, nvidia sort of hadn't. You know we're going to have this decline now, a bit like Cisco. But almost I'm sensing even more, maybe I don't know. Do you have a?

Speaker 2:

view. I mean, training and retraining is still going to be a very significant growth vector and you're going to need massive capacity blocks. So you've got Project Stargate in the middle of Texas. You've got the Tennessee project that you referenced with Grog. Here in Asia, where I am right now, you've got a lot of land banking that has taken place over the last several years. Some of this even predates AI, where the hyperscalers haven't even exercised the rights to purchase the land. But there is optionality to build bigger and bigger models to stay competitive in the race.

Speaker 2:

But that'll be again for training and retraining. There will be some inference, obviously, but that growth factor is not going to end anytime soon. There will be some inference, obviously, but that growth vector is not going to end anytime soon. But then, on top of that, you're going to see this pivot where inference is a relatively, you know, an even faster growth rate, and that's where you're going to see a lot of deployments closer into the metros for performance reasons. So it's not either or, but it's both, and I would say when it comes to where the investment's happening.

Speaker 1:

It's everything. Now. Everyone freaked out when DeepSeat came out early and they're like, oh no, no one needs anything, and there was a blip and sort of everything started coming down. And then there were these rumors of, I think, microsoft. You know there was a rumor that Microsoft had sort of turned off one of their data center leases, and I think we've now figured out. I'm not even sure I'm asking, I'm not sort of leading you here, but I think that we've proven that a lot of that was overhyped and actually the reality is they're still taking out spaces, expanding. Eap-sec was a blip and actually it certainly opened our eyes to certain things, but in reality we pivoted back to probably where we were prior and if not, like I was saying, with the grok piece, you're okay, more like more trainings. What is the room? I know you probably know, probably from the detail, a bit more around that, but they have, they're not slowing down the leases. Is that true or is that change for person? I can't.

Speaker 2:

I mean, I can just point to the last two weeks, a lot of these companies that you mentioned have actually reported their results, the calendar first quarter results, and have reaffirmed, or even increased in the case of Meta, what their capex is going to be for the remainder of the year. It's increasing, probably increases to some degree next year as well. The lease versus build mix always shifts and then some companies in particular regions they may have, I think what you can say is this might be a year of deliveries. In some cases they may be taking a pause on new leasing to then fill in the capacity that they've kind of signed up for the last couple of years.

Speaker 2:

Every company is on a slightly different part of the curve, and so what you find is if one company is taking a temporary pause in leasing, which I think is fair to say, then others are certainly aware of that and picking up the slack. A company that I cover, digital Realty, put up their third biggest leasing quarter in calendar 1Q, and this is immediately during the time when people were panicking, as you alluded to. So it's, I think, could be a mistake to focus on just one or two companies. One thing that I've noticed is that the breadth of companies that are signing double digit megawatt leases is actually bigger today than it was several years ago, so it's multifaceted and to focus on just one or two drivers can sometimes lead you to the wrong conclusion.

Speaker 1:

Yeah, agreed. Well, that's good insight, because I think people over-rotate to the upside and over-rotate on the downside rapidly, and the problem is that everything's good when it's a consistent line and everyone knows exactly what's returning and what's working and what's not working, et cetera. The challenge in our space, to your point, is everyone's trying to figure out the killer apps. What's actually solving the problems? What doesn't have this hallucination sort of issue? Where it does work, I feel like we're getting closer and closer to customers, or at least enterprises, understanding the value that AI can bring.

Speaker 1:

We're still trying to find, I think, the killer apps that you know, this potential ability to add human-like functionality into the office, be that accounts receivable, accounts payable, all these functions underneath. That would just AI magically, would just take control, or, if you're a law firm, ai could just be leveraged for that. But there's still at this point where you know where there's hallucinations, it can't be guaranteed. There's all these pieces that then you then need to audit the work that's been done by it. So there's this argument of whether it's actually adding a great deal of value or it's not.

Speaker 1:

I think we're starting to see it cross that. I feel that anyway, I feel that, certainly inside from a coding perspective. I know all of our developers are getting huge leaps from co-pilots and so forth, but it's not replacing the humans, in that the humans are now doing the audit work of what the code's been delivered and arguably you're moving faster than you were before, but you're not sort of replacing or doing some sort of human arbitrage or whatever it may be. So have you had a think into that space or got a view on how that's playing out?

Speaker 2:

I mean, in my friend circle and acquaintance circle, I know folks that are starting companies that, with a fraction of the headcount that they thought they needed, can be amazingly productive. So I think that we need to talk about web native companies and you can maybe talk about AI native companies, and we're just seeing that productivity boost that you alluded to maybe in the most pronounced fashion. And then I think, the larger the company and it depends on the industry, of course everybody's asking themselves the question is like what should be our AI strategy? Just like you know, 14 years ago people were asking what should be our cloud strategy. So there's a lot of kind of pausing, reflecting, seeing what works and what doesn't. But the more you know, I'd say in some cases the smaller the company and the more recently it was put into existence, the more nimble they're able to take advantage of these innovations.

Speaker 1:

I want to ask you another one to pivot, because I know we've got I think what have we got 10, 15 minutes with you R-Link. Do you follow Starlink at all? I've got a couple, particularly in its impact to towers. So I hear this rumor a lot and I'd love you to get your perspective. It's like if Starlink is successful everywhere, why would we have 4G, 5g towers and I'm sure you can answer that question for the listeners, but it's one that pops up and there's probably some good technical reason. But I'm just curious as to how you would respond to something like that.

Speaker 2:

It's powerful. We could have done this video call had you been in the Grand Canyon or I had been in Yosemite over Starlink with pretty high resolution, and it's gotten better over the last several years. But you do have to be able to see the sky. There has to be line of sight. The light you see is not instantaneous, and so what I would say is it's a portable service, it's a nomadic service.

Speaker 2:

In rural areas, it can be a viable alternative to things like fixed wireless or cable or fiber access, but it's not a substitute, it's an augmentation.

Speaker 2:

So, as it relates to towers which carry mobile traffic with handoff and where we use our mobile devices, towers are still kind of indispensable. I think that when you get to the fringes of kind of very rural areas where companies might have been thinking about deploying fixed wireless access, which is a thing in the States that T-Mobile, verizon and AT&T have really pushed over the last couple of years, a lot of their broadband subscribers, in fact, are considered fixed wireless access. If it's a rural area, I think that Starlink is a very credible competitive thread and doesn't need therefore you don't need to string fiber, or even deploying fixed wireless access might not be as feasible. So to the extent that towers are serving rural areas, then you may not need as much capacity on the terrestrial infrastructure due to Starlink. But where people spend a lot of their time in suburbs and cities where towers carry the vast majority of the traffic, I don't see much impact.

Speaker 1:

Yeah, I'd agree. I had this discussion with someone the other day and they sort of debunked it by saying it doesn't go through buildings, and so I think you need to have the towers to get through the buildings. The amount of latency for the towers, and then just the sheer throughput I think that they are delivering, is not something that Starlink's taking away. So it's an and not an or, from what I can gather, right.

Speaker 2:

You have to have a repeater. If there's a repeater on the roof that sees the sky and sees the constellation, and then you've got Wi-Fi or whatever serving inside the building. That's how you get around that challenge, but it's, it's not going to be a service that that's as flexible as just using, you know, the mobile.

Speaker 1:

Yes, Okay. So less disruption for the towers, lots of disruption from a data center company perspective, the disruption being liquid cooling and how much power you can get to your racks. Are you seeing data centers respond to that? I mean, you were talking earlier in the call. Eight megawatts was like this crazy big number. That was it. Facebook needed Eight megawatts. What are they talking about? There's a thousand megawatt data centers, or even 2000 megawatts, I heard someone say the other day. So we're talking about one gig data centers. So what's playing out? How are the data center operators thinking about that? Are there some data center operators that are going to benefit or do better here, or can they all augment so long as they can get access to power? How do you think about that?

Speaker 2:

Access to power is kind of the pinch point, because a data center that's not connected to sufficient grid transmission capacity isn't going to be very useful. So if you're talking about adding new capacity, you have to be able to follow that path to power, which means coordinating the utility and operating in a jurisdiction where the utility has been able to make that investment or where the data center operator is able to, kind of, you know, build their own substation and help solve the problem. So that's, it's going to be experienced operators that have done this for years or decades, rather than newcomers. So I think you'll see a lot of some mixed success among new investors into that sector that are new to these challenges around site selection and, in particular, power.

Speaker 2:

What you were alluding to in your question, though, I think, relates to existing data centers and can they meet the needs of AI and liquid cooling, and I think the short answer is, in most cases, yes.

Speaker 2:

Designs that are even five, in some cases even 10 years old, as long as they're using chilled water as the basis of their cooling architecture, it's not that complicated to use either containment technology that focuses the cooling into a tighter area, or to use a secondary loop and employ direct-to-chip liquid cooling.

Speaker 2:

So, as a relatively straightforward, the short answer is yes, if it's got liquid cooling, which is how most data centers have been designed for the last 10 years, it's not that complicated. But one thing I'll kind of point out is that CPUs are still actively being made and deployed in cloud architectures and they're running profitable cloud workloads, and so, as we come up to the replacement cycle of cloud data centers that are maybe a bit more CPU-centric, they're being replaced with CPUs, so there's no need to upgrade those data centers from a cooling standpoint. Therefore, I think that most AI is going to go into new fleets of data centers as opposed to retrofits. But there are retrofits that are happening, certainly, and because they're so interesting and because of the question that you pose, it kind of captures people's attention.

Speaker 1:

But the vast majority of AI is going to go into new fleets of data centers that are kind of directed ship or liquid cooling from the ground up and and that doesn't diminish from the fact that you pointed it out cpu is still massively growing in terms of what the clouds run on as opposed to say that gpu, tpus or any otherPUs and everything else that's sort of coming out, but the growth is not going to slow from a CPU perspective, or it's unlikely, because that continues to scale, and so the data centers that they've currently got are still not big enough for what they need.

Speaker 1:

So therefore, they're going to scale. Then they'll refresh those with more CPUs and so that's not an issue from the power or heat and cooling et cetera, and so then what you've got is this additional build of data centers coming in. So I guess it's a good place to be in a data center game. I don't think I've ever seen anything so amazing in terms of how much capital has been raised, how much capital is getting deployed, companies that are getting acquired. There's only a couple left that are public. That's the other piece that's kind of interesting. In terms of data centers that are publicly traded, a huge amount of them are ending up in the private space. Do you cover the private side of things that PE would typically look at, are you? I understand the sector.

Speaker 2:

Yeah, I mean you do have to pay attention to the whole industry to understand kind of the dynamics and I think you make a good point is that the number of listed data center companies is smaller now than it was five years ago, and this predates Gen AI. But you had a period when companies like Digital Bridge and Blackstone and you've got KKR, gip, add to that Brookfield and Macquarie-backed platforms, and if you add all those together, there's easily six or seven companies that spend more on their development pipeline than Digital Realty and Equinix do. So Digital Realty and Equinix if that's all you focus on or NextDC also in the public space, you're looking at just the small tip of the iceberg, because, as public companies, there's limits on how much leverage they can have on their balance sheets. Some of them are REITs, and so you can't really you don't want to surprise your investors by playing pinball with your CapEx budget, but when you are a private company, you can lever up a lot more and then the capital can flow more freely.

Speaker 2:

What I found then is that these unlisted companies are really responsible for most of the development. Digital realty and economics are still highly relevant because they're global, but technically they're losing share to these other unlisted companies. It's not about share gaining or share loss, because this is an industry where the deals just repeat themselves, and it's more about getting favorable unit economics and favorable returns. And that's where about getting favorable unit economics and favorable returns and that's where Edwin and Digital have both done quite a good job. So we're now in a you know, we're probably in year three of a fairly supply-constrained environment Energy transmission bottlenecks are a reason for that, and so pricing and yield on cost have actually gone higher for most of the industry.

Speaker 1:

Yeah, it's really tricky because it's putting pressure on every part of this development process. So even diesel gens are hard to get a hold of or there's a backlog on the cooling and so forth. I'm mindful. We've got a couple more minutes. Anything you? I know that reporting season has just sort of rolled through for a lot of the companies that you sort of follow. I'd love to get any just broader thoughts. I've been leading the questions. I'd love you to just anything you'd like to riff off and let the folks on the podcast know about.

Speaker 2:

Well, a lot of ways to get involved in this AI trend, and you just mentioned the companies that make the electromechanical equipment transformers, generators, liquid cooling, pdus so a lot of these sectors have been examined for now multiple years as kind of derivative AI plays, and while I'm not an expert on every one of those companies, it has surprised me how quickly one can ramp up production capacity for some of these big-ticket items, whether it's batteries, transformers, pdus, cdus. And in Johor, malaysia, just across and straight from Singapore, supermicro spun up a manufacturing plant within a year, for what they're putting up is rack-ready, nvidia-compatible designs. So the ability of the manufacturing supply chain to react to this has actually surprised me, because it's been fairly rapid. It is surprising.

Speaker 1:

And then it is surprising, although I must say, where there's a will, there's a way. So when there's a capital opportunity to make money out of something, typically the capital markets respond in some way. The limit, which is sort of goes to the next piece, is so we're seeing humans build stuff faster, more manufacturing plants. You know, I'm with you, I'm surprised how quickly they are On one hand. I'm also not surprised that they're hustling on the other. But the other piece that plays out is power, which is typically driven by governments. That, then, is changing as well, and that's where I think, where this capitalistic mindset is pushing companies to, for the first time ever, maybe in our history, think about building power, or power generation or transmission, or probably generation in small, so they don't have to build transmission maybe, I don't know. That's also quite astounding.

Speaker 2:

So the long poles in the tent if you want to get into this business. You know things like generators and transformers. It's usually measured in years, but then upstream, the power transmission fix. Time and money will eventually solve that issue. But you're talking about years. It depends on which country and which jurisdiction. Where I live in California, it's probably the better part of a decade. In Europe it varies. In Asia it varies. You're talking years to close to a decade depending, and so that's the longest call to attend. Those investments are outside of the ability of private capital to solve by themselves. You have to have cooperative public sector buy-in as well. Yeah, okay, well, I know we're running right out of time. Anything else worth?

Speaker 1:

mentioning. How do people follow you, because I get the benefit of reading a lot of your reports. I edition Okay Well, I know we're running right out of time. Anything else worth mentioning? How do people follow you, because I get the benefit of reading a lot of your reports. I don't know if you assume you can't access them publicly, or is that investors only? How does that?

Speaker 2:

work. I am not on social media, unfortunately, so I've got to be a client.

Speaker 1:

Perfect. So be a client and you get access to some incredible research, constantly updated, and not just research on one vector. You're covering every different angle. You're probably one of the smartest, deepest thinking analysts I've ever come across. I've never met anyone who travels more or is probably more in touch on the ground than anyone else, and I remember asking you a question around some of your perspectives on even data centers, and you were saying you just need to look and see if someone's building on the roof, and if they're building on the roof, you'll know. And so every now and then, I ask the builders across the road to tell me if someone's building on the roof to figure out. I'm like you're everywhere. It's impressive. Appreciate you jumping on the pod, appreciate you covering Megaport and also all the insight you provided us. You're an incredible resource. Thanks so much.

People on this episode