
Uplink: AI, Data Center, and Cloud Innovation Podcast
Uplink explores the future of connectivity, cloud, and AI with the people shaping it. Hosted by Michael Reid, we explore cutting edge trends with top industry experts.
Uplink: AI, Data Center, and Cloud Innovation Podcast
Sovereignty, Liquid Cooling, and the New Infrastructure Hierarchy with Gavin Dudley
The AI revolution isn't just about algorithms; it's fundamentally reshaping the physical world of data infrastructure. In this episode of Uplink, host Michael Reid is joined by Gavin Dudley, VP of Sales at Macquarie Data Centers, to explore how AI is turning the data center industry upside down.
Gavin shares his expert view on why data centers have suddenly become the "rockstars of tech infrastructure." We dive into the practical shifts happening on the ground, from the move away from generic facilities to purpose-built environments designed for AI's unique demands. You'll learn why advanced liquid cooling is becoming essential to handle the heat from dense AI computations and, as Gavin puts it, why "plumbers are going to rule" tomorrow's data centers.
The conversation also covers the critical role of data sovereignty in a world where AI models are invaluable business assets, and how a new tiered hierarchy of data centers is emerging for training, inference, and edge deployments. The episode offers a clear look at the strategic thinking and engineering shifts required to build and manage the infrastructure of tomorrow.
--
π Uplink explores the future of connectivity, cloud, and AIβwith the people shaping it. Hosted by Michael Reid, we dive into cutting-edge trends with top industry experts.
π Follow the show, leave a rating, and let us know what you think. New episodes every two weeks.
π§ Listen on Spotify, Apple Podcasts, or wherever you get your podcasts. All streaming links available at: https://www.uplinkpod.com/
πΊ Watch video episodes on YouTube: https://mp1.tech/uplink-on-youtube
π Learn more:
Megaport β https://www.megaport.com/
Welcome to Uplink, where we explore the world of digital infrastructure, uncovering the technology, fueling AI and cloud innovation, with the leaders making it happen. Megaport has this sort of unique lens where we're in 26 countries around the world and we're seeing all these different trends in different countries. We've spoken to Brazil, the US. We look look at europe. We've seen some interesting things in france. So all these different uh pieces that are playing out, it's possibly the hottest space in the world right now, particularly data centers. Um, as a flow on to what we're seeing with ai and this exploding space, how hyperscalers are sort of playing out. Um, in the US they're saying there's just an insatiable amount of capital that's been raised and they're deploying all these new data centers.
Speaker 1:I think everyone has landed, everyone's now in the data center game, it would seem, which I find entertaining. How is that playing out for the Australia region? And I know you guys have just raised a whole bunch of capital. Go and do some interesting stuff. And I know you guys have just raised a whole bunch of capital. Go and do some interesting stuff. So I'd love to get your perspective on how that's playing out for Australia, and I know you got some views, particularly on the sovereign elements of that which I think will play out for all locations around the world.
Speaker 2:Correct. Yes, it's not a uniquely Australian thing. Yes, it clearly impacts Australia and, frankly, thanks for having me me here, and one of the things I do appreciate is I see Megaport as a great Aussie success story too.
Speaker 1:So, yeah, I'm very happy the Aussies out there dominating the world. That's it way to go.
Speaker 2:So how is it impacting us? So I think there's absolutely a lot of interest from a technology and a capital flow perspective. Certainly, if you're in the data center game, everyone wants to talk to you.
Speaker 1:Everyone's got capital for you to deploy.
Speaker 2:I think there's a long queue of people. When people are calling you. I say you to mean our business, calling you to give you money. It's a nice place to be.
Speaker 1:Yeah, it is. I know you've been in the game a while, so it's a different lens, I suppose, or a different landscape right now.
Speaker 2:Yeah, it is. I often joke about the fact that probably in 2019, if you told someone that you were a virologist, no one cared and didn't want to talk to you, and then the pandemic happened and suddenly everyone wanted to have the local virologist talk about COVID-19. And it's a bit like that with data centers. Right Five years ago, you say you're in data centers and people roll their eyes. Now, oh really, I didn't want to talk, you're the cool kid on the block.
Speaker 1:Do you want to oversay? Come and talk to our people. That's right, exactly. Get on our podcast.
Speaker 2:There you go and so what's happening? So I think there's definitely a land grab and there's a FOMO for a lot of capital and a lot of businesses that see an AI explosion occurring or on the horizon, and clearly there are people that debate about whether or not it's overhyped or not, and so they're trying to make sure that they don't miss out on whatever AI brings to the economy. I often, when I think about AI, I think that the speed at which it will impact our lives I think sometimes is overestimated in the sense that I don't think it's going to quite happen as quickly as people think. Yeah, but the impact will be way more profound than people think yes, yeah.
Speaker 2:You know electricity. So when electricity was first brought to market market as a product, people didn't buy electricity for their homes, they bought lighting, because they thought that's the only thing electricity didn't do. Yeah right, so uh ai, it's a bit like that. I don't think we've realized, we've seen the things that ai can do.
Speaker 1:Yeah, we think of ai right now, in the lighting sense that's right, yeah, chat gbt.
Speaker 2:Well, yeah, that's really's really a very interesting element of AI, but I don't think it's going to be anywhere near all of what AI can deliver. Interestingly. Look, so we're entering the age of agent AI. You know we'll have agents that work for organizations to do tasks on behalf of that business, but we'll soon enter the age of physical AI. Clearly, elon Musk demonstrated his robots at the last launch a few months ago now, and he's building the whole factory, the old Electrolux factory. They've got 100,000 GPUs and they're going to 200,000 GPUs. Yeah, and that's meant to be just to train the robots. It's amazing, he said. I'm sure you recall he said that you know they believe they'll make more money out of robots than they will out of cars More than likely.
Speaker 1:Yeah, we'll have one. Yeah, in theory you will. Yeah, absolutely wash your clothes, cook your breakfast, etc.
Speaker 2:Etc. Yeah, so, in a data center sense, to bring it back to the topic, I guess that you know, I think, that data centers, uh, people want to make sure that they capture that. Uh, what's happening in the data center world is we're going to change the hierarchy and structure of data centers to some degree. Uh, I think there'll be more purpose driven data centers and I'll explain those to sort of a generic yeah, like I've got a data center and how big is it? No, no, so I've got a data center and there might be a data center. It's a training ground. Yes, um. So the interesting thing, for example, in a training data center, like a genuine um training data center, 100% availability might not be that important. Yes, right, but once the training starts, you need it to be up during that time that's right.
Speaker 2:So you need 100% reliability when it's running. Yes, that's right.
Speaker 1:And that changes what that changes, like SLAs, how you build.
Speaker 2:Yes, you know you, you, those places might have multiple halls where you stay.
Speaker 2:You might not have all of the power going all of the time, so you might have a finite amount of power and you've got one hall that you fire up and have going spin up for a period you've got period and then, while they're staging that because in a training sense, an AI cluster also isn't all GPUs, there's got to be some storage, there's got to be some network yes, and there's a period during the way AI tasks run is that the CPUs and the network and storage are working heavy, but they don't consume anywhere near the amount of GPUs and they're kind of getting the thing ready to go. They're getting ready to go, and so you could have a hall that's focused on that at the moment and the one next door is doing hardcore training, really churning through the electricity, and then you bring that side down and bring the other one up. So that could be a thing.
Speaker 1:Can I ask you on that one? Does it impact the grid? Like, if you spin these things up, have you got enough power that you can deploy? Like I assume when you turn the thing on, it just starts to draw? Do you have to? I don't know how? Do you manage that? Yeah, well, so that is a problem. Can you do? You have to? I don't know how do you manage that.
Speaker 2:Yeah, well, so that is a problem. So, firstly, if we talk about how, when the systems kick in, they go from naught, you know, effectively drawing very little power to drawing 100% of power in milliseconds sometimes.
Speaker 1:Yeah, so certainly the I don't even understand how the grid handles this, yeah.
Speaker 2:So, because of the way data centers are operated and you have both generators and and some type of ups yeah, the feeding into, you know the, the electrical bus, but really they're the ones that get hit the hardest. Yes, right, so they're the ones that really get punched. So so having that uh match fit, uh, electrical and cooling infrastructure is absolutely part of our design in data centers now. So, talking about how we've got to think about data centers, now we've got to think about different ways of building and designing data centers to cope with AI, and it's not just the hard hit of when they fire up, it's also that clearly they run very hot, and so now you need ways to cool them. Yes, and at the moment, the clubhouse leader, we believe, will be some type of liquid-to-liquid cooling.
Speaker 1:And that's running to the chip that is submerged, is it?
Speaker 2:what's the mixture like a radiator yeah, so the way it works is that the generally the data center operator provides chilled water which it currently uses to blow air through to cool the room. Yeah, so the chilled water still already in the data center? Yeah, but, but it then runs into chill distribution units, so city use, and so they have effectively a, a radiator with cold water running through it, and then that's attached to another radiator with another form of liquid, generally not water, it's a, uh, it's a another or something. Yeah, it's a. There's there's a few different um specs of the of the liquid, but it's either water with other things added or other uh elements. There's mineral oils, there's water additives, there's a. Anyway, there's a few differences, right, a few ways like that, and uh, and then that secondary loop then goes through and runs through a manifold in a rack, across a very well calibrated block of metal with very specific paths that the liquid travels through.
Speaker 2:It's like a heat distribution. Yeah, yeah, well, to extract the heat from the chip and, depending on the manufacturer.
Speaker 1:Can you retrofit this into your data centers, or it's only some, or how does that play out?
Speaker 2:Yeah, so you certainly can put it into a lot of data centers, but if you've got an older data center, it may make it very inefficient in the way you consume and retrofit a data center. Your best bet is actually to have one that's built from the get-go. Yes, so I've talked about the liquid cooling infrastructure. You still need to cool the air, and so, in the next few generations at least, of this AI infrastructure, we are looking at a blend of air and liquid cooling, which may change depending upon what it's used for.
Speaker 1:Correct, right.
Speaker 2:So you've got the CPUs ramped down because you're staging something and you might not need as much liquid cooling at the moment. So it really is flexibility in the design. So, again, the impact on the AI industry is that we need to build data centers that are more flexible between air cooling and liquid cooling and you need to be able to do both. So that's going to be a massive impact. So you've got different cooling technologies. You've got different skills. In a data center, there's now a lot more liquid. In a data center, plumbers are going to rule. So whoever thought of bringing on plumbers into the data center? But clearly we've had water a lot, but it was much more static in the way that was routed. Before it was electricians, now it's plumbers.
Speaker 1:Yes, that's right yeah.
Speaker 2:Or a bit of both. Yeah, you absolutely still need electricians, so it's going to be a. It'd be great to see a university or some. There are some organisations that train on the engineering aspects of data centres, but I don't know of a specific degree or university.
Speaker 1:That's really starting it and it, but I don't know of a specific degree or university and it's going to be so fundamental to our life how that data centers was not going away. That's the thing. So it's like this explosion right now, but it's kind of funny because it's like I guess you've always been in this space. You knew how critical data centers were for the entire world. It's now become just so publicly critical. It's exploded really quickly, but it's not like something that's now become just so publicly critical. It's exploded really quickly, but it's not like something that's gonna just ramp up and then disappear. You said it before the training is never gonna stop. Then there's the inference scene and then there's, just like you said, we've kind of figured out like chat. Gbt is like the equivalent of screwing a light bulb in. Now we have light, that's great, but now what? What will it allow us to see?
Speaker 2:yeah, exactly what can we say?
Speaker 1:and then what can we go to? Air condition it and all the rest of it um, so how does that play out for you guys? So I'd love to, um, give the australian lens to that. Sure so we see global, all these trends. What does that mean for australia? Like, why would we even play in this space? We've got expensive power while we, why would there be a need to be there, sure so?
Speaker 2:so there's a couple of, I'll do that couple. So, firstly, I think you know, with all due respect to the, the americans, I think they're a bit ahead of um the world, frankly, in, in ai, the big hyperscalers are going really really fast on it and they've got scale and things and so they are a couple of years ahead of the australian context, um, but I I definitely see people starting to plan and get ready for AI. We're certainly talking the market about some big AI projects about to kick off that really aren't public yet.
Speaker 1:So I think we will see, and the importance of being sovereign, I think, is what plays out for us, as opposed to just taking everything from the US Absolutely.
Speaker 2:So I think we'll see some larger data centers for bulk training. They're typically going to be less latency sensitive. So the interesting thing, I think, about those very large training grounds is the intra data center latency. So the latency inside the data center is critical because you want the gpu chips to think they're operating like one unit. That, and and in order to do that you need to have super low latency. Yeah, so that's why. That's why they're so hot and dense, because if you spread them out, to spread the cooling across a larger area, that the physical distance from the one chip to the other, yeah, would mean that it wouldn't operate correctly. That's why they become dense and small. Um, so we're going to see those sites, I think, in the in less latency sensitive areas, so they'll go out to the west or in places where it's easier to build a bigger site. Yeah, then you'll have another tearing down, uh of sites that are going to be, uh, doing a combination of continual training.
Speaker 2:So and you know we people need to, people need to understand that you do, you build a model, but once you've built a model, generally if it's acting as an agent for your business or doing something it wants, you want to continually train it. Yeah, it's've built a model. Generally, if it's acting as an agent for your business or doing something, you want to continually train it. It's not just a model and it's static. Last year's data is like this year's, yeah, so it might not be trained at the same rate. So you're not creating tokens at the rate that you do when you're building the initial model, but you're still creating tokens all the way through this process. So you're continually training the model and that model is then being used for inferencing or generative AI or averaging some people call it. So that model. They're still going to be significantly sized and they're likely to be medium, what I'd say medium-sized data centers 100 meg I mean. 10 years ago, 100 meg people were, oh my God, that's huge, but now 100 meg is medium. Yes, medium say medium-sized data centers, the 100 meg I mean. 10 years ago, 100 meg people were, oh my God, that's huge, but now 100 meg is medium. Yes, medium to large-sized data center. And so you'll see clusters of these medium-sized you know, 100, 50, 100 meg data centers being built in capital cities. And, yes, it's a phenomenon for Australia, but I think it's a phenomenon for the world, to be clear, and Australia. But I think it's a phenomenon for the world to be clear and the models that they create and the agents that are in those models that are helping businesses be successful are going to be fundamental to that business.
Speaker 2:You think of it like an accounting firm with a couple of hundred people that might've bought off the shelf overseas an accounting model. Then they bring that model and they have it dedicated to their business and they start to train it With the local laws, with the local laws, tax rules. In fact, you work there, I don't, and Fred's a customer and Mary's not. So it needs to learn about that and it becomes more and more ingrained in that company and the amount of effort you make in training that model. Even a moderate-sized company will spend millions in training the model over time. Now it's augmenting 20 or 30 roles, so it will be a efficiency drop. It will more than pay for itself, but it will become possibly the most valuable asset of that business. So it needs to be protected and we need to make sure that it constantly yeah, updated constantly and and to your point earlier you know as these decisions start making business critical decisions life and death decisions, medical decisions they need to reside in in the in the host country, because that host country you need to have laws and jurisdictions from that host country. Yes, and that's why we're seeing both a combination of the hyperscalers also building in-country in a lot of places Because it's really important. Then they understand that it's going to be important to have locally, not just from a latency perspective, but for the jurisdictional reasons, not just from a latency perspective but for the jurisdictional reasons. And we're also seeing the advent of.
Speaker 2:I think companies will start to own some of their own AI infrastructure as well. We'll see small pods, and so companies will also have a little bit of their own AI infrastructure, use some AI resources out of the cloud and, of course, the general cloud resources continue to grow. So you've got all of these pressures driving the data center business. The perfect storm, yeah, and then, of course, there will be a real rise in the edge data center, so the ones right next to the customer. I guess the classic example of that is a self-driving car that doesn't do that much learning quick decisions right on the spot, and you know there's a combination of, you know a couple of meg-sized kind of data centres really close to the customers. So I think we'll see the tiering of data centres. They'll have purpose-built data centres. You'll see the growth of AI and the continual growth of cloud.
Speaker 1:And this sovereign factor, and I think I mean you, you guys are I don't know how large, but one of the largest um suppliers to government. I think, yes, it's a region, so you'll see that yeah, absolutely so, we do.
Speaker 2:We have a big part of our business supplies to government. In fact, 42 percent of federal government agencies use us to crazy and provide, clean their internet and do some other yeah, security things as well. So, um, you know, every, every business needs to have a, a thing that they stand by. And you know our thing is about being sovereign and compliant and secure. Yeah, so you know we are proud of being very australian, like megaport, yes, uh and uh, and so that's a factor and sovereignty really, honestly, will be such a big thing around the world. It's also, to some degree, about ownership of the assets, if you think. You know, we're clearly in a time of heightened geopolitical tension.
Speaker 1:And putting your eggs in another country's basket. That's right. How do you pick that? And then how do you get out of it if they're ever needed to?
Speaker 2:It's tricky, yeah, and it might not be a friend, you know, might it might be a friendly country, but they have. You know, if we've got some existential problem like another covid or something like that, they may just need the resources for their own country. Right, they're not going to be rude about it. I just, you know it's a choice of the resources go to you or to go to me, and I've got the problem that you have. I'm going to use it. So there needs to be a little bit of self-reliance, defense, all of those things. So definitely a big push to have those AI resources controlled and in a sovereign location. And I think there is a third factor, and that is that governments I often joke about governments, you know, a bit like justice grind slowly, but they grind thoroughly, yeah, and I think they'll eventually catch on to how transformative it is. Ai will be.
Speaker 2:And we've talked about the kind of negative assets, but also the well, think of the productivity gains and how it will drive a country's economy. And so as soon as you see something that's so fundamental to the economy and the people of a country, government will regulate it. So you know, there's a tsunami of laws coming your way regulation compliance, and so we also see people a bit like my comment about the hyperscales making sure they're putting assets in those countries because they know that those regulations will eventually arrive. And so we say you might as well plan early for that and go well, it's probably going to be regulated. I need to have it here. So let's start with how do I get it to be in my country, in my backyard, kind of thing? So there's definitely a lot going on and it's exciting. I just can't wait to have my own Tesla robot or whatever brand it is.
Speaker 1:Maybe a couple of them, yeah, that's right.
Speaker 2:Upstairs, one downstairs yeah, that's right, I mean, if it genuinely is $30,000 US, let's see about that, which is, you know, unfortunately, the Southwest Pacific Peso that we have in Australia. That's more like $50,000. But anyway, but if it's a $50,000 one-off purchase to have a, you know, effectively a servant in there, I mean, I think I can be derogatory about it because it's quite a lot.
Speaker 1:Yeah, yeah, yeah.
Speaker 2:I don't know if we should Not every tourist.
Speaker 1:Yeah, so yeah, 50,000, I think there'd be a lot of homes with someone to go and make you wash your car and walk your dog. How many can you have?
Speaker 2:Look, another thing you're talking about AI. Another thing that and the light example we talked about before. People often talk about AI assistants or AI agents being like oh, imagine you know you had an assistant and they could call a restaurant and make a booking for you and you go yeah, okay.
Speaker 2:You know that's kind of cute, but that's not how AI would work. The AI agent would call 300 restaurants. Yeah, fine, know exactly what your preferences are. Yes, find exactly what you want and chances are in the future they will be calling AI agents in those restaurants. Well, they have to be If everyone's calling through the restaurant.
Speaker 2:It's called by the restaurant, so the restaurants will have to have that. It really is not about doing exactly what we're doing now a little bit better. It's about a next leap forward in what you can do with AI. I think we discussed earlier, too that you know the move away from clearly large language models chat-shaping, those types of things are going to continue, but there is a shift in research with AI and AI models to move down to models based on mathematics, to a more fundamental building block. I think that will also drive a new wave of quickness of development of AI.
Speaker 1:It plays into healthcare. It plays into how we make decisions on life and death.
Speaker 2:Yeah, there's an article I read recently about how long it took to map a protein which was important in medicine and treating medical conditions, and this PhD department had 10 people that worked for 15 months to map one protein. Yeah, and now they can do it in a day on a gp. That's insane.
Speaker 1:Yeah, so well that's what I hope we see like beyond it's the light bulb, as you said, but going well and truly beyond that, like what improvements for humanity and life should we see out of this, and how quickly can?
Speaker 2:we see that People will get drugs made that treat exactly their condition. Yeah, specifically Based on your DNA or whatever the hell it is Well cancer is another classic, because cancer, by its definition, is where cells have started to divide irregularly and so every person's cancer, even if two people have the same cancer, the cancer cells are very specific to that person, and so if you could then map those cancer cells in real time and then create something that just goes and kills just that cell, I mean massive advances in cancer treatment.
Speaker 1:One thing I'd love to get your take on is connectivity, and I mean it's the world we live in, certainly Megaport, yeah, how critical is that to the data centers? And I think it's one thing that gets sort of forgotten. Everyone's like we've got to get power, we've got to space, we've got to build these incredible things, and then we go now how do we get customers to connect to them?
Speaker 2:Yeah, go. Now how do we get customers to connect to them? Yeah, yeah, I mean I'll shamefully say that we partner with Megaport and absolutely the connectivity layer is going to be critical because if I go back to the example earlier of the 300 restaurants that the AO agent calls, well, it's not going to be picking up the telephone. So it's the connectivity out of these data centers and into data center connectivity that becomes a fundamental enabler. It's an enabler of you know, honestly, nothing happens without connectivity in and out of a data center. You need to move stuff into stage models, you need models to talk to other things. And it's not just connectivity and it's not just local. You know it would be a global phenomenon. Other things, uh, and, and it's not just connectivity and it's not just local, it, you know it's a, it 'll be a global phenomenon and I appreciate the megaport fabrics global um and um. So it, it, you know it could only get, uh, more important. It's like air if you give, it is like yeah oxygen in the air.
Speaker 2:Yeah, that's right, yeah, it's got to be.
Speaker 1:It's got to be there to enable everything else well, what we're seeing and we see this everywhere is this a then? Um, it's sort of this sensational demand for speed, but speed at a different, like speed for a period where they're moving things, sort of flexibility and ability to move up and down, which is sort of kind of what we were, we were originally built upon. So it's sort of playing beautifully into our sweet spot. We'll keep building more and more speed. So, you know, we've gone and rolled out 400 gig backbones and so that people can just instantly spin up 100 gig wherever we can get them to. Yeah, of course. And the other piece is we're seeing so many customers in so many different locations needing to access these different data centers that are coming into sort of into bean. So we have had cloud for a while, but we're seeing these sort of hey, maybe it's ai, an ai data center, or maybe it's a, an inference space, or but they've got to move that data from where they are today.
Speaker 2:So I, I think, I think gone will be the days of a company, frankly, having all of their stuff in a data center. Yeah, this is the one place for us. Yeah, and when they do concentrate in a couple of data centers, those elements will need to talk to lots of other things in the network, right and so, yeah, it would be. The dynamic connections and dynamic bandwidth will really be a thing.
Speaker 1:Well, we saw this. It was like one cloud to rule them all, and if that was the case, they'd just build their own data centers no place for us and then we'd all land in sort of this AWS and that was it. But what we've seen play out is it's getting more fragmented, it's getting more misty. There's more and more cloud. We're seeing more and more on-premise requirements. We're now seeing a gpu as a service. Companies appear. We're seeing all these. It's sort of it's getting far more complex, shall we say, and it's up to us to simplify that for the customers. But that plays out well for our businesses in terms of what we service back to the customers. Yep, new locations, new spaces, different types, different types of companies and different connectivity required to do that. Yeah, yeah, yeah. Well, we love your partnership. We really appreciate what we're doing together. We're going to keep going big with yourselves. We know we do some of the most. I know you look after some of the most critical infrastructure inside Australia that service all the Australians and we I know you leverage a lot of Megaport to do that. We can't share what that is, but it's exciting to see what you're doing and we really enjoy that partnership and we're here for the long term. So, thank you and we love the partnership as well. So, yeah, awesome, thank you very much. Pleasure, love that, thank you. Appreciate it, man.
Speaker 1:We're here in Hawaii, we're at PTC Megaport has this sort of unique lens where we're in 26 countries around the world and we're seeing all these different trends in different countries. We've spoken to Brazil, the US. We look at Europe. We've seen some interesting things in France. So all these different pieces that are playing out.
Speaker 1:It's possibly the hottest space in the world right now, particularly data centers, as a flow on to what we're seeing with AI and this exploding space, how hyperscalers are sort of playing out In the US. They're seeing this just insatiable amount of capital that's been raised and they're deploying all these new data centers. I think everyone has landed Everyone's now in the data center game, it would seem, which I find entertaining. How is that playing out for the Australia region? And I know you guys have just raised a whole bunch of capital, go and do some interesting stuff. So I'd love to get your perspective on how that's playing out for Australia, and I know you've got some views, particularly on the sovereign elements of that which I think will play out for all locations around the world.
Speaker 2:Correct. Yes, it's not a uniquely Australian thing. It clearly impacts Australia and, frankly, thanks for having me here. And one of the things I do appreciate is I see Megaport as a great Aussie success story too.
Speaker 1:so yes, I'm very happy the Aussies out there dominating the world. That's it way to go.
Speaker 2:So how is it impacting us? So I think there's absolutely a lot of interest and from a from a technology and a capital flow perspective, yes, Certainly, if you're in the data center game, everyone wants to talk to you. Everyone's got capital for you to deploy. I think there's a long queue of people. When people are calling you I say you to mean our business calling you to give you money it's a nice place to be.
Speaker 1:Yeah, yeah, it is. Um. I know you've been in the game a while, so it's like it's a different lens, I suppose, or a different landscape right now yeah, it is.
Speaker 2:You know, I I often joke about the fact that, uh, probably in 2019, if you told someone that you're a virologist, that no one cared and didn't want to talk to you, and then the pandemic happened and suddenly everyone wanted to have the local virologist talk about COVID-19. And it's a bit like that with data centers. Right Five years ago, you say you're in data centers and people roll their eyes. Now, oh, really, I didn't want to talk.
Speaker 1:You're the cool kid, do you want to oversay? Come and talk to our people. That's right, exactly. Get Podcast there you go, and so what's happening?
Speaker 2:So I think there's definitely a land grab and there's a FOMO for a lot of capital and a lot of businesses that see an AI explosion occurring or on the horizon, and clearly there are people that debate about whether or not uh, it's overhyped or not uh, and so they're trying to make sure that I don't miss out on uh, on whatever ai brings to the, to the, the economy. I often um, when I think about ai, I think that the, the speed at which it will impact our lives, I think sometimes is overestimated in the sense that I don't think it's going to quite happen as quickly as people think. Yeah, but the impact will be way more profound than people think yes, yeah.
Speaker 2:You know electricity. So when electricity was first brought to market as a product, people didn't buy electricity for their homes. They bought lighting because they thought that's the only thing electricity did. Yeah Right, so AI, it's a bit like that. I don't think we've realized the things that AI can do.
Speaker 1:Yeah, we think of AI right now in the lighting sense. That's right, yeah.
Speaker 2:Chat GBT. That's really a very interesting element of AI, but I don't think it's going to be anywhere near all of what AI can deliver. Interestingly, look. So we're entering the age of agent AI. We'll have agents that work for organizations to do tasks on behalf of that business, but we'll soon enter the age of physical AI. Clearly, elon Musk demonstrated his robots at the last launch a few months ago now, and he's building the whole factory, the old Electrolux factory. They've got 100,000 GPUs and they're going to 200 000 gpus. Yeah, and that's meant to be just to train the robots. It's amazing. He. He said yeah, I'm sure you recall. He said that. You know, um, they believe they'll make more money out of robots than they will out of cars, more than likely yes, we'll have one.
Speaker 1:Yeah, in theory you will. Yeah, absolutely wash your clothes, cook your breakfast, etc. Etc. Cetera, et cetera.
Speaker 2:So, in a data center sense to bring it back to the topic, I guess I think that data centers, people want to make sure that they capture that. What's happening in the data center world is we're going to change the hierarchy and structure of data centers to some degree. I think there'll be more purpose-driven data centers and I'll explain those to sort of a generic yeah, like I've got a data center and how big is it? No, no, so I've got a data center and there might be a data center. It's a training ground. Yes, um, so the interesting thing for example, in a training data center, like a genuine um training data center, a hundred percent availability might not be that important. Yes, yep, right, but once the training starts you need it to be up during that's right, that's right, yes, so you need a hundred percent reliability when it's running.
Speaker 1:Yes, that's right, um, and I, and that changes what that changes like slas, how you build.
Speaker 2:Yes, you know, you, you, those places might have multiple halls where you stay. You might not have all of the power going all of the time, so you might have a finite amount of power and you've got one hall that you fire up and have going spin up for a period. For a period and then, while they're staging that because because in a training sense, an AI cluster also isn't all GPUs, there's got to be some storage, there's got to be some network, yes, and there's a period during the way AI tasks run is that the CPUs and the network and storage are working heavy, but they don't consume anywhere near the amount of GPUs and they're kind of getting the thing ready to go.
Speaker 1:Getting the body stage in it, so to speak.
Speaker 2:And so you could have a hall that's focused on that at the moment and the one next door is doing hardcore training, really churning through the electricity, and then you bring that side down and bring the other one up, so that could be a thing. Can I ask you on that side down and bring the other up, so that that could be a?
Speaker 1:thing? Yeah, um, does that? Can I ask you on that one? Um, does it impact the grid? Like, if you spin these things up, have you got enough power that you can deploy, like I assume when you turn the thing on, it just starts to draw? Can you do? You have to. I don't know how do you manage that?
Speaker 2:yeah, well so that is a problem. So so, yeah, firstly, when the, if we talk about how, when the systems kick in, uh, they go from north you know, effectively very little power to drawing 100% of power in milliseconds, yes, yeah. So so certainly the um, I don't understand how the grid happens yes, yeah.
Speaker 2:so because of the way data centers are operated and you have both generators and some type of UPS feeding into you know the electrical bus, but really they're the ones that get hit hardest, yes, so they're the ones that really get punched. So having that match fit electrical and cooling infrastructure is absolutely part of our design in data centers now. So, talking about how we've got to think about data centers, now we've got to think about different ways of building and designing data centers to cope with AI, and it's not just the hard hit when they fire up, uh, it's also that clearly they run very hot, and so now you need ways to cool them. Yes and uh, and at the moment that the clubhouse leader we believe will be some type of liquid to liquid cooling, um and uh and that's running to the chip that is submerged.
Speaker 1:What's the mixture? Like a radiator.
Speaker 2:Yeah, so the way it works is that generally, the data center operator provides chilled water, which it currently uses to blow air through to cool the room. So the chilled water is still already in the data centre, yeah, but it then runs into chilled distribution units, so CDUs, and so they have effectively a radiator with cold water running through it, and then that's attached to another radiator with another form of liquid, generally not water, it's another, a fuel or something. Yeah, there's a few different specs of the of the liquid, but it's either water with other things added or other elements. There's mineral oils, there's water additives, there's a. Anyway, there's a few differences, right, yeah, and and then that secondary loop then goes through and runs through a manifold in a rack across a very well calibrated block of metal with very specific paths that the liquid travels through. It's like a heat distribution. Yeah, yeah, to extract the heat from the chip, yeah, and depending on the manufacturer.
Speaker 2:Can you fit this into your data centers, or it's only some, or how does that play out? Yeah, so you certainly can put it into a lot of data centers, but if you've got an older data center, it may make it very inefficient in the way you consume and retrofit a data center and retrofitted data center, your best bet is actually to have one that's built from the get go. Yeah, because so I've talked about the liquid cooling infrastructure. You still need to cool the air, and so, in the next few generations at least, of this AI infrastructure, we are looking at a blend of air and liquid cooling, which may change on a on a on, depending upon what it's used for correct, right so yeah, you know you've got the, the cpus ramped down because you're you're staging it.
Speaker 2:Yeah, sorry, gpu's right down because you're staging something, uh, and you might not need as much liquid cooling at the moment, you know. So it really is, uh, flexibility in the design. So, again, the impact on the ai industry is that we need to build data centers that are more flexible between air cooling and liquid cooling and you need to be able to do both. So that's going to be a massive impact. So you've got different cooling technologies. You've got different skills in a data center. There's now a lot more liquid in a data center. Plumbers are going to rule. You know power's going to rule. So whoever thought of bringing on plumbers into the data center? But you know they are. You know, clearly, we've had water a lot, but it was much more static in the way that was routed. But now we've Before it was electricians- now it's plumbers.
Speaker 1:Yes, that's right, yeah. Or a bit of both.
Speaker 2:Yeah, you absolutely still need electricians, so it's going to be a it'd be great to see you know a university or some. There are some organizations that train on the engineering aspects of data centers, but I don't know of a specific degree or university. That's really starting it and it's going to be so fundamental to our life how that data centers.
Speaker 1:Well, it's not going away. That's the thing. So it's like this explosion right now. But it's kind of funny because it's like I guess you've always been in this space, you knew how critical data centers were for the entire world. It's now become just so publicly critical. It's exploded really quickly, but it's not like something that's going to just ramp up and then disappear. The training is never going to stop. Um, then there's the inference scene and then there's, just like you said, we've kind of figured out like chat. Gbt is like the equivalent of screwing a light bulb in. Now we have light, that's great, yeah, but now what, like? What will allow us to see?
Speaker 2:yeah, exactly what can we?
Speaker 1:say and then what can we go to? Air condition it and all the rest of it um so how does that play out for you guys? So I'd love to um give the australian Sure. So we see global, all these trends. What does that mean for Australia? Like, why would we even play in this space? We've got expensive power, why would there be a need to be there?
Speaker 2:Sure, so I'll deal with that a couple of ways. So firstly, I think you know, with all due respect to the Americans, I think they're a bit ahead of the world, frankly, in AI Right, I think they're a bit ahead of the world.
Speaker 2:frankly, in AI, the big hyperscalers are going really really fast on it and they've got scale and things and so they are a couple of years ahead of the Australian context, but I definitely see people starting to plan and get ready for AI. We're certainly talking in the market about some big AI projects about to kick off that really aren't public yet.
Speaker 1:So I think we will see, and the importance of being sovereign, I think, is what plays out for us, as opposed to just taking everything from the US Absolutely.
Speaker 2:So I think we'll see some larger data centres for bulk training. They're typically going to be less latency sensitive. So the interesting thing, I think, about those very large training grounds is the intra data center latency. So the latency inside the data center is critical because you want the GPU chips to think they're operating like one unit and in order order to do that, you need to have super low latency. Yeah, so that's why. That's why they're so hot and dense, because if you spread them out, to spread the cooling across a larger area, that the physical distance from the one chip to the other, yeah, would mean that it wouldn't operate correctly. That's why they become dense and small. Um, so we're going to see those sites, I think, in the in in less latency sensitive areas, so they'll go out to the west or in places where it's easier to build a bigger site. Yeah, then you'll have another tearing down uh of sites that are going to be, uh, doing a combination of continual training.
Speaker 2:So and you know we people need to, people need to understand that you do, you build a model, but once you've built a model, generally if it's acting as an agent for your business or doing something it wants, you want to continually train it. Yeah, people, it's not just a model and it's that last year's data is, that's right. This year's, yeah, so it might not be trained at the same rate. So you say you're not creating tokens, you know, at the rate that you do when you're building the initial model, but you're still creating tokens all the way through this process. So you're continually training the model and that model is then being used for inferencing or generative ai, um or or averaging, some people call it, yeah, um. So that model, they're still going to be significantly sized and they're likely to be medium, what I'd say medium-sized data centers, the 100 meg I mean. 10 years ago, 100 meg people go oh my God, that's huge. But now 100 meg is medium, medium to large size data center and so you'll see clusters of these medium-sized you know 150, 100 meg data centers being built in capital cities and yes, it's a phenomenon for Australia, but I think it's a phenomenon for the world, to be clear, and the models that they create and the agents that are in those models that are helping businesses be successful, are going to be fundamental to that business.
Speaker 2:You know, you think of it like an accounting firm with a couple of hundred people that might've bought off the shelf overseas an accounting model. Then they bring that model and they have it dedicated to their business and they start to train it that you know With the local laws, with the local tax rules, yeah, yeah, in fact, you work there, I don't, and Fred's a customer and Mary's not. So it needs to learn about that and it becomes more and more ingrained in that company and the amount of effort you make in training that model. Even a moderate-sized company will spend millions in training the model over time. Now it's augmenting 20 or 30 roles. So it will be a efficiency, it will more than pay for itself, but it will become possibly the most valuable asset of that business. So it needs to be protected and we need to make sure Updated constantly, yeah, updated constantly.
Speaker 2:And to your point earlier, as these decisions start making business critical decisions, life and death decisions, medical decisions start making business critical decisions, life and death decisions, medical decisions they need to reside in, in the, in the host country. Um, because that host country, uh, uh, you need to have laws and jurisdictions from that host country. Yes, and that's why we're seeing, you know both, a combination of the hyperscalers also building in country in a lot of places, yeah, uh, because it's really important. Then they know, they understand, um, understand that it's going to be important to have locally, not just from a latency perspective but for the jurisdictional reasons. And we're also seeing the advent of, I think companies will start to own some of their own AI infrastructure as well.
Speaker 2:We'll see small pods, and so companies will also have a little bit of their own AI infrastructure, use some AI resources out of the cloud and, of course, the general cloud resources continue to grow. So we've got all of these pressures driving the data center business the perfect storm, yeah. And then, of course, there will be a real rise in the edge data center, so the ones right next to the customer. I guess the classic example of that is a self-driving car that doesn't do that much learning, quick decisions right on the spot and there's a combination of a couple of meg size kind of data centers really close to the customers. So I think we'll see the tiering of data centers. They'll have purpose-built data centers. You'll see the growth of AI and the continual growth of cloud.
Speaker 1:So, and this sovereign factor, and I think I mean you guys are I don't know how large, but one of the largest suppliers to government. I think, yeah, so region. So you'll see that.
Speaker 2:Yeah, absolutely so we do. We have a big part of our business supplies to government. In fact, 42% of federal government agencies use us to provide, clean their internet and do some other security things as well. So every business needs to have a thing that they stand by, and our thing is about being sovereign and compliant and secure. So we are proud of being very Australian, like Megaport, and so that's a factor, and sovereignty really, honestly, will be such a big thing around the world.
Speaker 1:It's also, to some degree, about ownership of the assets, if you think you know we're clearly in a time of heightened geopolitical tension and putting your eggs in another country's basket, that's right. How do you pick that and then how do you get out of it if they're ever needed to? It's tricky.
Speaker 2:Yeah, and it might not be a friend, you know. It might be a friendly country, but they have, you know, if we've got some existential problem like another COVID or something like that, they may just need the resources for their own country. They're not going to be rude about it. It's a choice of the resources go to you or to go to me, and I've got the problem that you have, I'm going to use it. So there needs to be a little bit of self-reliance defense, all of those things. So definitely a big push to have those AI resources controlled and in a sovereign location.
Speaker 2:And I think there is a third factor and that is that governments I often joke about governments, you know, like justice grind slowly but they grind thoroughly and I think they'll eventually catch on to how transformative AI will be. And we've talked about the kind of negative aspects, but also think of the productivity gains and how it will drive a country's economy. And so as soon as you see something that's so fundamental to the economy and the people of a country, government will regulate it. So you know there's a tsunami of laws coming your way regulation, compliance and so you know we also see people, a bit like my committee of the hyperscales, making sure they're putting assets in those countries, because they know that those regulations will eventually arrive. And so, you know, we say you might as well plan early for that.
Speaker 2:And yeah, well, it's probably going to be regulated. I need to have it here. So let's start with how do I get it to be in my country, in my backyard, kind of thing. So there's definitely a lot going on and it's exciting. I just can't wait to have my own Tesla robot or whatever brand it is, maybe a couple of them yeah, that's right.
Speaker 1:Upstairs one downstairs yeah, that's right. Or upstairs one downstairs yeah, that's right.
Speaker 2:I mean, if it genuinely is $30,000 US, let's see about that, which is, you know, unfortunately, with the Southwest Pacific Peso that we have in Australia, that's more like $50,000. But anyway, but if it's a $50,000 one-off purchase to have a, you know, effectively a servant in there, I mean, I think I can be derogatory about it yeah, yeah.
Speaker 1:So yeah, 50,000, I think there'd be a lot of homes with someone to go and make you wash your car and walk your dog, and how many can you have?
Speaker 2:Look, another thing you're talking about another thing. That and the light example we talked about before. People often talk about our assistants or AI agents being like oh, imagine you had an assistant and they could call a restaurant and make a booking for you and you go, yeah okay.
Speaker 2:You know that's kind of cute, but that's not how AI would work. The AI agent would call 300 restaurants. Yeah, fine. Know exactly what your preferences are? Yes, find exactly what you want and chances are in Right. Know exactly what your preferences are? Yes, find exactly what you want and chances are in the future they will be calling AI agents in those restaurants. Well, they have to be if everyone's calling through.
Speaker 1:Exactly right.
Speaker 2:The restaurant's taking, so the restaurants will have to have their. So it's not. It really is not about doing exactly what we're doing now a little bit better. It's about a next leap forward, what you can do with AI. I think we discussed earlier, too that the move away from clearly large language models, chat-sharp IT, those types of things are going to continue, but there is a shift in research with AI and AI models to move down to models based on mathematics, which is a more fundamental building block. I think that will also drive a new wave of quickness of development of AI.
Speaker 1:It plays into healthcare. It plays into how we make decisions on life and death.
Speaker 2:Yeah, there's an article I read recently about how long it took to map a protein which was important in medicine and treating medical conditions, and this PhD department had 10 people that worked for 15 months to map one protein. Yeah, and now they can do it in a day on a GPU. That's insane.
Speaker 1:Yeah Well, that's what I hope we see. It's the light bulb, as you said, but going well and truly beyond that, what improvements for humanity and life should we see out of this, and how quickly can?
Speaker 2:we see that People will get drugs made that treat exactly their condition, down to the very.
Speaker 1:Based on your DNA or whatever the hell it is.
Speaker 2:Well, cancer is another classic, because cancer, by its definition, is where cells have started to divide irregularly, and so every person's cancer, even if two people have the same cancer, the cancer cells are very specific to that person, and so if you could then map those cancer cells in real time and then create something that just goes and kills just that cell, yes, I mean massive advances in cancer treatment.
Speaker 1:One thing I'd love to get your take on is connectivity, and I mean it's the world we live in, certainly Megaport, yeah, how critical is that to the data centers? And I think it's one thing that gets sort of forgotten. Everyone's like we've got to get power, we've got to space, we've got to build these incredible things, and then we go now how do we get customers to connect to them?
Speaker 2:Yeah, I mean I'll shamefully say that we partner with Megaport and absolutely the connectivity layer is going to be critical because if I go back to the example earlier of the 300 restaurants that the AO agent calls, well, it's not going to be picking up the telephone. So you know it's the connectivity out of these data centres and into data centre connectivity that becomes a fundamental enabler. It's an enabler of you know, honestly, nothing happens without connectivity in and out of a data centre. You need to move stuff into stage models, you need models to talk to other things, and it's not just connectivity and it's not just local. You know it would be a global phenomenon and I appreciate the Megaport Fabrics Global, and so you know it can only get more important. It's like air. Think of it as like air.
Speaker 1:Oxygen in the air.
Speaker 2:It's got to be there to enable everything else what we're seeing and we see this everywhere is this A?
Speaker 1:it's sort of this sensational demand for speed, but speed at a different like speed for a period where they're moving things, sort of flexibility and ability to move up and down, which is sort of kind of what we were originally built upon. So it's sort of playing beautifully into our sweet spot. We'll keep building more and more speed. So, you know, we've gone and rolled out 400 gig backbones and so that people can just instantly spin up 100 gig wherever we can get them to. Yeah, of course. And the other piece is we're seeing so many customers in so many different locations needing to access these different data centers that are coming into Bean. So we have had cloud for a while, but we're seeing these sort of hey, maybe it's an AI data center or maybe it's an inference space, but they've got to move that data from where they are today inference space, or, but they've got to move that data from where they are today.
Speaker 2:So I I think, I think gone will be the days of a company frankly having all of their stuff in a data center yeah this is the one place. Yeah, yeah, so, and and when they do, you know, concentrate in a couple of data centers, those uh elements will need to talk to lots of other things on the network, right, and so, yeah, it would be, the dynamic connections and dynamic bandwidth will really be a thing.
Speaker 1:Well, we saw this. It was like one cloud to rule them all, and if that was the case, they'd just build their own data centers no place for us and then we'd all land in sort of this AWS and that was it. But what we've seen play out is it's getting more fragmented, it's getting more misty, there's more and more cloud. We're seeing more and more on-premise requirements. We're now seeing GPU as a service companies appear. We're seeing all these. It's sort of it's getting far more complex, shall we say, and it's up to us to simplify that for the customers. But that plays out well for our businesses in terms of what we service back to the customers New locations, new spaces, different types, different types of companies and different connectivity required to do that.
Speaker 1:Well, we love your partnership. We really appreciate what we're doing together. We're going to keep going big with yourselves. We know we do some of the most. I know you look after some of the most critical infrastructure inside Australia that service all the Australians, and I know you leverage a lot of Megaport to do that. We can't share what that is, but it's exciting to see what you're doing and we really enjoy that partnership and we're here for the long term. So thank you.
Speaker 2:And we love the partnership as well.
Speaker 1:Yeah, awesome Very much.
Speaker 2:Pleasure, love that, thank you.
Speaker 1:Appreciate it, man.