TSXV:TUO - Post Discussion
Post by
highper on Apr 10, 2024 3:41pm
musk describes tomorrow today or AI electricity demand
The advancement of AI technology, according to Musk, is currently hampered by two main factors: supply shortages on advanced processors — like Nvidia's H100, as it's not easy to get 100,000 of them quickly — and the availability of electricity. Nvidia's H100 GPU consumes around 700W when fully utilized, and thus 100,000 GPUs for AI and HPC workloads could consume a whopping 70 megawatts of power. Since these GPUs need servers and cooling to operate, it's safe to say that a datacenter with 100,000 Nvidia H100 processors will consume around 100 megawatts of power. That's comparable to the power consumption of a small city. https://news.yahoo.com/tech/elon-musk-says-next-generation-181954074.html Musk stressed that while the compute GPU supply has been a significant obstacle so far, the supply of electricity will become increasingly critical in the next year or two. This dual constraint underscores the challenges of scaling AI technologies to meet growing computational demands.
Be the first to comment on this post