search
Conrad Egusa
Entrepreneur

As Nvidia expands in Israel, this VC believes in these building blocks for AI

Sumedh Nadendla (image free for use on a commercial platform in accordance with copyright law, courtesy of Sumedh Nadendla)

In an event that’s often fondly referred to as the ‘Woodstock for AI’, global tech giant Nvidia held its annual developers’ conference, GTC 2025, in San Jose last week.

As ever, the keynote from CEO Jensen Huang was one of the most hotly anticipated parts of the conference and a firm favorite for enthusiasts given Huang’s tendency to reveal game-changing insights during his presentations. 

This year, we’ve learned that data centers will continue to play an increasingly important role in the future of the company. Huang’s session outlined a future where innovation is powered by an AI ecosystem of enablers coming from both hardware and software. He also re-introduced the concept of AI factories, which are essentially gigantic data centers. 

On this note, Israel is expecting to play a hand in building these ecosystems of the future. Earlier this year the tech giant invested $500 million to build the most powerful data center in the country in a 10,000 square-meter facility in the Mevo Carmel Science and Industry Park. 

In addition, Nvidia’s flagship event offered us a glimpse into the future of chip technology. Its latest release, the Blackwell Ultra, has been “built for the age of AI reasoning” and could democratize AI adoption by making advanced compute power available for more organizations and initiatives. 

Although Nvidia dominates when it comes to chip technology, tech companies like Google and Amazon are aiming to reduce their reliance on the goods and prices set by them. For example, Google is expanding its chip development operations in Israel to develop a new type of communications chip – a critical component for GPUs within giant data centers.

From this, it’s clear that the country is a key partner for the tech sector, helping to expand data capacity and develop new chip technologies that will fuel the next wave of AI developments. 

When it comes to AI enablers more broadly, which serve as a foundation to scale AI and overcome common shared development challenges, Sumedh Nadendla, investment lead at Pacific Alliance Ventures, has keen insights into what we can expect. 

The next phase of AI 

The past few months have seen a rapid succession of announcements around AI models becoming faster and cheaper to run, from Google training AI that’s 3 times faster and 10 times more power efficient, to DeepSeek’s model built at a fraction of the cost of its competitors. 

According to Sumedh, this is a trend that’s set to continue. 

“As AI models become more sophisticated by orders of magnitude each month, and the costs for powerful LLMs continue to come down, we can expect to see even more adoption of AI across industries. 

Already the technology is rapidly expanding the scope of software from productivity enhancement to the work itself, opening up a total addressable market of over $10Tn. This nascent, vibrant phase for AI is marked by groundbreaking innovations, bold investments, and transformative potential.” 

However, as seen with the likes of Nvidia and Google expanding operations in Israel, certain so-called building blocks will be needed to ensure AI innovation can continue. 

“A recent IBM survey highlighted the challenges developers face when building GenAI apps. The problems are three-fold: skills gap in AI, lack of clarity on frameworks and toolkits, and the fast-paced changes surrounding the tech. 

AI-enablers can play a key role in this regard for the ecosystem. While they’re not as well-known as AI-adopters, these enablers lay the critical groundwork to run any AI application and software – and can range from data centers, cloud providers, data and analytics partners, and chipmakers.” 

How AI fuels other areas of industry growth 

Building AI ecosystems isn’t just good news for the likes of OpenAI, it’s also fueling waves of new innovation in adjacent industries like robotics. 

This was clearly demonstrated at Nvidia GTC as the company unveiled Isaac Groot N1, claimed to be the world’s first open humanoid robot

Here, Sumedh believes that lower-cost AI is making the use of the technology more viable thanks to fundamental economic feasibility principles. 

He explains, “Autonomous robotic solutions will rely on a combination of AI solutions, from neural networks for vision, mapping and large-scale solvers for complicated optimal control problems. GPUs are helping to address the huge demand for processing power, but the industry still needs a good robotics operating system that allows for higher level applications.” 

Even so, the combined power of AI and robotics promises to drive adoption across industries and in new use cases. However, this will still hinge on having improved infrastructure and standardization. 

The parallels between AI and the internet

As AI rests at this pivotal moment, Sumedh believes we can see clear parallels if we look to the early stages of the internet. 

“The dominant telcos of the time had to invest heavily in building the physical backbone required for widespread internet adoption, tearing up roads to lay down fibre cables and building costly data centers,” he explained. 

“Commercial enterprises then created useful internet-adjacent products like networking infrastructure, security solutions or enterprise-grade storage. Over time, these foundations paved the way for software infrastructure companies Cisco, Sun Microsystems, Microsoft and Oracle to become pivotal Internet enablers with system design and protocols with standardization.” 

Looking forward, he believes software infrastructure is going to be as pivotal as the availability of data centers and advanced processing chips. These will need to solve challenges like data conflicts, ensure developers can focus on the business logic that makes AI apps robust and reliable and leverage resource management tools to help cloud infrastructure adapt to the needs of this emerging technology. 

Leveraging best practices from DevOps will help companies to get off the ground more quickly when developing AI applications through efficient testing and compiling tools that reduce build times and deliver consistent, reliable deployments,” the investor concluded. 

About the Author
Conrad Egusa is a Global Mentor at 500 Startups, Founder Institute, Techstars, Cardinal Ventures of Stanford University, Oxford Entrepreneurs and more, and has contributed to TechCrunch, VentureBeat, Forbes and TheNextWeb. Conrad is also is an Advisory Board Member at SXSW Pitch, an Advisor at Microsoft Startup Growth Partners and Horasis, and is a Judge at Start-Up Chile and Parallel18.
Related Topics
Related Posts