Nvidia introduces computing platform for AI data processing

Nvidia introduces computing platform for AI data processing

AI data chip.jpeg

Nvidia has created a computing platform, NVIDIA EGX, that can perform low-latency AI at the edge and act in real time on continuous streaming data.

The American computer processing technology giant has designed NVIDIA EGX to meet the growing demand to perform instantaneous, high-throughput AI at the edge — where data is created – for applications such as those involving data exchanges between 5G base stations and enterprises based in warehouses, retail stores and factories.

The Nvidia platform’s edge servers are designed to handle volumes of data that may be created in more than five years' time, based on research by market intelligence firms such as IDC, who estimated in 2018 that 150 billion machine sensors and IoT devices will stream continuous data by 2025.

NVIDIA EGX was created with guaranteed response times, while reducing the amount of data that must be sent to the cloud.


“Enterprises demand more powerful computing at the edge to process their oceans of raw data — streaming in from countless interactions with customers and facilities — to make rapid, AI-enhanced decisions that can drive their business,” commented Bob Pette, vice president and general manager of Enterprise and Edge Computing at NVIDIA.

“A scalable platform like NVIDIA EGX allows them to easily deploy systems to meet their needs on premises, in the cloud or both.”

EGX uses a NVIDIA Jetson Nano — a small, powerful computer that lets you run run multiple neural networks in parallel, which in a few watts can provide one-half trillion operations per second (TOPS) of processing for tasks such as image recognition.


It spans all the way to a full rack of NVIDIA T4 servers, delivering more than 10,000 TOPS for real-time speech recognition and other real-time AI tasks.

EGX combines the full range of NVIDIA AI computing technologies with Red Hat OpenShift and NVIDIA Edge Stack together with Mellanox and Cisco security, networking and storage technologies.

Nvidia has claimed that this will allow companies in the largest industries of telecom, manufacturing, retail, healthcare and transportation to quickly stand up secure, enterprise-grade AI infrastructures.

“Mellanox Smart NICs and switches provide the ideal I/O connectivity for data access that scale from the edge to hyperscale data centers,” said Michael Kagan, chief technology officer at Mellanox Technologies. “The combination of high-performance, low-latency and accelerated networking provides a new infrastructure tier of computing that is critical to efficiently access and supply the data needed to fuel the next generation of advanced AI solutions on edge platforms such as NVIDIA EGX.”

NVIDIA EGX is optimising AI at the edge for a growing ecosystem of software solutions such as video analytics applications, which are ideal for large retail chains and smart cities.

NVIDIA EGX servers are also tuned for NVIDIA Edge Stack and NGC-Ready validated for CUDA-accelerated containers.

Early adopters include more than 40 companies and organisations.

BMW Group Logistics will use NVIDIA’s EGX edge computing and Isaac robotic platforms to bring AI directly to the edge of its logistics processes and handle increasingly complex logistics with real-time efficiency.

Another industry leader, GE Healthcare, is also adopting EGX for its medical devices.

Jason Polzin, Ph.D., general manager of MR Applications, GE Healthcare, commented: “AI is fundamental to achieving precision health and must be pervasively available from the cloud to the edge and directly on medical devices.

“NVIDIA’s EGX enables GE Healthcare to deliver rapid MR acquisition times, improves image quality and reduces variability by embedding NVIDIA T4 GPUs directly into our medical devices — all to further our goal of improving patient outcomes.

“Real-time, critical-care use cases demand AI at the edge.”

An article focusing on how edge computing will revolutionise mobile gaming by Jason McGee-Abe recently explored how new technologies will use edge devices to carry out a substantial amount of computation to minimise latency and response time.

Gift this article