Gcore unveils Inference at the Edge

Gcore has launched Gcore Inference at the Edge, a breakthrough solution that provides ultra-low latency experiences for AI applications. This innovative solution enables the distributed deployment of pre-trained machine learning (ML) models to edge inference nodes, ensuring seamless, real-time inference.

  • Sunday, 9th June 2024 Posted 6 months ago in by Phil Alsop

Gcore Inference at the Edge empowers businesses across diverse industries—including automotive, manufacturing, retail, and technology—with cost-effective, scalable, and secure AI model deployment. Use cases such as generative AI, object recognition, real-time behavioural analysis, virtual assistants, and production monitoring can now be rapidly realised on a global scale.

Gcore Inference at the Edge runs on Gcore's extensive global network of 180+ edge nodes, all interconnected by Gcore’s sophisticated low-latency smart routing technology. Each high-performance node sits at the edge of the Gcore network, strategically placing servers close to end users. Inference at the Edge runs on NVIDIA L40S GPUs, the market-leading chip designed specifically for AI inference. When a user sends a request, an edge node determines the route to the nearest available inference region with the lowest latency, achieving a typical response time of under 30 ms.

The new solution supports a wide range of fundamental ML and custom models. Available open-source foundation models in the Gcore ML Model Hub include LLaMA Pro 8B, Mistral 7B, and Stable-Diffusion XL. Models can be selected and trained agnostically to suit any use case, before distributing them globally to Gcore Inference at the Edge nodes. This addresses a significant challenge faced by development teams where AI models are typically run on the same servers they were trained on, resulting in poor performance.

Benefits of Gcore Inference at the Edge include:

· Cost-effective deployment: A flexible pricing structure ensures customers only pay for the resources they use.

· Inbuilt DDoS protection: ML endpoints are automatically protected from DDoS attacks through Gcore’s infrastructure.

· Outstanding data privacy and security: The solution features built-in compliance with GDPR, PCI DSS, and ISO/IEC 27001 standards.

· Model autoscaling: Autoscaling is available to handle load spikes, so a model is always ready to support peak demand and unexpected surges.

· Unlimited object storage: Scalable S3-compatible cloud storage that grows with evolving model needs.

Andre Reitenbach, CEO at Gcore comments: “Gcore Inference at the Edge empowers customers to focus on getting their machine learning models trained, rather than worrying about the costs, skills, and infrastructure required to deploy AI applications globally. At Gcore, we believe the edge is where the best performance and end-user experiences are achieved, and that is why we are continuously innovating to ensure every customer receives unparalleled scale and performance. Gcore Inference at the Edge delivers all the power with none of the headache, providing a modern, effective, and efficient AI inference experience.”

Exclusive Global Solutions (XGS) aimed at reducing complexity, increasing value and accelerating time to revenue for global cybersecurity...

WPP and Kyndryl enhance creativity

Posted 6 days ago by Phil Alsop
Kyndryl and WPP, the creative transformation company, have created a modern, digital workplace using advanced technologies such as hybrid cloud and...
La Molisana, a leading Italian pasta company, selects Hitachi Vantara’s Virtual Storage Platform One offering, leveraging advanced data...

Cerabyte receives EIC Accelerator Grant funding

Posted 1 week ago by Phil Alsop
Cerabyte, the pioneering leader in ceramic-based data storage technology, has been awarded a highly sought-after grant from the European Innovation...

Peer Software unveils next-generation PeerGFS

Posted 1 week ago by Phil Alsop
Innovations for large-scale deployments focused on flexibility, operational efficiency, resilience, and data governance.
New wired and wireless network consolidates and transforms operations to underpin mission-critical gas production across Europe.
ELTEX, Inc., a pioneer in the e-commerce industry in Japan, has modernised its storage infrastructure with the InfiniBox® solution, achieving a 2.4x...

StorMagic SvHCI expands

Posted 1 week ago by Phil Alsop
StorMagic has introduced version 2.0 of its SvHCI full-stack HCI (hyperconverged infrastructure) solution, which is purpose-built for enterprise edge...