F5 has announced early access of F5 AI Gateway to streamline interactions between applications, APIs, and large language models (LLMs) driving enterprise AI adoption. This powerful containerised solution optimises performance, observability, and protection capabilities – all leading to reduced costs. Integrated with F5’s portfolio, AI Gateway gives security and operations teams a seamless path to adopting AI services through significantly improved data output quality and a superior user experience.
According to F5’s State of AI Application Strategy Report, 75% of enterprises are implementing AI. Like countless modern applications, AI services are largely delivered and consumed via APIs. However, enterprises face many additional challenges in architecting and scaling AI-fluent apps and services. As an example, efficient operations require close monitoring of increasingly relevant metrics such as GPU compute costs and system responsiveness, as well as emerging regulatory compliance concerns.
“LLMs are unlocking new levels of productivity and enhanced user experiences for customers, but they also require oversight, deep inspection at inference-time, and defense against new types of threats,” said Kunal Anand, Chief Innovation Officer at F5. “By addressing these new requirements and integrating with F5’s trusted solutions for API traffic management, we’re enabling customers to confidently and efficiently deploy AI-powered applications in a massively larger threat landscape.”
Real-world AI solutions require optimised request, response, and prompt interactions across an entire data ecosystem. F5 AI Gateway observes, optimises, and secures a vast number of user and automated variables to offer cost reductions, mitigate malicious threats, and ensure regulatory compliance.
F5 AI Gateway is designed to meet customers – and their apps – at the ideal place in their AI journey. It can be deployed in any cloud or data center and will natively integrate with F5’s NGINX and BIG-IP platforms to take advantage of F5’s leading app security and delivery services in traditional, multicloud, or edge deployments. In addition, the solution’s open extensibility enables organisations to develop and customise programmable security and controls enforced by F5 AI Gateway. These processes can be easily updated and applied dynamically to drive instant adherence to security policies and compliance mandates.
“AI-powered applications will become a cornerstone for nearly every business and organisation in the coming years,” said Shari Lava, Senior Director, AI and Automation at IDC. “F5’s introduction of an AI gateway to its application stack of services enables its customers to have greater flexibility in how they build their AI application structure, but still have enhanced protection and model optimisation.”
F5 AI Gateway:
Delivers security and compliance policy enforcement with automated detection and remediation against the risks identified in the OWASP Top Ten for LLM Applications.
Offloads duplicate tasks from LLMs with semantic caching, enhancing the user experience and reducing operations costs.
Streamlines integration processes, allowing developers to focus on building out AI-powered applications rather than managing complex infrastructures.
Optimises load balancing, traffic routing, and rate limiting for local and third-party LLMs to maintain service availability and enhance performance.
Provides a single API interface that developers can use to access their AI model of choice.
“F5’s AI Gateway is an integral part of our AI strategy,” said Austin Geraci, CTO of WorldTech IT. “With this technology, our customers are able to develop both internal and external-facing AI applications capable of handling a surge in queries without degradation to site and application performance. F5 brings leading app security and delivery capabilities to accelerate AI experiences at scale. With F5 AI Gateway, semantic caching and intelligent traffic routing alone represent significant cost savings, and the unification of F5 services will save customers hundreds of hours of integration work.”