Exasol launches enterprise cloud data warehouse on AWS

The world’s fastest in-memory database is now available Amazon Web Services with 24/7 enterprise support.

  • Friday, 1st March 2019 Posted 5 years ago in by Phil Alsop
Exasol has announced the immediate availability of its analytic database as an enterprise-grade pay-as-you-go service on Amazon Web Services (AWS). The new service offers businesses easy native deployment of the world’s fastest and most flexible cloud data warehouse and data analytics solution on AWS.    

 

AWS Marketplace customers can now choose from pre-built Amazon Machine Images (AMIs) to build an optimal data analysis infrastructure that fully exploits Exasol’s market-leading performance. For example, to accelerate analysis of large complex datasets, or to enable self-service reporting as a central repository for structured and unstructured sources.

 

Jens Graupmann, Vice President of Product Management at Exasol, said: “Building a modern data infrastructure is essential for data-driven decision making, but it doesn’t have to be difficult or expensive. Deploying Exasol through AWS Marketplace is an attractive proposition – businesses can scale up their data analytics infrastructure with cost effective ‘no surprises’ pricing.”

 

AWS Marketplace customers can easily resize instances or add nodes to their AWS deployment, and Exasol’s Massively Parallel Processing (MPP) database will self-tune the resulting cluster to automatically deliver near-linear performance gains. An Exasol data warehouse provides unparalleled performance from 10GB – 100TB or more, enabling BI teams to analyse data with SQL and BI tools and data science teams to process the same data with user defined functions (UDFs) concurrently.

 

Using the AWS service, customers can deploy Exasol clusters of any size to optimize cost and performance and combine them with their on-premise servers to meet their data governance requirements. Exasol is also easily integrated with Hadoop and Spark data infrastructure, augmenting existing data lakes and big data projects to cost effectively derive value from existing corporate data sources.