Ampere and Oracle Cloud Infrastructure (OCI) introduces second-generation Ampere-based compute instances, OCI Ampere A2 Compute. This is based on the AmpereOne family of processors.
OCI Ampere A2 Compute is built upon the success of OCI Ampere A1 Compute instances, which is deployed across over 100 OCI services, including Oracle Database services like HeatWave, MySQL, and Oracle Cloud Applications. The new product provides higher core count virtual machines and high-density containers within a single server, delivering more performance, scalability and cost-efficiency for cloud native workloads.
Additionally, OCI Ampere A2 Compute series extends OCI’s lead in both Arm-based cloud computing and price-for-performance.
“OCI and Ampere began our collaboration with the ground-breaking A1 shapes. We’ve demonstrated the versatility of these shapes on a wide range of workloads from general purpose applications and OCI services to the most recently announced and highly demanding use case: Llama3 generative AI services,” said Jeff Wittich, Chief Product Officer at Ampere. “Building on this momentum, the new OCI Ampere A2 Compute shapes using our AmpereOne® processors are set to create a new baseline in price-performance for the cloud industry across an ever-expanding variety of cloud native workloads and instance types.”
The OCI Ampere A2 Compute instances offer up to 78 OCPUs (1 OCPU = 2 AmpereOne cores, 156 cores total) and up to 946 GB of DDR5 memory with 25% more bandwidth compared to A1. They feature flexible VM sizes, with up to 946 GB of memory, block storage boot volumes up to 32TB, networking bandwidth of up to 78 Gbps Ethernet, and up to 24 VNICs. Testing shows up to 2X better price performance versus comparable x86-based shapes. Pricing is $0.014 per OCPU per hour and $0.002 per GB per hour. Oracle’s Flex Shapes allow customers to adjust the core count based on their workloads for additional savings.
OCI Ampere A2 Compute instances show strong performance for multiple AI functions, including generative AI, which is made possible through Ampere and OCI joint development. They together have delivered up to 152% performance gain over the previous upstream llama.cpp open-source implementation.
OCI Ampere A1 and A1 Compute instances are also well-suited for other cloud native workloads, such as analytics and databases, media services, video streaming and web services. These instances offer linear scalability, low latency, density and predictable performance these workloads need, resulting in higher performance and cost savings.