Amazon EC2 C5 instances are very popular for running compute-heavy workloads like batch processing, distributed analytics, high-performance computing, machine/deep learning inference, ad serving, highly scalable multiplayer gaming, and video encoding.
In 2018, we added blazing fast local NVMe storage, and named these new instances C5d. They are a great fit for applications that need access to high-speed, low latency local storage like video encoding, image manipulation and other forms of media processing. They will also benefit applications that need temporary storage of data, such as batch and log processing and applications that need caches and scratch files.
Just a few weeks ago, we launched new instances sizes and a bare metal option for C5 instances. Today, we are happy to add the same capabilities to the C5d family: 12xlarge, 24xlarge, and a bare metal option.
The new C5d instance sizes run on Intel’s Second Generation Xeon Scalable processors (code-named Cascade Lake) with sustained all-core turbo frequency of 3.6GHz and maximum single core turbo frequency of 3.9 GHz.
The new processors also enable a new feature called Intel Deep Learning Boost, a capability based on the AVX-512 instruction set. Thanks to the new Vector Neural Network Instructions (AVX-512 VNNI), deep learning frameworks will speed up typical machine learning operations like convolution, and automatically improve inference performance over a wide range of workloads.
These instances are based on the AWS Nitro System, with dedicated hardware accelerators for EBS processing (including crypto operations), the software-defined network inside of each Virtual Private Cloud (VPC), and ENA networking.
New Instance Sizes for C5d: 12xlarge and 24xlarge
Here are the specs:
Instance Name | Logical Processors | Memory | Local Storage | EBS-Optimized Bandwidth | Network Bandwidth |
c5d.12xlarge | 48 | 96 GiB | 2 x 900 GB NVMe SSD | 7 Gbps | 12 Gbps |
c5d.24xlarge | 96 | 192 GiB | 4 x 900 GB NVMe SSD | 14 Gbps | 25 Gbps |
Previously, the largest C5d instance available was c5d.18xlarge, with 72 logical processors, 144 GiB of memory, and 1.8 TB of storage. As you can see, the new 24xlarge size increases available resources by 33%, in order to help you crunch those super heavy workloads. Last but not least, customers also get 50% more NVMe storage per logical processor on both 12xlarge and 24xlarge, with up to 3.6 TB of local storage!
Bare Metal C5d
As is the case with the existing bare metal instances (M5, M5d, R5, R5d, z1d, and so forth), your operating system runs on the underlying hardware and has direct access to processor and other hardware.
Bare metal instances can be used to run software with specific requirements, e.g. applications that are exclusively licensed for use on physical, non-virtualized hardware. These instances can also be used to run tools and applications that require access to low-level processor features such as performance counters.
Here are the specs:
Instance Name | Logical Processors | Memory | Local Storage | EBS-Optimized Bandwidth | Network Bandwidth |
c5d.metal | 96 | 192 GiB | 4 x 900 GB NVMe SSD | 14 Gbps | 25 Gbps |
Bare metal instances can also take advantage of Elastic Load Balancing, Auto Scaling, Amazon CloudWatch, and other AWS services.
Now Available!
You can start using these new instances today in the following regions: US East (N. Virginia), US East (Ohio), US West (N. California), US West (Oregon), Canada (Central), Europe (Ireland), Europe (Frankfurt), Europe (Stockholm), Europe (London), Asia Pacific (Tokyo), Asia Pacific (Seoul), Asia Pacific (Singapore), Asia Pacific (Sydney), South America (São Paulo), and AWS GovCloud (US-West).
Please send us feedback, either on the AWS forum for Amazon EC2, or through your usual AWS support contacts.
— Julien;
Amazon
via https://www.aiupnow.com
Julien Simon, Khareem Sudlow