AAEON’s MAXER-2100 Inference Server Integrates Both Intel CPU and NVIDIA GPU Technologies

by | Jul 11, 2024

AAEON MAXER-2100_3DFront_02

AAEON has released the inaugural offering of its AI Inference Server product line, the MAXER-2100. The MAXER-2100 is a 2U Rackmount AI inference server powered by the Intel® Core™ i9-13900 Processor, designed to meet high-performance computing needs.

The MAXER-2100 is also able to support both 12th and 13th Generation Intel® Core™ LGA 1700 socket-type CPUs, up to 125W, and features an integrated NVIDIA® GeForce RTX™ 4080 SUPER GPU. While the product’s default comes with the NVIDIA® GeForce RTX™ 4080 SUPER, it is also compatible with and an NVIDIA-Certified Edge System for both the NVIDIA L4 Tensor Core and NVIDIA RTX™ 6000 Ada GPUs.

Given the MAXER-2100 is equipped with both a high-performance CPU and industry-leading GPU, a key feature highlighted by AAEON upon the product’s launch is its capacity to execute complex AI algorithms and datasets, process multiple high-definition video streams simultaneously, and utilize machine learning to refine large language models (LLMs) and inferencing models.

Given the need for latency-free operation in such areas, the MAXER-2100 offers up to 128GB of DDR5 system memory through dual-channel SODIMM slots. For storage, it includes an M.2 2280 M-Key for NVMe and two hot-swappable 2.5” SATA SSD bays with RAID support. The system also provides extensive functional expansion options, including one PCIe [x16] slot, an M.2 2230 E-Key for Wi-Fi, and an M.2 3042/3052 B-Key with a micro SIM slot.

For peripheral connectivity, the server boasts a total of four RJ-45 ports, two running at 2.5GbE and two at 1GbE speed, along with four USB 3.2 Gen 2 ports running at 10Gbps. In terms of industrial communication options, the MAXER-2100 grants users RS-232/422/485 via a DB-9 port. Multiple display interfaces are available, thanks to HDMI 2.0, DP 1.4, and VGA ports, which leverage the exceptional graphic capability of the server’s NVIDIA® GeForce RTX™ 4080 SUPER GPU.

Given the combined thermal output of its 1000W power supply, 125W CPU, integrated NVIDIA® GeForce RTX™ 4080 SUPER GPU, and potential additional add-on cards, the MAXER-2100 is remarkably compact at just 17″ x 3.46″ x 17.6″ (431.8mm x 88mm x 448mm). This is made possible by a novel cooling architecture utilizing three fans, prioritizing airflow around the CPU and key chassis components. Fan placement within the MAXER-2100 chassis also serves to reduce system noise.

AAEON has indicated that the system caters to three primary user bases – edge computing clients, central management clients, and enterprise AI clients.

The first of these refers to organizations and businesses that require scalable, server-grade edge inferencing for applications such as automated optical inspection (AOI) and smart city solutions.

“The MAXER-2100 can be used to run multiple AI models across multiple high-definition video streams simultaneously, via either its onboard peripheral interfaces or scaled up via network port integration.” Associate Vice President of AAEON’s Smart Platform Division Alex Hsueh said. “Its high-performance CPU, powerful GPU, large memory capacity, and high-speed network interfaces make it well-equipped to handle the acquisition and processing of 50-100+ high definition video streams, making it an ideal solution for applications requiring real-time video analysis.” Hsueh added.

AAEON’s second target market is those seeking remote multi-device management functions, such as running diagnostics, deploying or refining AI models, or storing local data on edge devices. On the topic of the product’s suitability for such clients, Mr. Hsueh remarked, “With the MAXER-2100, our customers can utilize K8S, over-the-air, and out-of-band management to update and scale edge device operations across smart city, transportation, and enterprise AI applications, addressing key challenges faced by our customers when managing multiple AI workloads at the edge.”

For enterprise AI clients, AAEON indicates that by leveraging the MAXER-2100, companies can effectively harness their data to build and deploy advanced AI solutions powered by LLMs. This includes applications in natural language processing, content generation, and customer interaction automation. The key benefits that the MAXER-2100 brings to such setups are the security provided by data being stored and processed at the edge and the system’s ability to train and refine inference models during operations. For more information and detailed specifications, please visit the MAXER-2100 product page, or contact your AAEON representative via the contact form on the AAEON website.

Related articles

ICP Unveils NVMe Enterprise Grade U.3 Solid State Disk

ICP Unveils NVMe Enterprise Grade U.3 Solid State Disk

U.3 high-performance storage media are used in industrial servers and data centers for fast data processing. This unique technology combines NVMe speed and flexibility with an interface that supports SATA, SAS and PCI Express for seamless integration into various...

Drive Systems from NORD for all Packaging Process Stages

Drive Systems from NORD for all Packaging Process Stages

NORD DRIVESYSTEMS will present its comprehensive product portfolio and its packaging industry expertise from 24 to 26 September 2024 at Fachpack in Nuremberg. In Halle 3C, Stand 440, the company will present tailor-made, economical and sustainable drive solutions for...

Trending Articles

Join our mailing list

Subscribe to our mailing list to receive regular updates!

x