Revolutionary Analog In-Memory Compute Enabling High-Performance and Ultra Low-Power Edge AI Inference Applications

S. Nayak
Sagence AI, California, United States

Keywords: Artificial Intelligence, Inference, Edge AI, Neural Networks, Computer Vision

Many edge AI inference applications are constrained by power and cost limitations, restricting them to small models. However, there is a growing need for these devices to handle larger, more complex models to support advanced applications. For instance, while the number and resolution of security cameras have surged, the AI models processing this data are either basic and run on-premises or more sophisticated but cloud-based with high latency. New approaches leveraging in-memory analog computing aim to close the gap between performance demands and the power available for edge applications. When applied to real-time AI video analytics, the power and performance improvements are significant. Analog Inference's revolutionary In-Memory Compute technology for AI directly addresses these critical challenges, offering a sustainable, scalable solution that meets the evolving demands of modern AI applications. This poster presentation will summarize the benefits across performance, power, and cost metrics.