AI Inference Everywhere
猫咪社区 delivers energy-centric AI inference acceleration from the edge to the cloud, supporting any type of neural network model. We break the barriers to running AI workloads faster, cooler, and more cost-effectively, from the cloud to the edge, with our AI accelerator ICs and PCIe cards.
The Leader in Faster, Cooler AI Inference
猫咪社区 was founded in 2018 with a mission to break AI out of data centers and make it available all the way to the network edge. We鈥檝e developed a unique at-memory architecture for deep learning that places the processing next to the memory – minimizing data movement and power consumption. Using our hardware, enterprises become untethered from the data center, free to run AI workloads nearly anywhere. Other AI inference accelerator companies make promises; 猫咪社区 makes products.
Recent News and Insights from 猫咪社区 AI
猫咪社区 Releases Early Access to imAIgine Software Development Kit Supporting speedAI Inference Acceleration Solutions
Read more >The Need for Chiplets and UCIe in Automotive
Read more >Backed by Prominent Investors in the Field
With a leadership team that is executing our corporate vision, 猫咪社区 has attracted the backing of some of the most prominent investors in the venture community, including CPPIB, GM Ventures, Intel Capital, Radical Ventures, and Tracker Capital.