Kove's Innovative Software-Defined Memory Revolutionizes AI Inference Workloads with Unmatched Performance

Kove's Revolutionary Software-Defined Memory (SDM)



Kove, known for its pioneering solutions in software-defined memory (SDM), has recently unveiled impressive benchmark results demonstrating the capabilities of its KoveSDM™ technology. This groundbreaking solution allows Redis and Valkey, two of the leading engines in AI inference, to handle workloads up to 5 times larger compared to traditional local DRAM, while also achieving lower latencies. This advancement is significant, especially as the technological landscape continues to heavily rely on memory efficiency in artificial intelligence applications.

The Power of KoveSDM™



KoveSDM™ stands out as the world’s first commercially available software-defined memory solution, leveraging pooled memory across servers. Operable on any hardware that supports Linux, KoveSDM™ enables dynamic memory allocation tailored to application needs. This flexibility facilitates enhanced processing times, quicker solutions, improved memory resilience, and notable energy savings. During a recent presentation by CEO John Overton at the AI Infra Summit 2025, the company emphasized that memory constraints, rather than computing power, are emerging as the critical bottleneck affecting AI inference scalability.

While traditional GPUs and CPUs continue to evolve, DRAM technology has remained static and inefficient, often leading to unproductive computing and additional costs. Benchmark tests indicated that using KoveSDM™, businesses can expect improved performance while lowering expenses associated with GPU usage due to enhanced memory management. In this regard, KoveSDM™ offers a unique solution by virtualizing memory across multiple servers, thereby creating vast elastic memory pools that mimic local DRAM functionality and performance.

Benchmark Results and Performance Metrics



In an independent benchmarking scenario conducted on the Oracle Cloud Infrastructure, performance was measured first without KoveSDM™ and then again with it. The results are as follows:

Redis Benchmark (v7.2.4)


  • - 50th Percentile: SET 11% faster, GET 42% faster
  • - 100th Percentile: SET 16% slower, GET 14% faster

Valkey Benchmark (v8.0.4)


  • - 50th Percentile: SET 10% faster, GET 1% faster
  • - 100th Percentile: SET 6% faster, GET 25% faster

John Overton noted, "These results highlight the potential of software-defined memory in accelerating AI inference by removing KV cache evictions and repetitive GPU calculations. Each GET that doesn't result in recomputation saves vital GPU resources, ultimately translating to substantial savings for enterprises engaged in large-scale inference operations."

Business Implications of KoveSDM™



For companies investing in AI technology, the financial benefits of implementing KoveSDM™ are clear:
  • - Expected annual savings ranging from $30 to $40 million for extensive deployments.
  • - 20 to 30% reduction in hardware costs by postponing expensive high-memory server upgrades.
  • - 25 to 54% decrease in power and cooling expenses due to enhanced memory efficiency.
  • - Significant reduction in downtime costs, as KoveSDM™ minimizes memory-related failures.

Beth Rothwell, Kove's Director of GTM Strategy, remarked, "Kove has successfully established a new category — software-defined memory — which enhances AI infrastructure. It empowers businesses to optimize their operations by maximizing performance while ensuring economic viability. Without this capability, AI scalability faces challenges; with it, AI inference scales efficiently, ensuring GPUs are optimally utilized, leading to massive savings for enterprises."

Timeliness of the Development



As demands for AI technology surge, doubling approximately every six to twelve months, standard DRAM solutions struggle to keep up. Existing alternatives, such as tiered KV caching, introduce latency or inefficiencies. KoveSDM™ addresses these challenges effectively by pooling DRAM across servers, thereby delivering performance comparable to that of local memory. Using KoveSDM™, KV caching avoids the pitfalls of tiering to storage, which can result in performance drops of 100 to 1000 times.

KoveSDM™ is immediately available for deployment without necessitating any code alterations or application modifications on any x86 hardware compatible with Linux.

About Kove



Founded in 2003, Kove has consistently aimed to solve complex technological challenges, ranging from high-speed backups for large databases to creating record-setting sustained storage speeds. The company’s innovative use of distributed hash tables facilitated significant advances in cloud storage and database scaling. With their latest development, Kove delivers the first comprehensive software-defined memory solution, empowering organizations to unlock greater potential through enhanced infrastructure efficiency. Committed to providing high-performance computing products and dedicated services, Kove is ready to assist enterprises across various sectors, including financial services, life sciences, energy, and defense. For more details, visit kove.com.

Topics Consumer Technology)

【About Using Articles】

You can freely use the title and article content by linking to the page where the article is posted.
※ Images cannot be used.

【About Links】

Links are free to use.