Products

How it works

Technology

About

Why Us

Blog

Docs

Book a call

Book a call

Insights from our sgaI-ai 2025 publication

Memory as a foundation for cognitive ai

Dec 15, 2025

Success stories

Large language models are impressive at generating fluent responses, but they still struggle with one essential capability: memory.

Most conversational systems rely on short-lived context windows. Once interactions grow longer or span multiple sessions, information is lost, personalization degrades, and coherence breaks down.

At SGAI-AI 2025 in Cambridge, we presented our paper Comparative Analysis of ChatGPT Memory and an External Conversational Memory Architecture, which explores this limitation through an architectural lens.

Instead of comparing models, we compared paradigms.

We evaluated ChatGPT’s native memory against an external conversational memory system designed to persist user profiles and interaction histories beyond the LLM context window. The study focused on two dimensions: factual recall accuracy and the quality of long-term, hyper-personalized responses.

Using a synthetic benchmark and a blind human evaluation, the results were clear. When personalization needs to persist over time, an external memory module delivers measurable gains in recall, consistency, and response quality.

The takeaway is simple: memory should be treated as a first-class architectural layer, not an auxiliary capability embedded in the model.

Separating memory from generation enables more coherent, traceable, and reliable AI systems, especially for real-world use cases where interactions are continuous rather than transactional.

Cognitive AI will not emerge from larger context windows alone, but from better system design.

Full paper available here:
https://link.springer.com/chapter/10.1007/978-3-032-11442-6_30

Navigation

Products

How it works

About

Technology

Blog

Why Us

FAQs

Legal

Articles

Privacy policy

Terms of service

Socials

Linkedin

©2025 Ontbo.

All rights reserved

Products

How it works

Technology

About

Why Us

Blog

Docs

Book a call

Book a call

Insights from our sgaI-ai 2025 publication

Memory as a foundation for cognitive ai

Dec 15, 2025

Success stories

Large language models are impressive at generating fluent responses, but they still struggle with one essential capability: memory.

Most conversational systems rely on short-lived context windows. Once interactions grow longer or span multiple sessions, information is lost, personalization degrades, and coherence breaks down.

At SGAI-AI 2025 in Cambridge, we presented our paper Comparative Analysis of ChatGPT Memory and an External Conversational Memory Architecture, which explores this limitation through an architectural lens.

Instead of comparing models, we compared paradigms.

We evaluated ChatGPT’s native memory against an external conversational memory system designed to persist user profiles and interaction histories beyond the LLM context window. The study focused on two dimensions: factual recall accuracy and the quality of long-term, hyper-personalized responses.

Using a synthetic benchmark and a blind human evaluation, the results were clear. When personalization needs to persist over time, an external memory module delivers measurable gains in recall, consistency, and response quality.

The takeaway is simple: memory should be treated as a first-class architectural layer, not an auxiliary capability embedded in the model.

Separating memory from generation enables more coherent, traceable, and reliable AI systems, especially for real-world use cases where interactions are continuous rather than transactional.

Cognitive AI will not emerge from larger context windows alone, but from better system design.

Full paper available here:
https://link.springer.com/chapter/10.1007/978-3-032-11442-6_30

Navigation

Products

How it works

About

Technology

Blog

Why Us

FAQs

Legal

Articles

Privacy policy

Terms of service

Socials

Linkedin

©2025 Ontbo.

All rights reserved

Products

How it works

Technology

About

Why Us

Blog

Docs

Book a call

Book a call

Insights from our sgaI-ai 2025 publication

Memory as a foundation for cognitive ai

Dec 15, 2025

Success stories

Large language models are impressive at generating fluent responses, but they still struggle with one essential capability: memory.

Most conversational systems rely on short-lived context windows. Once interactions grow longer or span multiple sessions, information is lost, personalization degrades, and coherence breaks down.

At SGAI-AI 2025 in Cambridge, we presented our paper Comparative Analysis of ChatGPT Memory and an External Conversational Memory Architecture, which explores this limitation through an architectural lens.

Instead of comparing models, we compared paradigms.

We evaluated ChatGPT’s native memory against an external conversational memory system designed to persist user profiles and interaction histories beyond the LLM context window. The study focused on two dimensions: factual recall accuracy and the quality of long-term, hyper-personalized responses.

Using a synthetic benchmark and a blind human evaluation, the results were clear. When personalization needs to persist over time, an external memory module delivers measurable gains in recall, consistency, and response quality.

The takeaway is simple: memory should be treated as a first-class architectural layer, not an auxiliary capability embedded in the model.

Separating memory from generation enables more coherent, traceable, and reliable AI systems, especially for real-world use cases where interactions are continuous rather than transactional.

Cognitive AI will not emerge from larger context windows alone, but from better system design.

Full paper available here:
https://link.springer.com/chapter/10.1007/978-3-032-11442-6_30

Navigation

Products

How it works

About

Technology

Blog

Why Us

FAQs

Legal

Articles

Privacy policy

Terms of service

Socials

Linkedin

©2025 Ontbo.

All rights reserved