Wednesday, February 21, 2024

Call for Papers: Workshop on CBR and LLMs

Generated by Gemini
Last year was the most remarkable year in AI that I can recall. Large Language Models (LLMs) like Chat-GPT changed the public perception of AI, and what had previously seemed like science fiction was now a reality. I was only tangentially familiar with LLM research, having been working on emotion recognition in speech with a PhD student. However, last year, I started diving into LLM research in-depth, which, as one commentator said, was like trying to drink water from a fire hydrant, such was the volume of publications through places like arXiv.

I view all problems through a lens coloured by case-based reasoning (CBR), my long-term AI research speciality. I quickly saw synergies between CBR and LLMs where both could benefit from each other's approaches, and I wrote up my initial thoughts and published them on arXiv.

CBR has an annual international conference, and I proposed the idea of a workshop at the conference on CBR-LLM synergies to some colleagues, who all thought this was a great idea and agreed to co-organise the workshop with me. The Case-Based Reasoning and Large Language Models Synergies Workshop will take place at  ICCBR 2024 in Mérida, Yucatán, México on July 1st 2024. The Call for papers can be accessed here, and submissions are via EasyChair. 

Thursday, February 15, 2024

A Long-term Memory for ChatGPT

Generated by Gemini

In October last year, I published a short position paper, A Case-Based Persistent Memory for a Large Language Model, arguing that ChatGPT and other LLMs need a persistent long-term memory of their interactions with a user to be truly useful. It seems OpenAI was listening because a couple of days ago, they announced that ChatGPT would retain a persistent memory of chats across multiple conversations. As reported in Wired, the memory will be used to add helpful background context to your prompts, improving their specificity to you over time. I argued in my October paper that the LLM community should look to the Case-Based Reasoning community for help with memory since we are the discipline within AI that has been explicitly concerned with memory for decades. For example, we long ago realised that while remembering is vital, a memory must also be able to forget some things to remain functional. This is a non-trivial problem discussed in Smyth and Keane's 1997 paper Remembering To Forget: A Competence-Preserving Case Deletion Policy for Case-Based Reasoning Systems. The synergies between CBR and LLMs will be the focus of a workshop at ICCBR-24 in July in Merida, Yucatán, México.