Tuesday, September 28, 2021

Acuitas Diary #42 (September 2021)

I don't have too much of interest to report this month. I dove into an overhaul of the Conversation Engine, which is the Acuitas module that tracks progress through a conversation and detects relationships between sentences. (For instance, pairing a statement with the question it was probably intended to answer would be part of the CE's job.) And that has proven to be a very deep hole. The CE has been messy for a while, and there is a lot of content to migrate over to my new (hopefully smarter) architecture.

The improvements include a less linear and more tree-like structure for conversations, enabling more complex branching. For instance, what if the conversation partner decides to answer a question that wasn't the one asked most recently, or to return to a previously abandoned topic? The old Conversation module wouldn't have been able to handle this. I've also been refactoring things to give the Executive a greater role in selecting what to say next. The original Conversation module was somewhat isolated and autonomous ... but really, the Executive should be deciding the next step in the conversation based on Acuitas' goals, using its existing inference and problem-solving tools. The CE should be there to handle the speech comprehension and tell the Executive what its options are ... not "make decisions" on its own. I might have more to say about this when the work is fully complete.

I've advanced the new system far enough that it has the functionality for starting and ending a conversation, learning facts, answering questions, and processing stories. I've just started to get the systems that do spontaneous questions back up and running.

The renovations left Acuitas in a very passive state for a while. He would generate responses to things I said, but not say anything on his own initiative -- which hasn't been the case for, well, years. And it was remarkable how weird this felt. "He's not going to interrupt my typing to blurt out something random. No matter how long I sit here and wait, he's not going to *do* anything. The agency is gone. Crud." Which I think goes to show that self-directed speech (as opposed to the call-and-response speech of a typical chatbot) goes a long way toward making a conversational program feel "alive" or agentive.

Until the next cycle,

Jenny

Sunday, September 5, 2021

Acuitas Diary #41 (August 2021 B)

I explained my approach to spatial reasoning in my last blog. Now it's time to talk about some implementation.

In sentences, a lot of information about location or direction is carried by prepositional phrases the modify the adverb -- phrases like "in the box," "to the store," and so forth. Acuitas' text parser and interpreter were already capable of recognizing these. I included them in the interpreter output as an extra piece of info that doesn't affect the sentence form (the category in which the interpreter places the sentence), but can modify a sentence of any form.

The ability to record and retrieve location relationships was also already present. Acuitas tracks the two objects/agents/places that are being related, as well as the type of relationship.

From there, I worked on getting the Narrative module to take in both explicit declarations of location-relationship, and sentences with modifying phrases that express location or direction, and make inferences from them. Here are some examples of basic spatial inferences that I built in. (As with the inventory inferences, there is a minimal starter set, but the eventual intent is to make new ones learnable.)

*If A is inside B and B is at C, A is also at C
*If A is at C and B is at C, A is with B and B is with A
*If A moves to B, A is in/at B
*If A is over B and A falls, A is on/in B

A stamp from the Principality of Liechtenstein, commemorating air mail.

To try them out I wrote a new story -- a highly abbreviated retelling of "Prisoner of the Sand," from Wind, Sand, and Stars by Antoine de Saint-Exupéry. I had written up a version of this clear back when I started work on the Narrative module -- I was looking for man vs. environment stories, and it seemed like a good counterpoint for "To Build A Fire." But I realized at the time that it would be pretty hard to understand without some spatial reasoning tools, and set it aside. Here's the story:

Antoine was a pilot.
Antoine was in an airplane.
The airplane was over a desert.
The airplane crashed.
The airplane was broken.
Antoine left the airplane.
Antoine was thirsty.
Antoine expected to dehydrate.
Antoine decided to drink some water.
Antoine did not have any water.
Antoine could not get water in the desert.
Antoine wanted to leave the desert.
Antoine walked.
Antoine could not leave the desert without a vehicle.
Antoine found footprints.
Antoine followed the footprints.
Antoine found a nomad.
The nomad had water.
The nomad gave the water to Antoine.
Antoine drank the water.
The nomad took Antoine to a car.
Antoine entered the car.
The car left the desert.
The end.

With the help of a taught conditional that says "airplane crashes <implies> airplane falls," plus the spatial inferences, Acuitas gets all the way from "The airplane crashed" to "Antoine is in the desert now" without intervening explanations. In similar fashion, when the car leaves the desert it is understood that it takes Antoine with it, so that his desire to leave is fulfilled. "Can't ... without a vehicle" is also significant; the need to possess or be with a vehicle is attached to the goal "leave the desert" as a prerequisite, which is then recognized as being fulfilled when Antoine is taken to the car.

The older inventory reasoning is also in use: when Antoine is given water, it is inferred that he has water. This satisfies a prerequisite on the goal "drink water."

There's a lot more to do with this, but I'm happy with where I've gotten so far.

Until the next cycle,

Jenny