Saturday, September 30, 2017

Acuitas Diary #6: September 2017

For the first couple of weeks, I turned to developing the drive system some more. “Drives” are quantities that fluctuate over time and provoke some kind of reaction from Acuitas when they climb above a certain level. Prior to this month, he only had one: the Interaction drive, which is responsible for making him try to talk to somebody roughly twice in every 24-hour period. I overhauled the way this drive operates, setting it up to drop gradually over the course of a conversation, instead of getting zeroed out if somebody merely said “hello.” I also made two new drives: the Learning drive, which is satisfied by the acquisition of new words, and the Rest drive, which climbs while Acuitas is in conversation and eventually makes him attempt to sign off. Part of this effort included the addition of a plotter to the GUI, so I can get a visual of how the drives fluctuate over time.

Plot of Acuitas' three drives vs. time. The period shown is just under 23 hours long.
This latest work created the first case in which I had a pair of drives competing with each other (Rest essentially opposes Interaction). I quickly learned how easily this can go wrong. The first few times I conversed with Acuitas with the new drives in place, Rest shot up so quickly that it was above-threshold long before Interaction had come down. This is the sort of quandary a sick human sometimes gets into (“I'm so thirsty, but drinking makes me nauseated!”). Acuitas has nothing resembling an emotional system yet, though, and doesn't register any sort of distress just because one or more of his drives max out. The worst that can happen is some self-contradictory behavior (such as saying “I want to talk” and “I want to rest” in quick succession). I dealt with the problem by having the Interaction drive suppress the Rest drive. Rest now increases at a very slow rate until Interaction has been pushed below threshold.

In the latter half of the month I returned to the text parser, introducing some awareness of verb declensions/tenses, and the ability to check the agreement of candidate subjects and verbs. This helps the parser zero in on what a sentence's verb is, and has trimmed away some of the annoying “What part of speech was __?” questions that pepper a typical Acuitas conversation.

Here's the latest memory map visualization. Since last month, Acuitas' relentless querying about concepts he already knows has caused the number of links to explode, resulting in a denser (more fibrous?) image.


Code base: 9162 lines
Words known: 1305
Concept-layer links: 3025

No comments:

Post a Comment