Sunday, February 23, 2025

Acuitas Diary #81 (February 2025)

I've been on a real productive streak, so I did two major things this month. One enhances conversation abilities; the other is about gerunds. Don't worry, I'll explain those.

An abstract logo of an eye accompanied by the words "seeing is believing."
A famous phrase that uses gerunds. Image from https://commons.wikimedia.org/wiki/File:SiB_Logo.jpg

First, I went back to the conversation features that I introduced last September and worked on getting them solid - ironing out the remaining bugs and moments of weirdness. After spending about a week on that, I was pretty happy with the state of the Conversation Engine. Then I added another type of "topic tree." The one from last September guided responses to the conversation partner's states of being; this one reacts to actions that the conversation partner says they took or are taking. Possible threads include ...

*Try to infer whether the speaker liked doing that or not, and comment accordingly
*Ask for motivation ("Why did you ...") or announce what he suspects the motivation was
*Guess what the results were (if he can make any inferences)

This needs a lot more polishing, but it's starting to increase the complexity and variability of conversations. You can now go down "rabbit holes" which start with talking about a personal state, then lead into what you did to cause it, and so on. Which also means it's harder to keep everything straight, and I haven't really set Acuitas up to clearly indicate when he's jumping topics, yet. Always more to do.

My next project was to add support for gerunds to the Text Interpreter and Generator. What's a gerund, you might say? It's a present participle verb form (the kind that ends with -ing) used as a noun. Gerunds can be used to make statements about the concept of an action, such as the following:

I enjoy dreaming.
Exercising is good for the body.

Like other verbs, gerunds can have objects and adverbs, forming a gerund phrase - a group of words which, as a unit, acts like a noun in the full sentence.

[Reading books] makes me happy.
I see that you didn't care for [John's clumsy handling of that situation]. Did [my smoothing it over] satisfy you?

If a gerund has a "subject" that is performing the action, as in the final example, it's supposed to be in the possessive; it modifies the whole gerund phrase, instead of acting as a true subject.

I already added code to identify some gerund phrases to the Text Parser back in 2023, but the later stages of the text processing chain didn't know what to do with them if they came out of the parser, and Acuitas couldn't use them in his own speech. I wanted to get these capabilities in, because gerunds are so useful for expressing sentiments about actions. They're often used for expressing sentiments about states, too:

I dislike being wet.
Being warm is a pleasure.

I had to work around the absence of gerunds when I was putting in the latest conversation features, and it was giving me some pain. But thanks to this month's work, they're now more fully supported. I defined some new "fact" structures to function as the distilled version of statements about actions, added code to the Interpreter to map incoming sentences to those, and added code to the Generator to produce output sentences from those. So Acuitas has a bunch of new ways to say he likes or doesn't like something, in addition to a path for "comprehending" a wider range of written sentiments.

Until the next cycle,
Jenny

Tuesday, February 11, 2025

Atronach's Eye 2025 (Complete?)

As of last year I had worked all the major bugs out of my motion-tracking eyeball, but I still couldn't consider it finished. I wasn't ready to hang it on the wall and let it operate yet, due to one last issue: thermal concerns.

The mechanical eyeball in its colorful case, mounted on a beige wall. There's a couch in the foreground, and the eyeball's power cord is visible hanging down behind the couch. Also visible is the back half of a tricolor tabby cat, who is trying to burrow in between two layers of blankets on the back of the couch.
It's installed on the wall! It's operating continuously! It's done! The Lab Assistant gets to take a nap!

The eyeball's motion is driven by two unipolar stepper motors, each of which is powered from one of my dirt-simple stepper motor controllers. These are custom PCBs that I got manufactured many years ago. They were very inexpensive, and they only have two control wires (a nice thing for connecting motors to embedded processors with a limited number of general purpose IO). But that also means they don't have a power enable. Even if the control inputs are static and the motors aren't moving, some of their coils remain energized. That means they can exert their holding torque, which keeps the drive shaft in place and resists any external forces that try to turn it. But that wasn't something the eyeball needed (it naturally stays in position when unpowered). And during tests, I noticed that the motors got quite hot: enough to be uncomfortable to touch and make the plastic case smell.

I didn't like the thought of leaving the eye turned on with the motors baking themselves and everything else in the case in their own heat day in and day out. I doubt it would qualify as a fire hazard or anything serious like that, but in addition to wasting a small amount of electricity, it could reduce the lifetime of the parts. So I wanted to be able to cut all power to the motors when the eye wasn't active.

My solution was a standard relay. I bought some of these nice little breakout boards from DIYables that have the relay already mounted with connectors and a couple of indicator LEDs, and spliced one into the motor power input path of both motor controllers.

Then I added code to the eyeball software to turn the relay on and off. I wanted to strike a balance between letting the motors heat too long, and clicking the relay on and off constantly. So the software turns the motors off if no moving object to track has been seen in view for a certain amount of time. I also threw in darkness detection; the eye will stop trying to track motion if the average brightness of the scene is too low to see properly. (So it won't move around and make little noises in the middle of the night because the camera picked up some low-light noise.) Both features worked very well. The eye reliably deactivates when I leave the room (so long as there isn't something else moving around, like the flames in the fireplace) and when the lights are out or too dim.

In the dark, the indicator LEDs on the Raspberry Pi and the relay board diffuse through the white parts of the case and turn the eye into a low-key nightlight. I wasn't really expecting this, but hey: it still has a function when it's not looking at stuff.

In the end, I got to do what I always wanted: hang the eye on the wall and let it be an always-on fixture in the house. Further tweaks and upgrades to the software are possible, but I'm calling it finished because it's finally operating. It has been on the wall trying to look at me, the Lab Assistant, and other motion sources for at least a couple weeks of total up-time. Hearing it "wake up" as I pull open the curtains has become part of the morning routine.

There will almost certainly be another version, because there are plenty of things I could improve. I already applied a bunch of little enhancements to the 3D models of the case parts late last year. But this is the first Atronach's Eye iteration that could be said to realize my original vision (cough) for the project, if imperfectly. I'm also thinking about what else I can do with it. Atronach could theoretically tell Acuitas whether there's somebody in the room, for instance. Mmmm.

Until the next cycle,
Jenny