Discussion about this post

User's avatar
Claudine Notacat's avatar

I noticed this phenomenon after learning how to knit. I told a friend that sweaters in a store were “screaming information at me,” which is an insane-sounding way of putting it, but it was just pretty neat how I could *see* all this information I didn’t know was there before, about fiber content, construction techniques, etc.

Expand full comment
David Kiferbaum's avatar

"Ironically, if I were trying to create an AI system that could see, and reason about, the world more effectively (and who says I’m not!) I would be trying to find ways to inject more of this kind of embodied, valence-laden affect into its perceptual apparatus." I think this is exactly what people do when they assign a persona to a chatbot (e.g., "you are an expert at Jungian dream analysis. Last night I dreamt that..."). The persona seems to more effectively "constrain" the statistical space of token prediction, often improving output quality. It's possible that we're all socially encouraged to do this, take on expertise that implicitly narrows our affect in order to improve the quality of our labor output. The humanities sort of acknowledge this by (ideally) broadening our conceptual space and supposedly enriching the experience of life (before eventually being pushed into expertise). To that end, seems like market forces might be a drag for humans and for AI.

Expand full comment
7 more comments...

No posts