←back to thread

Building Effective "Agents"

(www.anthropic.com)
596 points jascha_eng | 1 comments | | HN request time: 0.215s | source
Show context
simonw ◴[] No.42475700[source]
This is by far the most practical piece of writing I've seen on the subject of "agents" - it includes actionable definitions, then splits most of the value out into "workflows" and describes those in depth with example applications.

There's also a cookbook with useful code examples: https://github.com/anthropics/anthropic-cookbook/tree/main/p...

Blogged about this here: https://simonwillison.net/2024/Dec/20/building-effective-age...

replies(6): >>42475903 #>>42476486 #>>42477016 #>>42478039 #>>42478786 #>>42479343 #
Animats ◴[] No.42478039[source]
Yes, they have actionable definitions, but they are defining something quite different than the normal definition of an "agent". An agent is a party who acts for another. Often this comes from an employer-employee relationship.

This matters mostly when things go wrong. Who's responsible? The airline whose AI agent gave out wrong info about airline policies found, in court, that their "intelligent agent" was considered an agent in legal terms. Which meant the airline was stuck paying for their mistake.

Anthropic's definition: Some customers define agents as fully autonomous systems that operate independently over extended periods, using various tools to accomplish complex tasks.

That's an autonomous system, not an agent. Autonomy is about how much something can do without outside help. Agency is about who's doing what for whom, and for whose benefit and with what authority. Those are independent concepts.

replies(5): >>42478093 #>>42478201 #>>42479305 #>>42480149 #>>42481749 #
simonw ◴[] No.42478201[source]
Where did you get the idea that your definition there is the "normal" definition of agent, especially in the context of AI?

I ask because you seem very confident in it - and my biggest frustration about the term "agent" is that so many people are confident that their personal definition is clearly the one everyone else should be using.

replies(2): >>42478826 #>>42478885 #
1. PhilippGille ◴[] No.42478826[source]
Didn't he mention it was the court's definition?

But I'm not sure if that's true. The court didn't define anything, in contrary they only said that (in simplified terms) the chatbot was part of the website and it's reasonable to expect the info on their website to be accurate.

The closest I could find to the chatbot being considered an agent in legal terms (an entity like an employee) is this:

> Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.

Source: https://www.canlii.org/en/bc/bccrt/doc/2024/2024bccrt149/202...