←back to thread

419 points serjester | 2 comments | | HN request time: 0.399s | source
Show context
wiradikusuma ◴[] No.43536155[source]
Booking a flight is actually task I cannot outsource to a human assistant, let alone AI. Maybe it's a third-world problem or just me being cheap, but there are heuristics involved when booking flights for a family trip or even just for myself.

Check the official website, compare pricing with aggregator, check other dates, check people's availability on cheap dates. Sometimes I only do the first step if the official price is reasonable (I travel 1-2x a month, so I have expectation "how much it should cost").

Don't get me started if I also consider which credit card to use for the points rewards.

replies(8): >>43536336 #>>43536536 #>>43537460 #>>43538122 #>>43539691 #>>43540935 #>>43542337 #>>43547348 #
victorbjorklund ◴[] No.43536536[source]
I don't really need an AI agent to book flights for me (I just don't travel enough for it to be any burden) but aren't those arguments for an AI agent? If you just wanna book the next flight London to New York it isn't that hard. A few minutes of clicking.

But if you wanna find the cheapest way to get to A, compare different retailers, check multiple peoples availability, calculate effects of credit cards etc. It takes time. Aren't those things that could be automated with an agent that can find the cheapest flights, propose dates for it, check availability etc with multiple people via a messing app, calculate which credit card to use, etc?

replies(2): >>43536692 #>>43536703 #
1. Jianghong94 ◴[] No.43536703[source]
Yep that's what I've been thinking. This shouldn't be that hard, at this point LLMs should already have all the 'rules' (e.g. credit card A buys flight X give you m point which can be converted into n miles) in their params or can easily query the web to get it out. Dev need to encode the whole thing into a decision mechanism and once executed ask LLM to chase down the specific path (e.g. bombard ticket office with emails).
replies(1): >>43541892 #
2. antihipocrat ◴[] No.43541892[source]
And what happens to the 1% where this fails? At the moment the responsibility is on the person. If I incorrectly book my flight for date X, and I receive the itinerary and realise I chose the wrong month - then damn, I made a mistake and will have to rectify.

An LLM could organise flights with a lower error rate, however, when it goes wrong what is the recourse? I imagine it's anger and a self-promise never to use AI for this again.

*If you're saying that the AI just supplies suggestions then maybe it's useful. Though wouldn't people still be double checking everything anyway? Not sure how much effort this actually saves?