Doesn't it feel wrong to the author to snoop through that private information? And publishing it in a news article definitely crosses a line.
It feels a little strange at first but I suspect (correctly or not) that he has sought and received permission from the daughter first. Although I did not see any direct statement. The daughter is 18 or so now (maybe, adding up the times).
The article is as much about the humdrumness of family life as about what Alexa and Amazon hears. I am glad I read it. Puts life (and some parts of technology in perspective).
.
Of course. The point is to snoop on people to make better "recommendations". Dystopian.
Then they get all that juicy "accidental activation" data on top of that.
Amazon is also a ecosystem. Alexa shows you notifications from Amazon like the status of a delivery. It's able to call others (great for family).
Amazon has also the fire kid tablet, fire TV etc.
And if I already use Amazon anyway I'm quite happy if Amazon would recommend me good products I like.
For plenty of things, Alexa is a very good UI.
1) What sensitive information was published in this article besides some superficial listening preferences and some Alexa interactions we have all had? I'm not sure identifying the extent of the use of the prefix "omni" is particularly sensitive information. It's not like anyone was divulging personal preference by asking for definitions.
2) What makes you think the author didn't run it by their family before submitting the story?
Well of course, only Amazon should have this info 8-/
This whole thing is truly disturbing.
And the millennial expectation that "OF COURSE the monopolistic corps should know everything", is by far the most disturbing part of all.
When in the next decade or two, people find themselves truly and irreversibly f_cked by corporate over-dominance, it will largely be their own fault...
> What is this marketing, why is it happening, and is it widely accepted in that society?
What?
Your discomfort or offense makes you think that someone encountering a word means that it was "marketed" to them as opposed to that concept simply existing in society, where others will encounter it. Because identities other than yours can exist equally openly to yours. Without their acceptance needing to be justified to you.
The piece of the puzzle you're missing is that a child understanding what homosexuality is (for example) is equally mundane as their learning what heterosexuality is. The world is not going back to these other identities hiding themselves, so you can either accept it or spend the rest of your life uncomfortable about it. You have free will.
Your experiences are very different from my own. I struggle to remember meeting anyone that thought this. Mostly people are just apathetic.
And apathy is what caused all of history's greatest crimes to happen. No matter which political ideology, which skin color, which age.
As for the argument of "OF COURSE the monopolistic corps should know everything" itself... I kinda get it. Google at least used to provide a decent service to the end users in exchange for all the data, but they've gone completely off the rails the last few years.
Ever since Google fucked up social media by requiring verification with Google+ they've been pretty bad. That was 14 years ago.
As an example, the part of the article about questions his daughter has asked Alexa reflects things no different than ones you might type into a search engine. But he describes it as "Coco’s relationship with Alexa...", a term I'm confident he wouldn't use to describe her typing the same things into Google. You could maybe make the argument that it's different because people ask Alexa things they wouldn't just search for, but that potentially interesting distinction is unexplored by the author.
I'm not aware of anything covering this, but I think there's some interesting potential looking into how humans see technology as more human if they can communicate with it in a human way, regardless of whether or not it otherwise displays aspects of humanity. Generative AI falls into this category too I think. People view it as way more intelligent than it actually is because you can sort of converse with it like a human.
> Mostly people are just apathetic
and
> OF COURSE the monopolistic corps should know everything
Are, de facto, equivalent.
Not caring allows the profit-motivated non-human legal entity to pursue whatever course of action it desires.
Not caring is granting permission for that action.
I'm going out on a limb here, and I hope I don't ruin my argument with this stretch analogy:
This reminds me of the situation people enter into when they are legally married. They agree to the legal terms and liabilities of a contract without disclosure of those terms. People are informed by family, peers and society that marriage is an expression of their love and devotion to each other. Then, those unfortunate enough to find themselves in a contentious divorce, discover that the many volumes of their state's family legal code don't actually contain any language at all about love and/or devotion.
There is a lot of language about who gets the money and/or the kids.
They're now subject to the rules that they agreed to when they said "I do", even though they had no idea about those rules at the time.
Is this analogy just bitter divorce vitriol? Yes, yes it is. And I hope you never have to experience seeing things from this perspective.
But to explicitly complete the analogy: the owners of, whatever snoop device, cared enough to buy it, they cared enough to AGREE TO THE EULA!, they cared enough to let that thing monitor their home conversations for years, but they were apathetic about the long term consequences of allowing a profit-motivated no-human legal entity all the rights granted in that many paged EULA that they so eagerly clicked OK to without reading.
Now we get to see a little glimmer of the consequences of that OK. And trust me, this story is by no means the end of those consequences.
This is facilitated by, and will continue to get worse, because: Mostly people are just apathetic
I don't know why, but certain types of people seem easily fooled into thinking that LLMs really are like a real person. I have to imagine that either these people don't actually need things that currently only a real person can provide, or they're just happy enough with what the LLM spits out to be unable to tell the difference.
Which doesn't make any sense to me, because whenever I talk to an LLM, I can pretty easily tell that it's nowhere close to a real person. As an example, I never use LLMs for conversation, because speaking to one is not in any way fulfilling to me like speaking to another real person is. I usually use LLMs for creative writing instead, but they're terrible at things that haven't already been exactly seen in their training data. They're not nearly as generalizable as the media would have you believe. All they can do is spit out sentences that look like sentences from real stories, they don't actually have any conception of the story or visualization of the scene that they then can describe like a person would. They don't actually simulate any of the story or imagine anything like I do.
I have to wonder if the people who are so fooled by LLMs are just non-autistic people. If non-autistic brains work based on patterns rather than how I work based on strict logic, that could explain why something that appears to show patterns of a person would then be perceived as a person to them.
But I dunno, that suggests that non-autistic people are somehow generally simpler/dumber than autistics and I wouldn't want to just assume that.
https://www.amazon.com/dp/B07GBM7TZJ seems like a totally normal price to pay for eggs these days, although you probably wouldn't just buy a dozen eggs in isolation, given delivery fees and driver tip.
https://en.wikipedia.org/wiki/Room_641A
No one needs to know what else you do.
It's a lot more suspicious to not tell big brother what he wants to hear.
And besides that strategy, dystopian stories sell better. No one would read good feel descriptions.
You can't build, for example, the US' system of slavery without the apathy of a bunch of white people. And that's the only thing that really maintained that system - as soon as a good chunk of white people started caring, it collapsed.