←back to thread

1013 points QuinnyPig | 9 comments | | HN request time: 1.109s | source | bottom
Show context
consumer451 ◴[] No.44564348[source]
Important details from the FAQ, emphasis mine:

> For users who access Kiro with Pro or Pro+ tiers once they are available, your content is not used to train any underlying foundation models (FMs). AWS might collect and use client-side telemetry and usage metrics for service improvement purposes. You can opt out of this data collection by adjusting your settings in the IDE. For the Kiro Free tier and during preview, your content, including code snippets, conversations, and file contents open in the IDE, unless explicitly opted out, may be used to enhance and improve the quality of FMs. Your content will not be used if you use the opt-out mechanism described in the documentation. If you have an Amazon Q Developer Pro subscription and access Kiro through your AWS account with the Amazon Q Developer Pro subscription, then Kiro will not use your content for service improvement. For more information, see Service Improvement.

https://kiro.dev/faq/

replies(4): >>44565507 #>>44565980 #>>44567912 #>>44569042 #
srhngpr ◴[] No.44565507[source]
To opt out of sharing your telemetry data in Kiro, use this procedure:

1. Open Settings in Kiro.

2. Switch to the User sub-tab.

3. Choose Application, and from the drop-down choose Telemetry and Content.

4. In the Telemetry and Content drop-down field, select Disabled to disable all product telemetry and user data collection.

source: https://kiro.dev/docs/reference/privacy-and-security/#opt-ou...

replies(1): >>44566830 #
1. m0llusk ◴[] No.44566830[source]
Is there a way to confirm this works or do we just have to trust that settings will be honored?
replies(3): >>44566856 #>>44567741 #>>44568474 #
2. consumer451 ◴[] No.44566856[source]
You could place some unique strings in your code, and test it to see if they appear as completions in future foundation models? Maybe?

I am nowhere near being a lawyer, but I believe the promise would be more legally binding, and more likely to be adhered to, if money was exchanged. Maybe?

The "Amazon Q Developer Pro" sub they mention appears to be very inexpensive. https://aws.amazon.com/q/pricing/

3. Waterluvian ◴[] No.44567741[source]
Just like using an AI model, you can’t actually know for sure that it won’t do anything malicious with what interfaces you give it access to. You just have to trust it.
replies(1): >>44569703 #
4. pmontra ◴[] No.44568474[source]
As for everything else: trust, possibly enhanced by the fear of consequences for the other party.

How do we know if random internet service sells our email / password pair? They probably store the hashed password because it's easier (libraries) than writing their own code, but they get it as cleartext every time we type it in.

replies(2): >>44568663 #>>44569692 #
5. Quekid5 ◴[] No.44568663[source]
> How do we know if random internet service sells our email / password pair? They probably store the hashed password because it's easier (libraries) than writing their own code, but they get it as cleartext every time we type it in.

For that, we can just use a unique password per service. That's not really a thing for code.

6. rusk ◴[] No.44569692[source]
> How do we know if random internet service

Audits. Obviously not every service is going to be in a jurisdiction that proactively audits data processors and controllers. Another thing to consider before you hand over your data.

7. dkga ◴[] No.44569703[source]
Well, you can at least check if there is network traffic to AWS or something similar.
replies(2): >>44569735 #>>44570311 #
8. yurishimo ◴[] No.44569735{3}[source]
But wouldn't that look the same as actually querying the model? Or am I missing the joke?
9. Waterluvian ◴[] No.44570311{3}[source]
There’s always ways to mitigate malicious behaviour once it’s already happening.