←back to thread

1134 points mtlynch | 5 comments | | HN request time: 0s | source
Show context
pc ◴[] No.22937303[source]
Stripe cofounder here. The question raised ("Is Stripe collecting this data for advertising?") can be readily answered in the negative. This data has never been, would never be, and will never be sold/rented/etc. to advertisers.

Stripe.js collects this data only for fraud prevention -- it helps us detect bots who try to defraud businesses that use Stripe. (CAPTCHAs use similar techniques but result in more UI friction.) Stripe.js is part of the ML stack that helps us stop literally millions of fraudulent payments per day and techniques like this help us block fraud more effectively than almost anything else on the market. Businesses that use Stripe would lose a lot more money if it didn't exist. We see this directly: some businesses don't use Stripe.js and they are often suddenly and unpleasantly surprised when attacked by sophisticated fraud rings.

If you don't want to use Stripe.js, you definitely don't have to (or you can include it only on a minimal checkout page) -- it just depends how much PCI burden and fraud risk you'd like to take on.

We will immediately clarify the ToS language that makes this ambiguous. We'll also put up a clearer page about Stripe.js's fraud prevention.

(Updated to add: further down in this thread, fillskills writes[1]: "As someone who saw this first hand, Stripe’s fraud detection really works. Fraudulent transactions went down from ~2% to under 0.5% on hundreds of thousands of transactions per month. And it very likely saved our business at a very critical phase." This is what we're aiming for (and up against) with Stripe Radar and Stripe.js, and why we work on these technologies.)

[1] https://news.ycombinator.com/item?id=22938141

replies(52): >>22937327 #>>22937331 #>>22937352 #>>22937362 #>>22937385 #>>22937475 #>>22937518 #>>22937526 #>>22937559 #>>22937599 #>>22937775 #>>22937815 #>>22937962 #>>22938015 #>>22938068 #>>22938208 #>>22938310 #>>22938383 #>>22938533 #>>22938646 #>>22938728 #>>22938777 #>>22938855 #>>22938884 #>>22939026 #>>22939035 #>>22939376 #>>22939803 #>>22939814 #>>22939916 #>>22939952 #>>22940051 #>>22940090 #>>22940177 #>>22940282 #>>22940315 #>>22940317 #>>22940352 #>>22940686 #>>22940751 #>>22941252 #>>22942502 #>>22942538 #>>22942710 #>>22942907 #>>22943100 #>>22943453 #>>22944163 #>>22944509 #>>22944652 #>>22945170 #>>22946136 #
flower-giraffe ◴[] No.22937815[source]
And how much GDPR risk?

Is it not the case that sites using this would have to explicitly gain consent from visitors to capture data in advance?

replies(1): >>22938031 #
lmkg ◴[] No.22938031[source]
Under GDPR, no. Fraud detection is the canonical Legitimate Interest, and the only one mentioned by name in the text of the law.

PECR is a different matter. That's more restrictive but I'm not sure about the exact contours of what's considered "essential."

replies(1): >>22938106 #
lrpublic ◴[] No.22938106[source]
I don’t think recital 47 allows carte blanche data collection in the name of fraud detection, and at the very least I would think there is an obligation for disclosure of the data collection, a mechanism to access it (DSAR) and the ability to correct inaccuracies.
replies(1): >>22938452 #
lmkg ◴[] No.22938452[source]
You're correct about all of these points. GDPR still means that the principles of transparency, purpose limitation, data minimization, etc. are in play, as are data subject rights like access, rectification, and erasure. I was only addressing the specific issue of consent from your previous comment. Consent wouldn't be necessary if there's a different legal basis, and fraud detection qualifies as a Legitimate Interest.

Note that collecting consent still doesn't give you carte blanche to collect all the datas. The principle of data minimization still restricts you to only the data you need for the purpose you state when gathering consent.

replies(1): >>22938644 #
1. flower-giraffe ◴[] No.22938644[source]
For the avoidance of doubt, the main point of my comment was the not insignificant risk (a maximum fine of 20 million euro or 4% of turnover if that is greater) if a data controller does not meet the obligations of the GDPR.

Consent, as you point out, is only one aspect of this.

replies(1): >>22941146 #
2. Nextgrid ◴[] No.22941146[source]
> was the not insignificant risk (a maximum fine of 20 million euro or 4% of turnover if that is greater)

Facebook and Google are still around. There is absolutely zero risk of any significant GDPR fine as long as the biggest offenders are allowed to run freely.

replies(2): >>22941937 #>>22942524 #
3. flower-giraffe ◴[] No.22941937[source]
Facebook and Google have very deep pockets and are taking lots of steps to comply with the letter, but arguably not the full spirit of GDPR.

I think it would be unsafe to assume that there is zero risk of significant GDPR fines on the basis that the regulatory bodies have not picked a battle with google and Facebook.

Smaller organisations that seem to be doing less to respect GDPR are probably an easier starting point for regulators to begin enforcing the law.

4. hanspeter ◴[] No.22942524[source]
There's absolutely more than zero risk. In Denmark a medium sized taxi company was fined $200,000 for keeping their customer data longer than necessary.

Also: How are Google and Facebook offenders of GDPR?

replies(1): >>22943117 #
5. lrpublic ◴[] No.22943117{3}[source]
I think this is exactly the point. Smaller companies (like stripe), that play fast and loose (maybe not stripe) with European customers’ data are a good target for regulators to make a point.