Most active commenters
  • lucb1e(11)
  • Kalium(7)
  • techbio(3)
  • (3)
  • mistercow(3)
  • TheDong(3)

←back to thread

1134 points mtlynch | 51 comments | | HN request time: 0.741s | source | bottom
Show context
pc ◴[] No.22937303[source]
Stripe cofounder here. The question raised ("Is Stripe collecting this data for advertising?") can be readily answered in the negative. This data has never been, would never be, and will never be sold/rented/etc. to advertisers.

Stripe.js collects this data only for fraud prevention -- it helps us detect bots who try to defraud businesses that use Stripe. (CAPTCHAs use similar techniques but result in more UI friction.) Stripe.js is part of the ML stack that helps us stop literally millions of fraudulent payments per day and techniques like this help us block fraud more effectively than almost anything else on the market. Businesses that use Stripe would lose a lot more money if it didn't exist. We see this directly: some businesses don't use Stripe.js and they are often suddenly and unpleasantly surprised when attacked by sophisticated fraud rings.

If you don't want to use Stripe.js, you definitely don't have to (or you can include it only on a minimal checkout page) -- it just depends how much PCI burden and fraud risk you'd like to take on.

We will immediately clarify the ToS language that makes this ambiguous. We'll also put up a clearer page about Stripe.js's fraud prevention.

(Updated to add: further down in this thread, fillskills writes[1]: "As someone who saw this first hand, Stripe’s fraud detection really works. Fraudulent transactions went down from ~2% to under 0.5% on hundreds of thousands of transactions per month. And it very likely saved our business at a very critical phase." This is what we're aiming for (and up against) with Stripe Radar and Stripe.js, and why we work on these technologies.)

[1] https://news.ycombinator.com/item?id=22938141

replies(52): >>22937327 #>>22937331 #>>22937352 #>>22937362 #>>22937385 #>>22937475 #>>22937518 #>>22937526 #>>22937559 #>>22937599 #>>22937775 #>>22937815 #>>22937962 #>>22938015 #>>22938068 #>>22938208 #>>22938310 #>>22938383 #>>22938533 #>>22938646 #>>22938728 #>>22938777 #>>22938855 #>>22938884 #>>22939026 #>>22939035 #>>22939376 #>>22939803 #>>22939814 #>>22939916 #>>22939952 #>>22940051 #>>22940090 #>>22940177 #>>22940282 #>>22940315 #>>22940317 #>>22940352 #>>22940686 #>>22940751 #>>22941252 #>>22942502 #>>22942538 #>>22942710 #>>22942907 #>>22943100 #>>22943453 #>>22944163 #>>22944509 #>>22944652 #>>22945170 #>>22946136 #
threepio ◴[] No.22938646[source]
Stripe customer here. The question raised is, more broadly, "Is Stripe collecting this data in a legal and ethical way?" This too can be readily answered in the negative.

It doesn't matter whether "Stripe.js collects this data only for fraud prevention" or if it works in practice. Under CalOPPA [1], Stripe still has to disclose the collection of the data, and (among other things) allow customers to opt out of collection of this data, and allow customers to inspect the data collected. Stripe's privacy policy refers to opt-out and inspection rights about certain data, but AFAICT not this.

[This is not legal advice]

[1] http://leginfo.legislature.ca.gov/faces/codes_displayText.xh...

[2] https://stripe.com/privacy#your-rights-and-choices

replies(9): >>22938707 #>>22938733 #>>22938916 #>>22939641 #>>22940272 #>>22940285 #>>22940307 #>>22940438 #>>22943152 #
1. Kalium ◴[] No.22938916[source]
I am not an attorney. This is not legal advice.

Based on a plain reading of the law, several things about CalOPPA stand out to me. For one, it's not clear to me that the mouse movements in question qualify as "personally identifiable information". Mouse movements are not a first or last name, physical or email address, SSN, telephone number, or any contact method I am familiar with (maybe you know a way?).

Second, it seems to me that opt-out, right to inspect and update, and more are all contingent upon the data being PII within the scope of CalOPPA. Perhaps you can help me with something I've overlooked that would show me where I've erred?

Further, what do you think the correct legal and ethical way for Stripe to use mouse movement data would be? From your comment I can guess that you believe it should be treated as PII. Is that correct?

replies(6): >>22939055 #>>22939773 #>>22939959 #>>22940138 #>>22942576 #>>22942757 #
2. techbio ◴[] No.22939055[source]
> first or last name, physical or email address, SSN, telephone number, or any contact method I am familiar with (maybe you know a way?)

What about a face? Fingerprints? Voice? Aren't those identifiable information even though it didn't make your (common sensical) short list? Mouse movements are on the same order of specificity.

Edit: Also not giving legal advice.

Edit2: Please see https://news.ycombinator.com/item?id=22939145

replies(2): >>22939076 #>>22939091 #
3. swsieber ◴[] No.22939076[source]
I have yet to hear of legally binding definition of PII that involves mouse movements.
replies(1): >>22939145 #
4. Kalium ◴[] No.22939091[source]
It's less my short list and more the one in the text of the law being cited. Other things, such as finger-, voice-, and face-prints were probably not contemplated by lawmakers in 2003 and thus go unmentioned. They may fall under the "maintains in personally identifiable form in combination with an identifier" clause, though.

Of course, that also provides an easy way to comply. Don't store mouse movements in a way that ties them to PII under CalOPPA, and you don't meet any criteria.

replies(1): >>22939160 #
5. techbio ◴[] No.22939145{3}[source]
Not a lawyer, but not that surprised that the laws you refer to are growing technical loopholes. Here are a couple things that mouse movements can identify in case no one knows what I'm talking about:

https://www.researchgate.net/publication/221325920_User_re-a...

https://medium.com/stanford-magazine/your-computer-may-know-...

replies(1): >>22939191 #
6. techbio ◴[] No.22939160{3}[source]
Makes sense, but I don't trust it to never be tied to PII.
replies(1): >>22939223 #
7. Kalium ◴[] No.22939191{4}[source]
Thank you for bringing hard research to this discussion!

I find it interesting that the one that contemplates authentication requires supervised machine learning and goes on to explicitly state that "analyzing mouse movements alone is not sufficient for a stand-alone user re-authentication system". Taken together, this suggests that a sizable corpus of mouse movement data known to be associated with one user may qualify as PII under some definitions.

Again, thank you for sharing this timely information.

replies(2): >>22941465 #>>22956322 #
8. Kalium ◴[] No.22939223{4}[source]
That's definitely a question of implementation, policy, compliance, and liability. You are absolutely correct.

The law in question also requires data to be maintained in personally identifiable form. I am uncertain if a small number of mouse movements is likely to reach this. I do not see how, but that's not a reason why it cannot be so.

9. Xelbair ◴[] No.22939773[source]
you have to account for someone using a on screen keyboard to input their credentials.
replies(3): >>22940101 #>>22940113 #>>22940176 #
10. throwaway284629 ◴[] No.22939959[source]
mouse movements will contain personally identifiable information if the user has any kind of writing to text system turned on. You definitely can't rule it out. (not a lawyer) I think what stripe is doing is illegal
replies(1): >>22939977 #
11. three_seagrass ◴[] No.22939977[source]
Writing-to-text tools are technographic data, not really personally identifiable information (PII), and not run in-browser.

When used with other technographic data it can be used to fingerprint a user, but without any PII, you don't know who that user is.

replies(1): >>22949455 #
12. Scaevolus ◴[] No.22940101[source]
Javascript code cannot see mouse movements outside of its page, like in an on screen keyboard or another webpage.
13. mattigames ◴[] No.22940113[source]
They are a minority so its likely easy to account for, stuff like tracking them by learning their IP and transaction history to mark them with certain degree of trustability; on the other hand tracking mouse movements and other techniques are essential for users you have no record of (new ip, new user, new cc, etc)
14. lucb1e ◴[] No.22940138[source]
> Mouse movements are not a first or last name, physical or email address, [or one of a dozen other obvious examples]

You misunderstand what personally identifiable information is. Each individual letter of my name is also not identifiable, the letters of the alphabet are not PII, but when stored in in the same database row, the separate letters do form PII no matter that you stored them separately or even hashed or encrypted them. My phone number is also not something that just anyone could trace to my name, but since my carrier stores my personal data together with the number (not to mention the CIOT database where law enforcement can look it up at will), there exists a way to link the number to my person, making it PII. Everything about me is PII, unless you make it no longer about me.

Mouse movements may not be PII if you don't link it to a session ID, but then it would be useless in fraud detection because you don't know whose transaction you should be blocking or allowing since it's no longer traceable to a person.

Another example[1] mentioned on a website that the Dutch DPA links to (from [2]) is location data. Coordinates that point to somewhere in a forest aren't personal data, but if you store them with a user ID...

[1] (English) https://www.privacy-regulation.eu/en/4.htm

[2] (Dutch) https://autoriteitpersoonsgegevens.nl/nl/over-privacy/persoo...

replies(3): >>22940210 #>>22942777 #>>22945387 #
15. lucb1e ◴[] No.22940176[source]
You mean on a touchscreen device, or because of a physical disability? Because the latter case seems exceptional enough that I'm not sure how that would legally work (do you have to think of all possible edge cases? What if someone uses dictation because they can't type, does that mean you'd potentially capture social security numbers if you use the microphone for gunshot detection and process the sound server-side?) and in the former case I'm pretty sure taps on a keyboard are not registered as a mouse movement in JavaScript.
replies(1): >>22942899 #
16. Kalium ◴[] No.22940210[source]
> You misunderstand what personally identifiable information is.

Not to belabor a point discussed elsewhere, but those were not arbitrarily chosen types of PII. They are how PII is defined in the specific law that was cited - CalOPPA. The comment to which I responded contains a link. The text of the law contains its definition of PII.

Please accept my apologies. I can see I failed to communicate clearly and readers interpreted my statements as broad comment about what is or isn't PII across a union of all potentially relevant laws and jurisdictions. This was in no way, shape, form, or manner my intended meaning. Again, please accept my apologies for failing to be clear.

> Mouse movements may not be PII if you don't link it to a session ID, but then it would be useless in fraud detection because you don't know whose transaction you should be blocking or allowing since it's no longer traceable to a person.

Maybe it's just me, but I was under the distinction impression that some patterns of input are characteristic of humans and others of inhuman actors. Is it possible that a user could be identifiable as human or inhuman without having to know which specific human an input pattern corresponds to? Have I misunderstood something?

replies(1): >>22940376 #
17. lucb1e ◴[] No.22940376{3}[source]
> [could one distinguish] human or inhuman without having to know which specific human an input pattern corresponds to?

You can't rely on the client asking the server anonymously and adhering to the response. If you want to avoid a connection to a "specific human", it would go like this:

Fraudulent lient: POST /are/these/mouse_movements/human HTTP/1.0 \r\n Content-Type: JSON \r\n [{"x":13,"y":148},...]

Server: that's a robot

Fraudulent client: discards server response and submits transaction anyway

To make sure the server knows to block the transaction, it has to tie the mouse movements to the transaction, and thereby to a credit card number (afaik Stripe does only credit cards as payment option), at least during the processing of the submission before discarding the mouse movement data.

I'm not arguing this is evil or mistrusting Stripe or anything, just that this is considered PII in my part of the world.

replies(6): >>22940539 #>>22940557 #>>22940567 #>>22940578 #>>22940589 #>>22941143 #
18. ◴[] No.22940539{4}[source]
19. nl ◴[] No.22940557{4}[source]
To make sure the server knows to block the transaction, it has to tie the mouse movements to the transaction, and thereby to a credit card number (afaik Stripe does only credit cards as payment option), at least during the processing of the submission before discarding the mouse movement data.

Which is absolutely fine by the law if it isn't stored tied to PII.

replies(1): >>22948127 #
20. DevKoala ◴[] No.22940567{4}[source]
As someone who had to implement GDPR for a DSP, that doesn't make the data PII.
replies(1): >>22948083 #
21. Kalium ◴[] No.22940578{4}[source]
> You can't rely on the client asking the server anonymously and adhering to the response. If you want to avoid a connection to a "specific human", it would go like this:

I'm afraid I don't understand. Maybe you can help me? Seems to me you could not store things, you could require a signed and expiring token from the /are/these/mouse_movements/human service, or you could treat the request as super risky without that signed token. I'm sure there are others, I am known to suffer failures of imagination at times.

> To make sure the server knows to block the transaction, it has to tie the mouse movements to the transaction, and thereby to a credit card number (afaik Stripe does only credit cards as payment option), at least during the processing of the submission before discarding the mouse movement data.

I'm clearly wrong, but doesn't the logic here only work if the mouse movements are identifiable in the same sort of way that a phone number is? What happens if that's not accurate and mouse movements from a session are not so personally identifiable? What have I failed to understand? Wouldn't this logic also make transaction timestamps PII?

replies(2): >>22941656 #>>22947957 #
22. disiplus ◴[] No.22940589{4}[source]
but you are giving stipe pii when you buy something directly. at that point the mouse movement is nothing. and if you dont buy something the mouse movement is not pii.
replies(1): >>22948028 #
23. mistercow ◴[] No.22941143{4}[source]
Huh? Client sends data to bot-detection server, server sends back a signed response with a nonce and an expiration date saying "Yep, this is a human". Server stores the nonce to prevent replays. Client attaches the signed validation when submitting the transaction. The server that receives that verifies the signature and expiration date, then checks and invalidates the nonce. No association between the transaction and the mouse data necessary.

I don't know if that's how Stripe is doing it, but you could do it that way.

replies(1): >>22941597 #
24. ◴[] No.22941465{5}[source]
25. TheDong ◴[] No.22941597{5}[source]
How is the nonce not an association?

We have two possible options here:

1. Client sends mouse-data + card info to a server, server checks the mouse data, turns it into a fraudPercent, and only stores that percent. That seems to be what they're doing now.

2. Client sends mouse data, gets back a unique nonce, and then sends that nonce to the server with card info. The server could have either stored or discarded the mouse info. It's perfectly possible the nonce was stored with the mouse info.

Those two things seem totally identical. The nonce by necessity must be unique (or else one person could wiggle their mouse, and then use that one nonce to try 1000 cards at once), and you can't know that they don't store the full mouse movement info with the nonce.

You gain nothing by adding that extra step other than some illusion of security.

Note, cloudflare + tor has a similar problem that they tried to solve with blind signatures (see https://blog.cloudflare.com/the-trouble-with-tor/), but that hasn't gone anywhere and requires a browser plugin anyway. It's not a viable solution yet.

replies(1): >>22942214 #
26. TheDong ◴[] No.22941656{5}[source]
Your feigned "maybe you can help me?" reads more like sealioning than like a genuine lack of understanding.

However, sure, I'll humour you. A "signed and expiring token" is not sufficient because then a single attacker could use that token to try 1000s of cards before it expires.

Thus, you need a unique token, and wherever you store that unique token (to invalidate it, akin to a database session), you can optionally store the mouse movements or not. The association still exists. A unique token isn't functionally different from just sending the data along in the first place.

replies(1): >>22942136 #
27. snowwrestler ◴[] No.22942136{6}[source]
I think he/she is being very patient with people who don't seem to have a good understanding of the law they're citing.
replies(2): >>22943606 #>>22947986 #
28. mistercow ◴[] No.22942214{6}[source]
If you're going to go as far as "it's perfectly possible that the nonce was stored with the mouse info", then your example following:

> If you want to avoid a connection to a "specific human", it would go like this:

doesn't work either. It's perfectly possible that the server stored that info with the IP address and session information, since it also has access to those, and that could then be connected up with the transaction. I don't understand at this point what standard you're trying to meet, because it sounds like by what you're saying, literally any data sent to a server is "PII" if at some point that server also can, in principle, know your name.

replies(1): >>22942545 #
29. TheDong ◴[] No.22942545{7}[source]
I don't think it's PII. My point is just that your scheme of signed tokens doesn't avoid an association. There isn't a way to.

And that's fine because it's not PII and it's the only way to implement this (in my mind). What you're proposing is just shuffling around deck chairs, not actually sinking the ship.

replies(1): >>22942580 #
30. masterfooo ◴[] No.22942576[source]
Yeah? It is clearly personally identifiable. In fact it is pyschologically identifiable when Stripe can associate that with your name, credit card, Ip address, time of the purchase, the vendor, type of the item, how you get to the store, the items you are paying for, how much time you spent on the item or the store, which links you clicked, the browser you are using, the device you are on, your location etc. Do you want me to list all the possibilities they are recording?. You are out of touch with reality here.
31. mistercow ◴[] No.22942580{8}[source]
Oh, I mistook you for the previous commenter. Yeah, I agree that what I proposed doesn't really buy you anything unless you for some reason need the mouse data not to touch the server that's processing the transaction, which seemed to be what they were saying was required. There are multiple layers to why what they're saying doesn't make sense.
32. wpearse ◴[] No.22942757[source]
Counter-point: if your business is selling digital or physical product into New Zealand the NZ tax department requires you to collect two different types of data for the transaction that prove the customer is located in NZ. This can include IP address, phone number, address.

So, in some instances, Stripe is legally required to collect some of this data.

replies(1): >>22945722 #
33. onion2k ◴[] No.22942777[source]
Mouse movements may not be PII if you don't link it to a session ID, but then it would be useless in fraud detection because you don't know whose transaction you should be blocking or allowing since it's no longer traceable to a person.

Surely the point of mouse movement detection for anti-fraud is more "did the mouse move in an exact straight line to the exact center of an element and therefore isn't a human" or "the last 3 orders on this site used exactly the same pattern of mouse movements therefore is a recording" rather than some sort of "gait detection" to tell who someone is.

replies(1): >>22948214 #
34. shakna ◴[] No.22942899{3}[source]
> or because of a physical disability? Because the latter case seems exceptional enough that I'm not sure how that would legally work

There have been a number of accessibility-based lawsuits recently. Generally speaking, yes, you absolutely have to allow for them to use an alternative system without locking them out.

Because if your particular methodology breaks things for a people group that way, all kinds of discrimination laws become a hammer that someone can toss your way.

replies(1): >>22947879 #
35. SkyPuncher ◴[] No.22945387[source]
> Each individual letter of my name is also not identifiable, the letters of the alphabet are not PII, but when stored in in the same database row, the separate letters do form PII no matter that you stored them separately or even hashed or encrypted them.

This is a correct statement, but it's implied suggestion that Stripe is doing this is incorrect. There are lots of ways around this: no storing specific keys and hashing input would be my initial impressions.

My guess is Stripe is more concerned about the action patterns than the specific keys that a being pressed.

> Mouse movements may not be PII if you don't link it to a session ID, but then it would be useless in fraud detection because you don't know whose transaction you should be blocking or allowing since it's no longer traceable to a person.

This is an opinion and not a fact.

I don't need to know the identity of the guy wearing a balaclava and carrying a pillow case to know if that guy is in a bank and reaching into his jacket pocket, there's a high likelihood he's robbing the place.

When he shows up at the next place to rob, I don't have to have any PII on him to identify him as a robber. Might not be the same robber at both banks, but they both exhibit similar patterns. If they both limp or talk with a slur, I can reasonably connect the two without knowing the underlying identity.

replies(1): >>22948178 #
36. sib ◴[] No.22945722[source]
How does an IP address (in a world with VPN) or a phone number (which is likely to be mobile and could be located anywhere in the world) "prove that the customer is located in NZ"?
replies(1): >>22949623 #
37. lucb1e ◴[] No.22947879{4}[source]
> allow for them to use an alternative system without locking them out

That's not what I'm arguing against, though. I was not saying: forbid screen readers. I said:

> do you have to think of all possible edge cases? What if someone uses dictation because they can't type, does that mean you'd potentially capture social security numbers if you use the microphone for gunshot detection and process the sound server-side?

replies(1): >>22951749 #
38. lucb1e ◴[] No.22947957{5}[source]
You keep using that ridiculously apologetic tone that really rubs me the wrong way while making constructive remarks. If you could lose the former without the latter, I might actually appreciate your replies. But then, I'm reasonably sure that it's meant to annoy.

> Seems to me you could not store things, you could require a signed and expiring token

That's actually a good idea.

replies(1): >>22949616 #
39. lucb1e ◴[] No.22947986{7}[source]
Really, you read that as being patient? To me it seems to be an obvious attempt to rub the person they're replying to entirely the wrong way while feigning ignorance.

I would flag it as attempting to trigger others if each reply did not also contain one or two constructive sentences.

> with people who don't seem to have a good understanding of the law

"People" had a fine understanding of applicable PII law, but the person clarified (in between a bunch of bullshit about how godforsaken sorry they are) that they were talking about some USA thing specifically and not the broader definition.

40. lucb1e ◴[] No.22948028{5}[source]
> but you are giving stipe pii when you buy something directly. at that point the mouse movement is nothing

1) but that's not how the law works

2) law aside, I'm also not sure it holds up ethically to say "you're giving them <some info necessary to fulfill your payment>, what's wrong with giving them <unnecessary other data>". Now, if you say "but it's not unnecessary, it's for anti-fraud!" then sure, that's a different argument: then the argument is not that you might as well give it because of something else you gave but because it's necessary for fraud. They could still do the courtesy of telling users before tracking them (which might bring us back to the legal argument, which tells us that is is indeed necessary to do so).

41. lucb1e ◴[] No.22948083{5}[source]
Digital Signal Processor? Delaware State Police? Defense Support Program? DSP is a little ambiguous here.
replies(1): >>22948838 #
42. lucb1e ◴[] No.22948127{5}[source]
GDPR doesn't apply only to storage, though? Maybe I'm confusing it with the previous data protection directive but I'm pretty sure the new GDPR also defines PII processing to include things like transmitting and operating on it.

But if there is some source (e.g. case law, data protection authority) that confirms that you can process two pieces of data and keep one as non-PII if you promise not to connect them in storage or forward them to another place in an identifiable manner, that would be interesting.

replies(1): >>22949250 #
43. lucb1e ◴[] No.22948178{3}[source]
> My guess is Stripe is more concerned about the action patterns than the specific keys that a being pressed.

Don't they still need to process the data server-side to derive that pattern to make a decision on it?

44. lucb1e ◴[] No.22948214{3}[source]
The purpose of processing the individual mouse positions over time may be exactly that, but I'm not sure that the intent matters. For example, a website asking for my social security number for the sole purpose of verifying whether it matches the checksum (Dutch SSNs contain a checksum) would still be processing my SSN, no? I'd be interested if I'm wrong, though.
45. ceefry ◴[] No.22948838{6}[source]
Given context of GDPR, I'm assuming Demand Side Platform ("buying" half of programmatic advertising).
46. joshuamorton ◴[] No.22949250{6}[source]
> But if there is some source (e.g. case law, data protection authority) that confirms that you can process two pieces of data and keep one as non-PII if you promise not to connect them in storage or forward them to another place in an identifiable manner, that would be interesting.

It would be impossible to follow the GDPR otherwise, all data would implicitly be PII, since all data is associated with an IP address and GDPR defines IP as PII.

> GDPR doesn't apply only to storage, though?

This doesn't matter, because you can always collect data for business critical purposes, which fraud protection reasonably is.

47. throwaway284629 ◴[] No.22949455{3}[source]
if they write their SSN in the tool and it gets recorded in the mouse movements, how is that not PII?
48. Kalium ◴[] No.22949616{6}[source]
OK.

You didn't read the law I was talking about that was specifically and clearly linked in the initial comment to which I responded. The comment in question made a specific claim about a specific law in a specific jurisdiction to which I responded narrowly and specifically. My comment referred clearly to the law in question and summarized points from it.

All points about other laws in other locations are irrelevant to the specific points I was offering discussion of.

> That's actually a good idea.

It is... provided that a handful of mouse movements actually qualify as PII. Which, as claimed here under CalOPPA, seems like it might be doubtful. As others have pointed out, there's room to doubt that a few mouse movements would be considered PII under any current regulatory regime (there are multiple notable ones, they don't agree on all points).

As an approach, it's useful for things like SAML and OAuth protocols when you're dealing with different systems controlled by different parties and need to delegate trust through an untrusted party. It's rarely the best way to move data around inside a system, though, unless you have some compelling reason to introduce this level of blinding.

49. ◴[] No.22949623{3}[source]
50. shakna ◴[] No.22951749{5}[source]
Inadvertently capturing social security numbers does actually open you up to a lot of PII laws. So yes, that is still a problem.

Any time you get data from a user, you need to be careful about what you're grabbing.

51. DrShila ◴[] No.22956322{5}[source]
This is how we can say mouse movements can lead to privacy violation: mouse movements as such doesn't contain PII like name, zipcode or gender. But when mouse movements are run through the machine learning algorithm, it can NOT only help you to identify the person (mouse dynamics are behavioral factors and you can map across different sites. By mapping across different sites, you will learn basically the same person is surfing these three sites and valuable information for advertising world, as an example) but you can analyze the mouse movements to identify your health issues. Now you take this information and link to other publicly available databases to identify the person!! So, overall, if stripe doesn't sell this data to analyze other patterns like id or health issues, its fine...but guaranteeing it is hard.

So at Unknot.id, we learn similar patterns to detect fraud but using smartphones. But we make sure, only needed results (that is fraud or not) can be achieved and not his health or other privacy related.