Most active commenters
  • utilize1808(4)
  • shkkmo(4)

←back to thread

110 points jonbaer | 18 comments | | HN request time: 1.569s | source | bottom
1. utilize1808 ◴[] No.45073112[source]
I feel this is not the scalable/right way to approach this. The right way would be for human creators to apply their own digital signatures to the original pieces they created (specialised chips on camera/in software to inject hidden pixel patterns that are verifiable). If a piece of work lacks such signature, it should be considered AI-generated by default.
replies(3): >>45073155 #>>45073302 #>>45073834 #
2. shkkmo ◴[] No.45073155[source]
That seems like a horrible blow to anonymity and psuedonymity that would also empower identity thieves.
replies(3): >>45073244 #>>45073831 #>>45074209 #
3. utilize1808 ◴[] No.45073244[source]
Not necessarily. It’s basically document signing with key pairs —- old tech that is known to work. It’s purpose is not to identify the individual creators, but to verify that a piece of work was created by a process/device that is not touched by AI.
replies(2): >>45073863 #>>45076968 #
4. HPsquared ◴[] No.45073302[source]
Then you just point the special camera at a screen showing the AI content.
replies(1): >>45073754 #
5. utilize1808 ◴[] No.45073754[source]
Sure. But then it will receive more scrutiny because you are showing a "capture" rather than the raw content.
replies(1): >>45074633 #
6. taminka ◴[] No.45073831[source]
there's very likely already some sort of fingerprinting in camera chips, à la printer yellow dot watermarks that uniquely identify a printer and a print job...
replies(1): >>45077007 #
7. jay-barronville ◴[] No.45073834[source]
> If a piece of work lacks such signature, it should be considered AI-generated by default.

That sounds like a nightmare to me.

replies(1): >>45074189 #
8. BoiledCabbage ◴[] No.45073863{3}[source]
And what happens when someone uses their digital signature to sign an essay that was generated by AI?
replies(1): >>45073997 #
9. utilize1808 ◴[] No.45073997{4}[source]
You can’t. It may be set up such that your advisor could sign it if they know for sure that you wrote it yourself with using AI.
replies(1): >>45074751 #
10. xpe ◴[] No.45074189[source]
You aren’t specifying your point of comparison. A nightmare relative to what? You might be saying a nightmare relative to what we have now. Are you?

We once considered text to be generated exclusively by humans, but this assumption must be tossed out now.

I usually reject arguments based on an assumption of some status quo that somehow just continues.

Why? I’ll give two responses, which are similar but use different language.

1. There is a fallacy where people compare a future state to the present state, but this is incorrect. One has to compare two future states, because you don’t get to go back in time.

2. The “status quo” isn’t necessarily a stable equilibrium. The state of things now is not necessarily special nor guaranteed.

I’m now of the inclination to ask for a supporting model (not just one rationale) for any prediction, even ones that seem like common sense. Common sense can be a major blind spot.

replies(1): >>45075211 #
11. xpe ◴[] No.45074209[source]
Maybe as the direct effect, maybe not. Also think about second order effects: how would various interests respond? The desire for privacy is strong and people will search for ways to get it.

Have you looked into kinds of mitigations that cryptography offers? I’m not an expert, but I would expect there are ways to balance some degree of anonymity with some degree of human identity verification.

Perhaps there are some experts out there who can comment?

12. HPsquared ◴[] No.45074633{3}[source]
Actually come to think of it, I suppose a "special camera" could also record things like focusing distance, zoom, and accelerations/rotation rates. These could be correlated to the image seen to detect this kind of thing.
replies(1): >>45080018 #
13. akoboldfrying ◴[] No.45074751{5}[source]
> You can’t.

I like the digital signature approach in general, and have argued for it before, but this is the weak link. For photos and video, this might be OK if there's a way to reliably distinguish "photos of real things" from "photos of AI images"; for plain text, you basically need a keystroke-authenticating keyboard on a computer with both internet access and copy and paste functionality securely disabled -- and then you still need an authenticating camera on the user the whole time to make sure they aren't just asking Gemini on their phone and typing its answer in.

replies(1): >>45077091 #
14. jay-barronville ◴[] No.45075211{3}[source]
> You aren’t specifying your point of comparison. A nightmare relative to what? You might be saying a nightmare relative to what we have now. Are you?

Very fair point.

And no, it’s less about the status quo and more about AI being the default. There are just too many reasons why this proposal, on its face, seems problematic to me. The following are some questions to highlight just a few of them:

- How exactly would “human creators [applying] their own digital signatures to the original pieces they created” work for creators who have already passed away?

- How fair exactly would it be to impose such a requirement when large portions of the world’s creators (especially in underdeveloped areas) would likely not be able to access and use the necessary software?

- How exactly do anonymous and pseudonymous creators survive such a requirement?

15. shkkmo ◴[] No.45076968{3}[source]
> It’s basically document signing with key pairs —- old tech that is known to work.

I understand the technical side of the suggestion. The social and practical side is inevitably flawed.

You need some sort of global registry of public keys. Not only does each registrar have to be trusted but you also need to both trust every single real person to protect and not misuse their keys.

Leaving aside the complete practical infeasability that, even if you accomplish it, you now have a unique identifier tied to every piece of text. There will inevitably be both legal processes to identify who produce a signed work as well as data analysis approaches to deanonamize the public keys.

The end result is pretty clearly that anyone wishing to present material that purports to be human made has to forgo anonymity/pseudonymity. Claiming otherwise is like claiming we can have a secure government backdoor for encryption.

16. shkkmo ◴[] No.45077007{3}[source]
The way those work is primarily through a combination of obscurity (most people don't know they exist) and through a lack of real finacial incentive to break them at scale.

I would also argue that those techniques do greatly reduce privacy and anonymity.

17. shkkmo ◴[] No.45077091{6}[source]
> for plain text, you basically need a keystroke-authenticating keyboard on a computer with both internet access and copy and paste functionality securely disabled -- and then you still need an authenticating camera on the user the whole time to make sure they aren't just asking Gemini on their phone and typing its answer in.

Which is why I say it would destroy privacy/pseudonymity.

> For photos and video, this might be OK if there's a way to reliably distinguish "photos of real things" from "photos of AI images";

I suspect if you think about it, many of the issues with text also apply to images and videos.

You'd need a secure enclave You'd need a chain of signatures and images to allow human editing. You'd need a way of revoking the public keys of not just insecure software, but bad actors. You would need verified devices to prevent allowing AI tooling the using software to edit the image....etc.

This are only the flaws I can think of in like 5 minutes. You've created a huge incentive to break an incredibly complex system. I have no problem comfortably saying that the end result is a complete lack of privacy for most people while those with power/knowledge would still be able to circumvent it.

18. tough ◴[] No.45080018{4}[source]
ROC Camera does exactly this

> Creates a Zero Knowledge (ZK) Proof of the camera sensor data and other metadata

https://roc.camera/