←back to thread

423 points sungam | 5 comments | | HN request time: 0.973s | source

Coded using Gemini Pro 2.5 (free version) in about 2-3 hours.

Single file including all html/js/css, Vanilla JS, no backend, scores persisted with localStorage.

Deployed using ubuntu/apache2/python/flask on a £5 Digital Ocean server (but could have been hosted on a static hosting provider as it's just a single page with no backend).

Images / metadata stored in an AWS S3 bucket.

1. jacquesm ◴[] No.45160857[source]
Nice job. Now you really need to study up on the statistics behind this and you'll quickly come to the conclusion that this was the easy part. What to do with the output is the hard part. I've seen a start-up that made their bread and butter on such classifications, they did an absolutely great job of it but found the the problem of deciding what to do with such an application without ending up with net negative patient outcomes to be far, far harder than the classification problem itself. The error rates, no matter how low, are going to be your main challenge, both false positives and false negatives can be extremely expensive, both in terms of finance and in terms of emotion.
replies(1): >>45161691 #
2. sungam ◴[] No.45161691[source]
Thanks for your comment - the purpose of this app is patient education rather than diagnosis but I will definitely have a look at the relevant stats in more detail!
replies(2): >>45161870 #>>45163114 #
3. jacquesm ◴[] No.45161870[source]
The risk I think is that people will not understand that that is your goal, instead they will use it to help them diagnose something they might think is suspicious.

They will go through your images until they get a good score, believe themselves and expert and proceed to diagnose themselves (and their friends).

By the time you have an image set that is representative and that will actually educate people to the point where they know what to do and what not to do you've created a whole raft of amateur dermatologists. And the result of that will be that a lot of people are going to knock on the doors of real dermatologists who might tell them not to worry about something when they are now primed to argue with them.

I've seen this pattern before with self diagnosis.

replies(1): >>45167839 #
4. thebeardisred ◴[] No.45163114[source]
To that end I quickly learned something that AI models would as well (which isn't your intention):

Pictures with purple circles (e.g. faded pen ink on light skin outlining the area of concern) are a strong indicator of cancer. :wink:

5. nextaccountic ◴[] No.45167839{3}[source]
So what? Are you arguing that ensuring patients have less information available about diseases leads to better outcomes? What's your take on public campaigns about self diagnosing mammary cancer by touch? (very common where I live)

As a patient I'd rather have more information available to me, even if I ultimately defer to specialists

Also it's common for medical professionals to ignore symptoms of certain demographics - in those cases, enabling patients to advocate for themselves is essential https://www.nytimes.com/2022/07/29/well/mind/medical-gasligh...

A personal anecdote of mine was a friend that had abdominal pain for months. She had some comorbidities that made it easier for doctors to dismiss her symptoms. After visits to various doctors she only got adequate treatment because I went with her and advocated for her. After discarding multiple options eventually a renal infection was diagnosed and treated. If she went with the opinion of the first doctor she would still have the underlying condition untreated.