←back to thread

422 points sungam | 1 comments | | HN request time: 0.299s | source

Coded using Gemini Pro 2.5 (free version) in about 2-3 hours.

Single file including all html/js/css, Vanilla JS, no backend, scores persisted with localStorage.

Deployed using ubuntu/apache2/python/flask on a £5 Digital Ocean server (but could have been hosted on a static hosting provider as it's just a single page with no backend).

Images / metadata stored in an AWS S3 bucket.

Show context
lazarus01 ◴[] No.45159105[source]
What you created is a version of “am I hot or not” for skin cancer. The idea is constrained to the limitations of your programming capability. Showing a photo and creating 3 buttons with a static response is not very helpful. These are the limits of vibe coding.

I was thinking to train a convnet to accurately classify pictures of moles as normal vs abnormal. The user can take a photo and upload it to a diagnostic website and get a diagnosis.

It doesn’t seem like an overly complex model to develop and there is plenty of data referring to photos that show normal vs abnormal moles.

I wonder why a product hasn’t been developed, where we are using image detection on our phones to actively screen for skin cancer. Seems like a no brainer.

My thinking is there are not enough deaths to motivate the work. Dying from melanoma is nasty.

replies(11): >>45159137 #>>45159156 #>>45159171 #>>45159175 #>>45159183 #>>45159263 #>>45159367 #>>45159446 #>>45159516 #>>45159974 #>>45160819 #
1. hombre_fatal ◴[] No.45159367[source]
Every dermatologist (and developer with a dermatologist relative) in the world has had that app idea since most of your daily checkups are moles that you categorize in seconds.

The app already exists btw. Did nobody in this thread google it before saying it couldn't work?