https://georgetownvoice.com/2025/04/06/georgetown-students-c...
https://georgetownvoice.com/2025/04/06/georgetown-students-c...
They should feel bad about not communicating with the "researcher" after the fact, too. If i had been blown off by a "company" after telling them everything was wide open to the world for the taking, the resulting "blog post" would not be so polite.
STOP. MAKING. APPS.
Nonetheless: "two months old vulnerability" and "two months old students-made app/service".
It's hard to tell these days what is real.
Linkedin shows 2024 founded, and 2-10 employees. And that same Linkedin page has a post which directly links to this blurb: https://www.readfeedme.com/p/three-college-seniors-solved-th...
The date of this article is May 2025, and it references an interview with the founders.
There's nothing wrong with making your POC/MVP with all of the cool logic that shows what the app will do. That's usually done to gain funding of some sort, but before releasing. Part of the releasing stage should be a revamped/weaponized version of the POC, and not the damn POC itself. The weaponized version should have security stuff added.
That's much better than telling people stop making apps.
You know what else was an app built by university students? The Facebook. We're all familiar with the "dumb fucks" quote, with Meta's long history of abusing their users' PII, and their poor security practices that allowed other companies to abuse it.
So, no. This type of behavior must not be excused, and should ideally be strongly regulated and fined appropriately, regardless of the age or experience of the founders.
If all of the developers were named and shamed, would you, as a hiring manager, ever hire them to develop an app for you? Or would you, in fact, tell them to stop making apps?
They enabled stalkers. There's no possible way to argue that they didn't, you don't know, and some random person just looked into it because their friends mentioned the app and found all of this. I guarantee if anyone with a modicum of security knowledge looks the platform over there's going to be a lot more issues.
It's one thing to be curious and develop something. It's another to seek VC/investments to "build out the service" by collecting PII and not treating it as such. Stop. Making. Apps.
This can only be solved by regulation.
Also, if we're talking about a company that had a hiring manager in the process of making an app and did not hire employees with security knowledge somewhere in the process, then the entire company is rotten.
Let me flip this on its head though with your same logic. If you're the type of person that would be willing to provide an app your passport information. Stop. Using. Apps.
The disclosure didn't show every API endpoint, just a few dealing with auth and profiles. They also mentioned only a few PII, you can tell because there were multiple screenshots spread throughout the post. I'm harping on passport for the reason you specify, too; but mostly that information shouldn't be stored...
"Class Immobility" (95% of users unlock this without trying!)
How to unlock: Be denied access to an accredited education. Work twice as hard for half the recognition. Watch opportunities pass you by while gatekeepers congratulate themselves!
At the end of the day the masses will finally get tired of the fuckery of programmers doing whatever they want and start putting laws in place, and the laws will be passed by the stupidest people among us.
Programmers now should start looking into standards of professional behaviors before they are forced on them by law.
And sure, if your follow-up is "that won’t change," I get it, but that doesn’t mean the open nature of programming is the problem.
>At the end of the day the masses will finally get tired of the fuckery of programmers doing whatever they want and start putting laws in place, and the laws will be passed by the stupidest people among us.
I agree laws will pass eventually but it won't start from the people. They rarely even think or hear about software security as something other than an amorphous boogie man, and there are no repercussions so any voices are easily forgotten. Eventually, it will be some big tech corp executive or politician moving into government convincing them to create a security auditing authority to extract money from these companies and/or shut them down.
I'm sure we can find some holier than thou types to fill chairs with security auditors for the new "SSC" once it's greenlit.
Perhaps, like GDPR, HIPAA, and similar, any (web|platform)apps that contain login details and/or PII must thoroughly distance themselves from haphazard, organic, unprofessional, and (bad) amateurish processes and technologies and conform to trusted, proven patterns, processes, and technologies that are tested, audited, and preferably formally proven for correctness. Without formalization and professional standards, there are no standards and these preventable, reinvent-the-wheel-badly hacks will continue doing the same thing and expecting a different result™. Massive hacks, circumvention, scary bugs, other attacks will continue. And, I think this means a proper amount of accreditation, routine auditing, and (the scary word, but smartly) regulation to drag the industry (kicking-and-screaming if need by by showing using appropriate leadership on the government/NGO-SGE side) from an under-structured wild west™ into professionalism.
Way back when I last used a dating site, a significant percentage of profiles ended up being placeholders for scams of some sort.
In fact, several texted me a link to some bogus "identity verification" site under the guise of "I get too many fake bot profile hits"... Read the fine print, and you're actually signing up for hundreds of dollars worth of pron subscriptions.
If the dating app itself verified people were real, AND took reports of spam seriously, AND kept that information in a way that wasn't insecure, it'd be worth it.
Until the balance of incentives changes, I don't see any meaningful change in behavior unfortunately.
It's astonishing to me the ease of which software developers can wreak _real_ measurable damage to billions of lives and have no real liability for it.
Software developers shouldn't call themselves engineers unless they're licensed, insured and able to be held liable for their work in the same way a building engineer is.
Civil engineering requires licensing because there are specific activities that are reserved for licensed engineers, namely things that can result in many people dying.
If a major screwup doesn't even motivate victims to sue a company then a license is not justified.
Nonsese. I've met PhDs in computer science that were easily out-performed by kids fresh out of coding bootcaps. Do you think that spending 5 years doing a few written exampls makes you competent at cyber security? Absurd.
Instead, I think this is the fair approach: anyone is free to make a website/app/VR world whatever, but if it stores any kind of PII, you had better know what you are doing. The problem is not security. The problem is PII. If someone's AWS key got hacked, leaked and used by others, well it's bad, but that's different from my personal information getting leaked and someone applying for a credit card on my behalf.
Observing that each individual harm may not be worth the effort of suing over is evidence that the justice system is not effective at addressing harm in the aggregate, not evidence of lack of major harm.
The distinction between creating virtual software and physical structures is fairly obvious.
Of course physical engineers that create buildings and roads need to be regulated for safety.
And there are restrictions already for certain software industries, such as healthcare.
Many other forms of software do not have the same hazards so no license should be needed, as it would be prone for abuse.
You don't know what you don't know; sometimes people can think they do know what they're doing and they just haven't encountered situations otherwise. We were all new to programming once; no one would ever become a solid engineer if they prevented themselves from building anything out of fear of doing something wrong that they did not account for out of lack of experience.
Civil engineering works well because we mostly figured it out anyway. But looking at PCI, SOX and others, we'd probably just require people to produce a book's worth of documentation and audit trail that comes with their broken software.
US tech is built on the "go fast, break things" mentality. Companies with huge backers routinely fail at security, and some of them actually spend money to suppress those who expose the companies' poor privacy/security practices.
If anything, college kids could at least reasonably claim ignorance, whereas a lot of HN folks here work for companies who do far worse and get away with it.
Some companies, some unicorns, knowingly and wilfully break laws to get ahead. But they're big, and people are getting rich working for them, so we don't crucify them.
https://en.wikipedia.org/wiki/2017_Equifax_data_breach
Or how about four suicides and 900+ wrongful convictions?
https://en.wikipedia.org/wiki/British_Post_Office_scandal
Not to mention the various dating app leaks that led to extortion, suicides and leaking of medical information like HIV status. And not to forget the famous Therac-25 that killed people as direct result of a race condition.
Where's the threshold for you?
I don't think anyone is proposing that Flappy Bird or Python scripts on Github should be outlawed. Just like you can still build a robot at home but not a bridge in the town center.
You can sign a liability waiver and do all sorts of dangerous things.
>most “real” engineering field have had licensing requirements for a century, without any real complaints against that process).
Most newer engineering fields are trending away from licensing, not towards it. For example, medical device and drug engineering doesn't use it at all.
There are all sorts of failures in the structural space. How many pumped reinforced concrete buildings are being built in Miami right now? How many of them will be sound in 50-75 years? How likely is the architect/PE’s ghost to get sued?
PE’s are smart professionals and do a valuable service. But they aren’t magic, and they all have a boss.
You can't stop someone from doing electrical repairs on their own home but if the house burns down as a result, the homeowners' insurance will probably just deny the claim, and then they risk losing their mortgage. Basically, if you make it bureaucratically difficult to do the wrong thing, you'll encourage more of the right thing.
No mention of PII or any specifics.
SWE already has regulations. I see no need for a license requirement...
Concerning PII, it's kind of hypocritical for the gov to regulate when the NSA was proven to be collecting data on everyone against their will or knowledge.
We had two security teams. Security and compliance. It was not possible to be secure and compliant, so the compliance team had to document every deviance from the IRS standard and document why, then self-report us and the customer to audit the areas where we were outside the lines. That took a dozen people almost a year to do.
All of that existed because a US state (S Carolina iirc) was egregiously incompetent and ended up getting breached. Congress “did something” about it.
Should've know when they said interpreters and compilers.
Incidentally I replied with sarcasm to theirs as well so it all works out.
I'm not saying I'm pro identity theft or data breach or something, but the industry culture is vastly different
people here are pro on move fast break things some of idea, I think you just cant tbh
If you're looking for a regulatory fix, I would prefer something like a EU-style requirement on handling PII. Even the US model--suing in cases of privacy breaches--seems like it could be pretty effective in theory, if only the current state of privacy law was a little less pro-corporate. Civil suits could make life miserable for the students who developed this app.
They all update their recommendation and standards routinely, and do a reasonably good job at being professional organizations.
The current state of this as regards to the tech sector doesn't mean its impossible to implement.
Thats why all the usual standards (PCI, SOC2 in particular) are performative in practice. There's nothing that holds industry accountable to be better and there is nothing, from a legal stand point, that backs up members of the association if they flag an entity or individual for what would be effectively malpractice.
It’s a trade-off between shipping fast and courting risk. I’m not judging one over the other; it comes down to what you’re willing to accept, not what you wish for.
These students may be liable for things after the fact, but that is hardly any consolation to the people that may have had their intimate personal data leaked. Even if they are successfully sued by everybody on the site, how much money could they possibly squeeze out of a bunch of college students? I don’t know how you can prevent this without some up front thing, such as a license, rather than making them liable after the fact.
is a special case exception, where rather than requiring licensing for the engineers building the product, we put detailled restrictions and regulations on what needs to be done (extensive testing, detailled evidence, monitoring programs, etc) before the product can be sold or marketed.
That is hardly an example of a field where risk-taking is encouraged and unlicensed persons are able to unleash their half-developed ideas on the public.
Do you have any other examples of fields which are "trending away" from licensing?
Anyways, I’m not the one who should be deciding the specifics here, it should be a collaboration between lots of different parties, even if I may have a seat at that table. But we have got to get away from the notion (as seen in other comments in this thread) that any sort of attempt to prevent this kind of harm before it happens is authoritarianism.
It is also a commercial product, not something they made for fun:
In-App Purchases
- Cerca App $9.99
- Cerca App 3 month $9.99
- 10 Swipes $2.99
- 3 Swipes $0.99
- 5 swipes $1.99
- 3 Searches $1.99
- 10 Searches $3.99
- 5 Searches $2.99
I enforced a no-login policy, because I didn't want potential users to even think about entering a password into a form on the website. I didn't trust myself or my group to handle it correctly, so I decided it was best to just side-step the problem. Naturally this made the application a lot less useful - but it was a student project, who cares.
Software engineering students have an obligation to ethics just like all other engineers. We need to think these things through, and decide if we even want to implement features. And we need to be thinking in terms of risk, not design.
Storing sensitive data is risky, even if you're really talented. Companies will try to put processes in place to mitigate that risk. But students are almost certainly not doing that, so they should be questioning if they should even be doing what they're doing in the first place.
Will I need a license if Flappy Bird has a online function for uploading high scores to a leader table stored online somewhere?
Will I need a license to put a PR on Github?
You also haven't thought about how many unintended consequences it will have. It will affect things like open source, hiring and how it will affect smaller niche cultures that rely on pseudo-anonymity or just want to do fun things.
Just off the top of my head:
Am I going to need a license to build a EDuke32 package for AUR?
Am I going to need a license to add a plugin to a piece of software?
Will I need a license to stick a gist on github?
Many people that currently make the laws in industry (just look at the UK online safety act) don't understand/won't care about any of the nuance.
>I just think there has been entirely too much demonstrated harm to start with the premise of “anyone can build any software they want at any time, with zero liability”.
Actually it is the opposite. I and many others could argue that it has improved the world immensely. I can talk to people that share my interests from all around the globe, I have the ability to work internationally and never leave my home. I've just recently I've taught myself how to fix many of my own vehicle problems at home using Youtube and do some basic maintenance around the house.
I can get any niche product delivered to my door in a matter of days. All of these are massively positives that have benefited the world immeasurably.
> These students may be liable for things after the fact, but that is hardly any consolation to the people that may have had their intimate personal data leaked. Even if they are successfully sued by everybody on the site, how much money could they possibly squeeze out of a bunch of college students? I don’t know how you can prevent this without some up front thing, such as a license, rather than making them liable after the fact.
A license will guarantee nothing. You should assume that anything you put online can be leaked. I can control the amount of information I put on most sites by either giving them false information or being pseudo-anonymous / anonymous.
However regulation in my country is going to force photo ID for platforms such as Discord (and many others) under the guise of age checks. This will mean that I have to give a third party my ID which has all my data or not use the service. This will tie my identity on Discord (which is pseudo-anonymous) to my Discord account.
So licensing/regulation actually guarantees more data leaks. Because I can't vet the company that deals with the ID check, not can I easily circumvent information gathering. Sure I will probably be able to defeat most of this with a VPN. But it is more of a PITA.
Maybe you are allowed to build that faulty bridge in, I dunno, Laos or whatever, and if people go to Laos specifically to drive on your bridge, then that’s on them if it collapses. But countries can and do successfully regulate how software is handled in their jurisdiction, see GDPR for example. It’s not an unsolvable problem, and even if there are cracks (like there are with GDPR), the solution isn’t to throw our hands up and say “welp, nothing to be done, just have to accept that sometimes people’s intimate personal details gets leaked.”
If you think my suggestion is bad (which it very well may be), happy to hear your take on how to prevent things like this and and other negligent software.
They are merely unconstructive statement, developer have free will, they spent time and money to make the app, customer spent time and money to use their app. If there are any mistakes, util you prove that they were intentional harm the customer - or - violating the contract of data safety between the app and the customer, they are free to keep their business. The free market will decide what will happen next.
And the link you gave as an example was just made nonsense. The victim was being fired from the position which worked for security of government because he did not have honesty from the start, did not inform that he use a dating app. With his private data in a dating app, even if they were not leaked: the data can be exchanged illegally in the background, which can lead to social engineering, harm the government and nation he is working for. Actually, that firm and the nation was lucky that his data was being leaked - on purpose by someone. It was his vault.
Perhaps they could move even faster and scale better by collecting and storing less data. Moving forward fast instead of moving frantically while looking for things to break seems more reasonable to me. But then again I'm not the kind of person to become a billionaire tech CEO who's unironically bragging about being called the Eye of Sauron, so what do I know.
As you point out, the trend is for self certification and government review, like is done for medicine, aircraft.
I don't think these are special cases, but the norm for any field developed after the 60's or so.
>risk-taking is encouraged and unlicensed persons are able to unleash their half-developed
That's your hostile strawman, not mine.
I do imagine a technical organization that strives to do its best and would have sufficient scope to protect its members legally if need be, so members would be empowered to make the best decisions possible.