https://news.ycombinator.com/submitted?id=nickprophet (submitted 3x, no other HN activity) https://news.ycombinator.com/submitted?id=ocbcordoba (submitted 1x, no other HN activity) https://news.ycombinator.com/submitted?id=pepelopez10 (submitted 1x, no other HN activity) https://news.ycombinator.com/submitted?id=gptprophet (submitted 4x, no other HN activity) https://news.ycombinator.com/submitted?id=rebeca420 (submitted 2x, no other HN activity) https://news.ycombinator.com/submitted?id=fireofmachines (submitted 2x, no other HN activity)
Past any HN rules interpretation I'm concerned that you might actually believe this stuff. If GPT can hallucinate nonsense about well documented software API's, it's certainly capable of making up random stuff that sounds spiritual & profound but is just as nonsensical as when it makes up window.fetchJson().