If you are seriously coming close to ending your own life, so many things around you have gone awry. Generally, people don't want to die. Consider: if an acquaintance suggested to you how a noose could be made, would you take the next step and hang yourself? Probably not. You have to be put through a lot of suffering to come to a point in life where ending it all is an appealing option.
Life had failed that guy and that's why he committed suicide, not because a chatbot told him to. Just the fact that a chatbot is his closest friend is a huge red flag for his wellbeing. The article says how he appeared so happy, which is exactly an indicator of how much disconnect there was between him and those around him. He wasn't sharing how he was truly feeling with anyone, he probably felt significant shame around it. That's sad. What else may have gone amiss to lead him to such a point? Issues with health? Social troubles? Childhood problems? Again, it's not a healthy state of things to be considering suicide, even including teenage quirkiness. His case is a failure of family, friends, and society. Discussing ChatGPT as the cause of his death is ignoring so many significant factors.