The essential point of this blog post is to avoid "shotgun parsing", where parsing/validating is done just from a procedural standpoint, where it matters when exactly it happens. In the paper "Out of the Tar Pit" it is asserted that this leads to "accidental complexity" (AKA "pain and anxiety"), which is something every programmer has experienced before, possibly many times.
I've become a fan of declarative schema to (json-schema/OpenApi, clojure spec etc.) to express this kind of thing. Usually this is used at the boundaries of an application (configuration, web requests etc.) but there are many more applications for this within the flow of data transformations. If you apply the "parse don't validate" principle you turn schema-validated (sic!) data into a new thing. Whether that is a "ValidatedData" type or meta data, a pre-condition or runtime check says more about the environment you program rather than the principle in discussion. The benefit however is clear: Your code asserts that it requires parsed/validated data where it is needed, instead of when it should happen.