[0] E.g. arxiv/0812.0848: "This paper has been withdrawn by the author due to a crucial definition error of Triebel space".
Peer review is not well equipped to catch fraud and deliberate deception. A half-competent fraud will result in data that looks reasonable at first glance, and peer reviewers aren't in the business of trying to replicate studies and results.
Instead, peer review is better at catching papers that either have internal quality problems (e.g. proofs or arguments that don't prove what they claim to prove) or are missing links to a crucial part of the literature (e.g. claiming an already-known result as novel). Here, the value of peer review is more ambiguous. It certainly improves the quality of the paper, but it also delays its publication by a few months.
The machine learning literature gets around this by having almost everything available in preprint with peer-reviewed conferences acting as post-facto gatekeepers, but that just reintroduces the problem of non-peer-reviewed research being seen and cited.