It’s like Apify but my goal is to make it easier to use
Currently, I have APIs and scrapers that work with platforms like Google Maps, Yelp and Amazon. The APIs are useful to get data immediately and scrapers to extract information from many URLs.
The plan is to add more general purpose APIs like an HTML API, Markdown API and eventually features to build your own api and scrapers with AI.
There’s a lot of tools in the space nowadays but imo they are all flaky. My intention with unwrangle is to offer a way to scrape any site that just works without the need for any config or complicated pricing. The project is at a little over $1000 MRR. Marketing it has been and continues to be a big challenge. I’m bootstrapping solo and hoping to reach 5-10k MRR in the following months. Plan for that is to consistently improve offering and conduct marketing experiments.
What I find interesting about it: I’m offering easy ways to scrape sites on which antibot is really hard to bypass like Twitter, paywalled sites, LinkedIn, etc. The ability to build crawlers without writing any code is kinda cool. User who would normally not have used web data are scheduling scraping jobs and using the data for analysis. For the HTML API I’m thinking of doing an interesting spin on what others like ScrapingBee are doing and abstracting the needless config like premium proxy etc. and just effectively offering a higher # of requests. Also for the build your own scraper, offering users a way to create a parser with a prompt and use it in a single browser session to collect data from many pages saves hassle compared to a synchronous API approach