I build tons of scraper and things that pretend to be browser (handcoded, not recorded from the browser - but lighter than spinning up a real browser) and the harder bit is keeping the flows maintained. Some websites are particularly annoying to work with because of random captchas jumping in your face but it's something you can handle by coding support for the captcha in the flow and presenting a real user the captcha.
One problem of logging in the cloud is IP checks. You may be asked to confirm.
If you want to look into this issues I'd recommend scraping yandex for dealing with captchas being thrown in your face and authed google or facebook for IP restrictions, weird authentication requests.
Again, I think a marketplace could outsource these problems to a community of developers maintaining flows.
Security could be another concern, but you always have the option of running things locally.