In this article, we'll show you how to transfer cookies from your web browser to any Apify actor that has the option to pass in initial cookies, in order to crawl a website that requires a login. Example actors that allow you to login with cookies are Web Scraper, Puppeteer Scraper , Instagram Scraper and many more. This is the quickest and simplest solution but note that there are other solutions that may be more reliable, for example you can also fill in the login form directly in the code like explained in our other article.
First, you'll need to install EditThisCookie extension to your web browser. After you install it, go to the website you'd like to crawl and log in using your credentials.
Click the EditThisCookie button right next to your URL and click Export. The cookies will be copied to the clipboard as a JSON array that is compatible with the cookie format used by Puppeteer/Headless Chrome(the headless web browser that we use for crawling).
Let's look how to pass these to a Web Scraper.
The Initial cookies field is inside the Proxy and browser configuration tab. So, click on that.
Now simply click into the Initial cookies field and press CTRL + V to paste the cookies.
And that's it! When you run the scraper, it will start already logged in. Keep in mind that if the cookies are very short lived, this might not work and you will need to implement login in the code.
Happy logged-in crawling!