search help

To make sure that you don't crawl the same page multiple times by accident, Apify actors are set to avoid visiting the same URL twice. To do this, each request is assigned a uniqueKey, which acts as its identifier. The uniqueKey is usually the same as the request's URL, so two requests with the same uniquekey are considered to point to the same URL. 

Fortunately for us, Apify actors automatically provide different uniqueKey properties to POST requests with unique payloads. So, all we need to do is change the default GET request method to POST and provide a payload. 

There are two ways to do this, depending on the type of actor you're using:

  1. Through the GUI or API in actors that use StartURLs 
  2. Programmatically in actors

Actors with StartURLs  

When creating a new task for an actor such as Web Scraper (apify/web-scraper), you are asked to set one or more StartURLs. By default, they are set to use the GET method. To change this, click on the green Details button to display the additional options. Then, set the Method to POST and add the data you want to send. 

In the GUI:

To add more URLs, just click on the green Add URL  button in the bottom-left corner and repeat.

If you prefer to use the API instead, you can send your StartURLs  in the body of your request, using the following structure:

"startUrls": [
    {
      "url": "https://www.example.com/test.html",
      "method": "POST",
      "payload": {
        "greeting": "Hi mom, I'm using Apify!"
      },
      "userData": {
        "label": "START"
      }
    }
  ]

Set StartURLs programmatically

In actors that do not use StartURLs , you need to set your request methods programmatically. To do this, add your requests to the request queue, setting the method to POST and adding properties data such as payload, headers, and userData.

In scrapers, use context.enqueueRequest() in your Page function:

context.enqueueRequest({
    url: "https://www.example.com/test.html",
    method: "POST",
    payload: "Hi, mom! I'm using an Apify scraper!",
})

In the Apify SDK:  

// Open a new request queue
const requestQueue = await Apify.openRequestQueue();

// Enqueue the page and set the data
await requestQueue.addRequest({
    url: "https://www.example.com/test.html",
    method: "POST",
    payload: "Hi, mom! I'm using the Apify SDK!",
});

You can do this manually for each request, or, if you have a long list of URLs, you can iterate over it, adding the necessary data during the iteration. 

Please note that context.enqueueRequest() and requestQueue.addRequest() have the same functionality, however can not be used interchangeably between scrapers and the SDK.
  

And that's it! You are now ready to send multiple POST requests to the same URL.


Did this answer your question?