Are you trying to use actors for the first time, or moving some of your crawler to an actor and don't know how to deal with the request label or how to pass data to the request as easily as in a crawler? 

Here's how to do it.

If you are using the requestQueue, you can do it this way.

When you add a request to the queue, use the userData attribute.

// Create a request list.
const requestQueue = await Apify.openRequestQueue();
//Add the request to the queue
await requestQueue.addRequest(new Apify.Request({
    url: 'https://www.example.com/',
    userData: {
        label: "START"
    }

}));

So right now, we have one request in the queue that has the label "START".  Now we can specify which code should be executed for this request in the handlePageFunction.

if (request.userData.label === "START") {
//your code for the first request for example
//enqueue the items of a shop
}else if (request.userData.label === "ITEM") {
//other code for the item of a shop
}

And in the same way you can keep adding requests in the handlePageFunction. 

You can also handle the passing of data to the request like this. For example, when we have extracted the item from the shop above, we want to extract some information about the seller. So we need to pass the item object to the seller page, where we save the rating of a seller, e.g..

await requestQueue.addRequest(new Apify.Request({
    url: sellerDetailUrl,
    userData: {
        label: "SELLERDETAIL",
        data: itemObject
    }

}));

Now, in the "SELLERDETAIL" url, we can just evaluate the page and extracted data merge to the object from the item detail, for example like this

const result = Object.assign({},request.userData.data ,sellerDetail);

So next just save the results and we're done!

await Apify.pushData(result);

If you have any questions, let us know.

Did this answer your question?