Apify users sometimes need to submit a form on pages created with ASP.NET (URL typically ends with .aspx). These pages have a different approach for how they submit forms and navigate through pages. This tutorial shows you how to handle these kind of pages. This approach is based on a blog post from Todd Hayton, where he explains how crawlers for ASP.NET pages should work.

First of all, you need to copy&paste this function to your crawler Page function:

var enqueueAspxForm = function(request, formSelector, submitButtonSelector, async) {
    request.postData = $(formSelector).serialize();
    if ($(submitButtonSelector).length) {
        request.postData += decodeURIComponent("&"+$(submitButtonSelector).attr("name") + "=" + $(submitButtonSelector).attr("value"));
    }
    request.postData += decodeURIComponent("&__ASYNCPOST="+async.toString());
    request.method = "POST";
    request.uniqueKey = Math.random();
    context.enqueuePage(request);
    return request;
};

The function has these parameters:

request - the same object you use for context.enqueuePage()

formSelector - selector for a form to be submitted e.g 'form[name="test"]'

submitButtonSelector - selector for a button for submit form e.g. '#nextPageButton'

async - if true, request returns only params not HTML content

Then you can use it in your Page function as follows:

enqueueAspxForm({
        url: "http://architectfinder.aia.org/frmSearch.aspx",
        label: "searchResult"
    }, 'form[name="aspnetForm"]', '#ctl00_ContentPlaceHolder1_btnSearch', false);

As a template you can use this community crawler we’ve shared for you. As always, if you have any questions, we’re just an email away.

Did this answer your question?