The Apify API has a limitation of 9 MB per request, so if you need to upload a larger array of items into a key-value store or a dataset, you'll run into trouble. The following function can help you to split the array into smaller chunks and upload them separately: 

const chunkUpload = async (
  items,
  uploadFunction,
  size = 1000,
  count = 0
) => {
  if (items.length) {
    const chunk = items.slice(0, size)
    const nextCount = count + chunk.length
    console.log(`Uploading items ${nextCount} items.`)
    await uploadFunction(chunk)
    return chunkUpload(
      items.slice(size),
      uploadFunction,
      size,
      nextCount
    )
  }
  console.log('Finished chunkUpload.')
  return null
}

Now we can call the function with an array of items to upload into our dataset or key-value store. The function will slice the items into small uploadable chunks (make sure that the size of the chunk is smaller than 9 MB).

Example:

...

const uploadToDataset = (data) => datasets.putItems({ data })

// Call the chunkUpload function with default parameters
// size = 1000 & count = 0
await chunkUpload(allData, uploadToDataset)

...

Voila!

Did this answer your question?