OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Try/Catch function inconsistency when "batching" using ScriptProperties

  • Thread starter Thread starter DJ Luke
  • Start date Start date
D

DJ Luke

Guest
So, this post is a continuation of a post I made 10 months ago. I'll reference it multiple times in this one, but here is the link to it to go back to: Import of a Large Quantity of API Data

To start, I have been working on a database that pulls from thousands of API/JSON links to create data. After many attempts to do it individually caused the code to error out, time out, or outright fail. I made a post looking for an answer, and that is what lead me to trying to use "batching". Here is the code that was provided, and with a few minor tweaks to fit the data, not to the "batching" part, it worked:

Code:
var scriptProperties = PropertiesService.getScriptProperties();

function dataImport() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName("Import1");
  var exportSheet = ss.getSheetByName("Data1");
  var reqs = sheet.getRange("A2:A" + sheet.getLastRow()).getDisplayValues().reduce((ar, [url]) => {
    if (url) {
      ar.push({ url, muteHttpExceptions: true });
    }
    return ar;
  }, []);

  //Storage of current data
  var bucket = [];
  var batchSize = 200;
  var batches = batchArray(reqs, batchSize);
  var startingBatch = scriptProperties.getProperty("batchNumber") == null ? 0 : parseInt(scriptProperties.getProperty("batchNumber"));
  var processedBatches = scriptProperties.getProperty("processedBatches") == null ? 0 : parseInt(scriptProperties.getProperty("processedBatches"));

  console.log(`Total: ${reqs.length}.\n${batches.length} batches.`)

  if (processedBatches >= (batches.length - 1)) {
    console.log('All data has been processed already.');
  } else {
    //Start from the very last batch that stopped that needs to be processed.
    for (let i = startingBatch; i < batches.length; i++) {
      console.log(`Processing batch index #${parseInt(i)}`);
      try {
        var responses = UrlFetchApp.fetchAll(batches[i]);
        bucket.push(responses);
        //Remove previous batch index number
        scriptProperties.deleteProperty("processedBatches");
        //Store latest sucessful batch index number
        scriptProperties.setProperty("processedBatches", parseInt(i));
      }
      //Catch the last batch index number where it stopped due to URL fetch exception
      catch (e) {
        //Remove the old batch number to be replaced with new batch number.
        scriptProperties.deleteProperty("batchNumber");
        //Remember the last batch that encountered and error to be processed again in the next call.
        scriptProperties.setProperty("batchNumber", parseInt(i));
        console.log(`Batch index #${parseInt(i)} stopped`);
        break;
      }
    }

    const initialRes = [].concat.apply([], bucket);

    var temp = initialRes.reduce((ar, r) => {
      if (r.getResponseCode() == 200) {
        var { id, firstName, lastName, fullName, displayName, shortName, weight, height, position: { abbreviation }, team: { $ref }, birthPlace: { city }, birthPlace: { state, country }, experience: { years, displayValue }, jersey, active } = JSON.parse(r.getContentText());
        ar.push([id, firstName, lastName, fullName, displayName, shortName, weight, height, abbreviation, $ref, city, state, country, years, displayValue, jersey, active]);
      }
      return ar;
    }, []);
    var res = [...temp];

    //Add table headers
    exportSheet.getLastRow() == 0 && exportSheet.appendRow(['IDs', 'First Name', 'Last Name', 'Full Name', 'Display Name', 'Short Name', 'Weight', 'Height', 'Position', 'Team URL', 'City', 'State', 'Country', 'Years', 'Class', 'jersey', 'Active']);

    //Add table data
    var result = () => {
      return temp.length != 0 && exportSheet.getRange(exportSheet.getLastRow() + 1, 1, res.length, res[0].length).setValues(res);
    }
    result() && console.log(`Processed: ${res.length}.`);
  }
}

//Function to chunk the request data based on batch sizes
function batchArray(arr, batchSize) {
  var batches = [];

  for (var i = 0; i < arr.length; i += batchSize) {
    batches.push(arr.slice(i, i + batchSize));
  }

  return batches;
}

//Optional function to check the latest processed batch
function checkBatch() {
 console.log(`${parseInt(scriptProperties.getProperty("processedBatches"))+1} batches have been processed.`);
}

This code is supposed to process then apply each batch individually to the "Import Data" sheet. As I run through the code multiple times, it would increase both the batchNumber and processedBatches ScriptProperties. I would add a trigger into Apps Script that would run the code every minute as it works through all the batches. By saving the properties, the goal is to essentially save the progress of working through the batches to avoid the code timing out. When it was working, it would take around 10-15 seconds each time I would run through the code, and then the per minute trigger would get all the batches applied. The answer in the post goes into greater detail on how it is supposed to work, including a GIF demo, so if you would like detail on that, it explains it better than I can!

Fast forward to the present, and I was looking to update my data with the latest information, and attempted to use the same code I had before. To my dismay, the "batching" that had previously worked was not functioning correctly. It is a bit hard to describe in any other way that inconsistent as it has done multiple different things without me even making any real changes to the code. Sometimes, it will get stuck in a processing loop, where it will keep going through the "try" part of the try/catch loop, listing batch after batch as "processing", but not actually applying the data. The processing would keep going until the code itself would timeout, and would just update the processedBatches property, but not the batchNumber property. This is the option I have seen most frequently. Other times I have seen it process whatever batch it is on, then just stop, not putting any data in after only running once. Then in rare instances it will put a batch in the code, but I haven't been able to get it to do so twice in a row. Usually, when one of the batches does go in the sheet, the next time I run it, I'll be stuck in a processing loop again. Generally, unless a previous attempt was successful, I have been trying multiple things, including setting both properties to zero and removing the properties in order to try and reset the code, but neither has worked to yield successful results. Also, for additional reference, when the code is running right, I haven't needed to manually adjust the properties until all the batches have been applied, then I would reset it, but haven't got anywhere close in this current attempt.

I'm not sure exactly where to go to try and fix this. I already felt lost when trying to put it together, now I have no clue why it stopped working. I did go in and put extra console.log(s) in the code to try and find where the error may be occurring, and that is where I found that the "catch" wasn't consistently processing.

For reference, here is the most current version of the code that I have been working with, and to show that most of the internals I do not believe have been altered. In fact, even went back in and re-pasted the "batching" sections into the code to try and get it to work:

Code:
var scriptProperties = PropertiesService.getScriptProperties();

function dataImport() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName("Base JSON Import");
  var exportSheet = ss.getSheetByName("Import Data");
  var reqs = sheet.getRange("A2:A" + sheet.getLastRow()).getDisplayValues().reduce((ar, [url]) => {
    if (url) {
      ar.push({ url, muteHttpExceptions: true });
    }
    return ar;
  }, []);

  //Storage of current data
  var bucket = [];
  var batchSize = 200;
  var batches = batchArray(reqs, batchSize);
  var startingBatch = scriptProperties.getProperty("batchNumber") == null ? 0 : parseInt(scriptProperties.getProperty("batchNumber"));
  var processedBatches = scriptProperties.getProperty("processedBatches") == null ? 0 : parseInt(scriptProperties.getProperty("processedBatches"));

  console.log(`Total: ${reqs.length}.\n${batches.length} batches.`)

  if (processedBatches >= (batches.length - 1)) {
    console.log('All data has been processed already.');
  } else {
    //Start from the very last batch that stopped that needs to be processed.
    for (let i = startingBatch; i < batches.length; i++) {
      console.log(`Processing batch index #${parseInt(i)}`);
      try {
        var responses = UrlFetchApp.fetchAll(batches[i]);
        bucket.push(responses);
        console.log(`Fetched batches, pushed responses`);
        //Remove previous batch index number
        scriptProperties.deleteProperty("processedBatches");
        console.log(`Removed previous batch index number`);
        //Store latest sucessful batch index number
        scriptProperties.setProperty("processedBatches", parseInt(i));
        console.log(`Stored latest succcessful batch index number`);
      }
      //Catch the last batch index number where it stopped due to URL fetch exception
      catch (e) {
        console.log(`Catching the last batch index number`);
        //Remove the old batch number to be replaced with new batch number.
        scriptProperties.deleteProperty("batchNumber");
        console.log(`Removed old batch number`);
        //Remember the last batch that encountered and error to be processed again in the next call.
        scriptProperties.setProperty("batchNumber", parseInt(i));
        console.log(`Remember the last batch number`);
        console.log(`Batch index #${parseInt(i)} stopped`);
        break;
      }
    }

    const initialRes = [].concat.apply([], bucket);

    var temp = initialRes.reduce((ar, r) => {
      if (r.getResponseCode() == 200) {
        var { id, firstName, lastName, fullName, displayName, shortName, weight, height, position, dateOfBirth, hand, jersey, debutYear, birthPlace, experience, active } = JSON.parse(r.getContentText());
          var { abbreviation = null } = position || {};
          var { displayValue = null } = hand || {};
          var { city = null, state = null, country = null } = birthPlace || {};
          var { years = null } = experience || {};
        ar.push([id, firstName, lastName, fullName, displayName, shortName, weight, height, abbreviation, dateOfBirth, displayValue, jersey, debutYear, city, state, country, years, active]);
      }
      return ar;
    }, []);
    var res = [...temp];

    //Add table headers
    exportSheet.getLastRow() == 0 && exportSheet.appendRow(['IDs', 'First Name', 'Last Name', 'Full Name', 'Display Name', 'Short Name', 'Weight', 'Height', 'Position', 'DOB', 'Hand', 'Jersey', 'Debut Year', 'City', 'State', 'Country', 'Years', 'Active']);

    //Add table data
    var result = () => {
      return temp.length != 0 && exportSheet.getRange(exportSheet.getLastRow() + 1, 1, res.length, res[0].length).setValues(res);
    }
    result() && console.log(`Processed: ${res.length}.`);
  }
}

//Function to chunk the request data based on batch sizes
function batchArray(arr, batchSize) {
  var batches = [];

  for (var i = 0; i < arr.length; i += batchSize) {
    batches.push(arr.slice(i, i + batchSize));
  }

  return batches;
}

//Optional function to check the latest processed batch
function checkBatch() {
 console.log(`${parseInt(scriptProperties.getProperty("processedBatches"))+1} batches have been processed.`);
}

In my searches, I already hadn't found great examples of the importation of API links, but I still haven't found much of anything that tackles this inconsistency when pairing "batching" with try/catch. So I now come here to ask for help. I have attached a test sheet if it makes it easier to find an issue, and I appreciate any help I can get! Also, if any more details are needed or desired, please let me know.

Test Sheet: https://docs.google.com/spreadsheets/d/1yb-AeTTv7Z5pJNsc13s90bmT3uIOHSovswVzLwEQVkM/edit?usp=sharing

<p>So, this post is a continuation of a post I made 10 months ago. I'll reference it multiple times in this one, but here is the link to it to go back to: <a href="https://stackoverflow.com/questions...pi-data-through-apps-script?noredirect=1&lq=1">Import of a Large Quantity of API Data</a></p>
<p>To start, I have been working on a database that pulls from thousands of API/JSON links to create data. After many attempts to do it individually caused the code to error out, time out, or outright fail. I made a post looking for an answer, and that is what lead me to trying to use "batching". Here is the code that was provided, and with a few minor tweaks to fit the data, not to the "batching" part, it worked:</p>
<pre class="lang-js prettyprint-override"><code>var scriptProperties = PropertiesService.getScriptProperties();

function dataImport() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName("Import1");
var exportSheet = ss.getSheetByName("Data1");
var reqs = sheet.getRange("A2:A" + sheet.getLastRow()).getDisplayValues().reduce((ar, ) => { if (url) { ar.push...T3uIOHSovswVzLwEQVkM/edit?usp=sharing</a></p>
 
Top