Create custom imports

Create custom imports

  • 09/06/2022
  • Jaime Requena
  • 0
Create Custom Imports

Introduction

This article will detail how you can create your own custom import tool using custom lists, custom records, SuiteScript, and a library called Papa Parse.

Custom Lists

We will create two custom lists:

  1. CSV Import Type
    • One value for now – “Sales Order” – This would be the type of CSV Import you are doing, and you could add to this list as you add new custom imports.
  2. CSV Import Type Status
    • Three values – “Ready for Processing, “Errors Occurred While Processing”, “Fully Processed”

Custom Records

We will create two custom records:

  1. CSV Import Job
    • Fields:
      • CSV Import Type – Type List/Record of CSV Import Type (from above).
      • CSV File – Type Document
      • Status – Type List/Record of CSV Import Status – Should default to “Ready for Processing”
  2. CSV Import Record – Child record of CSV Import Job
    • Fields:
      • CSV Import Job – Type List/Record of CSV Import Job (mark “Record is Parent”)
      • Payload – Type Text Area – This will be the data we will use to import the record.  It is in JSON format.
      • Transaction – Type List/Record Transaction – This will be the resulting record that is created.  This could be another type of record and not necessarily a transaction.
      • Error – Text Area – If any errors occur while creating the record, they will be logged here.

 

SuiteScript

There will be two scripts to process the CSV file.  One script which will create the custom records and another script which will process the custom records.  A user event script will need to be created to update the CSV Job record.

Create Custom Records Script

The first script will look for CSV files in a directory and then create the custom records.  It uses a free library called Papa Parse (https://www.papaparse.com/) to parse the CSV file.

Script summary:

  1. Get all files to be processed by searching for all files in a folder designated by a script parameter.
  2. For each file:
    • Verify the file hasn’t already been processed by searching for CSV Import Job records that already have this file attached.  If so, log an error and do stop importing.
    • Create a CSV Import Job record.
    • Parse the CSV file using Papa Parse.
    • Create a CSV Import Record for each line in the CSV file.
    • Move the file to the Processed folder.

 

The code:

/**
 * @NApiVersion 2.1
 * @NScriptType MapReduceScript
 * @NModuleScope SameAccount

*/
//------------------------------------------------------------------
//Script: 		CreateCSVJob_MR.js
//------------------------------------------------------------------

define([ 'N/runtime', 'N/record', 'N/search', 'N/file', './papaparse'],
(runtime, record, search, file, papa) => {

	getInputData = (context) => {
        log.debug('==START==','==START==');

        const scriptObj = runtime.getCurrentScript();
        const csvFolderId = scriptObj.getParameter({name: "custscript_csv_folder_id"});
        const processedFolderId = scriptObj.getParameter({name: "custscript_processed_csv_folder_id"});

        let filters = [];
        filters.push(["internalidnumber","equalto", csvFolderId]);

        const folderSearchObj = search.create({
            type: "folder",
            filters: filters,
            columns:
            [
                search.createColumn({
                    name: "internalid",
                    join: "file",
                    label: "Internal ID"
                 })            ]
        });
        let fileIds = [];
        folderSearchObj.run().each(function(result){
            fileId = result.getValue({
                name: "internalid",
                join: "file",
                label: "Internal ID"
            });
            fileIds.push(fileId);
            return true;
        });

        // get all lines in the file and create array
        let allLines = [];
        for (let y = 0; y < fileIds.length; y++) {
            let fileId = fileIds[y];
            if (doesImportAlreadyExist(fileId)) {
                log.error('The file has already been processed.', 'File ID : ' + fileId);
                return;
            }

            // create CSV Job record
            let csvJobRecord = record.create({
                type: 'customrecord_csv_import_job',
                isDynamic: true
            });
            csvJobRecord.setValue('custrecord_csv_import_type', 1);  // for instance, sales order
            csvJobRecord.setValue('custrecord_csv_import_file', fileId);
            csvJobRecord.setValue('custrecord_csv_import_job_status', 1); // ready for processing
            let csvJobRecordId = csvJobRecord.save();

            let myFile = file.load({
                id: fileId
            });

            let fileContents = myFile.getContents();

            let config =
            {
                delimiter: "",	// auto-detect
                newline: "",	// auto-detect
                quoteChar: '"',
                escapeChar: '"',
                header: true,
                dynamicTyping: false,
                preview: 0,
                encoding: "",
                worker: false,
                comments: false,
                step: undefined,
                complete: undefined,
                error: undefined,
                download: false,
                skipEmptyLines: true,
                chunk: undefined,
                fastMode: undefined,
                beforeFirstChunk: undefined,
                withCredentials: undefined
            };

            let records = papa.parse(fileContents, config);

            for(let i = 0; i < records.data.length; i++) {
                let payload = {};
                for(let x = 0; x < records.meta.fields.length; x++) {
                    payload[records.meta.fields[x]] = records.data[i][records.meta.fields[x]];
                }
                allLines.push({
                    payload: JSON.stringify(payload),
                    csvJobRecordId: csvJobRecordId
                });
            }

            // move the file to the processed folder
            myFile.folder = processedFolderId;
            myFile.save();
        }
        return allLines;
    }

    doesImportAlreadyExist = (fileId) => {
        const jobSearchObj = search.create({
            type: "customrecord_csv_import_job",
            filters: [
                ["custrecord_csv_import_file", "anyof", fileId]
            ],
            columns: [
                'internalid'
            ]
        });
        const searchResultCount = jobSearchObj.runPaged().count;
        if (searchResultCount > 0)
            return true;
        return false;
    }

    map = (context) => {
        log.debug('map context', context);
        try {
            const value = JSON.parse(context.value);
            const csvImportRecord = record.create({
                type: 'customrecord_csv_import_record',
                isDynamic: true
            });
            csvImportRecord.setValue('custrecord_csv_import_record_job', value.csvJobRecordId);
            csvImportRecord.setValue('custrecord_csv_import_record_payload', value.payload);
            const recordId = csvImportRecord.save();
            log.debug('Created csv import record with id', recordId);
        }
        catch(e) {
            log.error('Error', message);
        }
    }

    summarize = (summary) => {
        log.debug('==END==','==END==');
    }

    return {
        getInputData: getInputData,
        map: map,
        summarize: summarize
    };

});

Process Custom Records Script

The second script will get all the CSV Import Records which need to be processed and will process them.

Script summary:

  1. Get all CSV Import Records to process.
  2. Using the payload field, create new transaction (or other record depending on your need) using any custom logic you desire.
  3. Update the CSV Import Record:
    • If Successful, set the Transaction field.
    • If Error, set the Error field.

The Code:

 

/**
 * @NApiVersion 2.1
 * @NScriptType MapReduceScript
 * @NModuleScope SameAccount

*/
//------------------------------------------------------------------
//Script: 		ProcessCSVRecords_MR.js
//------------------------------------------------------------------

define([ 'N/runtime', 'N/record', 'N/search', 'N/file'],
(runtime, record, search, file) => {

	getInputData = (context) => {
        log.debug('==START==','==START==');

        let filters = [];
        filters.push(["custrecord_csv_import_record_job.custrecord_csv_import_job_status","anyof",["1","2"]],  // ready for processing or error
        "AND",
        ["custrecord_csv_import_record_job.custrecord_csv_import_type","anyof","1"], // for instance - 'sales order'
        "AND",
        ["custrecord_csv_import_record_transaction.internalidnumber","isempty",""]);

        const csvSearchObj = search.create({
            type: "customrecord_csv_import_record",
            filters: filters,
            columns:
            [
                search.createColumn({name: "custrecord_csv_import_record_payload", label: "Payload"}),
                search.createColumn({name: "custrecord_csv_import_record_job", label: "Job"}),
            ]
        });

        return csvSearchObj;
    }

    reduce = (context) => {
        log.debug('reduce context', context);
        let values = {};
		let payload = null;
        try {
            log.debug('payload', context.values[0]);
            payload = JSON.parse(context.values[0]);

			// ADD CUSTOM LOGIC HERE TO CREATE THE RECORD
			// create your own function here instead of createSalesOrder
			const salesOrderId = createSalesOrder(payload);
            values.custrecord_csv_import_record_transaction = salesOrderId;
            values.custrecord_csv_import_record_error = '';
        }
        catch(e) {
            const message = e.message + ' : ' + JSON.stringify(e);
            log.error('Error', message);
            values.custrecord_csv_import_record_error = message;
        }

        try {
			record.submitFields({
				type: 'customrecord_csv_import_record',
				id: context.key,
				values: values,
				options: {
					enableSourcing: false,
					ignoreMandatoryFields: true
				}
            });
        }
        catch(e) {
            log.error('Failed to update CSV Import record with id ' + context.key, JSON.stringify(e));
        }
    }

    summarize = (summary) => {
        log.debug('==END==','==END==');
    }

    return {
        getInputData: getInputData,
        reduce: reduce,
        summarize: summarize
    };

});

CSV Import Job Custom Record User Event Script

Here is the pseudo-code:

  1.  On Before Submit:
    • Cycle through all CSV Import Records pertaining to this CSV Import Job.  If all have transactions records assigned, then set the status to “Fully Processed”.  If any record has an error, then set the status to “Errors Occurred While Processing”.

 

Conclusion

The solution outlined here is a starting point.  I hope I have given you a good base to start from to create your own custom import.  If you need any help creating a custom import, please contact Suite Tooth Consulting.  We would be happy to help you!

Follow on
Jaime Requena Chief Executive Officer

Jaime Requena is a seasoned NetSuite Consultant and Solutions Architect, known for delivering WHITE GLOVE service to businesses. With 15+ years of experience and 3x certifications in ERP, Developer, and Admin, Jaime specializes in highly customized NetSuite accounts, transforming operations for 200+ satisfied customers all across the globe.

Get Connected

How can we help?


    Stay in the loop with Suite Tooth Consulting!

    We aim to bring unmatched expertise and professionalism to your NetSuite initiatives. Let’s talk about how our NetSuite consultancy can make a difference!

    Global Client Satisfaction

    Client Testimonials

    It’s Been 4+ Years Now And We Have Worked With Hundreds Of Clients, Building Our Way To The Top, One Happy Client After Another! Their Voices Of Satisfaction Serve As A Testament To Our Success –