Import Data from Excel to Dataverse/SharePoint List using Power Automate
One of the most common requirements in any business use case is to read the excel from an external system via Email or direct upload by a user. In Power Platform it can be implemented in multiple ways.
Dataflows are a great way to transform data and import it to excel, we can configure an excel file to run in a recurring way to import in the dataverse. But only one file can be configured.
Excel online Business Connector
Using an excel connector in power automate is a great way to loop through dynamic excel files in a recurring process. But it comes with a set of limitations,
Limitations of Excel connector
But 25 MB is the maximum file limit and .xlsx can only be used in that connector.
Only tables can be read from excel, So whatever data should be extracted from excel should be inside the table.
Excel file will be locked for 6 minutes when it used by the excel connector
Excel connector can be used from SharePoint, OneDrive. In our example we are going to provide an app for users to upload the files in SharePoint or users can directly upload files in SharePoint.
Attachment control can be added from a SharePoint generated form, To pass the excel content to workflow it should be converted to binary format using the Image control.
On the Button click:
Set(ExcelFile,JSON(FileImageHide.Image,JSONFormat.IncludeBinaryData)); Set(FileBase64,Mid(ExcelFile,Find(",", ExcelFile)+1,Len(ExcelFile)-Find(",",ExcelFile)-1)); Set(VarFileLink,UploadFile.Run(FileBase64,Concatenate(prefix,Last(attUpload.Attachments).Name)).sharepointlink)
File Upload Workflow:
When a file is created in SharePoint document library, the main workflow which will read excel rows and pushes it to dataverse will run.
Initializing the required variables
Remember the limitation, Only the tables from excel can be retrieved. Either the user should send the file with data encapsulated in a table or In our case, it is a report the number of rows will vary each month so we are creating a table for the whole excel sheet.
We are providing a Table range to create a table for the sheet and Column names to be read from excel.
Once all the records are created, the “boolContinue” variable will be set to true to stop the Do Until loop.
List rows by default will return only 256 rows, We need to enable the pagination and threshold limit to read more rows in each iteration. In our case, we are setting to 50000 because the report will return more than 100000 rows each month.
“List Rows” action will support filtering the tables but it was not working consistently when we tried so we used the “Filter array” action to filter the rows returned.
Skip Count is used to skip the rows, Initially, it will be 0 and on each iteration, it will be incremented by 50000 to eliminate the already created rows.
Since we are creating a table for the whole sheet, we need to eliminate the blank rows in the table, So we are filtering the 1st Column which is the S.No column is Not Equal to Blank().
If the total number of the returned rows is less than 50000, that is the last iteration. Because we set the pagination to 50000 if the returned rows are less than 50000 then it is the last iteration and the “boolContinue” variable will be set to true.
Once the rows are returned from the filter action it should be used in “Apply to each” action. For each row, the “Add a new row” action will create a record in the table.
Once the “Apply to Each” action is completed, the intCount variable should be incremented by 50000. Since it deals with more data, Concurrency Control should be turned on for “Apply to Each” to maximum. Otherwise, the workflow will run for the whole day and sometimes it will get timed out for the large set of data.
User should be notified once the process is completed.
Post out your questions in the comment section. Happy Building 🙂