Azure Functions And Conversion Of base64 Encoded Files Of Salesforce

Mukul Mahawariya
5 min readAug 10, 2020

--

To store a file from a cloud to another cloud, you need a connection between them.

To understand these connections, I have written a blog where I explained the connection between Azure and Salesforce, and the related terminologies like — Blob Storage, Dataset, Linked Services and many more, followed with how to connect Salesforce and Azure Blob Storage and fetch the data from Salesforce and storing it in Azure blob storage. Here is the link to the blog “Connecting Salesforce and Azure Blob Storage”.

Salesforce stores files in the “base64” encoded format in the ContentVersion object, so when you retrieve the files, the content of files will not be human-readable. Therefore, we need custom logic to convert each object in the JSON file to change it into the human-readable format.

Let’s understand how we can effectively process this with the help of Azure Functions:

Azure Functions

Azure Functions is a serverless compute service that lets you run event-triggered code without having to explicitly provision or manage infrastructure. In simple words, Azure functions are used to suit different business requirements.

Creating Azure Functions -

  1. First, you need to create a resource (Follow this link). After creating the resource click on the “Compute”. If you do not find compute, click on “Add” and select Function App, specify details accordingly, and select runtime stack as JAVA.
  2. In the Azure portal, you can not develop Java functions. Therefore, you need to install the Visual Studio Code to develop Java functions. Click here and select Java to set up Visual Studio Code. Specify language and project name according to you.

Once the project is created the next step is to create a BlobTrigger function by clicking on the thunder icon in the Azure extension window.

Specify the following details :

1.The backup file path should be in the below format -

"path":<Container name>/<folder name if any>/{file-name}.json

2. Connection : “AzureWebJobsStorage”

3. Once created, verify the function.json.

Function.json contains the retrieved file path and connection string called AzureWebJobsStorage.

AzureWebJobsStorage

The Azure Functions runtime uses this storage account connection string for all functions except for HTTP triggered functions. The storage account must be a general-purpose one that supports blobs, queues, and tables.

Now in your project folder, open local.settings.json and specify details of your storage account in the AzureWebJobsStorage.

It should contain the following settings

  1. DefaultEndpointsProtocol
  2. AccountName
  3. AccountKey
  4. BlobEndpoint
  5. QueueEndpoint
  6. TableEndpoint
  7. FileEndpoint

For example :

"AzureWebJobsStorage": DefaultEndpointsProtocol=https;AccountName=<name>;AccountKey=<key>;BlobEndpoint=<BlobServiceEndpoint>;QueueEndpoint=<QueueserviceEndPoint>;TableEndpoint=<TableServiceEndpoint>;FileEndpoint=<FileServiceEndPoint>"

Specify the primary endpoint for all the endpoints. You can find Account keys and endpoints under Access keys and properties of the storage account respectively.

Now it’s time to code, open the Explorer and find your function under the following path src\main\java\com\function, click on your function and perform the following steps :

  1. Check the @BlobTrigger it should contain the following details -
@BlobTrigger(name = "content", dataType = "binary",path = "<Container Name>/<Folder Name if any>/{name}.json", connection = "AzureWebJobsStorage") byte[] content,

Content of the file is passed in byte[] content.

2. The project you have created is a Maven Project, so you need dependencies for reading the JSON file and for creating different files from the objects of JSON. Add below dependencies in your project.

<dependency><groupId>com.azure</groupId><artifactId>azure-storage-blob</artifactId><version>12.5.0</version></dependency><dependency><groupId>com.microsoft.azure</groupId><artifactId>azure-storage</artifactId><version>4.0.0</version></dependency><dependency><groupId>commons-io</groupId><artifactId>commons-io</artifactId><version>2.6</version></dependency><dependency><groupId>com.googlecode.json-simple</groupId><artifactId>json-simple</artifactId><version>1.1.1</version></dependency><dependency><groupId>commons-codec</groupId><artifactId>commons-codec</artifactId><version>20041127.091804</version></dependency><dependency><groupId>com.microsoft.azure.functions</groupId><artifactId>azure-functions-java-library</artifactId></dependency>

3. Import following packages

import com.microsoft.azure.functions.annotation.*;import com.microsoft.azure.functions.*;import java.io.ByteArrayInputStream;import java.io.InputStreamReader;import java.nio.charset.StandardCharsets;import java.util.*;import com.microsoft.azure.functions.annotation.*;import com.microsoft.azure.storage.CloudStorageAccount;import com.microsoft.azure.storage.blob.CloudBlobClient;import com.microsoft.azure.storage.blob.CloudBlobContainer;import com.microsoft.azure.storage.blob.CloudBlockBlob;import com.microsoft.azure.functions.*;import org.apache.commons.codec.binary.Base64;import org.apache.commons.io.*;import org.json.simple.JSONArray;import org.json.simple.JSONObject;import org.json.simple.parser.JSONParser;

4. Create a JSON parser

JSONParser jsonParser = new JSONParser();

5. Creating a connection with the Storage account and container using Cloud Storage classes of com.microsoft.azure. package.

final String storageConnectionString = "DefaultEndpointsProtocol=https;AccountName=<name>;AccountKey=<key>;BlobEndpoint=<BlobServiceEndpoint>;QueueEndpoint=<QueueserviceEndPoint>;TableEndpoint=<TableServiceEndpoint>;FileEndpoint=<FileServiceEndPoint>";CloudStorageAccount account = CloudStorageAccount.parse(storageConnectionString);CloudBlobClient blobClient = account.createCloudBlobClient();CloudBlobContainer container= blobClient.getContainerReference("<container name>");

6. Now you need to convert byte array content in the string format and parse the JSON array.

String byteContent = new String(content);String standard = byteContent.replaceAll("", "");//to remove unwanted charactersInputStreamReader inputStreamReader = new InputStreamReader(IOUtils.toInputStream(standard,StandardCharsets.UTF_8));JSONArray versionDataList = (JSONArray) jsonParser.parse(inputStreamReader);

7. Iterate over the obtained JSON array

Iterator<JSONObject> iterator = versionDataList.iterator();JSONObject iterate;String fileData, fileName;CloudBlockBlob blob;while (iterator.hasNext()) {iterate = new JSONObject();iterate = iterator.next();fileData = iterate.get("VersionData").toString();//contains the file contentfileName = iterate.get("PathOnClient").toString();//name and file formatblob = container.getBlockBlobReference(fileName);byte[] byteArray = Base64.decodeBase64(fileData.getBytes(StandardCharsets.UTF_8));blob.upload(new ByteArrayInputStream(byteArray), -1);}

8. Cover the code from point 5 to 7 under try-catch.

For more information on CloudStorage class, click here.

Now, upload the function by clicking on Azure extension.

Once the function is deployed, then go back to the Azure portal and navigate -

function app>functions>your function>Code + Test.

  1. Click on Test/Run, specify the JSON file path which was retrieved from salesforce, and click on run.
  2. Go back to the container, and there you can see all converted files.

Conclusion -

Above steps help you to convert encoded files in the Salesforce to their real format.

  • Files at Salesforce are stored in base64 encoded format.
  • The backup of files from Salesforce is stored in a JSON file at Azure, which is not human readable.
  • By using the Azure function, JSON objects of the backup file are converted to their original format.

Now, after reading this blog I hope you have a better understanding of “Azure functions and Salesforce Files” & what are the key points that we should keep in mind or what is the process for doing so. If you find this blog helpful in your learnings, share it with your friends or colleagues. Thanks for reading.

--

--

Mukul Mahawariya

4x Salesforce Certified | Trailhead Ranger | Salesforce Enthusiast