Search

28 December, 2020

Mocking data in Azure Logic Apps

In any distributed system development, various pieces of an application gets developed by multiple teams from different parts of the geography. This is a OK till you hit a roadblock when you cannot proceed until another team finishes their work and provide the data you need. In today's microservices design model this is even more relevant where you need a dependent service to return the data you need so that you can complete your work.

This type is scenario is inevitable and you cannot help but rather have to live with it. So what to do? 😒 

Mocking!! Yes you are right. Why not mock the data you need to finish your work and later just do the plumbing work with the actual service when it's up and ready to throw back actual data. This is a great idea where teams can work on their own schedule without waiting and running after each other.

In Azure, when it comes to mocking, developers often fall back on API Management to implement this with the help of APIM polices. But why going for such costly idea where the need is only to mock data when you can use your other friend Logic Apps to achieve the same. Let's check it out 👍

Scenario:

We need to get the employee details from the database service (developing by the back-end team) by passing the employee code. But the back-end team is not yet there and hence we need to mock the employee details.

Let's get started.


Define an http endpoint where we will send the employee code as below


Add a new action and search for the HTTP action


Fill the details as per your requirement. In my case it's a GET call with the following details.


As mentioned the URI is the service that's getting developed by the backend team which is not yet ready. So, in this action we have to inject out mocking logic. 

Click the "..." eclipse at the top right section of the action and select "Static result (Preview)" from the context menu. Yeah it's preview but works just fine. Go ahead!

Enable the Static result to get the following window. The default selections are good to go for our mocking purpose


But strangely there's no place to provide the static json response content here. Don't worry! It's there but we need to tweak a bit 😉

Hit the JSON mode icon to switch to the JSON view mode as below


Now here in the outputs section paste your return JSON in a body tag as below. I just need the employee first_name, last_name , "dept" and "rank" from the database service. So, I have mocked that piece as the back-end service is not yet ready and I can continue with my development 😃


Hit Done as return to the workflow. Now you will find a new icon added to the action which denotes that this action has a static result configured


As a last step let's add an "Response" action in the workflow and return back the employee details to the caller. 


Save your logic app.

Done! let take a bird's eye view of our logic app workflow. This will help you to match if you are doing it in parallel along with this article


Testing time! Copy the http endpoint from the trigger (first action) and paste it in any rest client to make the call. I have used postman rest client as below. Hit Send and get your static result back.


That's it. You have successfully implemented mocking and what a cool way to do so. 

Isn't that great! 

Congratulations! for coming this far. Hope this article will help you to further explore more on this feature.

Do share with me about your experience and what you have built upon this foundation. You can take it upto any level and integrate. I would love to hear from you.

14 December, 2020

Inject Inline javascript code in your Azure Logic App workflow

JavaScript is web developers all time favourite. Writing small snippets of JS code into your application to implement few cool functionality is very tempting and developers; I know one cannot think of any web application which do not have any JS flavor in some form or the other 😀 

Wouldn't it be great if we can use our JavaScript knowledge and inject a JS snippet into an Azure Logic App workflow and pull a magic out of it. I know you are thinking "Yessss! Why Not", so lets see how it an be done. 

[Please note that is article expects you to have the basic understanding of Azure Logic Apps and Functions. You can check on Microsoft Azure documentation for such] 

Our scenario for this article:

We need an Http endpoint which will accept a temperature and will do the Celsius or Fahrenheit conversion accordingly. If we send the temperature as Celsius then it will give us its Fahrenheit value and vice-versa.

In most cases I have seen developers quickly jumps into writing Azure Function to do this kind of simple stuff but after going through this article I hope you would think twice.

Our JSON payload will be as below which we will send while calling the Logic App endpoint. It's just sending the temperature in Celsius and expecting its respective Fahrenheit value.

{
   "temperature":"40",
   "metric":"C"
}

Now the JavaScript snippet to do the conversion is as below. It's just applying the formula and doing the respective Celsius and Fahrenheit conversion after reading the temperature and metric from the http request. If metric is coming as "C" then the result will be it's respective Fahrenheit and vice-versa.

var temp = workflowContext.trigger.outputs.body.temperature; 
var metric = workflowContext.trigger.outputs.body.metric;
var cf; 
if (metric == "C")
    cf = (temp * 9/5) + 32 + " degree Fahrenheit";
else if(metric == "F")
    cf = (temp - 32) * 5/9 + " degree Celsius";
else
    cf = "Invalid input"
return cf;

That's it! really 😲

Now lets create a Logic App and inject this piece of JavaScript snippet. There is an "Inline Code" Operation with an Action "Execute JavaScript Code (preview)" in Logic App workflow which we will use to achieve our result. Yes it's in preview but works just fine. Don't worry... you can safely use it 👍


Below is the full workflow and by looking at it you can pretty well understand what's it doing. So simple! 


Save your Logic App and copy the Http Post Url to test it. I will use Postman rest client to test the endpoint but you can use any rest client of your choice.

Test 1: Sending 20 degree Celsius and getting it's respective Fahrenheit value


Test 2: Sending 68 degree Fahrenheit and getting it's respective Celsius value


Cool! the outputs are as expected. With few line of JS snippet you are able to pull out the magic of temperature conversion which would otherwise need an Azure Function.

Isn't that great! 

Great Job and Congratulations! for coming this far. Hope this article will help you to further explore more on this feature.

Do share with me about your experience and what you have built upon this foundation. You can take it upto any level and integrate. I would love to hear from you.

30 November, 2020

Implementing Fan-Out/Fan-In Design Pattern using Azure Logic Apps

Azure Logic Apps are great for building system integrations. It doesn't matter your systems are running on premise or in cloud (any); with this great iPaaS service you can bring them all together in a common platform very quickly. 
But did you ever thought that you can also use this service to host multiple web services which can work in parallel to give the desired output. 

Think of a situation below:
"You have three web services each destined to perform a special task and return back a result. Now this result from each service individually doesn't makes any sense. You need to collate all the results, do some data massaging on it and return back the end result to the caller.". Very common scenario nowadays. Right?

Yes you can achieve this by
  • Calling each service one-by-one, store their individual result and then write code to prepare the end result
  • Azure Durable Functions with Orchestrator controller
  • Azure API Management with APIM policies.
All are right answers! 
This article is not to discourage you from using any of the above but rather intended to give you one more thought process to achieve the same using Azure Logic Apps which is worth a shot. Let's check it out.

[Please note that is article expects you to have the basic understanding of Azure Logic Apps and Functions. You can check on Microsoft Azure documentation for such. 👍]

I will use three very simple web services for this article. The services are nothing but Azure Functions which would simply return an integer value to the caller (in this case the logic app) where all will get summarized before flushing out the end result.

I have used one .Net and two Node.JS HttpTrigger function apps just to give it a multi-language web service flavour. The apps are just returning 10,20 and 30 respectively as their output.

.Net Function app - 01
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;

public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{    
    log.LogInformation("C# HTTP trigger function processed a request.");
    return new OkObjectResult(10);
}
Node.JS Function app - 02
module.exports = async function (context, req) {
    context.log('JavaScript HTTP trigger function processed a request.');    
    var responseMessage = 20;
    context.res = {
        body: parseInt(responseMessage)
    };
}
Node.JS Function app - 03
module.exports = async function (context, req) {
    context.log('JavaScript HTTP trigger function processed a request.');
    var responseMessage = 30;
    context.res = {
        body: parseInt(responseMessage)
    };
}
We are done with our three web services hosted as Azure Function apps. Cool 😊

Now let's put a Logic App in front which will act as an abstraction layer to all the web services and would give us a single http endpoint to consume. The logic app will call all the web services in parallel (Fan-Out), collect all results (Fan-In), do a summation and would flush out the end result. So, in our case the end result would be 60.. right!

Let's take a bird's eye view of our Logic App


The first three variables are declared to store the respective values from the three web services. The fourth and fifth variable are for doing the summation and final result.
Three web services are fanned-out in parallel as there is no dependency between each other. This will boost the speed as parallel processing is always faster. When all the runs are complete then the flow will be fanned-in for performing the data massaging (i.e. adding all the three results). The fourth and fifth variable are used to store the summation and ultimate flushing out the end result as a response to the caller.

Now our Logic App is in place and we are ready to test. Just copy the Logic App Http endpoint from the workflow and call it from Postman (or any other rest client is just fine). In a moment you will find the output of 60 flashing in your screen. Voila! It just worked as expected and we have successfully showcased a very rich design pattern with the help of Azure Logic App. 

Great Job and Congratulations! for coming this far. Hope this article will help you to further explore more on this feature.

Do share with me about your experience and what you have built upon this foundation. You can take it upto any level and integrate. I would love to hear from you.

29 September, 2020

Azure Logic App Integration : Transform XML to JSON

In Enterprise Integration or iPaaS, a scenario that often comes up is to convert a XML payload to Json. You might be receiving XML data from one system but needs to convert that to JSON before pushing it to another system. If both the source and destination systems understands same language then it's OK but that is not always the case. So, there's a challenge.

Azure Logic App has some cool power and we will try to leverage on such in our integration scenario. Here our legacy source system can only send XML data but our destination system only understands JSON. 

Prerequisites:

  • Azure Subscription (Free subscription will also do)
  • Basic XML, XSLT understanding
  • Basic JSON understanding
  • Azure Logic App basic Request/Response understanding
Let's get Started
  • Create a simple xml document (Vendor.xml). This is the payload our source system will send.
<?xml version="1.0" encoding="UTF-8"?>
<Vendor>
   <VendorNumber>VND1</VendorNumber>
   <VendorName>My Vendor</VendorName>
   <GroupId>10</GroupId>
   <Company>USRT</Company>
   <VendorCity>My City</VendorCity>
   <VendorContactPhone>1234567890</VendorContactPhone>
   <VendorContactEmail>myvendor@vendoremail.com</VendorContactEmail>
<VendorCountryRgn>USA</VendorCountryRgn> </Vendor>
  • But our destination system only understands JSON. Now we need to transform the above XML into JSON keeping all other information intact. So after transformation the data should look like this
{
   "VendorNumber":"VND1",
   "VendorName":"My Vendor",
   "Company":"USRT",
   "VendorCity":"My City",
   "VendorContactPhone":"1234567890",
   "VendorContactEmail":"myvendor@vendoremail.com",
   "VendorCountryRgn":"USA"
}
  • Looks simple! but not that simple in Logic App. There's a few marketplace connectors available but that won't help much when it comes to customization and also they not free and has licensing and pricing model complexities. So why to go that route when we have a free yet powerful friend in hand which can do this task quite comfortably and also is a first class citizen in Azure Logic App. XSLT Transformation. 
Here goes our Vendor.xsl document. Save it in your machine as you would need this later. It's quite simple and self-explanatory. XSLT has immense power and hence you can understand what level of control you will have in handling XML complexities.
<?xml version="1.0" encoding="UTF-8"?>

<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:output omit-xml-declaration="yes" method="text" media-type="application/json"/>
    <xsl:template match="/">
        <xsl:text>{ </xsl:text>
        
        <xsl:text>"VendorNumber" : "</xsl:text>
        <xsl:value-of select="Vendor/VendorNumber"/>
        <xsl:text>", </xsl:text>
        
        <xsl:text>"VendorName" : "</xsl:text>
        <xsl:value-of select="Vendor/VendorName"/>
        <xsl:text>", </xsl:text>
        
        <xsl:text>"Company" : "</xsl:text>
        <xsl:value-of select="Vendor/Company"/>
        <xsl:text>", </xsl:text>
        
        <xsl:text>"VendorCity" : "</xsl:text>
        <xsl:value-of select="Vendor/VendorCity"/>
        <xsl:text>", </xsl:text>
        
        <xsl:text>"VendorContactPhone" : "</xsl:text>
        <xsl:value-of select="Vendor/VendorContactPhone"/>
        <xsl:text>", </xsl:text>
        
        <xsl:text>"VendorContactEmail" : "</xsl:text>
        <xsl:value-of select="Vendor/VendorContactEmail"/>
        <xsl:text>", </xsl:text>
        
        <xsl:text>"VendorCountryRgn" : "</xsl:text>
        <xsl:value-of select="Vendor/VendorCountryRgn"/>
        <xsl:text>" </xsl:text>
        
        <xsl:text> }</xsl:text>
    </xsl:template>
</xsl:stylesheet>
  • We are done with our basic ground work. Now it's time to step into Azure. 😃
  • Login to Azure portal 
  • We cannot apply this XSLT file straightway on our input XML in Logic App. For that we need to create an "integration accounts". Let's do that. Search for integration in the Azure search bar and select.
  • Click +Add at the top and fill up the necessary details as per your choice. Create the account.
  • Once the integration account is created it will take you to the following landing page. There are lots of components available but for the sake of simplicity I won't go into those as it's not required for your case. It's required for typical B2B scenarios and if you are keep to learn then Azure has good documentation around it.  Here we only need the Maps component. So click on that.
  • Click +Add at the top and upload your Vendor.xsl file. This is the XSLT file we have created earlier. Click OK
  • Great. Now let create the logic app for some real action. Search for Logic App in the Azure search bar and select.
  • Click +Add at the top and fill up the following details as per your choice. One important point to remember here. Make sure your logic app location is same as your integration account location. This is a requirement. Create your logic app.
  • Now the most important part. We need to tell the Logic App to use the integration account so that it could pick up the XSLT transformation map from there. Go to the Settings>>Workflow Settings and map your integration account. Click Save
  • Click Logic App designer under Development Tools and choose "When a HTTP request is received" trigger from the list of common triggers
  • This will take you to the logic app workflow designer with the above trigger as the first element in the workflow. Do nothing here and click the "Next Step"
  • Search for "Transform XML" and select Transform XML from the list of actions.
  • For Contents, select the Body from the Dynamic content list. The is the http request body which we are going to send as an XML payload later.
  • Choose the XSLT map from the "Map" dropdown. This is coming from your integration account where we have uploaded the Vendor.xsl file.
  • Now select the Add new parameter dropdown and select "Transform options" and the value as "Apply XSLT output attributes". This is very important else you will get a runtime error "Code": "InvalidXsltContent", "Message": "An error occurred while processing map. 'Token Text in state Start would result in an invalid XML document. Make sure that the ConformanceLevel setting is set to ConformanceLevel.Fragment or ConformanceLevel.Auto if you want to write an XML fragment. ".  This is to just to inform the runtime to use the output format as defined in the XSLT file which in our case is application/json. So, make sure you do not miss this step 👍
  • Now we do not have any real source and destination system for this demo. So, we would return back the transformed JSON to the caller to simulate the XML to JSON conversion. Click on "+New Step", search for "Response" and select the Response Action
  • Select the "Transformed XML" from the Dynamic content selection window 
  • So, we have 3 steps in the workflow. Save the Logic app. This will generate the HTTP endpoint for us to test. Copy the HTTP POST URL endpoint and open any rest client of your choice. My favorite is Postman
  • Paste the url endpoint to make a POST call. In the body copy-paste the XML data from the Vendor.xml file we have created previously. Hit Send and Voila!! We are successfully getting back JSON response by sending XML request. Isn't that great stuff😚

Great Job and Congratulations! for coming this far. Hope this article will help you to further explore more on this feature.

Do share with me about your experience and what you have built upon this foundation. You can take it upto any level and integrate. I would love to hear from you.

07 September, 2020

Creating ARM Templates with Azure Resource Group Project

Automation is the buzzword nowadays and when it comes to Cloud, it becomes more inevitable. Creating various Azure resources and then logically grouping them inside a Resource group is a very common task we perform while working in Azure. We can go ahead and hit the Azure portal and create all the necessary resources by simple clicks and set up the environment in no time and get things running. Cool!

This may be good and acceptable for your development environment. What about other environments such as QA,Staging or even Production. Is it feasible to repeat the same task over and over again in each environment and create the exact same stuff. What about human error. Manual processes are largely error prone and a slight hiccup could lead to a big costly mistake. Moreover you might not have permission to all the environments and the same you might forget the exact resource configurations few months down the line. So, in short there's hell lot of things that can go wrong if things are manual.

Hence Automation. In this article I will walk you through the steps of creating a simple ARM template containing an Azure Storage resource in VS-2017 solution and then deploying the same with the help of PowerShell to Azure Cloud. Hope you all would enjoy the journey!

Please note that I have purposefully left out some complex areas or haven’t deep dived into details at some places. This is just to make this article an easy read and help as a quick guide from where to start in creating and deploying ARM templates

Prerequisites:

  • Microsoft Visual Studio 2017 with Azure SDK installed.
  • Azure Subscription (Free subscription of 30 days will also do)
  • Basic JSON understanding
  • Windows PowerShell (Command Line or ISE)

Let's get Started

  • Fire up your Visual Studio 2017
  • File > New > Project. Click on "Cloud" in the left panel and then select "Azure Resource Group" project type. Give the project name, location as per your choice. Click OK
No alt text provided for this image
  • Select "Blank Template" from the list of templates as of now. We will add resources later. Click OK
No alt text provided for this image
  • A Resource Group project will be created as below.
Solution Explorer

The main entry point is the azuredeploy.json. This is where we will create all the resources along with its respective configurations. Next is azuredeploy.parameters.json which is basically used to override any parameter that is defined in the azuredeploy.json file. Anything that needs to be customized on a per item basis should be placed here. Deploy-AzureResourceGroup.ps1 is the powershell script require to deploy the resources.

We will only make use the azuredeploy.json in this demo and perform the other two steps a bit differently so to avoid any complexity and help in easy understanding.

  • Double-Click on the azuredeploy.json file to open the code window.
No alt text provided for this image

There are four main sections in the json file.

parameters : Here we will define all the parameters respective to a resource in key-value pair instead of hard-coding in the resource body.

variables : This is optional. These are piece of information which the ARM template can work out on the fly. Specially used to deploy similar resources in different environments.

resources : All resource code blocks will go into this section following the same structure and pattern.

outputs : This is optional. Used when linking other ARM template with the master template or infact creating linked templates. Will not use this in this demo.

  • Once you opened the code view of the azuredeploy.json file, VS-2017 will automatically open a JSON Outline window for powerful navigation purpose.
No alt text provided for this image

If you do not see this windows then goto View > Other Windows > JSON Outline

  • Now our target is to create a Azure Storage resource here. Right click on resource section inside the JSON Outline window and select "Add New Resource"
No alt text provided for this image
  • Scroll down to find the "Storage Account" resource. Give it a name of your choice. I gave demoStorage but it can be any name. Click "Add"
No alt text provided for this image
  • Visual Studio will add all the necessary boilerplate code required for this resource type and will populate all the required sections of the ARM template with some default values.
No alt text provided for this image
  • Now lets do some minor modifications. The Storage name should be unique like any other Azure resources. Visual Studio has tried to make it unique by its own way but what if I want the storage name to start with "demo" as a prefix. Let do this.

We will add one parameter name "demoStoragePrefix" as below in the parameters section

"demoStoragePrefix": {
        "type": "string",
        "minLength": 1,
        "defaultValue": "demo"
      }

In the variables section lets modify the demoStorageName variable to read the prefix from the parameter we just defined above.

"demoStorageName": "[concat(parameters('demoStoragePrefix'), uniqueString(resourceGroup().id))]"

Now also add an additional parameter name "metadata" in the parameters section. This is optional and most people do overlook but I strongly recommend to add this with every template as it makes the code more readable.

"metadata": {
      "type": "string",
      "defaultValue": "Creation of Storage Account"
    }
  • Now the new modified json code block looks like this with the above changes in place.
No alt text provided for this image
  • Save the file and solution. We are done with our ARM template. Now lets dive into the deployment part.

Open PowerShell window (Command line or ISE window will do)

  • Connect to your Azure account. Execute the below command.
Connect-AzureRmAccount

This will prompt for your Azure username and password. Provide the details and login. A successful login with show all your subscriptions. I have only one subscription and hence it shows the details related to that.

Account          : XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
SubscriptionName : XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
SubscriptionId   : XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
TenantId         : XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Environment      : XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
  • Next set some deployment variables which we will use later in the steps. Select these steps and execute.
## Define Deployment Variables

   ## Your resource location. In my case I placed it in SE Asia
   $location = 'SouthEast Asia' 
   ## Resource Group Name
   $resourceGroupName = 'ARMdemo-simple-storage'
   ## Resource Group Deployment Name
   $resourceDeploymentName = 'ARMdemo-simple-storage-deployment'
   ## The path of the template file in your local drive. Your location folder path can be different
   $templatePath = $env:SystemDrive + '\' + 'Personal\Projects\DemoAzureRmStorageDeployment'
   ## The template filename. It will be same if you have taken the default filename which Visual Studio provides
   $templateFile = 'azuredeploy.json'
   ## The full template path along with the file
   $template = $templatePath + '\' + $templateFile

  • Create a Resource group in Azure under which our Storage resource will be placed. Now here you might be thinking that why we are not creating this resource group through the same ARM template. You are absolutely right but I have purposefully kept this separate to make our ARM template look simple. Otherwise we have to make it a bit complex by adding priority in the template regarding which resource should get created first and depends on what. Hope you understand.
### Create Resource Group
 New-AzureRmResourceGroup `
    -Name $resourceGroupName `
    -Location $location `
    -Verbose -Force

Execute the above script. It will create the resource group name "ARMdemo-simple-storage" in "SouthEast Asia" region. These are coming from the deployment variables we set before. Your name and region can be different based on what you have set in the variables.

  • Now the most important step. Deploying our ARM template with Powershell. Execute the following command. The command "New-AzureRmResourceGroupDeployment" will pick the ARM template from the location of our local drive and would deploy the resource under the resource group created above. The parameters passed are very much self explanatory
### Deploy Resources

New-AzureRmResourceGroupDeployment -Name $resourceDeploymentName -ResourceGroupName $resourceGroupName -TemplateFile $template -Verbose -Force
  • Wait for sometime for the deployment to finish. The command will also validate the ARM template before deployment and if looks fine will display "-Template is valid" else will throw an error message with description of what and where it went wrong.
  • DONE with our automation.

Login to Azure portal and check if everything is created as expected.

First lets check the resource group. Our "ARMdemo-simple-storage" is there as expected.

No alt text provided for this image

Let dive into it. Yes our storage account (provisioned thorough the ARM template) is showing up under the resource group in location : Southeast Asia.

No alt text provided for this image

One thing to notice here is the unique storage name. The prefix is "demo" which we had set in the ARM template and the rest is appended from the resourceGroup id. Recall the below variable we set in the template. Amazing!

"variables": {
     "demoStorageName": "[concat(parameters('demoStoragePrefix'), uniqueString(resourceGroup().id))]"
}

Click on the storage and you can view more details.

No alt text provided for this image

Great Stuff and Well Done! No more manual steps anymore. Going forward any redundant work you do, try to automate it. It will be a blessing.

Do share with me about your experience and what you have built upon this foundation. You can take it upto any level and integrate. I would love to hear from you.

06 July, 2020

Azure Logic App : A must have weapon in your armoury

I often get involved in discussions where people talks about automation and orchestrating stuff so that manual operations can be mitigated. This discussion generally leads towards implementing DevOps tools and practices and to setup a sound CI/CD pipeline, etc. But, I rarely come across a situation where people brings Logic App in this discussion. It a great service and should be a part of any automation discussion. 

Take an example (Order handling in an arbitrary eCommerce scenario). The situation is like
  1. Receiving the order in the system
  2. Save the order details in a database 
  3. Generate a service ticket (ITSM) so that the order can be handled
  4. Generate a SMS and send it to the customer confirming the order 
A very common workflow..right! What would you prefer here? A DevOps Automation or an Workflow Automation? Obviously the second one, and here in this article we will implement the above scenario with the help of a Logic App workflow and would try to feel the power of this awesome service. 

Please note that for simplicity purpose and to make this article an easy read we will go over a waterfall workflow model with sequential steps. 

Let's revisit out scenario once again and this time we will put the integration pointers. 
  1. Receiving the order in the system 👉 Postman REST API client
  2. Save the order details in a database 👉 Azure Table Storage
  3. Generate a service ticket (ITSM) so that the order can be handled 👉 FreshDesk (free account will do)
  4. Generate a SMS and send it to the customer confirming the order 👉 Twillio (free account will do)
Make sure you have the above settings done before moving forward.

OK, now we are all set. Let's dive.
  • First let's define the order information. We will use a simple JSON payload here for this demo just to simulate an order. Please replace the phone number with yours (with proper +Country code) 📱
{
   "orderNo": "1",
   "orderItem" : "Headphone",
   "orderCustName": "My Good Customer",
   "orderCustMobile" : "+91XXXXXXXXXX",
   "orderCity": "Bangalore",
   "orderState": "Karnataka"
}
  • Login to Azure and search for Logic Apps. 
  • Select your subscription, fill up the necessary details as per your choice, review and create the logic app.
  • Select "When a HTTP request is received" trigger in the designer
  • Click the link "Use sample payload to generate schema" hyperlink at the bottom and paste the order JSON payload to generate the JSON schema. Save the logic app to generate the HTTP Post Url.
  • Click "+ New Step" to add an Azure Storage table data insert step. Make sure you have already created a blank table in Azure Storage account so that you can insert data in that table directly using this step. You can create that table from Logic app as well, but it's upto you. Here I have already created that table inorder to reduce an extra step
          Search for Azure Table Storage and select "Insert Entity" action

  • Give a name for your table storage connection, select your storage account and hit "Create"
  • Select your table name from the Table dropdown. Paste the JSON payload here and replace the values with the parameters from the dynamic content selection box. As we are saving data in Azure Storage Table so make sure to modify your payload with PartitionKey and RowKey parameters to make an unique combination for each row item. I have used the orderNo and orderCustMobile fields but you can make your wish here 😐
  • Click "+ New Step" to add a FreshDesk service ticket action.
  • Provide your FreshDesk connection parameters and hit Create
  • Fill up the ticket details to be logged. You can decorate it as per your requirement. For the Email dropdown choose any dummy name as it doesn't matter for this demo. In real life scenario you will be having your user list. Here our purpose is to log a service ticket.
  • Click "+ New Step" to add a Twilio SMS action. Select "Send Text Message (SMS)" action
  • Enter the Connection Name of your choice, Twilio Account id (the ACCOUNT SID value in the Twilio portal) and the Twilio Access Token (the AUTH TOKEN value in the Twilio portal) and hit "Create"
  • Select the From Phone number which will be automatically populated from your Twilio account. Decorate your SMS message as you wish. 
  • Click "Save" at the top to save your logic app. This step is very important 😀

Show Time !!!!!!!

Copy the Http POST Url from the first step of your logic app and open postman. We are about to make a rest call. 
Paste the Url at the address bar, select JSON (application/json)  as content type and put your JSON payload in the body as raw input. Hit Send to get a 202 Accepted response back.


You have successfully made a rest call. Now three activities should happen. We must receive a record in table storage, service ticket logged in FreshDesk and a SMS in our mobile phone.

Open Azure Storage explorer and Aha! we have a new record in table


Open your FreshDesk ticketing dashboard and Voila! we have new service ticket 😲


Click the ticket and drill down to its details. Yep it showing the order details just as we configured it to be 👍


Awesome!... now as a last step lets check the mobile phone only to find the SMS sitting in our inbox. Now how cool is that !!!!!!!!  👏👏👏


We have successfully achieved our goal and the power of logic app simply drives us crazy. We have achieved a complicated, cost-effective and integrated solution using the serverless flavour of Azure with NO CODE.  Good job! 

Congratulations! for coming this far. Hope this article will help you to further explore this feature.

Do share with me about your experience and what you have built upon this foundation. You can take it up to any level and integrate. I would love to hear from you.