Using Logic Apps to transfer files from FTP into Azure Blob Storage

As increasing numbers of people look to move their data integration and repositories from their legacy on-premises systems to the cloud a common question is “How do I get FTP data into cloud storage?”.  In this post I will take you through a sample I put together which monitors a FTP server for new files and then transfers them to Azure blob storage.

Today’s solution makes use of Azure Logic Apps and Azure Blob storage, at the end of the post we will have a Logic app that connects to a FTP server and copies a file to an “uploads” container in Azure Blob storage.  You cannot nest containers in blob storage so in order to provide a way to recognise when files arrive we use in-built functions to build a file name that incorporates today’s date and that impersonates a more traditional file structure.

Pre-Reqs

In order to complete this demo you need:

  1. An Azure Subscription
  2. An FTP server that supports passive mode to act as the remote system we are polling (details for the FTP server I built for this are available here). You will need a valid username and password for connecting to the FTP server.
  3. A storage account (see the post https://azure.microsoft.com/en-gb/documentation/articles/storage-create-storage-account/#create-a-storage-account for setting a storage account up. You will need the Storage Account Name and the Access Key

Let’s do it!

To start we need to create a Logic App (surprise) so click New -> search for Logic App and select Logic App published by Microsoft

Click Create

Enter:

 

  • Name: <Name of your Logic App, e.g., FTPDemoLogicApp>
  • Resource Group: Create new: <Name of your resource Group, e.g., rgFTPLogicAppDemo>
  • Location: <Chose your region, e.g., North Europe>
  • Pin to Dashboard: Optional – I tend to check this to make it easy to find

Open the Logic App and under DEVELOPMENT TOOLS select Logic App Designer and from the list of Templates click on Blank Logic App

We start by configuring up the connection details for the remote FTP server we wish to connect to.

Search for FTP – When a file is added or modified

  • CONNECTION NAME: <Name of your FTP Server>
  • SERVER ADDRESS: <Public IP address of your FTP server>
  • USER NAME: <User name for logging on to FTP server>
  • PASSWORD: <Password>
  • FTP SERVER PORT: 21

Click Create

Once we have the connection created we need to specify the folder in which the files will reside – for simplicity I have selected the root folder so on the next box enter:

  • Folder: /

Click New step  and Add an action

OK now we need to configure the target Blob storage account to transfer the FTP file to.  To start search for Blob and select AzureBlobStorage – Create blob

Enter:

  • CONNECTION NAME: <Name of your connection>
  • AZURE STORAGE ACCONT NAME:<Name of your storage account>
  • AZURE STORAGE ACCOUNT ACCESS KEY: <Key1 of your storage account>

Click Create

Now we have the connection configured we need to specify which container we would like the file transferred to in my example the files are transferred to the “inbound” container.

For the Blob name we can start by just dragging in “File name” which cascades down from the FTP connection (see the image above), however in my example I have marked up the name so that the file name makes it appear a folder is created per day for the files.  In my case this would mean that the file address.csv would become 2017-16-11/address.csv – this is useful because for Azure storage you cannot nest containers, however “/” is a valid character in the blob name so you can create a pseudo folder structure.

  • Folder Path: /inbound
  • Blob name:@{substring(utcnow(),0,10)}/@{triggerOutputs()[‘headers’][‘x-ms-file-name’]}
  • Blob content: Select File content

 Click + New step

OK so at this stage we have logged on to the remote FTP server, checked for new files, copied the file to the desired Blob storage account and container and marked up the name to include the date, in the real world we might well want to remove the file from the FTP server as part of the process:

Search for ftp and select FTP – Delete file

For File: Select File name

Click Save (top left of the designer pane)

Now we simply need to ftp a file in to our FTP server

You can trigger the Logic App manually via the Run button in the Logic App Designer

Alternatively you can check the status via the Overview blade:

In the above screenshot you can see the Succeeded run which is where the app found a new file and processed it, to the right you can see a series of Skipped this is where the Logic App polled the remote FTP server and did not find anything to process.

We can check that the Logic App worked as expected within the Azure portal by opening the storage account we can view the files in the container:

If you look closely at the screenshot above you can see that the portal has presented the file as “address.csv” in a folder called “2016-11-19” (Azure Storage Explorer does the same), in reality the file name is “2016-11-19/address.csv” in the container “inbound”

Check files have been deleted from the FTP server – in the screenshot below you can see the final “ls” command returns no files:

Job Done!

Where can you take this?  We could perhaps have an Azure Function app monitoring new blobs being created in the Storage Account, or perhaps consume through Azure Data Factory (although for ADF you can FTP stuff in directly). If you need to FTP from Azure you could perhaps reverse this process and move files from Blob storage to a remote FTP server.

2 Comments

  • Using Azure Functions to Geocode an input file – My Thought Lab 28/11/2016 at 22:58

    […] on my recent post for using Azure Logic Apps to FTP files into Azure Blob Storage I’ll now walk you through a simple example Azure Function that monitors a given Azure Blob […]

    Reply
  • BestAlycia 09/08/2019 at 07:12

    I see you don’t monetize azurewebsites.net, don’t waste your traffic, you
    can earn additional cash every month with new monetization method.
    This is the best adsense alternative for any type
    of website (they approve all sites), for more details simply search in gooogle: murgrabia’s tools

    Reply

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.