Azure SQL
How to get your KPN Things data to Azure SQL.
Last updated
How to get your KPN Things data to Azure SQL.
Last updated
Get your KPN Things device data in a Azure SQL database. This QuickStart will help you set up a KPN Things flow towards an Azure Eventhub. The Azure Eventhub is used to ingest data in to the Azure platform. An Azure Streamanalytics job will process incoming data. The data will be stored in an Azure SQL database. For this QuickStart you’ll need a KPN Things account with an active device (or download the device simulator app) and an active Azure subscription.
In this QuickStart you’ll learn how to:
Create and configure the necessary Azure resources
Create and configure Azure Eventhub
Create an Azure SQL database with appropriate data tables
Create and configure a Stream Analytics job
Setup a KPN Things Flow with Azure Eventhub destination
KPN Things Account with active device
If you don’t have an KPN Things account create a free developer account.
Azure subscription
If you don't have an Azure subscription, create a free account.
In this section you’ll be guided through the creation and configuration of an Azure event hub, a stream analytics job and an Azure SQL database. It’s advised to create a new resource group in which to bundle these newly created resources.
The Azure EventHub is used for the ingestion of data. Don't have an Azure Event Hub yet? Head on over to the Microsoft documentation to learn to create an Azure Event Hub:
From this point, when following the link, you’ve created everything to ingest data to Azure. You have created a resource group, an Event Hubs namespace, and an event hub.
Now we have prepared the input data by creating the Event Hub. We can now proceed to the creation of the Azure SQL database for the data output. Don't have an Azure SQL database yet? Head on over to the Microsoft documentation to learn how to create an Azure SQL database:
At creation, remember to select the resource group created earlier. This way all resources for this quick start will be grouped together. Furthermore, make sure to write down the ‘Server admin login’ and ‘Password’. Under additional settings don’t select sample as we want to create our own data table later on.
When the creation of the database is complete we can create the necessary table. Go to the query editor in the Azure portal as described in the documentation link:
In the query window run the query underneath to create a new data table in the database:
When completed a new data table called eventhub will be created. Additionally, two indexes will be created to enable quick searches on ‘bn’ and ‘bt’. These columns will hold the KPN Things device identifier and measurement timestamp. The newly created data table will be visible in the left view under tables.
At this point you’ve created the data input and the data output by leveraging an Event Hub and an Azure SQL database. In the next step we’ll set up the data stream from input to output by using Azure Streamanalytics.
Now we have prepared the input data by creating the Event Hub and the data output by creating an Azure SQL data store. Now let’s set up Streamanalytics to tie it all together.
The following Microsoft quickstart will guide you through the creation of an streamanalytics job. The entire article describes the usage of an iothub for input data and blob storage for output. Remember in this quickstart we will be using Eventhub for input and Azure SQL database as output.
At ‘Configure job input’ select Event Hub and point to the Eventhub created earlier. Use ‘eventhubinput’ as alias.
At ‘Configure job output’ select SQL database and point to the Azure SQL database created earlier. Set the database table to ‘eventhub’ to match the table created during the previous step. Use ‘sqloutput’ as alias.
At ‘Define the transformation query’ select Query and update with the query underneath. Make sure you are using the input and output aliases as created under input and output
Now Save your input and start the Stream Analytics job.
At this point you’ve created a data ingestion point (Event Hub) a streaming data processor (Stream analytics job) and data storage (Azure SQL database) . In the next step we’ll connect KPN Things to Azure!
Let’s put things together! Now everything is set up in Azure all we need to do is setup a data flow in KPN Things. This Flow will send device data towards an Azure Event Hub destination.
Login to the KPN Things Portal using your account credentials (https://portal.kpnthings.com)
In the left pane navigate to ‘My first project’
Add a flow by giving it a name and description
Now link your device (don’t have any devices yet? Go to ……)
Now go back to the flow and set up the data processing by enabling the decoders for the selected device
Add a destination for the flow. Select ‘Create new Destination’. Fill in all necessary details related to the Azure event hub created in Step 1. Don’t know where to find all the required parameters? Follow the instructions below:
When all parameters are set the destination should be active. You should now have a working KPN Things Flow towards Azure Eventhub.
You’ve created your first flow towards Azure. All Data produced by the devices in the flow will now be sent to Azure. Get processed by the Stream Analytics Job and will be stored in the Azure SQL database.