In this lab, you will build such a system using Microsoft Azure and Microsoft Cognitive Services. Specifically, you will use an Azure IoT hub to ingest streaming data from simulated cameras, Azure Storage to store photographs, Azure Stream Analytics to process real-time data streams, Azure Functions to process output from Stream Analytics, Microsoft’s Custom Vision Service to analyze photographs for polar pears, Microsoft Power BI to build a dashboard for visualizing results, and Azure SQL Database as a data source for Power BI.
Scenarios for the application of real-time data analytics are many and include fraud detection, identity-theft protection, optimizing the allocation of resources (think of an Uber-like transportation service that sends drivers to areas of increasing demand before that demand peaks), click-stream analysis on Web sites, and countless others. Having the ability to process data as it comes in rather than waiting until after it has been aggregated offers a competitive advantage to businesses that are agile enough to make adjustments on the fly.
In this lab, the second of four in a series, you will create an Azure Stream Analytics job and connect it to the IoT hub you created in the previous exercise. Then you will write an Azure Function to receive the output, and use the two of them together to analyze data streaming in from a simulated camera array.
Microsoft Cognitive Services is a suite of services and APIs backed by machine learning that enables developers to incorporate intelligent features such as facial recognition in photos and videos, sentiment analysis in text, and language understanding into their applications. Microsoft’s Custom Vision Service is among the newest members of the Cognitive Services suite. Its purpose is to create image-classification models that “learn” from labeled images you provide. Want to know if a photo contains a picture of a flower? Train the Custom Vision Service with a collection of flower images, and it can tell you whether the next image includes a flower — or even what type of flower it is.
The Custom Vision Service exposes two APIs: the Custom Vision Training API and the Custom Vision Prediction API. You can build, train, and test image-classification models using the Custom Vision Service portal, or you can build, train, and test them using the Custom Vision Training API. Once a model is trained, you can use the Custom Vision Prediction API to build apps that utilize it. Both are REST APIs that can be called from a variety of programming languages.
In this exercise, you will create a Custom Vision Service model and train it to differentiate between various types of Arctic wildlife. Then you will connect it to the Stream Analytics job you created in the previous lab.
Microsoft Power BI was created to address the data explosion in commercial and academic organizations, the need to analyze that data, and the need for rich, interactive visuals to represent the data and reveal key insights. It contains a suite of tools that assist in data analysis, from data discovery and collection to data transformation, aggregation, sharing, and collaboration. Moreover, it allows you to create rich visualizations and package them in interactive dashboards.
In this exercise, you will connect Microsoft Power BI to the Azure SQL database you created in the previous exercise to capture information emanating from the virtual camera array you deployed in the Arctic. Then you will use Power BI to build a report that shows in near real-time where polar bears are being spotted.