In this learning path, you will learn how to build and architect big data solutions in Microsoft Azure. Topics will include architecting solutions using HD Insight, machine learning, visualizing data with Power BI, understanding lambda architecture patterns and IoT data ingestion. This path will help you prepare for exam Designing and Implementing Big Data Platform Solutions - exam 70-475 and will help you prepare for your MIcrosoft certification.
In this module, attendees will learn how to use features and capabilities within Azure to architect solutions to apply governance at scale with Microsoft Azure. This will include architecting the Azure EA portal for delegated access and charge back and discuss features like implementing role based access control (RBAC), and resource manager policies to enable enterprise control of an Azure deployment.
In this module, the attendee will learn the core capabilities and use cases of Azure Active Directory (AD). This module will emphasize strategies and techniques for integrating on-premises Active Directory with an Azure AD environment
In this module, attendees will learn about the capabilities of the Azure networking stack for connecting networks. This module will focus on capabilities and use cases so the student will be able to make an educated decision on connectivity requirements.
In this module, attendees will learn how to design solutions using Azure Infrastructure as a Service Components. This module will focus on core capabilities, use cases, and general best practices as well as discuss peripheral services such as Azure Backup and Site Recovery.
The Architecting Cloud Connectivity course covers general information about the Azure network, and specific topics related to designing hybrid connectivity solutions. Several example architectures are considered, evaluating key design criteria, such as performance and scale, security and compliance, and cost optimization. This course should help in preparation for the 70-534 exam, Architecting Microsoft Azure Solutions.
The Architecting Global Solutions course covers general information about scaling and architecting for scale. After the overview, the course dives deeper into the techniques of scaling solutions globally using Azure services. The topics and services discussed in this course include: ARR Affinity, Azure Redis Cache, Azure Content Delivery Network (CDN), Azure Traffic Manager, Auto-Healing, and Asynchronous Programming. Finally, this course finishes with an overview of a few example architectures to give a better perspective on architecting global solutions in the cloud. This course should help in preparation for the 70-534 exam, Architecting Microsoft Azure Solutions.
This module will cover all aspects of big data storage and batch processing. We will start by making the case for big data in Azure. Then we will look at Azure service topics to include Blob Storage, Azure Data Lake Store, Azure Data Lake Analytics, and HDInsight clusters running Hadoop, Hive, Interactive Hive (LLAP) and Spark. Storage topics will focus on choosing the right storage, configuring storage and storage optimization. We will also cover Big Data scenarios including batch processing, interactive clusters, multi-cluster deployments and on-demand clusters.
In this module, attendees will learn about the various storage options from SQL Database to NoSQL and Document based database technologies. This module is focused on choosing the right tool for the right job and considering the decision points architects will make when designing storage for their apps.
In this course, you will learn how to create web apps by using various Azure Platform as a Service components as as well as understand how to use Azure Container-based services.This course is part of the AZ 300 learning path for Microsoft Azure Architect Technologies.
Students will learn how to analyze resource utilization and consumption, create and configure storage accounts, create and configure a VM for Windows and Linux, create connectivity between virtual networks, implement and manage virtual networking, manage Azure Active Directory, and implement and manage hybrid identities.This course is part of the AZ 300 learning path for Microsoft Azure Architect Technologies.
This is course four of the exam prep for AZ-301: Microsoft Azure Architect Design. Students will learn to design a Site Recovery Strategy, design for High Availability, design a disaster recovery strategy for individual workloads and design a Data Archiving Strategy.
In this course the student will learn how to design a data management strategy, design a data protection strategy, design and document data flows, and design a monitoring strategy for the data platform.
In the course the student will learn about designing a storage strategy, networking strategy, compute strategy, and monitoring strategy for infrastructure.
This is course five of the exam prep for AZ-301: Microsoft Azure Architect Design. Students will learn how to design deployments, design migrations and design API integration strategy in Microsoft Azure.
This is course two of the exam prep for AZ-301: Microsoft Azure Architect Design.The Design for Identity and Security course teaches design identity management, design authentication, design authorization, design for risk prevention and identity, and how to design a monitoring strategy for identity and security.
In this module, attendees will learn how to develop media/video based solutions and services in Microsoft Azure. This will include Azure Media Services, video indexer, video API, computer vision API, and other media related services.
In this course, you will learn the ins-and-outs of using Azure Functions to design highly scalable solutions using a serverless design. This course will teach you how to deploy your code as well as how to monitor it once it is in production along with general best practices for writing solutions with Azure Functions.
This is course one of the exam prep for AZ-301: Microsoft Azure Architect Design. This course covers a range of topics, including the gathering of information and workload requirements, how to optimize a consumption strategy, and how to design an auditing and monitoring strategy.
This course will expose the student to developing solutions that use Cosmos DB storage, developing solutions that use a relational database, configuring a message-based integration architecture, and how to develop for autoscaling.This course is part of the AZ 300 learning path for Microsoft Azure Architect Technologies.
Learn how to Implement authentication in applications (certificates, Azure AD, Azure AD Connect, token-based), implement secure data (SSL and TLS), and manage cryptographic keys in Azure Key Vault.This course is part of the AZ 300 learning path for Microsoft Azure Architect Technologies.
Students will learn how to migrate servers to Azure, configure serverless computing, implement application load balancing, integrate on-premises network with Azure virtual network, manage role-based access control (RBAC) and implement Multi-Factor Authentication (MFA).This course is part of the AZ 300 learning path for Microsoft Azure Architect Technologies.
This course explores the NoSQL storage options available within the Microsoft Azure Cosmos DB database service. Formerly DocumentDB, Azure Cosmos DB is no longer just a Document-based NoSQL store, and it includes support for all 4 primary NoSQL data models (Document, Graph, Key/Value, Column). In addition to learning about NoSQL with Cosmos DB, students will also learn about the cloud-native features that make Cosmos DB a great NoSQL database-as-a-service in the Microsoft Azure cloud.
In this module, attendees will learn how to use services in Azure to monitor their services and solutions and to compose solutions that will effectively alert and trigger actions based on the established parameters. This module will discuss using the following services and solutions:Monitoring - Azure Monitor, Health, Log Analytics, Security Center, Application Insights, Network Watcher Automation - Chef, Puppet, PowerShell DSC, Logic Apps, Event Grid
The Real-Time Ingestion and Processing in Azure course covers information about implementing real-time event stream ingestion and processing within Microsoft Azure. The course starts with an overview of the Lambda Architecture and what a Message Broker is used for. The course continues to cover the Azure Event Hubs and Azure IoT Hub services used for event stream ingestion, and Azure Stream Analytics and HDInsight for integrating real-time event processing. Finally, the course finishes with an overview of a few example architectures to give a better perspective on architecting Real-Time Ingestion and Processing solutions within the Microsoft Azure cloud. This course should help in preparation for the 70-534 exam, Architecting Microsoft Azure Solutions.
In this module, you will focus on pricing and support models available with Microsoft to include but not limited to Azure subscriptions, planning and managing costs, support options available with Azure, and the service lifecycle in Azure.
In this module you will learn basic cloud concepts to include but not limited to the following: Why Cloud Services?, Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), Software-as-a-Service (SaaS), Public, Private, and Hybrid cloud models.
In this module, you will learn the basics of core services available within Microsoft Azure to include but not limited to Core Azure architectural components, Core Azure Services and Products, Azure Solutions, and Azure management tools.
In this module, you will learn about security, privacy, compliance, and trust with Microsoft Azure. You will become familiar with the following topics: securing network connectivity in Azure, core Azure identity services, security tools and features, Azure governance methodologies, monitoring and reporting in Azure, and privacy, compliance and data protection standards in Azure.
In this lab, you will create a new Azure Function that exposes an HTTP endpoint to enable the function to be triggered on-demand. The HTTP endpoint accepts two query string parameters from the HTTP request. The function outputs a calculated value based on the input parameters.
In this lab, you will use the Azure Migrate service to migrate the SmartHotel app which is currently hosted on an on-premises infrastructure hosted in Hyper-V to Azure Virtual Machines. During the lab, you will migrate this entire application stack to Azure using the Azure Migrate service. Note: this lab takes 60-75 minutes to fully deploy.
In this lab you will create an Azure SQL Database using the Azure Portal and connect to it using SQL Server Management Studio. You will then migrate a SQL Server database hosted on a virtual machine to an Azure SQL Database.
In this lab, you will create a Windows virtual machine running in Azure, and connect to it using Remote Desktop. You will then delete the virtual machine, and clean up associated resources.
Today, data is being collected in ever-increasing amounts, at ever-increasing velocities, and in an ever-expanding variety of formats. This explosion of data is colloquially known as the Big Data phenomenon.In order to gain actionable insights into big-data sources, new tools need to be leveraged that allow the data to be cleaned, analyzed, and visualized quickly and efficiently. Azure HDInsight provides a solution to this problem by making it exceedingly simple to create high-performance computing clusters provisioned with Apache Spark and members of the Spark ecosystem. Rather than spend time deploying hardware and installing, configuring, and maintaining software, you can focus on your research and apply your expertise to the data rather than the resources required to analyze that data.Apache Spark is an open-source parallel-processing platform that excels at running large-scale data analytics jobs. Spark’s combined use of in-memory and disk data storage delivers performance improvements that allow it to process some tasks up to 100 times faster than Hadoop. With Microsoft Azure, deploying Apache Spark clusters becomes significantly simpler and gets you working on your data analysis that much sooner.In this lab, you will experience HD Insight with Spark first-hand. After provisioning a Spark cluster, you will use the Microsoft Azure Storage Explorer to upload several Jupyter notebooks to the cluster. You will then use these notebooks to explore, visualize, and build a machine-learning model from food-inspection data — more than 100,000 rows of it — collected by the city of Chicago. The goal is to learn how to create and utilize your own Spark clusters, experience the ease with which they are provisioned in Azure, and, if you're new to Spark, get a working introduction to Spark data analytics.
In this lab, you will learn to build, monitor, manage and troubleshoot data pipelines with Azure Data Factory V2. You will learn to use the Copy Data wizard to build pipeline with no coding. You will build a custom pipeline to copy data from Blob storage to a table in Azure SQL Database. You will build a tumbling window pipeline to pick up data on a daily basis. Finally, you will learn to use the Management Monitoring tools to troubleshoot pipeline failures.
In this lab, an AKS cluster is deployed using the Azure CLI. A multi-container application consisting of web front end and a Redis instance is then run on the cluster. Once completed, the application is accessible over the internet.
In this lab, an Azure Virtual Machine disk will be encrypted using the following steps:Deploy a VM into Azure that is not encryptedObtain and run the Azure Disk Encryption Prerequisites Azure PowerShell scriptEncrypt your virtual machines
In this lab, you will create a virtual network that will allow the virtual machines you create to securely connect with each other. You will then create two virtual machines and specify the virtual network configuration and the availability set configuration along with storage for the virtual machine.
In this lab, you will create an Azure Web App and a SQL Database and configure the popular content management system (CMS) Orchard CMS. You will then configure the web app to automatically scale based on actual CPU usage.
This lab is designed to help you become familiar with several features of Microsoft Azure Log Analytics. You will learn how to setup a Log Analytics workspace and install the agent on several VMs. From there, you will configure data sources from Azure as well as diagnostic data from the VMs and learn the fundamentals of querying data and events using the Log Analytics query language.This lab pre-provisions several resources in Microsoft Azure and will take 15-20 minutes to start before it is ready.
In this lab, you will configure Azure Site Recovery to protect a sample n-tier application by configuring replication from the source Azure region to a target Azure region. Once the initial replication has completed and the application is protected, you will perform a test fail over and validate application functionality. Finally, you will accomplish the cleanup of the test failover resources.Note: This lab pre-deploys several resources and will take 30-45 minutes to start.
In this lab, you will create a Web API using ASP.NET MVC that will then be deployed into Azure API Apps. You will also integrate Swagger using the Swashbuckle NuGet package to automatically generate usage documentation for the Web API. From there you will setup a new API Management Service within Azure, and publish a custom Web API deployed to an Azure API App to be a Managed API.
In this lab, you will be introduced to basic concepts for developing with Azure Storage using Visual Studio and C#.
In this lab, the student will learn the basics of messaging patterns between software systems and how to use the Azure Service Bus as a messaging solution.
In this lab, you will use Visual Studio and ASP.NET to learn how to use Cosmos DB as a backend for an MVC application. You will learn how to programmatically read and write data, create and call a user-defined functions as well as understand management capabilities such as users and permissions, monitoring and scalability options.
In this lab, you will learn how to configure and manage an Azure Cosmos DB Account (formerly Azure DocumentDB), including how to query and manage JSON documents within a Collection. Among the topics covered are using SQL language syntax to perform document queries that return JSON results, and implementing and testing global data replication and fail over.
In this lab you will learn how to migrate an traditional three-tier web application (web, business logic, and data) from on-premises to Azure using Azure Site Recovery and Azure Database Migration Service.Note: This lab pre-deploys several resources and will take 20-30 minutes to start.
In this lab, you learn how to configure virtual networking peering.Virtual network peering enables you to seemlessly connect two Azure virtual networks. Once peered, the virtual networks appear as one, for connectivity purposes. The traffic between virtual machines in the peered virtual networks is routed through the Microsoft backbone infrastructure, much like traffic is routed between virtual machines in the same virtual network, through private IP addresses only.Note 1: This lab will connect two virtual networks within the same region. Peering across regions is currently in preview.Note 2: If you want a more in-depth view of virtual network connectivity (including site-to-site and point-to-site) try the Introduction to Virtual Network Connectivity lab.
In this lab, you will use Java to write a back-end console application and register it with Azure Active Directory. You will then create a Key for the Registered app, and write code to generate an Access Token for the application to use when calling the Azure AD Graph API. Code will also be written to call the Azure AD Graph REST API from within Java using the Access Token for authentication.