Continuous deployment enables teams to automatically release the code to production. This is indicative of high DevOps maturity and rock-solid automation practices. Using continuous deployment, code is automatically deployed to the production environments. This requires deep integration into how the software stack functions. All ongoing operations and customer requests are automatically taken care of, and […]
Category: Monitor Batches and Pipelines
Exam Essentials – Troubleshoot Data Storage ProcessingExam Essentials – Troubleshoot Data Storage Processing
Azure Well‐Architected Framework. The Azure Well‐Architected Framework is a set of best practices to follow for running your workloads on the Azure platform. They include cost optimization, security, reliability, operation excellence, and performance efficiency. Result set cache. When the same SQL queries are executed many times, performance can be improved by enabling the result set […]
Handle Interruptions – Troubleshoot Data Storage ProcessingHandle Interruptions – Troubleshoot Data Storage Processing
An interruption to the processing of your data stream flowing through your Azure Stream Analytics job can occur in many forms. One of the most catastrophic examples is caused by an event such as a storm or other event that results in the closure of all datacenters in a given Azure region. Although these events […]
Monitor Batches and Pipelines – Troubleshoot Data Storage ProcessingMonitor Batches and Pipelines – Troubleshoot Data Storage Processing
This section is a follow‐up to Chapter 6. It is placed here so that you can recall the content reading about logging, monitoring, optimizing, and troubleshooting techniques in this chapter and Chapter 9. Handle Failed Batch Loads There are many actions you can take within the Azure Batch job itself from a coding perspective. In […]
Design and Develop a Batch Processing Solution – Troubleshoot Data Storage ProcessingDesign and Develop a Batch Processing Solution – Troubleshoot Data Storage Processing
The processes discussed here were introduced in Chapter 6. A reason for introducing them there and following up here is that many of the concepts—such as logging, monitoring, and error handling—had not yet been covered. At this point, however, it is just a matter of connecting the dots and providing more detail within the context […]
Troubleshoot a Failed Pipeline Run – Troubleshoot Data Storage Processing-3Troubleshoot a Failed Pipeline Run – Troubleshoot Data Storage Processing-3
The pop‐out window enables you to select the integration runtime to use for the pipeline execution. Data flows often perform very large ingestion and transformational activities, and this additional amount of compute power is required to process them. The default amount of time to keep the IR active is 1 hour, but if you need […]
Troubleshoot a Failed Pipeline Run – Troubleshoot Data Storage Processing-2Troubleshoot a Failed Pipeline Run – Troubleshoot Data Storage Processing-2
The last topic to mention in the context of slowness has to do with overutilized compute resources. This is one of the most common scenarios you will encounter. The metrics you configure to monitor the health of your data analytics pipeline should target this specifically. When those metrics show that compute resources are under pressure, […]
Troubleshoot a Failed Pipeline Run – Troubleshoot Data Storage Processing-1Troubleshoot a Failed Pipeline Run – Troubleshoot Data Storage Processing-1
It is almost certain that at some point while running your data analytics procedures, something unexpected will happen. When it does, you must gather information like the symptoms experienced and the log files that will help you get down to the reason for the behavior. Knowing what you read in the last section about the […]
Optimize Pipeline for Descriptive versus Analytical Workloads – Troubleshoot Data Storage ProcessingOptimize Pipeline for Descriptive versus Analytical Workloads – Troubleshoot Data Storage Processing
The “Analytics Types” section in Chapter 2 described the numerous categories of data analytics—descriptive, diagnostic, predictive, preemptive, and prescriptive—each of which is an analytical workload. This is concluded by what you learned in the previous section: that OLTP operations are transactional, and OLAP operations are analytical. With the review of those five data analytics types, […]