Netherlands: Software

Introductie van Micorosoft SQL Server 2016

Issue link:

Contents of this Issue


Page 192 of 212

180 C H A P T E R 9 | Introducing Azure SQL Data Warehouse Deciding how to scale With all of these factors in mind, how do you know how to size your initial data warehouse and when to scale up and scale down to match your workload and optimize costs? The concept of DWUs is designed to be as simple as possible for your performance needs. When you need faster results from your queries, increase the number of DWUs. When you need less power, decrease the number to reduce costs. As you plan your scaling strategy, consider the following scenarios: Data-warehouse usage patterns tend to correlate with business-activity patterns. For example, usage typically increases at the end of each week and month. During those times, you need more DWUs to service those users than at other times. You can automate elastic scaling by creating a script to increase the number of DWUs during peak usage and decrease them at other times. You can also pause your data warehouse at night, and thereby reduce overall compute charges, but continue to maintain data in storage until it is needed during business hours. During a period of heavy data-loading operations, you might increase DWUs so that users can access data more quickly. Each time you change the number of DWUs, run a few queries to baseline the new performance level. The process to scale up or down is easy and fast, which means you can experiment with different settings without committing to more than an hour of compute time with a specific setting. Note If your database is less than 1 TB in size, you might not see expected performance benefits from SQL Data Warehouse because most of its optimizations are designed for larger data sets. Microsoft recommends that you use SQL Database for data volumes that are less than 1 TB. To scale your SQL Data Warehouse up or down, open the SQL Data Warehouse blade and click Scale. In the Scale blade, use the slider to adjust the number of DWUs, or type a number in the box to the right of the slider. You must use a value that is divisible by 100 or the performance level does not change in the blade. Click Save to permanently save the new performance level. Data loads Traditional data warehousing has two distinct usage patterns—large analytics queries and bulk loading of data from operational systems. The process to load data is commonly known as extract, transform, and load (ETL). To perform ETL processing for SQL Data Warehouse, you have the following options: Azure Data Factory PolyBase SQL Server Integration Services (SSIS) Note To use the first two options, you might need to use the AzCopy utility that you can download from You use this command-line utility to copy data from your on-premises environment to blob storage. If you need to load five or more terabytes into blob storage, you should instead use the Azure Import/Export service. When you use this service, you ship hard drives to Microsoft for loading directly into storage, which might be necessary for the initial loading of your data warehouse. You can learn more about this service at "Use the Microsoft Azure Import/Export Service to Transfer Data to Blob Storage," us/documentation/articles/storage-import-export-service/.

Articles in this issue

Links on this page

Archives of this issue

view archives of Netherlands: Software - Introductie van Micorosoft SQL Server 2016