
It is more faster and easier to pass the Microsoft exam ref 70 534 pdf download exam by using Printable Microsoft Architecting Microsoft Azure Solutions questuins and answers. Immediate access to the Leading exam ref 70 534 Exam and find the same core area 70 534 architecting microsoft azure solutions pdf questions with professionally verified answers, then PASS your exam with a high score now.
Q1. - (Topic 4)
You need to select the appropriate solution for monitoring the .NET application. What should you recommend?
A. Visual Studio IntelliTrace
B. Application Insights
C. Data Factory
D. Microsoft Analytics Platform
Answer: D
Topic 5, Northwind traders
Background Overview
Northwind Electric Cars is the premier provider of private, low-cost transportation in Denver. Northwind drivers are company employees who work together as a team. The founding partners believe that by hiring their drivers as employees, their drivers focus on providing a great customer experience. Northwind Electric Cars has a reputation for offering fast, reliable, and friendly service, due largely to their extensive network of drivers and their proprietary dispatching software named NorthRide.
Northwind Electric Cars drivers depend on frequent, automatic updates for the NorthRide mobile app. The Northwind management team is concerned about unplanned system downtime and slow connection speeds caused by high usage. Additionally, Northwind's in- house data storage solution is unsustainable because of the new influx of customer data
that is retained. Data backups are made periodically on DVDs and stored on-premises at corporate headquarters.
Apps NorthRide App
Northwind drivers use the NorthRide app to meet customer pickup requests. The app uses
a GPS transponder in each Northwind vehicle and Bing Maps APIs to monitor the location of each vehicle in the fleet in real time. NorthRide allows Northwind dispatchers to optimize their driver coverage throughout the city.
When new customers call, the dispatcher enters their pickup locations into NorthRide. NorthRide identifies the closest available driver. The dispatcher then contacts the driver with the pick-up details. This process usually results in a pick-up time that is far faster than the industry average.
Drivers use NorthRide to track the number of miles they drive and the number of customers they transport. Drivers also track their progress towards their established goals, which are measured by using key performance indicators (KPIs).
NorthRide App 2.0
Northwind Electric Cars is growing quickly. New callers often wait for their calls to be answered because the dispatchers are contacting their drivers to arrange pickups for other customers.
To support the growth of the business, Northwind's development team completes an overhaul of the NorthRide system that it has named NorthRide 2.0. When a dispatcher enters a customer's pickup location, the address and driving directions are automatically sent to the driver who is closest to the customer's pickup location.
Drivers indicate their availability on the NorthRide mobile app and can view progress towards their KPI's in real time. Drivers can also record customer ratings and feedback for each pickup.
Business Requirements Apps
NorthRideFinder App
Northwind Electric Cars needs a customer-facing website and mobile app that allows customers to schedule pickups. Customers should also be able to create profiles that will help ensure the customer gets a ride faster by storing customer information.
Predictor App
Northwind Electric Cars needs a new solution named Predictor. Predictor is an employee- facing mobile app. The app predicts periods of high usage and popular pickup locations and provides various ways to view this predictive data. Northwind uses this information to
better distribute its drivers. Northwind wants to use the latest Azure technology to create this solution.
Other Requirements
✑ On-premises data must be constantly backed up.
✑ Mobile data must be protected from loss, even if connectivity with the backend is lost.
✑ Dispatch offices need to have seamless access to both their primary data center and the applications and services that are hosted in the Azure cloud.
✑ Connectivity needs to be redundant to on-premises and cloud services, while providing a way for each dispatch office to continue to operate even if one or all of the connection options fail.
✑ The management team requires that operational data is accessible 24/7 from any office location.
Technical Requirements Apps and Website
NorthRide / NorthRideFinder Apps:
✑ The solution must support on-premises and Azure data storage.
✑ The solution must scale as necessary based on the current number of concurrent users.
✑ Customer pickup requests from NorthRideFinder must be asynchronous.
✑ The customer pickup request system will be high in volume, and each request will have a short life span.
✑ Data for NorthRideFinder must be protected during a loss of connectivity.
✑ NorthRide users must authenticate to the company's Azure Active Directory.
Northwind Public Website
✑ The customer website must use a WebJob to process profile images into thumbnails
✑ The customer website must be developed with lowest cost and difficulty in mind.
✑ The customer website must automatically scale to minimize response times for customers.
Other Requirements Data Storage:
✑ The data storage must interface with an on-premises Microsoft SQL backend database.
✑ A disaster recovery system needs to be in place for large amounts of data that will backup to Azure.
✑ Backups must be fully automated and managed the Azure Management Portal.
✑ The recovery system for company data must use a hybrid solution to back up both the on-premises Microsoft SQL backend and any Azure storage.
Predictive Routing:
✑ An Azure solution must be used for prediction systems.
✑ Predictive analytics must be published as a web service and accessible by using the REST API.
Security:
✑ The NorthRide app must use an additional level of authentication other than the employee's password.
✑ Access must be secured in NorthRide without opening a firewall port.
✑ Company policy prohibits inbound connections from internet callers to the on- premises network.
✑ Customer usernames in NorthRideFinder cannot exceed 10 characters.
✑ Customer data in NorthRideFinder can be received only by the user ID that is associated with the data.
Q2. DRAG DROP - (Topic 3)
You need to deploy the virtual machines to Azure.
Which four Azure PowerShell scripts should you run in sequence? To answer, move the appropriate scripts from the list of scripts to the answer area and arrange them in the correct order.
Answer:
Explanation:
Box 1:
Box 2:
Box 3:
Box 4:
Note:
* In order to upload a VHD file to Azure, we need :
✑ Azure PowerShell SDK
✑ A publish setting file
✑ An affinity group
✑ A Storage account
✑ A container
Topic 4, Lucerne Publishing
Background
Overview
Lucerne Publishing creates, stores, and delivers online media for advertising companies. This media is streamed to computers by using the web, and to mobile devices around the world by using native applications. The company currently supports the iOS, Android, and Windows Phone 8.1 platform.
Lucerne Publishing uses proprietary software to manage its media workflow. This software has reached the end of its lifecycle. The company plans to move its media workflows to the cloud. Lucerne Publishing provides access to its customers, who are third-party companies, so that they can download, upload, search, and index media that is stored on Lucerne Publishing servers.
Apps and Applications
Lucerne Publishing develops the applications that customers use to deliver media. The
company currently provides the following media delivery applications:
✑ Lucerne Media W - a web application that delivers media by using any browser
✑ Lucerne Media M - a mobile app that delivers media by using Windows Phone 8.1
✑ Lucerne Media A - a mobile app that delivers media by using an iOS device
✑ Lucerne Media N - a mobile app that delivers media by using an Android device
✑ Lucerne Media D - a desktop client application that customer's install on their local computer
Business Requirements
Lucerne Publishing's customers and their consumers have the following requirements:
✑ Access to media must be time-constricted once media is delivered to a consumer.
✑ The time required to download media to mobile devices must be minimized.
✑ Customers must have 24-hour access to media downloads regardless of their location or time zone.
✑ Lucerne Publishing must be able to monitor the performance and usage of its customer-facing app.
Lucerne Publishing wants to make its asset catalog searchable without requiring a database redesign.
✑ Customers must be able to access all data by using a web application. They must
also be able to access data by using a mobile app that is provided by Lucerne Publishing.
✑ Customers must be able to search for media assets by key words and media type.
✑ Lucerne Publishing wants to move the asset catalog database to the cloud without formatting the source data.
Other Requirements Development
Code and current development documents must be backed up at all times. All solutions
must be automatically built and deployed to Azure when code is checked in to source control.
Network Optimization
Lucerne Publishing has a .NET web application that runs on Azure. The web application analyzes storage and the distribution of its media assets. It needs to monitor the utilization of the web application. Ultimately, Lucerne Publishing hopes to cut its costs by reducing data replication without sacrificing its quality of service to its customers. The solution has the following requirements:
✑ Optimize the storage location and amount of duplication of media.
✑ Vary several parameters including the number of data nodes and the distance from node to customers.
✑ Minimize network bandwidth.
✑ Lucerne Publishing wants be notified of exceptions in the web application.
Technical Requirements Data Mining
Lucerne Publishing constantly mines its data to identify customer patterns. The company
plans to replace the existing on-premises cluster with a cloud-based solution. Lucerne Publishing has the following requirements:
Virtual machines:
✑ The data mining solution must support the use of hundreds to thousands of processing cores.
✑ Minimize the number of virtual machines by using more powerful virtual machines.
Each virtual machine must always have eight or more processor cores available.
✑ Allow the number of processor cores dedicated to an analysis to grow and shrink automatically based on the demand of the analysis.
✑ Virtual machines must use remote memory direct access to improve performance.
Task scheduling:
The solution must automatically schedule jobs. The scheduler must distribute the jobs based on the demand and available resources.
Data analysis results:
The solution must provide a web service that allows applications to access the results of analyses.
Other Requirements Feature Support
✑ Ad copy data must be searchable in full text.
✑ Ad copy data must indexed to optimize search speed.
✑ Media metadata must be stored in Azure Table storage.
✑ Media files must be stored in Azure BLOB storage.
✑ The customer-facing website must have access to all ad copy and media.
✑ The customer-facing website must automatically scale and replicate to locations around the world.
✑ Media and data must be replicated around the world to decrease the latency of data transfers.
✑ Media uploads must have fast data transfer rates (low latency) without the need to upload the data offline.
Security
✑ Customer access must be managed by using Active Directory.
✑ Media files must be encrypted by using the PlayReady encryption method.
✑ Customers must be able to upload media quickly and securely over a private connection with no opportunity for internet snooping.
Q3. HOTSPOT - (Topic 6)
You are migrating an on-premises application to Azure. The application requires secure storage of database connection strings. When the application is running locally, the connection strings are encrypted with an X509 certificate prior to being stored on disk. The X509 certificate is part of a trust chain to allow the certificate to be revoked by the Certificate Authority if a security breech is suspected.
The application must run on Azure. The X509 certificate must never be stored on disk or in RAM memory. A Certificate Authority must be able to revoke the certificate.
You need to configure Azure Key value.
How should you construct the Azure PowerShell script? To answer, select the appropriate A?ure PowerShell commands in the answer area.
Answer:
Q4. - (Topic 6)
You are evaluating an Azure application. The application includes the following elements:
✑ A web role that provides the ASP.NET user interface and business logic
✑ A single SQL database that contains all application data
Each webpage must receive data from the business logic layer before returning results to the client. Traffic has increased significantly. The business logic is causing high CPU usage.
You need to recommend an approach for scaling the application. What should you recommend?
A. Store the business logic results in Azure Table storage.
B. Vertically partition the SQL database.
C. Move the business logic to a worker role.
D. Store the business logic results in Azure local storage.
Answer: C
Explanation: For Cloud Services in Azure applications need both web and worker roles to
scale well.
Reference: Application Patterns and Development Strategies for SQL Server in Azure Virtual Machines
https://msdn.microsoft.com/en-us/library/azure/dn574746.aspx
Topic 7, Woodgrove Bank
Overview
Woodgrove Bank has 20 regional offices and operates 1,500 branch office locations. Each regional office hosts the servers, infrastructure, and applications that support that region. Woodgrove Bank plans to move all of Their on-premises resources to Azure, including virtual machine (VM)-based, line-of-business workloads, and SQL databases. You are the owner of the Azure subscription that Woodgrove Bank is using. Your team is using Git repositories hosted on GitHub for source control.
Security
Currently, Woodgrove Bank's Computer Security Incident Response Team (CSIRT) has a problem investigating security issues due to the lack of security intelligence integrated with their current incident response tools. This lack of integration introduces a problem during the detection (too many false positives), assessment, and diagnose stages. You decide to use Azure Security Center to help address this problem.
Woodgrove Bank has several apps with regulated data such as Personally Identifiable Information (PU) that require a higher level of security. All apps are currently secured by using an on-premises Active Directory Domain Services (AD DS). The company depends on following mission-critical apps: WGBLoanMaster, WGBLeaseLeader, and WGBCreditCruncher apps. You plan to move each of these apps to Azure as part of an app migration project.
Apps
The WGBLoanMaster app has been audited for transaction loss. Many transactions have been lost in processing and monetary write-offs have cost the bank. The app runs on two VMs that include several public end points.
The WGBteaseLeader app has been audited for several data breaches. The app includes a SQL Server database and a web-based portal. The portal uses an ASP.NET Web API function to generate a monthly aggregate report from the database.
The WGBCreditCruncher app runs on a VM and is load balanced at the network level. The app includes several stateless components and must accommodate scaling of increased credit processing. The app runs on a nightly basis to process credit transactions that are batched during the day. The app includes a web-based portal where customers can check their credit information. A mobile version of the app allows users to upload check images.
Business Requirements: WGBLoanMasterApp
The app audit revealed a need for zero transaction loss. The business is losing money due to the app losing and not processing loan information. In addition, transactions fail to process after running for a long time. The business has requested the aggregation processing to be scheduled for 01:00 to prevent system slowdown.
WGBLeaseLeader App
The app should be secured to stop data breaches. It the data is breached, it must not be readable. The app is continuing to see increased volume and the business does not want the issues presented in the WGBLoanMaster app. Transaction loss is unacceptable, and although the lease monetary amounts are smaller than loans, they are still an important profit center for Woodgrove Bank. The business would also like the monthly report to be automatically generated on the first of the month. Currently, a user must log in to the portal and click a button to generate the report.
WGBCreditCruncher app
The web-based portal area of the app must allow users to sign in with their Facebook credentials. The bank would like to allow this feature to enable more users to check their credit within the app.
Woodgrove Bank needs to develop a new financial risk modeling feature that they can include in the WGBCreditCruncher app. The financial risk modeling feature has not been developed due to costs associated with processing, transforming, and analyzing the large volumes of data that are collected. You need to find a way to implement parallel processing to ensure that the features runs efficiently, reliably, and quickly. The feature must scale based on computing demand to process the large volumes of data and output several financial risk models.
Technical Requirements: WGBLoanMaster App
The app uses several compute-intensive tasks that create long-running requests to the system. The app is critical to the business and must be scalable to increased loan processing demands. The VMs that run the app include a Windows Task Scheduler task that aggregates loan information from the app to send to a third party. This task runs a console app on the VM.
The app requires a messaging system to handle transaction processing. The messaging system must meet the following requirements:
*Allow messages to reside in the queue for up to a month
*Be able to publish and consume batches of messages
*Allow full integration with the Windows Communication Foundation (WCF) communication stack
*Provide a role-based access model to the queues, including different permissions for senders and receivers
You develop an Azure Resource Manager (ARM) template to deploy the VMs used to support the app. The template must be deployed to a new resource group and you must validate your deployment settings before creating actual resources.
WGBLeaseLeader App
The app must use Azure SQL Databases as a replacement to the current Microsoft SQL Server environment. The monthly report must be automatically generated.
The app requires a messaging system to handle transaction processing. The messaging system must meet the following requirements:
*Require server-side logs of all of the transactions run against your queues
*Track progress of a message within the queue
*Process the messages within 7 days
*Provide a differing timeout value per message
WGBCreditCruncher app
The app must
*Secure inbound and outbound traffic
*Analyze inbound network traffic for vulnerabilities.
*Use an instance-level public IP and allow web traffic on port 443 only.
*Upgrade the portal to a Single Page Application (SPA) that uses JavaScript Azure Active Directory (Azure AD), and the OAuth 2.0 implicit authorization grant to secure the Web API back end.
*Cache authentication and host the Web API back end using the Open Web Interface for
.NET (OWIN) middleware.
*Immediately compress check images received from the mobile web app.
*Schedule processing of the batched credit transactions on a nightly basis.
*Provide parallel processing and scalable computing resources to output financial risk models.
*Use simultaneous compute nodes to enable high performance computing and updating of the financial risk models.
Key Security Areas
Q5. - (Topic 6)
You are designing an Azure web application. The application uses one worker role. It does not use SQL Database. You have the following requirements:
✑ Maximize throughput and system resource availability
✑ Minimize downtime during scaling
You need to recommend an approach for scaling the application. Which approach should you recommend?
A. Increase the role instance size.
B. Set up horizontal partitioning.
C. Increase the number of role instances.
D. Set up vertical partitioning.
Answer: C
Explanation: On the Scale page of the Azure Management Portal, you can manually scale your application or you can set parameters to automatically scale it. You can scale applications that are running Web Roles, Worker Roles, or Virtual Machines. To scale an application that is running instances of Web Roles or Worker Roles, you add or remove role instances to accommodate the work load.
Reference: How to Scale an Application
http://azure.microsoft.com/en-gb/documentation/articles/cloud-services-how-to-scale/
Q6. - (Topic 6)
You are designing an Azure web application.
All users must authenticate by using Active Directory Domain Services (AD DS) credentials.
You need to recommend an approach to enable single sign-on to the application for domain-authenticated users.
Which two actions should you recommend? Each correct answer presents part of the solution.
A. Use Forms authentication to generate claims.
B. Use the SQL membership provider in the web application.
C. Use Windows Identity Foundation in the web application.
D. Use Active Directory Federation Services (AD FS) to generate claims.
Answer: C,D
Explanation:
Reference: What is Windows Identity Foundation? https://msdn.microsoft.com/en-us/library/ee748475.aspx
Reference: DirSync with Single Sign-On https://msdn.microsoft.com/en-us/library/azure/dn441213.aspx
Q7. - (Topic 6)
You are designing the deployment of virtual machines (VMs) and web services that run in Azure.
You need to specify the desired state of a node and ensure that the node remains at that state.
What should you use?
A. Microsoft Azure Pack
B. Service Management Automation
C. System Center 2021 Orchestrator
D. Azure Automation
Answer: A
Q8. - (Topic 6)
You are designing a web app deployment in Azure.
You need to ensure that inbound requests to the web app are routed based on the endpoint that has the lowest latency.
What should you use?
A. Azure health probes
B. Azure Fabric Controller
C. Azure Load Balancer
D. Azure Traffic Manager
Answer: D
Q9. - (Topic 1)
You are designing a plan to deploy a new application to Azure. The solution must provide a single sign-on experience for users.
You need to recommend an authentication type. Which authentication type should you recommend?
A. SAML credential tokens
B. Azure managed access keys
C. Windows Authentication
D. MS-CHAP
Answer: A
Explanation: A Microsoft cloud service administrator who wants to provide their Azure Active Directory (AD) users with sign-on validation can use a SAML 2.0 compliant SP-Lite profile based Identity Provider as their preferred Security Token Service (STS) / identity provider. This is useful where the solution implementer already has a user directory and password store on-premises that can be accessed using SAML 2.0. This existing user directory can be used for sign-on to Office 365 and other Azure AD-secured resources.
Reference: Use a SAML 2.0 identity provider to implement single sign-on https://msdn.microsoft.com/en-us/library/azure/dn641269.aspx?f=255&MSPPError=-2147217396
Topic 2, Trey Research
Background Overview
Trey Research conducts agricultural research and sells the results to the agriculture and food industries. The company uses a combination of on-premises and third-party server clusters to meet its storage needs. Trey Research has seasonal demands on its services, with up to 50 percent drops in data capacity and bandwidth demand during low-demand periods. They plan to host their websites in an agile, cloud environment where the company can deploy and remove its websites based on its business requirements rather than the requirements of the hosting company.
A recent fire near the datacenter that Trey Research uses raises the management team's awareness of the vulnerability of hosting all of the company's websites and data at any single location. The management team is concerned about protecting its data from loss as a result of a disaster.
Websites
Trey Research has a portfolio of 300 websites and associated background processes that are currently hosted in a third-party datacenter. All of the websites are written in ASP.NET, and the background processes use Windows Services. The hosting environment costs Trey Research approximately S25 million in hosting and maintenance fees.
Infrastructure
Trey Research also has on-premises servers that run VMs to support line-of-business applications. The company wants to migrate the line-of-business applications to the cloud, one application at a time. The company is migrating most of its production VMs from an aging VMWare ESXi farm to a Hyper-V cluster that runs on Windows Server 2012.
Applications DistributionTracking
Trey Research has a web application named Distributiontracking. This application
constantly collects realtime data that tracks worldwide distribution points to customer retail sites. This data is available to customers at all times.
The company wants to ensure that the distribution tracking data is stored at a location that is geographically close to the customers who will be using the information. The system must continue running in the event of VM failures without corrupting data. The system is processor intensive and should be run in a multithreading environment.
HRApp
The company has a human resources (HR) application named HRApp that stores data in an on-premises SQL Server database. The database must have at least two copies, but data to support backups and business continuity must stay in Trey Research locations only. The data must remain on-premises and cannot be stored in the cloud.
HRApp was written by a third party, and the code cannot be modified. The human resources data is used by all business offices, and each office requires access to the entire database. Users report that HRApp takes all night to generate the required payroll reports, and they would like to reduce this time.
MetricsTracking
Trey Research has an application named MetricsTracking that is used to track analytics for the DistributionTracking web application. The data MetricsTracking collects is not customer-facing. Data is stored on an on-premises SQL Server database, but this data should be moved to the cloud. Employees at other locations access this data by using a remote desktop connection to connect to the application, but latency issues degrade the functionality.
Trey Research wants a solution that allows remote employees to access metrics data without using a remote desktop connection. MetricsTracking was written in-house, and the development team is available to make modifications to the application if necessary. However, the company wants to continue to use SQL Server for MetricsTracking.
Business Requirements
Business Continuity
You have the following requirements:
✑ Move all customer-facing data to the cloud.
✑ Web servers should be backed up to geographically separate locations,
✑ If one website becomes unavailable, customers should automatically be routed to websites that are still operational.
✑ Data must be available regardless of the operational status of any particular website.
✑ The HRApp system must remain on-premises and must be backed up.
✑ The MetricsTracking data must be replicated so that it is locally available to all Trey Research offices.
Auditing and Security
You have the following requirements:
✑ Both internal and external consumers should be able to access research results.
✑ Internal users should be able to access data by using their existing company credentials without requiring multiple logins.
✑ Consumers should be able to access the service by using their Microsoft credentials.
✑ Applications written to access the data must be authenticated.
✑ Access and activity must be monitored and audited.
✑ Ensure the security and integrity of the data collected from the worldwide distribution points for the distribution tracking application.
Storage and Processing
You have the following requirements:
✑ Provide real-time analysis of distribution tracking data by geographic location.
✑ Collect and store large datasets in real-time data for customer use.
✑ Locate the distribution tracking data as close to the central office as possible to improve bandwidth.
✑ Co-locate the distribution tracking data as close to the customer as possible based on the customer's location.
✑ Distribution tracking data must be stored in the JSON format and indexed by metadata that is stored in a SQL Server database.
✑ Data in the cloud must be stored in geographically separate locations, but kept with the same political boundaries.
Technical Requirements Migration
You have the following requirements:
✑ Deploy all websites to Azure.
✑ Replace on-premises and third-party physical server clusters with cloud-based solutions.
✑ Optimize the speed for retrieving exiting JSON objects that contain the distribution
tracking data.
✑ Recommend strategies for partitioning data for load balancing.
Auditing and Security
You have the following requirements:
✑ Use Active Directory for internal and external authentication.
✑ Use OAuth for application authentication.
Business Continuity
You have the following requirements:
✑ Data must be backed up to separate geographic locations.
✑ Web servers must run concurrent versions of all websites in distinct geographic locations.
✑ Use Azure to back up the on-premises MetricsTracking data.
✑ Use Azure virtual machines as a recovery platform for MetricsTracking and HRApp.
✑ Ensure that there is at least one additional on-premises recovery environment for the HRApp.
Q10. DRAG DROP - (Topic 2)
You need to implement testing for the DataManager mobile application.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Answer:
Topic 3, Contoso, Ltd
Background
Overview
Contoso, Ltd., manufactures and sells golf clubs and golf balls. Contoso also sells golf accessories under the Contoso Golf and Odyssey brands worldwide.
Most of the company's IT infrastructure is located in the company's Carlsbad, California, headquarters. Contoso also has a sizable third-party colocation datacenter that costs the company USD $30,000 to $40,000 a month. Contoso has other servers scattered around the United States.
Contoso, Ltd., has the following goals:
✑ Move many consumer-facing websites, enterprise databases, and enterprise web services to Azure.
✑ Improve the performance for customers and resellers who are access company websites from around the world.
✑ Provide support for provisioning resources to meet bursts of demand.
✑ Consolidate and improve the utilization of website- and database-hosting resources.
✑ Avoid downtime, particularly that caused by web and database server updating.
✑ Leverage familiarity with Microsoft server management tools.
Infrastructure
Contoso's datacenters are filled with dozens of smaller web servers and databases that run on under-utilized hardware. This creates issues for data backup. Contoso currently backs up data to tape by using System Center Data Protection Manager. System Center Operations Manager is not deployed in the enterprise.
All of the servers are expensive to acquire and maintain, and scaling the infrastructure takes significant time. Contoso conducts weekly server maintenance, which causes
downtime for some of its global offices. Special events, such as high-profile golf tournaments, create a large increase in site traffic. Contoso has difficulty scaling the web- hosting environment fast enough to meet these surges in site traffic.
Contoso has resellers and consumers in Japan and China. These resellers must use applications that run in a datacenter that is located in the state of Texas, in the United States. Because of the physical distance, the resellers experience slow response times and downtime.
Business Requirements Management and Performance Management
✑ Web servers and databases must automatically apply updates to the operating
system and products.
✑ Automatically monitor the health of worldwide sites, databases, and virtual machines.
✑ Automatically back up the website and databases.
✑ Manage hosted resources by using on-premises tools.
Performance
✑ The management team would like to centralize data backups and eliminate the use of tapes.
✑ The website must automatically scale without code changes or redeployment.
✑ Support changes in service tier without reconfiguration or redeployment.
✑ Site-hosting must automatically scale to accommodate data bandwidth and number of connections.
✑ Scale databases without requiring migration to a larger server.
✑ Migrate business critical applications to Azure.
✑ Migrate databases to the cloud and centralize databases where possible.
Business Continuity and Support Business Continuity
✑ Minimize downtime in the event of regional disasters.
✑ Recover data if unintentional modifications or deletions are discovered.
✑ Run the website on multiple web server instances to minimize downtime and support a high service level agreement (SLA).
Connectivity
✑ Allow enterprise web services to access data and other services located on- premises.
✑ Provide and monitor lowest latency possible to website visitors.
✑ Automatically balance traffic among all web servers.
✑ Provide secure transactions for users of both legacy and modern browsers.
✑ Provide automated auditing and reporting of web servers and databases.
✑ Support single sign-on from multiple domains.
Development Environment
You identify the following requirements for the development environment:
✑ Support the current development team's knowledge of Microsoft web development and SQL Service tools.
✑ Support building experimental applications by using data from the Azure deployment and on-premises data sources.
✑ Mitigate the need to purchase additional tools for monitoring and debugging.
✑ System designers and architects must be able to create custom Web APIs without requiring any coding.
✑ Support automatic website deployment from source control.
✑ Support automated build verification and testing to mitigate bugs introduced during builds.
✑ Manage website versions across all deployments.
✑ Ensure that website versions are consistent across all deployments.
Technical Requirement Management and Performance Management
✑ Use build automation to deploy directly from Visual Studio.
✑ Use build-time versioning of assets and builds/releases.
✑ Automate common IT tasks such as VM creation by using Windows PowerShell workflows.
✑ Use advanced monitoring features and reports of workloads in Azure by using existing Microsoft tools.
Performance
✑ Websites must automatically load balance across multiple servers to adapt to varying traffic.
✑ In production, websites must run on multiple instances.
✑ First-time published websites must be published by using Visual Studio and scaled to a single instance to test publishing.
✑ Data storage must support automatic load balancing across multiple servers.
✑ Websites must adapt to wide increases in traffic during special events.
✑ Azure virtual machines (VMs) must be created in the same datacenter when applicable.
Business Continuity and Support Business Continuity
✑ Automatically co-locate data and applications in different geographic locations.
✑ Provide real-time reporting of changes to critical data and binaries.
✑ Provide real-time alerts of security exceptions.
✑ Unwanted deletions or modifications of data must be reversible for up to one month, especially in business critical applications and databases.
✑ Any cloud-hosted servers must be highly available.
Enterprise Support
✑ The solution must use stored procedures to access on-premises SQL Server data from Azure.
✑ A debugger must automatically attach to websites on a weekly basis. The scripts
that handle the configuration and setup of debugging cannot work if there is a delay in attaching the debugger.