70-776 Questions And Answers

$68

Exam Name: Perform Big Data Engineering on Microsoft Cloud Services (beta)

Updated: 2019-06-26

Q & A: 56

Money Back Guaranteed
  Reviews
  Customers who bought this item also bought

70-776 Frequently Asked Questions

Q1: Can I use 70-776 exam Q&As in my phone?
Yes, PassQuestion provides MCSA 70-776 pdf Q&As which you can download to study on your computer or mobile device, we also provide 70-776 pdf free demo which from the full version to check its quality before purchasing.

Q2: What are the formats of your Microsoft 70-776 exam questions?
PassQuestion provides Microsoft 70-776 exam questions with pdf format and software format, pdf file will be sent in attachment and software file in a download link, you need to download the link in a week, it will be automatically invalid after a week.

Q3: How can I download my 70-776 test questions after purchasing?
We will send MCSA 70-776 test questions to your email once we receive your order, pls make sure your email address valid or leave an alternate email.

Q4: How long can I get my MCSA 70-776 questions and answers after purchasing?
We will send MCSA 70-776 questions and answers to your email in 10 minutes in our working time and no less than 12 hours in our off time.

Working Time:
GMT+8: Monday- Saturday 8:00 AM-18:00 PM
GMT: Monday- Saturday 0:00 AM-10:00 AM

Q5: Can I pass my test with your MCSA 70-776 practice questions only?
Sure! All of PassQuestion MCSA 70-776 practice questions come from real test. If you can practice well and get a good score in our practice Q&As, we ensure you can pass your Perform Big Data Engineering on Microsoft Cloud Services (beta) exam easily.

Q6: How can I know my 70-776 updated? 
You can check the number of questions, if it is changed,that means we have updated this exam ,you can contact us anytime to ask for an free update. our sales email : [email protected]

Q7: What is your refund process if I fail Microsoft  70-776 test?
If you fail your 70-776 test in 60 days by studying our study material, just scan your score report and send to us in attchment,when we check, we will give you full refund.

Q8. What other payment menthod can I use except Paypal?
If your country don't support Paypal, we offer another Payment method Western Union,it is also safe and fast. Pls contact us for the details, we will send it to your email.

Question No : 1

Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
Start of Repeated Scenario:
You are migrating an existing on premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storaqe and an Azure Data Factory named AzureDF for extract, transformation, and load (ETL) functions.
For each table in LocalDW, you create a table in AzureDW.
On the on premises network, you have a Data Management Gateway.
Some source data is stored in Azure Blob storage. Some source data is stored on an on-premises Microsoft SQL Server instance. The instance has a table named Table1.
After data is processed by using AzureDF, the data must be archived and accessible forever. The archived data must meet a Service Level Agreement (SLA) for availability of 99 percent. If an Azure region fails the archived data must be available for reading always. The storage solution for the archived data must minimize costs.
End of Repeated Scenario
You need to define the schema of Table 1 in AzureDF.
What should you create?
A. a gateway
B. a linked service
C. a dataset
D. a pipeline
Answer: D

Question No : 2

HOTSPOT
You haw a Microsoft Azure Data Lake Analytics service.
You have a file named Employee.tsv that contains data on employers, Emptoyee.tsv contains seven columns named Empld, Start, FirstName, LastName, Age, Department, and Title.
You need to create a Data Lake Analytics job to transform Fmployee.tsv, define a schema for the data, and output the data to a CSV file. The outputted data must contain only employees who are in the sales department. The Age column must allow NULL.
How should you complete the U-SQL code segment? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.



Answer:


Question No : 3

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
You are monitoring user queries to a Microsoft Azure SQL data warehouse that has six compute nodes.
You discover that compute node utilization is uneven. The rows_processed column from sys.dm_pdw_dms_workers shows a significant variation in the number of rows being moved among the distributions for the same table for the same query.
You need to ensure that the load is distributed evenly across the compute nodes.
Solution: You change the table to use a column that is not skewed for hash distribution.
Does this meet the goal?
A. Yes
B. No
Answer: A

Question No : 4

You plan to add a file from Microsoft Azure Data Lake Store to Azure Data Catalog.
You run the Data Catalog tool and select Data Lake Store as the data source.
Which Information should you enter in the Store Account field to connect to the Data Lake Store?
A. an email alias
B. a server name
C. a URL
D. a subscription ID
Answer: D

Question No : 5

You have sensor devices that report data to Microsoft Azure Stream Analytics. Each sensor reports data several times per second.
You need to create a live dashboard in Microsoft Power BI that shows the performance of the sensor devices. The solution must minimize lag when visualizing the data.
Which function should you use for the time-series data element?
A. LAG
B. SlidingWindow
C. System.TimeStamp
D. TumblingWindow
Answer: C

Question No : 6

You are developing an application that uses Microsoft Azure Stream Analytics.
You have data structures that are defined dynamically.
You want to enable consistency between the logical methods used by stream processing and batch processing.
You need to ensure that the data can be integrated by using consistent data points.
What should you use to process the data?
A. a vectorized Microsoft SQL Server Database Engine
B. directed acyclic graph (DAG)
C. Apache Spark queries that use updateStateByKey operators
D. Apache Spark queries that use mapWithState operators
Answer: D

Question No : 7

HOTSPOT
You use Microsoft Visual Studio to develop custom solutions for customers who use Microsoft Azure Data Lake Analytics. You install the Data Lake Tools for Visual Studio.
You need to identify which tasks can be performed from Visual Studio and which tasks can be performed from the Azure portal .
What should you identify for each task? To answer, select the appropriate options in the answer area.
NOTE; each correct selection is worth one point.



Answer:


Question No : 8

The following materials are read as the reference.

You have a Microsoft Azure SQL data warehouse that has 10 compute nodes.
You need to export 10 TB of data from a data warehouse table to several new flat files in Azure Blob storage. The solution must maximize the use of tin* available* < ompute nodes.
What should you do?
Answer:

Question No : 9

HOTSPOT
Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
Start of Repeated Scenario:
You are migrating an existing on premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storaqe and an Azure Data Factory named AzureDF for extract, transformation, and load (ETL) functions.
For each table in LocalDW, you create a table in AzureDW.
On the on premises network, you have a Data Management Gateway.
Some source data is stored in Azure Blob storage. Some source data is stored on an on-premises Microsoft SQL Server instance. The instance has a table named Table1.
After data is processed by using AzureDF, the data must be archived and accessible forever. The archived data must meet a Service Level Agreement (SLA) for availability of 99 percent. If an Azure region fails the archived data must be available for reading always. The storage solution for the archived data must minimize costs.
End of Repeated Scenario
How should you configure the storage to archive the source data? To answer, select the appropriate options in the answer area. NOTE Each correct selection is worth one point.



Answer:


Question No : 10

HOTSPOT
You have a Microsoft Azure Data lake Analytics service.
You have a tab-delimited die named UserActivity.tsv that contains logs of user sessions. The file does not have a header row.
You need to create a table and to load the logs to the table. The solution must distribute the data by a column named Sessionld.
How should you complete the U-SQL statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.



Answer:


Question No : 11

You are using Cognitive capabilities in U SQL to analyze images that contain different types of objects.
You need to identify which objects might be people.
Which two reference assemblies should you use? Each correct answer presents part of the solution.
A. ExtPython
B. ImageCommon
C. ImageTagging
D. ExtR
E. FaceSdk
Answer: BE

Question No : 12

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a table named Table1 that contains 3 billion rows. Table1 contains data from the last 36 months.
At the end of every month, the oldest month of data is removed based on a column named DateTime.
You need to minimize how long it takes to remove the oldest month of data.
Solution: You implement a columnstore index on the DateTime column.
Does this meet the goal?
A. Yes
B. No
Answer: B

Question No : 13

DRAG DROP
You use Microsoft Azure Stream Analytics to analyze data from an Azure event hub in real time and send the output to a table named Table 1 in an Azure SQL database. Table 1 has three columns named Date. EventlD. and User.
You need to prevent duplicate data from being stored in the database.
How should you complete the statement.?
To answer, drag the appropriate values to the connect targets. Each value may be used once more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.



Answer:


Question No : 14

You have a Microsoft Azure SQL data warehouse.
Users discover that reports running in the data warehouse take longer than expected to complete.
You need to review the duration of the queries and which users are running the queries currently.
Which dynamic management view should you review for each requirement? To answer, drag the appropriate dynamic management view to the correct requirements. Each dynamic management view may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.

Untitled

Answer:

Question No : 15

You have a Microsoft Azure SQL data warehouse named DW1. Data is loaded to DW1 once daily at 01:00.
A user accidentally deletes data from a fact table in DW1 at 09:00.
You need to recover the lost data. The solution must prevent the need to change any connection strings and must minimize downtime.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the area and arrange them in the correct order.

Untitled

Answer:

Add Comments

Your Rating