Ingeniería de datos en el examen de Microsoft Azure Real DP-203 Guía de estudio en línea para aprobar

Posted by: Martina Comments: 0

La ingeniería de datos DP-203 en el examen de Microsoft Azure es el requisito de Microsoft Certified: Azure Data Engineer Associate Certification. Los candidatos para el examen Microsoft DP-203 deben tener experiencia en la materia que se integren, transforme y consoliden datos de varios sistemas de datos estructurados y no estructurados en una estructura adecuada para construir soluciones analíticas. Real Microsoft DP-203 Guía de estudio en línea para ayudarlo a aprobar ingeniería de datos en el examen Microsoft Azure DP-203. Si desea obtener preguntas y respuestas gratuitas del examen Microsoft DP-203 para el aprendizaje, las preguntas gratuitas en línea DP-203 están disponibles. Puede leer las preguntas del examen ReaLEXAM.ES DP-203 GRATIS a continuación.

Page 1 of 6

1. You need to design a data retention solution for the Twitter feed data records. The solution must meet the customer sentiment analytics requirements.

Which Azure Storage functionality should you include in the solution?

2. HOTSPOT

You are building an Azure Stream Analytics job to identify how much time a user spends interacting with a feature on a webpage.

The job receives events based on user actions on the webpage. Each row of data represents an event. Each event has a type of either 'start' or 'end'.

You need to calculate the duration between start and end events.

How should you complete the query? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



3. You have files and folders in Azure Data Lake Storage Gen2 for an Azure Synapse workspace as shown in the following exhibit.





You create an external table named ExtTable that has LOCATION='/topfolder/'.

When you query ExtTable by using an Azure Synapse Analytics serverless SQL pool, which files are returned?

4. DRAG DROP

You need to implement versioned changes to the integration pipelines. The solution must meet the data integration requirements.

In which order should you perform the actions? To answer, move all actions from the list of actions to the answer area and arrange them in the correct order.



5. CORRECT TEXT

The storage account container view is shown in the Refdata exhibit. (Click the Refdata tab.) You need to configure the Stream Analytics job to pick up the new reference data .

What should you configure? To answer, select the appropriate options in the answer area NOTE: Each correct selection is worth one point.



6. HOTSPOT

You need to implement an Azure Synapse Analytics database object for storing the sales transactions data. The solution must meet the sales transaction dataset requirements.

What solution must meet the sales transaction dataset requirements.

What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



7. You need to implement a Type 3 slowly changing dimension (SCD) for product category data in an Azure Synapse Analytics dedicated SQL pool.

You have a table that was created by using the following Transact-SQL statement.





Which two columns should you add to the table? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

8. HOTSPOT

You develop a dataset named DBTBL1 by using Azure Databricks.

DBTBL1 contains the following columns:

✑ SensorTypeID

✑ GeographyRegionID

✑ Year

✑ Month

✑ Day

✑ Hour

✑ Minute

✑ Temperature

✑ WindSpeed

✑ Other

You need to store the data to support daily incremental load pipelines that vary for each GeographyRegionID. The solution must minimize storage costs.

How should you complete the code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



9. CORRECT TEXT

You are designing an Azure Data Lake Storage Gen2 container to store data for the human resources (HR) department and the operations department at your company.

You have the following data access requirements:

• After initial processing, the HR department data will be retained for seven years.

• The operations department data will be accessed frequently for the first six months, and then accessed once per month.

You need to design a data retention solution to meet the access requirements. The solution must minimize storage costs.



10. HOTSPOT

You need to implement an Azure Databricks cluster that automatically connects to Azure Data lake Storage Gen2 by using Azure Active Directory (Azure AD) integration .

How should you configure the new clutter? To answer, select the appropriate options in the answers area. NOTE: Each correct selection is worth one point.




 

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *