Microsoft 70-463 Practice Questions 1

Each question is a total of 10 points.  There are 60 total questions.
For multichoice questions the points are divided among the correct answers.
 
It has a limitation in that for multi-choice questions if you choose every answer I think it will give you all the points, even though Microsoft would not allow the same.  So you do have to use this tool wisely.
 
Microsoft has their own practice test which you can purchase as well.
Each question is a total of 10 points.  There are 60 total questions.
For multichoice questions the points are divided among the correct answers.
 
It has a limitation in that for multi-choice questions if you choose every answer I think it will give you all the points, even though Microsoft would not allow the same.  So you do have to use this tool wisely.
 
Microsoft has their own practice test which you can purchase as well.
You are administering SQL Server Integration Services (SSIS) permissions on a production server that runs SQL Server 2012.
Quality Assurance (QA) testers in the company must have permission to perform the following tasks on specific projects:
– View and validate projects and packages
– View Environments and Environment variables
– Execute packages
You need to grant the minimum possible privileges to the QA testers.
What should you do? (Each correct answer presents part of the solution. Choose all that apply.)
In the SSISDB database, add QA Tester logons to the ssis_admin role.
In the msdb database, add QA Tester logons to the db_ssisoperator role.
Grant Modify permission in the projects to the QA Tester logons.
Grant Read permission in the SSIS catalog folder, the projects, and the Environments to the QA Tester logons.
Grant Execute permission in the projects to the QA Tester logons.
In the msdb database, add QA Tester logons to the db_ssisItduser role.
You are designing a data warehouse that uses SQL Server 2012.
The data warehouse contains a table named factSales that stores product sales. The table has a clustered index on the primary key, four foreign keys to dimension tables, and an aggregate column for sales totals. All key columns use the int data type and the aggregate column uses the money data type.
You need to increase the speed of data retrieval from the factSales table.
Which index type should you add to the table?
Clustered
Semantic search
Nonclustered
XML
You are designing a complex SQL Server Integration Services (SSIS) project that uses the Project Deployment model.
The project will contain between 15 and 20 packages. All the packages must connect to the same data source and destination.
You need to define and reuse the connection managers in all the packages by using the least development effort.
What should you do?
Copy and paste the connection manager details into each package.
Implement project connection managers.
Implement package connection managers.
Implement parent package variables in all packages.
You are performance tuning a SQL Server Integration Services (SSIS) package to load sales data from a source system into a data warehouse that is hosted on Windows Azure SQL Database.
The package contains a data flow task that has seven source-to-destination execution trees.
Only three of the source-to-destination execution trees are running in parallel.
You need to ensure that all the execution trees run in parallel.
What should you do?
Set the EngineThreads property of the data flow task to 7.
Set the MaxConcurrentExcecutables property of the package to 7.
Create seven data flow tasks that contain one source-to-destination execution tree each.
Place the data flow task in a For Loop container that is configured to execute seven times.
You are developing a SQL Server Integration Services (SSIS) package to implement an incremental data load strategy. The package reads data from a source system that uses the SQL Server change data capture (CDC) feature.
You have added a CDC Source component to the data flow to read changed data from the source system.
You need to add a data flow transformation to redirect rows for separate processing of insert, update, and delete operations.
Which data flow transformation should you use?
DQS Cleansing
Merge Join
Pivot
Conditional Split
You are designing a data warehouse that uses SQL Server 2012.
You are preparing to update the contents of a fact table that contains a non-clustered columnstore index.
You need to run an update statement against the table.
What should you do first?
Pause the columnstore index.
Change the recovery model of the database to Bulk-logged.
Change the non-clustered columnstore index to be a clustered columnstore index.
Drop the columnstore index.
Reporting from a Star schema is simpler than reporting from a normalized online transactional processing (OLTP) schema.
What are the reasons for wanting simpler reporting? (Choose all that apply.)
A Star schema typically has fewer tables than a normalized schema. Therefore, queries are simpler because they require fewer joins.
A Star schema has better support for numeric data types than a normalized relational schema; therefore, it is easier to create aggregates.
There are specific Transact-SQL expressions that deal with Star schemas.
A Star schema is standardized and narrative; you can find the information you need for a report quickly.
If you want to switch content from a nonpartitioned table to a partition of a partitioned table, what conditions must the nonpartitioned table meet? (Choose all that apply.)
It must have the same constraints as the partitioned table.
It must have the same compression as the partitioned table.
It must be in a special PartitionedTables schema.
It must have a check constraint on the partitioning column that guarantees that all of the data goes to exactly one partition of the partitioned table.
It must have the same indexes as the partitioned table.
Which of the following T-SQL functions is not very useful for capturing lineage information?
APP_NAME()
USER_NAME()
DEVICE_STATUS()
SUSER_SNAME()
Which statements best describe SQL Server Development Tools (SSDT)? (Choose all that apply.)
SSDT is an extension of the SQL Server Management Studio that can be used to create SSIS packages by means of a special wizard.
SSDT is a special edition of the SQL Server Management Studio, designed to provide an improved user experience to developers who are not particularly familiar with database administration.
SSDT is a special edition of Visual Studio, distributed with SQL Server 2012, providing a rich database development tool set.
SSDT is a new service in SQL Server 2012 that can be used to perform SQL Server maintenance tasks, such as data movements and similar data management processes.
You are developing a SQL Server Integration Services (SSIS) project with multiple packages to copy data to a Windows Azure SQL Database database.
An automated process must validate all related Environment references, parameter data types, package references, and referenced assemblies.
The automated process must run on a regular schedule.
You need to establish the automated validation process by using the least amount of administrative effort.
What should you do?
Use an event handler for OnError for the package.
Use an event handler for OnError for each data flow task.
Use an event handler for OnTaskFailed for the package.
View the job history for the SQL Server Agent job.
View the All Messages subsection of the All Executions report for the package.
Store the System::SourceID variable in the custom log table.
Store the System::ServerExecutionID variable in the custom log table.
Store the System::ExecutionInstanceGUID variable in the custom log table.
Enable the SSIS log provider for SQL Server for OnError in the package control flow.
Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.
Deploy the project by using dtutil.exe with the /COPY DTS option.
Deploy the project by using dtutil.exe with the /COPY SQL option.
Deploy the .ispac file by using the Integration Services Deployment Wizard.
Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.
Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.
Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and SSISDB.catalog. start_execution stored procedures.
Create a table to store error information. Create an error output on each data flow destination that writes OnError event text to the table.
Create a table to store error information. Create an error output on each data flow destination that writes OnTaskFailed event text to the table.
You are developing a SQL Server Integration Services (SSIS) project by using the Project Deployment Model. All packages in the project must log custom messages.
You need to produce reports that combine the custom log messages with the system generated log messages.
What should you do?
Use an event handler for OnError for the package.
Use an event handler for OnError for each data flow task.
Use an event handler for OnTaskFailed for the package.
View the job history for the SQL Server Agent job.
View the All Messages subsection of the All Executions report for the package.
Store the System::SourceID variable in the custom log table.
Store the System::ServerExecutionID variable in the custom log table.
Store the System::ExecutionInstanceGUID variable in the custom log table.
Enable the SSIS log provider for SQL Server for OnError in the package control flow.
Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow,
Deploy the project by using dtutil.exe with the /COPY DTS option.
Deploy the project by using dtutil.exe with the /COPY SQL option.
Deploy the .ispac file by using the Integration Services Deployment Wizard.
Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.
Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.
Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and SSISDB.catalog. start_execution stored procedures.
Create a table to store error information. Create an error output on each data flow destination that writes OnError event text to the table.
Create a table to store error information. Create an error output on each data flow destination that writes OnTaskFailed event text to the table.
You are developing a SQL Server Integration Services (SSIS) package to implement an incremental data load strategy. The package reads data from a source system that uses the SQL Server change data capture (CDC) feature.
You have added a CDC Source component to the data flow to read changed data from the source system.
You need to add a data flow transformation to redirect rows for separate processing of insert, update, and delete operations.
Which data flow transformation should you use?
Audit
Merge Join
Merge
CDC Splitter
You need to extract data from delimited text files. What connection manager type would you choose?
A Flat File connection manager
An OLE DB connection manager
An ADO.NET connection manager
A File connection manager
Some of the data your company processes is sent in from partners via email. How would you configure an SMTP connection manager to extract files from email messages?
In the SMTP connection manager, configure the OperationMode setting to Send And Receive.
It is not possible to use the SMTP connection manager in this way, because it can only be used by SSIS to send email messages.
The SMTP connection manager supports sending and receiving email messages by default, so no additional configuration is necessary.
It is not possible to use the SMTP connection manager for this; use the IMAP (Internet Message Access Protocol) connection manager instead.
You need to extract data from a table in a SQL Server 2012 database. What connection manager types can you use? (Choose all that apply.)
An ODBC connection manager
An OLE DB connection manager
A File connection manager
An ADO.NET connection manager
In your SSIS solution, you need to load a large set of rows into the database as quickly as possible. The rows are stored in a delimited text file, and only one source column needs its data type converted from String (used by the source column) to Decimal (used by the destination column).
What control flow task would be most suitable for this operation?
The File System task would be perfect in this case, because it can read data from files and can be configured to handle data type conversions.
The Bulk Insert task would be the most appropriate, because it is the quickest and can handle data type conversions.
The data flow task would have to be used, because the data needs to be transformed before it can be loaded into the table.
No single control flow task can be used for this operation, because the data needs to be extracted from the source file, transformed, and then loaded into the destination table. At least three different tasks would have to be used–the Bulk Insert task to load the data into a staging database, a Data Conversion task to convert the data appropriately, and finally, an Execute SQL task to merge the transformed data with existing destination data.
A part of your data consolidation process involves extracting data from Excel workbooks.
Occasionally, the data contains errors that cannot be corrected automatically.
How can you handle this problem by using SSIS?
Redirect the failed data flow task to an External Process task, open the problematic Excel file in Excel, and prompt the user to correct the file before continuing the data consolidation process.
Redirect the failed data flow task to a File System task that moves the erroneous file to a dedicated location where an information worker can correct it later.
If the error cannot be corrected automatically, there is no way for SSIS to continue with the automated data consolidation process.
None of the answers above are correct. Due to Excel’s strict data validation rules, an Excel file cannot ever contain erroneous data.
In your ETL process, there are three external processes that need to be executed in sequence, but you do not want to stop execution if any of them fails. Can this be achieved by using precedence constraints? If so, which precedence constraints can be used?
( I believe the answer to this one is incorrect, however it was the chosen answer )
No, this cannot be achieved just by using precedence constraints.
Yes, this can be achieved by using completion precedence constraints between the first and the second and between the second and the third Execute Process tasks, and by using a success precedence constraint between the third Execute Process task and the following task.
Yes, this can be achieved by using completion precedence constraints between the first and the second, between the second and the third, and between the third Execute Process task and the following task.
Yes, this can be achieved by using failure precedence constraints between the first and the second, and between the second and the third Execute Process tasks, and by using a completion precedence constraint between the third Execute Process task and the following task.
You are designing a data warehouse that contains a customer dimension. The customer dimension contains the original customer attributes as well as the most recent set of updated attributes. When a customer is updated, the original customer attributes must be preserved.
The customer dimension design must:
– Support a Type 3 Slowly Changing Dimension (SCD) process
– Minimize the total storage consumed
You need to design a solution that meets these requirements.
What should you do?
Design two tables. One table will hold the original customer surrogate key and attributes. The other table will hold the most recent customer surrogate key and attributes.
Design a table as a Type 2 SCD dimension that holds the surrogate key for each customer and its attributes. Limit the number of records per customer within the table to two.
Design a table to hold the surrogate key for each customer and its attributes. Add a column to store an indicator that identifies the record as original data or recent data.
Design a table to hold the surrogate key for each customer and its attributes as well as additional columns to store the original attributes.
You are designing a data warehouse hosted on Windows Azure SQL Database. The data warehouse currently includes the dimUser and dimRegion dimension tables and the factSales fact table. The dimUser table contains records for each user permitted to run reports against the warehouse, and the dimRegion table contains information about sales regions. The system is accessed by users from certain regions, as well as by area supervisors and users from the corporate headquarters. You need to design a table structure to ensure that certain users can see sales data for only certain regions. Some users must be permitted to see sales data from multiple regions. What should you do?
For each region, create a view of the factSales table that includes a WHERE clause for the region.
Create a userRegion table that contains primary key columns from the dimUser and dimRegion tables.
Add a region column to the dimUser table.
Partition the factSales table on the region column.
You are developing a SQL Server Integration Services (SSIS) package. The package uses a data flow task to source data from a SQL Server database for loading into a dimension table in a data warehouse.
You need to create a separate data flow path for data that has been modified since it was last processed.
Which data flow components should you use to identify modified data?
(Each correct answer presents a complete solution. Choose all that apply.)
Multicast
Data Conversion
Lookup
Slowly Changing Dimension
Aggregate
You are designing a SQL Server Integration Services (SSIS) data flow to load sales transactions from a source system into a data warehouse hosted on Windows Azure SQL Database. One of the columns in the data source is named ProductCode. Some of the data to be loaded will reference products that need special processing logic in the data flow.
You need to enable separate processing streams for a subset of rows based on the source product code.
Which data flow transformation should you use?
Source Assistant
Destination Assistant
Audit
Script Component
You are administering SQL Server Integration Services (SSIS) permissions on a production server that runs SQL Server 2012.
Package developers in your company must have permission to perform the following tasks only on their own projects:
– View projects and packages
– View Environments
– Validate packages
– Execute packages
You need to grant rights to the developers without assigning unnecessary privileges.
What should you do? (Each correct answer presents part of a solution. Choose all that apply.)
Add developer logins to the db_ssisltduser role in the msdb database.
Add developer logins to the db_ssisoperator role in the msdb database.
Grant Execute permission in the projects for the developer logins.
Grant Read permission in the SSIS catalog folder, the projects, and the Environments.
Add developer logins to the ssis_admin role in the SSISDB database.
Grant Modify permission in the projects for the developer logins.
To support the implementation of new reports, Active Directory data will be downloaded to a SQL Server database by using a SQL Server Integration Services (SSIS) 2012 package.
The following requirements must be met:
– All the user information for a given Active Directory group must be downloaded to a SQL Server table.
– The download process must traverse the Active Directory hierarchy recursively.
You need to configure the package to meet the requirements by using the least development effort.
Which item should you use?
Script task
Script component configured as a transformation
Script component configured as a source
Script component configured as a destination
You are designing a SQL Server Integration Services (SSIS) 2012 package that imports data from a Windows Azure SQL Database database into a SQL Server database.
The SSIS package has the following requirements:
– Every night, a very large amount of data is imported into the staging database.
– Package processing time must be minimized.
– The package must run on its own dedicated server when it is deployed to production.
– Transaction log activity when data is imported must be minimized.
You need to design the package to meet the requirements.
Which destination component should you use?
Raw File
ODBC
Bulk Insert
OLE DB
You are implementing a SQL Server Integration Services (SSIS) 2012 package that loads data from various flat files and a Windows Azure SQL Database database. Daily transactions must be loaded into a staging database. All the SSIS tasks will use the CurrentDate variable as the transaction date.
You need to set the CurrentDate variable to the date stored in a control table of the Windows Azure SQL Database database when the package starts.
You need to achieve this goal by using the least amount of development effort.
What should you use to set the variable?
an Expression task
an Execute SQL task
a Script component
a Script task
You are preparing to install SQL Server 2012 Master Data Services (MDS).
You need to ensure that the database requirements are met.
What should you install?
Microsoft SharePoint Server 2010 Standard Edition SP1
Microsoft SharePoint Server 2010 Enterprise Edition SP1
SQL Server 2012 Data Center (64-bit) x64 on the database server
SQL Server 2012 Enterprise (64-bit) x64 on the database server
You are implementing a SQL Server Integration Services (SSIS) package that imports Microsoft Excel workbook data into a Windows Azure SQL Database database. The package has been deployed to a production server that runs Windows Server 2008 R2 and SQL Server 2012.
The package fails when executed on the production server.
You need to ensure that the package can load the Excel workbook data without errors.
You need to use the least amount of administrative effort to achieve this goal.
What should you do?
Create a custom SSIS source component that encapsulates the 32-bit driver and compile it in 64- bit mode.
Install a 64-bit ACE driver and execute the package by using the 64-bit run-time option.
Execute the package by using the 32-bit run-time option.
Replace the SSIS Excel source with a SSIS Flat File source.
You are installing the Data Quality Server component of Data Quality Services.
You need to provision the hardware and install the software for the server that runs the Data Quality Server.
You must ensure that the minimum Data Quality Server prerequisites are met.
What should you do?
Install Microsoft Internet Explorer 6.0 SP1 or later.
Install SQL Server 2012 Database Engine.
Make sure the server has at least 4 GB of RAM.
Install Microsoft SharePoint Server 2010 Standard Edition.
You develop a SQL Server Integration Services (SSIS) package that imports Windows Azure SQL Database data into a data warehouse every night.
The Windows Azure SQL Database data contains many misspellings and variations of abbreviations. To import the data, a developer used the Fuzzy Lookup transformation to choose the closest- matching string from a reference table of allowed values.
The number of rows in the reference table is very large.
If no acceptable match is found, the Fuzzy Lookup transformation passes a null value.
The current setting for the Fuzzy Lookup similarity threshold is 0.50.
Many values are incorrectly matched.
You need to ensure that more accurate matches are made by the Fuzzy Lookup transformation without degrading performance.
What should you do?
Change the similarity threshold to 0.40.
Decrease the maximum number of matches per lookup.
Change the similarity threshold to 0.85.
Increase the maximum number of matches per lookup.
You maintain a SQL Server Integration Services (SSIS) package. The package was developed by using SQL Server 2008 Business Intelligence Development Studio (BIDS). The package includes custom scripts that must be upgraded.
You need to upgrade the package to SQL Server 2012.
Which tool should you use?
SQL Server Integration Services Deployment Wizard
SQL Server Configuration Manager
SSIS Upgrade Wizard in SQL Server Management Studio
SSIS Upgrade Wizard in SQL Server 2008 BIDS
You are preparing to install SQL Server 2012 Master Data Services (MDS).
You need to ensure that the database requirements are met.
What should you install?
Microsoft SharePoint Foundation 2010 SP1
SQL Server 2012 Enterprise (64-bit) x64 on the database server
SQL Server 2012 Data Center (64-bit) x64 on the database server
SQL Server 2008 Enterprise (64-bit) x64 on the database server
You are installing the Data Quality Server component of Data Quality Services.
You need to provision the hardware and install the software for the server that runs the Data Quality Server.
You must ensure that the minimum Data Quality Server prerequisites are met.
What should you do?
Install SQL Server 2012 Database Engine.
Install Microsoft SharePoint Server 2010 Enterprise Edition with PowerPivot.
Make sure the server has at least 4 GB of RAM.
Install Microsoft Internet Explorer 6.0 SP1 or later.
You are implementing a SQL Server Integration Services (SSIS) package that imports Microsoft Excel workbook data into a Windows Azure SQL Database database. The package has been deployed to a production server that runs Windows Server 2008 R2 and SQL Server 2012.
The package fails when executed on the production server.
You need to ensure that the package can load the Excel workbook data without errors.
You need to use the least amount of administrative effort to achieve this goal.
What should you do?
Install a 64-bit ACE driver and execute the package by using the 64-bit run-time option.
Enable Address Windowing Extensions (AWE) for the local SQL Server instance.
Replace the SSIS Excel source with a SSIS Flat File source.
Install a 64-bit ACE driver and replace the Excel source with an OLE DB source.
You develop a SQL Server Integration Services (SSIS) project by using the Package Deployment Model. A package in the project extracts data from a Windows Azure SQL Database database. The package is deployed to SQL Server.
The package is not producing the desired results.
You need to generate the .mdmp and .tmp debug files in order to troubleshoot the issues.
What should you do?
Execute the catalog.add_data_tap stored procedure with the package execution_id.
Execute the catalog.create_execution_dump stored procedure with the package execution_id.
Run the DTEXEC utility with the /DumpOnError option.
Run the DTEXEC utility with the /Reporting V option.
You are developing a SQL Server Integration Services (SSIS) project that contains a project Connection Manager and multiple packages.
All packages in the project must connect to the same database. The server name for the database must be set by using a parameter named ParamConnection when any package in the project is executed.
You need to develop this project with the least amount of development effort.
What should you do? (Each answer presents a part of the solution. Choose all that apply.)
Create a package parameter named ConnectionName in each package.
Edit each package Connection Manager. Set the ConnectionName property to @[$Project::ParamConnection].
Edit the project Connection Manager in Solution Explorer. Set the ConnectionName property to @ [$Project::ParamConnection].
Set the Sensitive property of the parameter to True.
Create a project parameter named ConnectionName.
Set the Required property of the parameter to True.
You are developing a SQL Server Integration Services (SSIS) package. The package sources data from an HTML web page that lists product stock levels.
You need to implement a data flow task that reads the product stock levels from the HTML web page.
Which data flow source should you use?
Raw File source
XML source
Custom source component
Flat File source
You are deploying a new SQL Server Integration Services (SSIS) package to five servers.
The package must meet the following requirements:
– .NET Common Language Runtime (CLR) integration in SQL Server must not be enabled.
– The Connection Managers used in the package must be configurable without editing and redeploying the package.
– The deployment procedure must be automated as much as possible.
– Performance must be maximized.
You need to set up a deployment strategy that meets the requirements.
What should you do?
Add an OnError event handler to the SSIS project.
Use an msi file to deploy the package on the server.
Open a command prompt and run the gacutil command.
Open a command prompt and run the dtutil /copy command.
Open a command prompt and run the dtexec /rep /conn command.
Open a command prompt and run the dtexec /dumperror /conn command.
Open a command prompt and execute the package by using the SQL Log provider and running the dtexecui.exe utility.
Create a reusable custom logging component and use it in the SSIS project.
Configure the SSIS solution to use the Project Deployment Model.
Configure the output of a component in the package data flow to use a data tap.
Run the dtutil command to deploy the package to the SSIS catalog and store the configuration in SQL Server.
You are designing a SQL Server Integration Services (SSIS) solution that will load multiple Online Transactional Processing (OLTP) data sources into a SQL Server data mart.
You have the following requirements:
– Ensure that the process supports the creation of an exception report that details possible duplicate key values, null ratios within columns, and column-length distributions of values. – Ensure that users can generate the exception report in an XML format.
– Use the minimum development effort.
You need to design the SSIS solution to meet the requirements.
What should you do?
(More than one answer choice may achieve the goal. Select the BEST answer.)
Use a Data Profiling task. Use a Data Flow task to extract the XML output of the Data Profiling task into a SQL Server table. Query the table to view the exceptions.
Use Data Flow tasks to process the clean data.
Use a Data Profiling task. Read the exceptions in Data Profile Viewer.
Design a stored procedure that examines data for common dirty data patterns. Use an Execute SQL task.
You are designing a SQL Server Integration Services (SSIS) solution.
The solution will contain an SSIS project that includes several SSIS packages.
Each SSIS package will define the same connection managers and variables.
You have the following requirements:
– The deployment model must support changing the content of connection strings by using parameters at execution time.
– The deployment model must automatically log events to the SSISOB database.
– Maximize performance at execution time.
You need to design a solution that meets the requirements.
What should you do?
(More than one answer choice may achieve the goal. Select the BEST answer.)
Use a project deployment model. Modify connection manager properties to use project parameters.
Use a package deployment model. Save each SSIS package to a file share that can be accessed from all environments.
Use a package deployment model. Configure connections in an XML configuration file referenced by an environment variable that corresponds to the SQL Server environment of each SSIS package.
Use a project deployment model. Modify connection manager properties to use package parameters.
You are designing a partitioning strategy for a large fact table in a data warehouse.
Tens of millions of new records are loaded into the data warehouse weekly, outside of business hours. Most queries are generated by reports and by cube processing.
Data is frequently queried at the day level and occasionally at the month level.
You need to partition the table to maximize the performance of queries.
What should you do?
(More than one answer choice may achieve the goal. Select the BEST answer.)
Partition the fact table by month, and compress each partition.
Partition the fact table by week.
Partition the fact table by year.
Partition the fact table by day, and compress each partition.
You are a database developer of a Microsoft SQL Server 2012 database.
You are designing a table that will store Customer data from different sources. The table will include a column that contains the CustomerID from the source system and a column that contains the SourceID. A sample of this data is as shown in the following table.
You need to ensure that the table has no duplicate CustomerID within a SourceID .
You also need to ensure that the data in the table is in the order of SourceID and then CustomerID.
Which Transact- SQL stat.ement should you use?
CREATE TABLE Customer (SourceID int NOT NULL IDENTITY, CustomerID int NOT NULL IDENTITY, CustomerName varchar(255) NOT NULL);
CREATE TABLE Customer (SourceID int NOT NULL, CustomerID int NOT NULL PRIMARY KEY CLUSTERED, CustomerName varchar(255) NOT NULL);
CREATE TABLE Customer (SourceID int NOT NULL PRIMARY KEY CLUSTERED, CustomerID int NOT NULL UNIQUE, CustomerName varchar(255) NOT NULL);
CREATE TABLE Customer (SourceID int NOT NULL, CustomerID int NOT NULL, CustomerName varchar(255) NOT NULL, CONSTRAINT PK_Customer PRIMARY KEY CLUSTERED (SourceID, CustomerID));
You are designing a SQL Server Integration Services (SSIS) solution.
The solution will contain an SSIS project that includes several SSIS packages.
Each SSIS package will define the same connection managers and variables.
You have the following requirements:
– Ensure that the deployment model supports changing the content of connection strings by using parameters at execution time.
– Ensure that the deployment model automatically starts from calls to the catalog.start_execution stored procedure in the SSISDB database.
– Maximize performance at execution time.
– Minimize development effort.
You need to design a solution that meets the requirements.
What should you do?
(More than one answer choice may achieve the goal. Select the BEST answer.)
Use a project deployment model. Modify connection manager properties to use project parameters. Ensure that the SSISDB database is created.
Use a project deployment model. Configure connections in an XML configuration file referenced by an environment variable that corresponds to the SQL Server environment of each SSIS package.
Use a package deployment model. Use a SQL Server package configuration with a common filter. Change the contents of the SSIS Configurations table at runtime.
Use a package deployment model. Save each SSIS package to a file share that can be accessed from all environments.
A SQL Server Integration Services (SSIS) package imports daily transactions from several files into a SQL Server table named Transaction.
Each file corresponds to a different store and is imported in parallel with the other files.
The data flow tasks use OLE DB destinations in fast load data access mode.
The number of daily transactions per store can be very large and is growing-The Transaction table does not have any indexes.
You need to minimize the package execution time.
What should you do?
Partition the table by day and store.
Create a clustered index on the Transaction table.
Run the package in Performance mode.
Increase the value of the Row per Batch property.
You work as a database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You have created a SQL Server Integration Services (SSIS) project that you are preparing to deploy using the Integration Services Deployment Wizard.
Which of the following is the third step in the Integration Services Deployment Wizard’s process?
Review
Select Destination
Deploy/Results
Select Source
You work as a database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You have received reports regarding a current existing SQL Server Integration Services (SSIS) package not completing its execution, and not transferring data.
You have decided to make use of the Package Deployment Model.
Which of the following is TRUE with regards to using the Package Deployment Model?
CLR integration is NOT required on the database engine.
Environment-specific parameter values are stored in environment variables.
Events that are created by the package are captured automatically and saved to the catalog during execution.
Events that are created by the package are NOT captured automatically during execution.
You work as a senior database administrator at Lead2pass.com. The Lead2pass.com network consists of a single
domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are running a training exercise for Microsoft SQL Server 2012 junior administrators.
You are discussing the use of the add_data_tap stored procedure.
Which of the following is TRUE with regards to using add_data_tap stored procedure? (Choose all that apply.)
Limiting the number of rows that are captured during the data tap is not advisable.
The add_data_tap stored procedure should only be used to troubleshoot data issues.
The add data tap process is validated prior to the package being executed.
You must be an administrator on the machine on which the stored procedure is run to open the file that contains the tapped data.
You work as a senior database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are running a training exercise for Microsoft SQL Server 2012 junior administrators.
You are discussing the use of Slowly Changing Dimension Transformation Outputs. One of the output options are used to update the record in the lookup table.
Which of the following is the output option?
Unchanged Output
Inferred Member Updates Output
Historical Attributes Inserts Output
Fixed Attribute Output
Changing Attributes Updates Output
You work as a database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are in the process of creating a SQL Server Integration Services (SSIS) package.
The package will be configured upload a file to a table in a SQL Azure database.
You need to make sure that the code includes an event handler that makes use of an executable to raise the event when its verification is completed.
Which of the following is the event handler that you should use?
The OnPostExecute event handler.
The OnTaskFailed event handler.
The OnPreExecute event handler.
The OnPostValidate event handler.
You work as a senior database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are running a training exercise for Microsoft SQL Server 2012 junior administrators.
You are discussing replication type that is used when subscribers need to receive data, make changes offline, and later synchronize changes with the Publisher and other Subscribers.
Which of the following is the replication type being discussed?
Snapshot replication.
Transactional replication.
Merge replication.
Peer-to-peer replication
You work as a senior database developer at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are running a training exercise for Microsoft SQL Server 2012 junior developers.
You are discussing the SorLead2passeyPosition property for a data flow.
Which of the following represents the data type for this property?
Integer
IDTSExternalMetadataColumn100
String
Boolean
You work as a database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You have received reports regarding a current existing SQL Server Integration Services (SSIS) package not completing its execution, and not transferring data.
You have decided to make use of the Project Deployment Model.
Which of the following is TRUE with regards to using the Project Deployment Model?
CLR integration is required on the database engine.
CLR integration is not required on the database engine.
Events that are created by the package are captured automatically and saved to the catalog during execution.
Packages are executed using the dtExec and DTExecUI execution utilities.
You are designing an extract, transform, load (ETL) process for loading data from a SQL Server database into a large fact table in a data warehouse each day with the prior day’s sales data.
The ETL process for the fact table must meet the following requirements:
– Load new data in the shortest possible time.
– Remove data that is more than 36 months old.
– Ensure that data loads correctly.
– Minimize record locking.
– Minimize impact on the transaction log.
You need to design an ETL process that meets the requirements.
What should you do?
(More than one answer choice may achieve the goal. Select the BEST answer.)
Partition the destination fact table by date. Insert new data directly into the fact table and delete old data directly from the fact table.
Partition the destination fact table by date. Use partition switching and staging tables both to remove old data and to load new data.
Partition the destination fact table by customer. Use partition switching both to remove old data and to load new data into each partition.
Partition the destination fact table by date. Use partition switching and a staging table to remove old data. Insert new data directly into the fact table.
You are designing a fact table in a SQL Server database.
The fact table must meet the following requirements:
– Include a columnstore index.
– Allow users to choose up to 10 dimension tables and up to five facts at one time.
– Maximize performance of queries that aggregate measures by using any of the 10 dimensions.
– Support billions of rows.
– Use the most efficient design strategy.
You need to design the fact table to meet the requirements.
What should you do?
(More than one answer choice may achieve the goal. Select the BEST answer.)
Design a fact table with 5 dimensional key columns and 10 measure columns. Place the columnstore index on the dimensional key columns.
Design a fact table with 5 dimensional key columns and 10 measure columns. Place the columnstore index on the measure columns.
Design a fact table with 10 dimensional key columns and 5 measure columns. Place the columnstore index on the dimensional key columns and the measure columns.
Design a fact table with 10 dimensional key columns and 5 measure columns. Place the columnstore index on only the measure columns.
You work as a senior database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are hosting a training exercise for Microsoft SQL Server 2012 administrators. You are currently discussing Multicast transformation.
Which of the following is TRUE with regards to Multicast transformation? (Choose two.)
It prevents a package from creating logical copies of data.
It allows a package to create logical copies of data.
It distributes its input to one or more outputs.
It distributes its input to a single output.
You work as a database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are in the process of creating a SQL Server Integration Services (SSIS) package. The package will be configured to make use of the Generate And Persist New Index Fuzzy Lookup transformation option.
Which of the following is the purpose of this option?
It allows for a current index to be reused.
It allows for a new index to be created, without saving it.
It allows for a new index to be created and saved, without maintaining it.
It allows for a new index to be created, saved, and maintained.
You work as a database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are in the process of creating a SQL Server Integration Services (SSIS) package. You have decided to make use of Expression Task.
Which of the following is TRUE with regards to Expression Task?
It creates and evaluates expressions that set variable values at runtime.
It provides code to perform functions that are not available in the built-in tasks and transformations that SQL Server Integration Services provides.
It runs an application or batch file as part of a SQL Server Integration Services package workflow.
It allows you to run SQL statements during package execution.
You work as a senior database administrator at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You are running a training exercise for Microsoft SQL Server 2012 junior administrators. You are discussing data quality projects.
Which of the following are benefits of data quality projects? (Choose all that apply.)
It enables you to perform data cleansing on your source data by using the knowledge in a DQS knowledge base.
It allows you to perform data matching on your source data by using the knowledge in a DQS knowledge base.
It allows you to perform data matching on your source data by using the matching policy in a knowledge base.
It enables you to perform data cleansing on your source data by using the matching policy in a knowledge base.
You work as a database developer at Lead2pass.com. The Lead2pass.com network consists of a single domain named Lead2pass.com. Lead2pass.com makes use of Microsoft SQL Server 2012 in their environment.
You have been asked to revise a SQL Server Integration Services (SSIS) package.
The package only includes Execute SQL tasks.
To make sure that, in the event of an Execute SQL task failing, changes made by all of the Execute SQL tasks is rolled back.
To do this, you want to alter the TransactionOption property of the package.
Which of the following is the value that must be set for the TransactionOption property of the package?
NotSupported
Supported
NotRequired
Required
0
{"name":"Microsoft 70-463 Practice Questions 1", "url":"https://www.quiz-maker.com/QR2XC8N","txt":"You are administering SQL Server Integration Services (SSIS) permissions on a production server that runs SQL Server 2012.Quality Assurance (QA) testers in the company must have permission to perform the following tasks on specific projects:– View and validate projects and packages– View Environments and Environment variables– Execute packagesYou need to grant the minimum possible privileges to the QA testers.What should you do? (Each correct answer presents part of the solution. Choose all that apply.), You are designing a data warehouse that uses SQL Server 2012.The data warehouse contains a table named factSales that stores product sales. The table has a clustered index on the primary key, four foreign keys to dimension tables, and an aggregate column for sales totals. All key columns use the int data type and the aggregate column uses the money data type.You need to increase the speed of data retrieval from the factSales table.Which index type should you add to the table?, Each question is a total of 10 points.For multichoice questions the points are divided among the correct answers.   It has a limitation in that for multi-choice questions if you choose every answer I think it will give you all the points, even though Microsoft would not allow the same.  So you do have to use this tool wisely.","img":"https://www.quiz-maker.com/3012/images/ogquiz.png"}
Powered by: Quiz Maker