Questão 1
Questão
Reporting from a Star schema is simpler than reporting from a normalized online
transactional processing (OLTP) schema. What are the reasons for wanting simpler
reporting? (Choose all that apply.)
Responda
-
A Star schema typically has fewer tables than a normalized schema. Therefore,
queries are simpler because they require fewer joins.
-
Star schema has better support for numeric data types than a normalized rela-
tional schema; therefore, it is easier to create aggregates.
-
There are specific Transact-SQL expressions that deal with Star schemas.
-
A Star schema is standardized and narrative; you can find the information you
need for a report quickly.
Questão 2
Questão
You are creating a quick POC project. Which schema is the most suitable for this kind
of a project?
Responda
-
Star schema
-
Normalized schema
-
Snowflake schema
-
XML schema
Questão 3
Questão
A Star schema has two types of tables. What are those two types? (Choose all that
apply.)
Responda
-
Lookup tables
-
Dimensions
-
Measures
-
Fact tables
Questão 4
Questão
You implement a Type 2 solution for an SCD problem for a specific column. What do
you actually do when you get a changed value for the column from the source system?
Responda
-
Add a column for the previous value to the table. Move the current value of the
updated column to the new column. Update the current value with the new value
from the source system.
-
Insert a new row for the same dimension member with the new value for the
updated column. Use a surrogate key, because the business key is now duplicated.
Add a flag that denotes which row is current for a member.
-
Do nothing, because in a DW, you maintain history, you do not update dimen-
sion data.
-
Update the value of the column just as it was updated in the source system.
Questão 5
Questão
Which kind of a column is not a part of a dimension?
Responda
-
Attribute
-
Measure
-
Key
-
Member property
-
Name
Questão 6
Questão
How can you spot natural hierarchies in a Snowflake schema?
Responda
-
You need to analyze the content of the attributes of each dimension.
-
Lookup tables for each dimension provide natural hierarchies.
-
A Snowflake schema does not support hierarchies.
-
You should convert the Snowflake schema to the Star schema, and then you would
spot the natural hierarchies immediately.
Questão 7
Questão
Over which dimension can you not use the SUM aggregate function for semi-additive
measures?
Responda
-
Customer
-
Product
-
Date
-
Employee
Questão 8
Questão
Which measures would you expect to be non-additive? (Choose all that apply.)
Responda
-
Price
-
Debit
-
SalesAmount
-
DiscountPct
-
UnitBalance
Questão 9
Questão
Which kind of a column is not part of a fact table?
Responda
-
Lineage
-
Measure
-
Key
-
Member property
Questão 10
Questão
Which database objects and object properties can you use for autonumbering?
(Choose all that apply.)
Responda
-
IDENTITY property
-
SEQUENCE object
-
PRIMARY KEY constraint
-
CHECK constraint
Questão 11
Questão
What columns do you add to a table to support Type 2 SCD changes? (Choose all that
apply.)
Responda
-
Member properties
-
Current row flag
-
Lineage columns
-
Surrogate key
Questão 12
Questão
What is an inferred member?
Responda
-
A row in a fact table added during dimension load
-
A row with aggregated values
-
A row in a dimension added during fact table load
-
A computed column in a fact table
Questão 13
Questão
Which types of data compression are supported by SQL Server? (Choose all that apply.)
Questão 14
Questão
Which operators can benefit from batch processing? (Choose all that apply.)
Responda
-
Hash Join
-
Merge Join
-
Scan
-
Nested Loops Join
-
Filter
Questão 15
Questão
Why would you use indexed views? (Choose all that apply.)
Responda
-
To speed up queries that aggregate data
-
To speed up data load
-
To speed up selective queries
-
To speed up queries that involve multiple joins
Questão 16
Questão
The database object that maps partitions of a table to filegroups is called a(n)
Responda
-
Aligned index
-
Partition function
-
Partition column
-
Partition scheme
Questão 17
Questão
If you want to switch content from a nonpartitioned table to a partition of a partitioned
table, what conditions must the nonpartitioned table meet? (Choose all that apply.)
Responda
-
It must have the same constraints as the partitioned table.
-
It must have the same compression as the partitioned table.
-
It must be in a special PartitionedTables schema.
-
It must have a check constraint on the partitioning column that guarantees that all
of the data goes to exactly one partition of the partitioned table.
-
It must have the same indexes as the partitioned table.
Questão 18
Questão
Which of the following T-SQL functions is not very useful for capturing lineage
information?
Responda
-
APP_NAME()
-
USER_NAME()
-
DEVICE_STATUS()
-
SUSER_SNAME()
Questão 19
Questão
You need to move data from a production database into a testing database. You need
to extract the data from several objects in the source database, but your manager
has asked you to only copy about 10 percent of the rows from the largest production
tables. The testing database already exists, but without any tables. How would you ap-
proach this task?
Responda
-
Use the Import and Export Wizard, copy all tables from the source database to the
empty destination database, and delete the excess rows from the largest tables.
-
Use the Import and Export Wizard multiple times—once for all the smaller tables,
and once for each large table, using the Write A Query To Specify The Data To
Transfer option to restrict the rows
-
Use the Import and Export Wizard, copy all tables from the source database to the
empty destination database, and restrict the number of rows for each large table
by using the Edit SQL option in the Column Mappings window
-
Use the Import and Export Wizard, configure it to copy all tables from the source
database to the empty destination database, save the SSIS package, and then,
before executing it, edit it by using SSDT to restrict the number of rows extracted
from the large tables.
Questão 20
Questão
You need to move data from an operational database into a data warehouse for the
very first time. The data warehouse has already been set up, and it already contains
some reference data. You have just finished preparing views in the operational data-
base that correspond to the dimension and fact tables of the data warehouse. How
would you approach this task?
Responda
-
Use the Import and Export Wizard and copy data from the dimension and fact
views in the operational database into the tables in the data warehouse, by using
the Drop And Re-create The Destination Table option in the Column Mappings
window for every non-empty destination table.
-
Use the Import and Export Wizard, configure it to copy data from the dimension
and fact views in the operational database into the tables in the data warehouse,
save the SSIS package, and then edit it by using SSDT to add appropriate data
merging functionalities for all destination tables.
-
Use the Import and Export Wizard and copy data from the dimension and fact
views in the operational database into the tables in the data warehouse, by using
the Merge Data Into The Destination Table option in the Column Mappings win-
dow for every non-empty destination table.
-
Use SSDT instead of the Import and Export Wizard, because the wizard lacks ap-
propriate data transformation and merging capabilities
Questão 21
Questão
When SSIS packages are saved to DTSX files, what format is used to store the SSIS
package definitions?
Responda
-
They are stored as binary files.
-
They are stored as plain text files.
-
They are stored as XML files.
-
hey are stored as special Microsoft Word documents.
Questão 22
Questão
Which statements best describe SQL Server Development Tools (SSDT)? (Choose all
that apply.)
Responda
-
SSDT is an extension of the SQL Server Management Studio that can be used to
create SSIS packages by means of a special wizard.
-
SSDT is a special edition of the SQL Server Management Studio, designed to pro-
vide an improved user experience to developers who are not particularly familiar
with database administration.
-
SSDT is a special edition of Visual Studio, distributed with SQL Server 2012, provid-
ing a rich database development tool set.
-
SSDT is a new service in SQL Server 2012 that can be used to perform SQL Server
maintenance tasks, such as data movements and similar data management pro-
cesses.
Questão 23
Questão
Which of the following statements about simple and complex data movements are
true? (Choose all that apply.)
Responda
-
Simple data movements only have a single data source and a single data
destination
-
Complex data movements require data to be transformed before it can be stored
at the destination.
-
In simple data movements, data transformations are limited to data type
conversion.
-
In complex data movements, additional programmatic logic is required to merge
source data with destination data.
Questão 24
Questão
Which of the following statements are true? (Choose all that apply.)
Responda
-
An SSIS package can contain one or more SSDT solutions, each performing a
specific data management operation.
-
An SSIS project can contain one or more SSIS packages.
-
An SSIS project can contain exactly one SSIS package.
-
SSIS packages contain programmatic logic used in data movements and data
transformation operations.
Questão 25
Questão
The Execute SQL Task allows you to execute SQL statements and commands against
the data store. What tools do you have at your disposal when developing SSIS pack-
ages to develop and test a SQL command? Choose all that apply.
Responda
-
SQL Server Management Studio (SSMS)
-
SQL Server Data Tools (SSDT)
-
The Execute SQL Task Editor
-
SQL Server Enterprise Manager (SSEM)
Questão 26
Questão
You need to execute two data flow operations in parallel after an Execute SQL Task has
been completed. How can you achieve that? (Choose all that apply.)
Responda
-
There is no way for two data flow operations to be executed in parallel in the same
SSIS package.
-
You can place both data flows inside the same data flow task and create a prece-
dence constraint leading from the preceding Execute SQL Task to the data flow task.
-
You can create two separate data flow tasks and create two precedence constraints
leading from the preceding Execute SQL Task to each of the two data flow tasks.
-
You can create two separate data flow tasks, place them inside a third data flow
task, and create a precedence constraint leading from the preceding Execute SQL
Task to the third data flow task.
Questão 27
Questão
Which precedence constraint can you use to allow Task B to execute after Task A even
if Task A has failed?
Responda
-
The failure precedence constraint, leading from Task A to Task B
-
The success precedence constraint, leading from Task A to Task B.
-
The completion precedence constraint, leading from Task A to Task B.
-
Use two precedence constraints—a success precedence constraint, and a failure
precedence constraint, both leading from Task A to Task B
Questão 28
Questão
You need to extract data from delimited text files. What connection manager type
would you choose?
Responda
-
A Flat File connection manager
-
An OLE DB connection manager
-
An ADO.NET connection manager
-
A File connection manager
Questão 29
Questão
Some of the data your company processes is sent in from partners via email.
How would you configure an SMTP connection manager to extract files from email
messages?
Responda
-
In the SMTP connection manager, configure the OperationMode setting to Send
And Receive.
-
It is not possible to use the SMTP connection manager in this way, because it can
only be used by SSIS to send email messages.
-
The SMTP connection manager supports sending and receiving email messages by
default, so no additional configuration is necessary.
-
It is not possible to use the SMTP connection manager for this; use the IMAP (In-
ternet Message Access Protocol) connection manager instead.
Questão 30
Questão
You need to extract data from a table in a SQL Server 2012 database. What connection
manager types can you use? (Choose all that apply.)
Responda
-
An ODBC connection manager
-
An OLE DB connection manager
-
A File connection manager
-
An ADO.NET connection manager
Questão 31
Questão
In your SSIS solution, you need to load a large set of rows into the database as quickly
as possible. The rows are stored in a delimited text file, and only one source column
needs its data type converted from String (used by the source column) to Decimal
(used by the destination column). What control flow task would be most suitable for
this operation?
Responda
-
The File System task would be perfect in this case, because it can read data from
files and can be configured to handle data type conversions.
-
The Bulk Insert task would be the most appropriate, because it is the quickest and
can handle data type conversions.
-
The data flow task would have to be used, because the data needs to be trans-
formed before it can be loaded into the table.
-
No single control flow task can be used for this operation, because the data needs
to be extracted from the source file, transformed, and then loaded into the desti-
nation table. At least three different tasks would have to be used—the Bulk Insert
task to load the data into a staging database, a Data Conversion task to convert
the data appropriately, and finally, an Execute SQL task to merge the transformed
data with existing destination data.
Questão 32
Questão
A part of your data consolidation process involves extracting data from Excel work-
books. Occasionally, the data contains errors that cannot be corrected automatically.
How can you handle this problem by using SSIS?
Responda
-
Redirect the failed data flow task to an External Process task, open the problematic
Excel file in Excel, and prompt the user to correct the file before continuing the
data consolidation process.
-
Redirect the failed data flow task to a File System task that moves the erroneous
file to a dedicated location where an information worker can correct it later.
-
If the error cannot be corrected automatically, there is no way for SSIS to continue
with the automated data consolidation process.
-
None of the answers above are correct. Due to Excel’s strict data validation rules,
an Excel file cannot ever contain erroneous data.
Questão 33
Questão
In your ETL process, a few values need to be retrieved from a database at run time,
based on another value available at run time, and they cannot be retrieved as part of
any data flow task. Which task can you use in this case?
Questão 34
Questão
How is the order of execution, or the sequence of operations, defined in an SSIS
package?
Responda
-
The SSIS run time engine determines the order of execution automatically, based
on the type of operations, the available software and hardware resources, and the
size of data.
-
The sequence is defined by using precedence constraints.
-
The sequence is defined by using the Sequence container.
-
The sequence is defined at design time by using precedence constraints and
Sequence containers, but at run time the SSIS engine executes the operations in
the order set by the most appropriate execution plan for maximum performance.
Questão 35
Questão
How does the Failure constraint affect the execution order?
Responda
-
The following task will only execute after the preceding task has failed.
-
The following task will only execute if the preceding task has not failed.
-
The following task will never execute, because this constraint is only used at
design time.
-
The following task will execute regardless of whether the preceding task has failed,
but an error will be written to the SSIS log.
Questão 36
Questão
In your ETL process, there are three external processes that need to be executed in
sequence, but you do not want to stop execution if any of them fails. Can this be
achieved by using precedence constraints? If so, which precedence constraints can
be used?
Responda
-
No, this cannot be achieved just by using precedence constraints.
-
Yes, this can be achieved by using completion precedence constraints between the
first and the second and between the second and the third Execute Process tasks,
and by using a success precedence constraint between the third Execute Process
task and the following task.
-
Yes, this can be achieved by using completion precedence constraints between
the first and the second, between the second and the third, and between the third
Execute Process task and the following task.
-
Yes, this can be achieved by using failure precedence constraints between the first
and the second, and between the second and the third Execute Process tasks, and
by using a completion precedence constraint between the third Execute Process
task and the following task.
Questão 37
Questão
Which data flow source adapters can you use if you would like to read data from SQL
Server? (Choose all that apply.)
Responda
-
ADO NET source
-
Raw File source
-
OLE DB source
-
ODBC source
Questão 38
Questão
Which data flow destinations can you use if you would like to temporarily stage data to
a file system? (Choose all that apply.)
Responda
-
OLE DB destination
-
Flat File destination
-
Raw File destination
-
Recordset destination
Questão 39
Questão
Which statements are true regarding data source adapters? (Choose all that apply.)
Responda
-
You can change how source data is mapped to SSIS data types.
-
You can have only one data source adapter per data flow task.
-
You must always select all columns from the source adapter.
-
You can read data from an XML file by using SSIS.
Questão 40
Questão
Which transformation can you use if you would like to convert data from one data
type to another? (Choose all that apply.)
Responda
-
Audit
-
Derived Column
-
Data Conversion
-
Script Component
Questão 41
Questão
Which transformations are fully blocking? (Choose all that apply.)
Questão 42
Questão
Which transformations are new in SQL Server 2012 SSIS? (Choose all that apply.)
Responda
-
CDC Splitter
-
Pivot
-
Fuzzy Lookup
-
DQS Cleansing
Questão 43
Questão
Which data flow transformation would you use if had to combine data from two different database tables that exist on two different servers? (Choose all that apply.)
Questão 44
Questão
Suppose that you want to load the data from a flat file and write it into a SQL Server,
Excel, and Raw File transformation inside one data flow task. How many Data Source
adapters do you need? (Choose all that apply.)
Questão 45
Questão
Which sentence is true regarding Cache connection manager? (Choose all that apply.)
Responda
-
Cache can be reused by multiple Lookup transformations.
-
You cannot incrementally update the cache while the package is running.
-
The Cache connection manager can be set on a project level.
-
You can do lookups against other (non OLE-DB) sources.
Questão 46
Questão
In your SSIS package you need to retrieve a scalar value from a table in the destina-
tion database, to be used by multiple tasks. What is the most appropriate method to
achieve this?
Responda
-
Embed a subquery in every existing query used by the package, so that the database
engine can prepare the most appropriate execution plan to retrieve it at run time
-
Create a variable and use an expression to retrieve the value from the database
once, and then use it throughout the execution.
-
Create a variable and use the Execute SQL task to retrieve the value once, and then
use it throughout the execution.
-
Create a variable and use the Expression task to retrieve the value from the data-
base as many times as needed.
Questão 47
Questão
In your SSIS package, you created a package-scoped variable to hold a value that you
want to reuse throughout the package. Later you discover that this value must be set
differently in one container, but the original variable should not be affected. What can
you do?
Responda
-
Create a new package-scoped variable with a different name and reconfigure the
tasks accordingly, to either use the new variable or the original one
-
Create a new container-scoped variable with a different name and reconfigure
only the tasks that it contains to use the new variable.
-
Create a new container-scoped variable with the same name, and leave the tasks
unchanged.
-
Use a package-scoped parameter, because this problem cannot be solved by using
variables.
Questão 48
Questão
In your SSIS process, a specific property will be determined by the administrator in the
production environment. The value supplied by the administrator will be used in mul-
tiple properties and will have to be overridden automatically if certain conditions are
met at run time. What is the most appropriate method to achieve this in SSIS?
Responda
-
Create a parameter and use expressions to assign its value to the corresponding
properties, but use an expression at the beginning of the execution to change the
parameter value if needed.
-
Create a parameter, use an expression at the beginning of the execution to either
assign its value to a variable or override the value if needed, and use expressions to
assign the value of the variable to the properties.
-
Create a read/write variable, use expressions to assign its value to the appropriate
properties, and assign the correct value to the variable via property paths at run
time.
-
Create a parameter and use expressions to either assign its value to the property
or override it if needed.
Questão 49
Questão
In your SSIS package, there are several data flow tasks, each importing data into the
destination database. You need to log the number of rows that have been inserted or
updated in each data flow. What options provided by SSIS can you use to accomplish
this? (Choose all that apply.)
Responda
-
You can use the Row Count task to count the rows passed through a data flow.
-
You can use the Row Count component to count the rows passed through a
data flow.
-
You can store the values in variables before saving them to the log.
-
You can use an expression to calculate the total number of processed rows.
Questão 50
Questão
In your SSIS package, you need to set the properties of several tasks based on the in-
formation available about the run time environment. Each of the properties you need
to compute can be calculated by using a mathematical expression. What would be the
most appropriate method?
Responda
-
Use the Expression Builder to build and test the expression, and then copy it to all
of the corresponding task definitions.
-
Place an Expression Task into the control flow, preceding each task whose proper-
ties need to be determined dynamically
-
Use as many Expression tasks as necessary to compute as many variables as there
are different calculations, and then use the variables to assign the values to the
corresponding tasks.
-
Use a single Expression task and store all required computed values in a row set
(Object) variable to be used in property expressions to configure the correspond-
ing tasks.
Questão 51
Questão
In the control flow of your SSIS package, you need to add a data maintenance task that
will rebuild the indexes of your dimension tables after they have been populated suc-
cessfully. You have implemented each dimension table load in an individual data flow.
The Execute SQL task containing the index rebuild script must be executed after the
preceding data flow has completed successfully, but only on Saturdays. The name of
the current day is stored in a variable. Is it possible to achieve this in SSIS control flow?
If so, how?
Responda
-
No, this is not possible in the control flow because the conditional split transforma-
tion is only available in a data flow.
-
Yes, it is possible to achieve this in the control flow, but only by using a Script task.
-
Yes, this can be achieved by using a success precedence constraint leading from
the data flow task to the Execute SQL task, with a precedence constraint expression
checking whether the value of the variable is Saturday.
-
Yes, this can be achieved by using a regular success precedence constraint leading
from the data flow task to the Execute SQL task.
Questão 52
Questão
What methods does SSIS provide for configuring child packages from the master
package?
Responda
-
Project parameters
-
Global variables
-
Package parameters
-
Solution parameters
Questão 53
Questão
Expressions can be used in precedence constraints to… (one or more answers are
correct)
Questão 54
Questão
Which of the following statements about master packages are true? (Choose all that
apply.)
Responda
-
Master packages are used to configure child packages.
-
Master packages are used to execute child packages.
-
Master packages prevent child packages from being executed directly.
-
Master packages are used to control the order in which child packages are
executed.
Questão 55
Questão
1. Which functionality does the Slowly Changing Dimension transformation support?
(Choose all that apply.)
Responda
-
Type 1 SCD
-
Type 2 SCD
-
Type 3 SCD
-
Inferred members
Questão 56
Questão
Which transformations can you use to determine whether the values of specific col-
umns have changed between the source data and the destination table? (Choose all
that apply.)
Responda
-
Data Conversion
-
Derived Column
-
Conditional Split
-
Multicast
Questão 57
Questão
Which statement is true regarding the Slowly Changing Dimension transformation?
(Choose all that apply.)
Responda
-
The Slowly Changing Dimension transformation supports set-based updates.
-
The Slowly Changing Dimension transformation supports inferred members.
-
You can have multiple Slowly Changing Dimension transformations in one data flow
-
The Slowly Changing Dimension Wizard supports only connections with a SQL
Server database.
Questão 58
Questão
Which process modes in a CDC Source can be used directly to load data without ap-
plying the additional ETL process of getting the current value of the row? (Choose all
that apply.)
Responda
-
Net
-
All
-
All with old value
-
Net with merge
Questão 59
Questão
Which method can you use to dynamically set SQL inside SSIS at run time if you are
using an OLE DB source adapter? (Choose all that apply.)
Responda
-
Use a stored procedure as a source
-
Use parameters inside the SQL statement
-
Use expressions
-
Use an SQL command from a variable.
Questão 60
Questão
Which data flow elements have an error flow? (Choose all that apply.)
Responda
-
The OLE DB Source adapter
-
The Union All transformation
-
The Merge transformation
-
The Lookup transformation
Questão 61
Questão
Which options are available for handling errors on the row level? (Choose all that apply.)
Responda
-
Ignore failure
-
Fail component
-
Redirect row
-
Delete row
Questão 62
Questão
Which data source adapters have an error flow? (Choose all that apply.)
Responda
-
OLE DB source
-
Raw File source
-
ODBC source
-
Excel source
Questão 63
Questão
Which task in the control flow supports transactions? (Choose all that apply.)
Responda
-
Data flow task
-
Execute SQL task
-
File System task
-
XML task
Questão 64
Questão
Which transaction isolation level does not lock the records being read? (Choose all that
apply.)
Responda
-
Serializable
-
Snapshot
-
Chaos
-
ReadUncommitted
Questão 65
Questão
Which T-SQL statements can you use to manually handle a transaction? (Choose all
that apply.)
Responda
-
BEGIN TRAN
-
ROLLBACK TRAN
-
END TRAN
-
COMMIT TRAN
Questão 66
Questão
Which properties do you need to set on the package level to enable checkpoints?
(Choose all that apply.)
Responda
-
CheckpointFileName
-
CheckpointUsage
-
SaveCheckpoints
-
FailPackageOnFailure
Questão 67
Questão
On which SSIS objects can you set checkpoints to be active? (Choose all that apply.)
Responda
-
Data flow task
-
Control flow tasks
-
Sequence container
-
Sort transformation
Questão 68
Questão
Which statements are correct regarding SSIS checkpoints? (Choose all that apply.)
Responda
-
You can have multiple checkpoint files per package.
-
If the data flow task fails, you can store rows with errors in the checkpoint file.
-
If the package is successful, the checkpoint file is deleted
-
If you set the CheckpointUsage property to Always, the checkpoint file must be
present or the package will not start
Questão 69
Questão
Which components in SSIS can act as executable components that trigger an event?
(Choose all that apply.)
Responda
-
Sequence container
-
Package
-
Data flow task
-
Data flow transformation
Questão 70
Questão
Which event handler types can you use if you want to log all the package errors?
(Choose all that apply.)
Responda
-
OnPostExecute
-
OnError
-
OnWarning
-
OnTaskFailed
Questão 71
Questão
Which event handler types can you use if you want to log information before the task
starts in the package? (Choose all that apply.)
Responda
-
OnTaskFailed
-
OnProgress
-
OnPreExecute
-
OnWarning
Questão 72
Questão
1. Which parameter types are available in SSIS in SQL Server 2012? (Choose all that
apply.)
Questão 73
Questão
Which properties can be set by using build configurations? (Choose all that apply.)
Questão 74
Questão
Which properties can be set by using property expressions? (Choose all that apply.)
Responda
-
SQL statement for the Execute SQL task
-
Variable values
-
Data flow task properties
-
The Lookup task cache mode property
Questão 75
Questão
Which configuration types can you use to store configuration values? (Choose all that
apply.)
Questão 76
Questão
On which objects can you set dynamic properties by using package configurations?
(Choose all that apply.)
Questão 77
Questão
Which SSIS elements can be configured by using a package configuration? (Choose all
that apply.)
Questão 78
Questão
Which statements about SSIS logging are correct? (Choose all that apply.)
Responda
-
Every task and every operation automatically reports every error to the output file
by default.
-
SSIS logging can be configured for individual SSIS tasks or containers or at the
package level
-
SSIS logging writes entries into the SQL Server Error Log
-
SSIS log entries are exposed to the environment using a variety of built-in log
providers.
Questão 79
Questão
Which of the following SSIS properties can be logged? (Choose all that apply.)
Responda
-
The name of the control flow task
-
The name of the event
-
The name of the user who started the process
-
The name of the machine on which the event occurred
Questão 80
Questão
Which feature allows you to configure logging once on a particular task or container,
and then reuse the same settings on other tasks and/or containers?
Questão 81
Questão
What information can be added to the data flow buffer by using the Audit transforma-
tion? (Choose all that apply.)
Questão 82
Questão
How can the Audit transformation be configured to provide additional system vari-
ables or user variables to the data flow?
Responda
-
The set of properties provided by the Audit transformation is fixed and cannot be
extended.
-
The Audit transformation itself cannot be extended, but by using the Expression
task, you can assign any value to an appropriate system variable used by the Audit
transformation.
-
The Audit transformation can be edited by using the Advanced Editor, which pro-
vides unlimited access to system and user variables, as well as the rest of the data
flow columns.
-
The set of properties provided by the Audit transformation can be extended at
design time by switching to advanced mode.
Questão 83
Questão
Which of the following pieces of information represent typical examples of audit data?
(Choose all that apply.)
Responda
-
The date and time that the row was added to the table
-
The date and time that the row was last modified
-
A value designating the success or failure of an SSIS process
-
The name of the SSIS task in which an error has occurred
Questão 84
Questão
What are SSIS package templates used for? (Choose all that apply.)
Responda
-
They can be used to develop new SSIS projects.
-
They can be used to preconfigure SSIS packages
-
They can be used to apply log configurations for SSIS objects.
-
They can be used to create SSIS packages.
Questão 85
Questão
What type is used for SSIS package template files?
Responda
-
The .dotx file type
-
The .dotsx file type
-
The .dtsx file type
-
The .xml file type
Questão 86
Questão
Which of the following statements about package templates are true? (Choose all
that apply.)
Responda
-
SSIS package templates can be used to reapply log settings to SSIS tasks and/or
containers.
-
SSIS package templates can be used to reduce the time needed to develop an SSIS
package.
-
SSIS package templates can be used to preconfigure SSIS operations.
-
SSIS package templates can speed up SSIS process execution.