Question 1
Question
Reporting from a Star schema is simpler than reporting from a normalized online
transactional processing (OLTP) schema. What are the reasons for wanting simpler
reporting? (Choose all that apply.)
Answer
-
A Star schema typically has fewer tables than a normalized schema. Therefore,
queries are simpler because they require fewer joins.
-
Star schema has better support for numeric data types than a normalized rela-
tional schema; therefore, it is easier to create aggregates.
-
There are specific Transact-SQL expressions that deal with Star schemas.
-
A Star schema is standardized and narrative; you can find the information you
need for a report quickly.
Question 2
Question
You are creating a quick POC project. Which schema is the most suitable for this kind
of a project?
Answer
-
Star schema
-
Normalized schema
-
Snowflake schema
-
XML schema
Question 3
Question
A Star schema has two types of tables. What are those two types? (Choose all that
apply.)
Answer
-
Lookup tables
-
Dimensions
-
Measures
-
Fact tables
Question 4
Question
You implement a Type 2 solution for an SCD problem for a specific column. What do
you actually do when you get a changed value for the column from the source system?
Answer
-
Add a column for the previous value to the table. Move the current value of the
updated column to the new column. Update the current value with the new value
from the source system.
-
Insert a new row for the same dimension member with the new value for the
updated column. Use a surrogate key, because the business key is now duplicated.
Add a flag that denotes which row is current for a member.
-
Do nothing, because in a DW, you maintain history, you do not update dimen-
sion data.
-
Update the value of the column just as it was updated in the source system.
Question 5
Question
Which kind of a column is not a part of a dimension?
Answer
-
Attribute
-
Measure
-
Key
-
Member property
-
Name
Question 6
Question
How can you spot natural hierarchies in a Snowflake schema?
Answer
-
You need to analyze the content of the attributes of each dimension.
-
Lookup tables for each dimension provide natural hierarchies.
-
A Snowflake schema does not support hierarchies.
-
You should convert the Snowflake schema to the Star schema, and then you would
spot the natural hierarchies immediately.
Question 7
Question
Over which dimension can you not use the SUM aggregate function for semi-additive
measures?
Answer
-
Customer
-
Product
-
Date
-
Employee
Question 8
Question
Which measures would you expect to be non-additive? (Choose all that apply.)
Answer
-
Price
-
Debit
-
SalesAmount
-
DiscountPct
-
UnitBalance
Question 9
Question
Which kind of a column is not part of a fact table?
Answer
-
Lineage
-
Measure
-
Key
-
Member property
Question 10
Question
Which database objects and object properties can you use for autonumbering?
(Choose all that apply.)
Answer
-
IDENTITY property
-
SEQUENCE object
-
PRIMARY KEY constraint
-
CHECK constraint
Question 11
Question
What columns do you add to a table to support Type 2 SCD changes? (Choose all that
apply.)
Answer
-
Member properties
-
Current row flag
-
Lineage columns
-
Surrogate key
Question 12
Question
What is an inferred member?
Answer
-
A row in a fact table added during dimension load
-
A row with aggregated values
-
A row in a dimension added during fact table load
-
A computed column in a fact table
Question 13
Question
Which types of data compression are supported by SQL Server? (Choose all that apply.)
Question 14
Question
Which operators can benefit from batch processing? (Choose all that apply.)
Answer
-
Hash Join
-
Merge Join
-
Scan
-
Nested Loops Join
-
Filter
Question 15
Question
Why would you use indexed views? (Choose all that apply.)
Answer
-
To speed up queries that aggregate data
-
To speed up data load
-
To speed up selective queries
-
To speed up queries that involve multiple joins
Question 16
Question
The database object that maps partitions of a table to filegroups is called a(n)
Answer
-
Aligned index
-
Partition function
-
Partition column
-
Partition scheme
Question 17
Question
If you want to switch content from a nonpartitioned table to a partition of a partitioned
table, what conditions must the nonpartitioned table meet? (Choose all that apply.)
Answer
-
It must have the same constraints as the partitioned table.
-
It must have the same compression as the partitioned table.
-
It must be in a special PartitionedTables schema.
-
It must have a check constraint on the partitioning column that guarantees that all
of the data goes to exactly one partition of the partitioned table.
-
It must have the same indexes as the partitioned table.
Question 18
Question
Which of the following T-SQL functions is not very useful for capturing lineage
information?
Answer
-
APP_NAME()
-
USER_NAME()
-
DEVICE_STATUS()
-
SUSER_SNAME()
Question 19
Question
You need to move data from a production database into a testing database. You need
to extract the data from several objects in the source database, but your manager
has asked you to only copy about 10 percent of the rows from the largest production
tables. The testing database already exists, but without any tables. How would you ap-
proach this task?
Answer
-
Use the Import and Export Wizard, copy all tables from the source database to the
empty destination database, and delete the excess rows from the largest tables.
-
Use the Import and Export Wizard multiple times—once for all the smaller tables,
and once for each large table, using the Write A Query To Specify The Data To
Transfer option to restrict the rows
-
Use the Import and Export Wizard, copy all tables from the source database to the
empty destination database, and restrict the number of rows for each large table
by using the Edit SQL option in the Column Mappings window
-
Use the Import and Export Wizard, configure it to copy all tables from the source
database to the empty destination database, save the SSIS package, and then,
before executing it, edit it by using SSDT to restrict the number of rows extracted
from the large tables.
Question 20
Question
You need to move data from an operational database into a data warehouse for the
very first time. The data warehouse has already been set up, and it already contains
some reference data. You have just finished preparing views in the operational data-
base that correspond to the dimension and fact tables of the data warehouse. How
would you approach this task?
Answer
-
Use the Import and Export Wizard and copy data from the dimension and fact
views in the operational database into the tables in the data warehouse, by using
the Drop And Re-create The Destination Table option in the Column Mappings
window for every non-empty destination table.
-
Use the Import and Export Wizard, configure it to copy data from the dimension
and fact views in the operational database into the tables in the data warehouse,
save the SSIS package, and then edit it by using SSDT to add appropriate data
merging functionalities for all destination tables.
-
Use the Import and Export Wizard and copy data from the dimension and fact
views in the operational database into the tables in the data warehouse, by using
the Merge Data Into The Destination Table option in the Column Mappings win-
dow for every non-empty destination table.
-
Use SSDT instead of the Import and Export Wizard, because the wizard lacks ap-
propriate data transformation and merging capabilities
Question 21
Question
When SSIS packages are saved to DTSX files, what format is used to store the SSIS
package definitions?
Answer
-
They are stored as binary files.
-
They are stored as plain text files.
-
They are stored as XML files.
-
hey are stored as special Microsoft Word documents.
Question 22
Question
Which statements best describe SQL Server Development Tools (SSDT)? (Choose all
that apply.)
Answer
-
SSDT is an extension of the SQL Server Management Studio that can be used to
create SSIS packages by means of a special wizard.
-
SSDT is a special edition of the SQL Server Management Studio, designed to pro-
vide an improved user experience to developers who are not particularly familiar
with database administration.
-
SSDT is a special edition of Visual Studio, distributed with SQL Server 2012, provid-
ing a rich database development tool set.
-
SSDT is a new service in SQL Server 2012 that can be used to perform SQL Server
maintenance tasks, such as data movements and similar data management pro-
cesses.
Question 23
Question
Which of the following statements about simple and complex data movements are
true? (Choose all that apply.)
Answer
-
Simple data movements only have a single data source and a single data
destination
-
Complex data movements require data to be transformed before it can be stored
at the destination.
-
In simple data movements, data transformations are limited to data type
conversion.
-
In complex data movements, additional programmatic logic is required to merge
source data with destination data.
Question 24
Question
Which of the following statements are true? (Choose all that apply.)
Answer
-
An SSIS package can contain one or more SSDT solutions, each performing a
specific data management operation.
-
An SSIS project can contain one or more SSIS packages.
-
An SSIS project can contain exactly one SSIS package.
-
SSIS packages contain programmatic logic used in data movements and data
transformation operations.
Question 25
Question
The Execute SQL Task allows you to execute SQL statements and commands against
the data store. What tools do you have at your disposal when developing SSIS pack-
ages to develop and test a SQL command? Choose all that apply.
Answer
-
SQL Server Management Studio (SSMS)
-
SQL Server Data Tools (SSDT)
-
The Execute SQL Task Editor
-
SQL Server Enterprise Manager (SSEM)
Question 26
Question
You need to execute two data flow operations in parallel after an Execute SQL Task has
been completed. How can you achieve that? (Choose all that apply.)
Answer
-
There is no way for two data flow operations to be executed in parallel in the same
SSIS package.
-
You can place both data flows inside the same data flow task and create a prece-
dence constraint leading from the preceding Execute SQL Task to the data flow task.
-
You can create two separate data flow tasks and create two precedence constraints
leading from the preceding Execute SQL Task to each of the two data flow tasks.
-
You can create two separate data flow tasks, place them inside a third data flow
task, and create a precedence constraint leading from the preceding Execute SQL
Task to the third data flow task.
Question 27
Question
Which precedence constraint can you use to allow Task B to execute after Task A even
if Task A has failed?
Answer
-
The failure precedence constraint, leading from Task A to Task B
-
The success precedence constraint, leading from Task A to Task B.
-
The completion precedence constraint, leading from Task A to Task B.
-
Use two precedence constraints—a success precedence constraint, and a failure
precedence constraint, both leading from Task A to Task B
Question 28
Question
You need to extract data from delimited text files. What connection manager type
would you choose?
Answer
-
A Flat File connection manager
-
An OLE DB connection manager
-
An ADO.NET connection manager
-
A File connection manager
Question 29
Question
Some of the data your company processes is sent in from partners via email.
How would you configure an SMTP connection manager to extract files from email
messages?
Answer
-
In the SMTP connection manager, configure the OperationMode setting to Send
And Receive.
-
It is not possible to use the SMTP connection manager in this way, because it can
only be used by SSIS to send email messages.
-
The SMTP connection manager supports sending and receiving email messages by
default, so no additional configuration is necessary.
-
It is not possible to use the SMTP connection manager for this; use the IMAP (In-
ternet Message Access Protocol) connection manager instead.
Question 30
Question
You need to extract data from a table in a SQL Server 2012 database. What connection
manager types can you use? (Choose all that apply.)
Answer
-
An ODBC connection manager
-
An OLE DB connection manager
-
A File connection manager
-
An ADO.NET connection manager
Question 31
Question
In your SSIS solution, you need to load a large set of rows into the database as quickly
as possible. The rows are stored in a delimited text file, and only one source column
needs its data type converted from String (used by the source column) to Decimal
(used by the destination column). What control flow task would be most suitable for
this operation?
Answer
-
The File System task would be perfect in this case, because it can read data from
files and can be configured to handle data type conversions.
-
The Bulk Insert task would be the most appropriate, because it is the quickest and
can handle data type conversions.
-
The data flow task would have to be used, because the data needs to be trans-
formed before it can be loaded into the table.
-
No single control flow task can be used for this operation, because the data needs
to be extracted from the source file, transformed, and then loaded into the desti-
nation table. At least three different tasks would have to be used—the Bulk Insert
task to load the data into a staging database, a Data Conversion task to convert
the data appropriately, and finally, an Execute SQL task to merge the transformed
data with existing destination data.
Question 32
Question
A part of your data consolidation process involves extracting data from Excel work-
books. Occasionally, the data contains errors that cannot be corrected automatically.
How can you handle this problem by using SSIS?
Answer
-
Redirect the failed data flow task to an External Process task, open the problematic
Excel file in Excel, and prompt the user to correct the file before continuing the
data consolidation process.
-
Redirect the failed data flow task to a File System task that moves the erroneous
file to a dedicated location where an information worker can correct it later.
-
If the error cannot be corrected automatically, there is no way for SSIS to continue
with the automated data consolidation process.
-
None of the answers above are correct. Due to Excel’s strict data validation rules,
an Excel file cannot ever contain erroneous data.
Question 33
Question
In your ETL process, a few values need to be retrieved from a database at run time,
based on another value available at run time, and they cannot be retrieved as part of
any data flow task. Which task can you use in this case?
Question 34
Question
How is the order of execution, or the sequence of operations, defined in an SSIS
package?
Answer
-
The SSIS run time engine determines the order of execution automatically, based
on the type of operations, the available software and hardware resources, and the
size of data.
-
The sequence is defined by using precedence constraints.
-
The sequence is defined by using the Sequence container.
-
The sequence is defined at design time by using precedence constraints and
Sequence containers, but at run time the SSIS engine executes the operations in
the order set by the most appropriate execution plan for maximum performance.
Question 35
Question
How does the Failure constraint affect the execution order?
Answer
-
The following task will only execute after the preceding task has failed.
-
The following task will only execute if the preceding task has not failed.
-
The following task will never execute, because this constraint is only used at
design time.
-
The following task will execute regardless of whether the preceding task has failed,
but an error will be written to the SSIS log.
Question 36
Question
In your ETL process, there are three external processes that need to be executed in
sequence, but you do not want to stop execution if any of them fails. Can this be
achieved by using precedence constraints? If so, which precedence constraints can
be used?
Answer
-
No, this cannot be achieved just by using precedence constraints.
-
Yes, this can be achieved by using completion precedence constraints between the
first and the second and between the second and the third Execute Process tasks,
and by using a success precedence constraint between the third Execute Process
task and the following task.
-
Yes, this can be achieved by using completion precedence constraints between
the first and the second, between the second and the third, and between the third
Execute Process task and the following task.
-
Yes, this can be achieved by using failure precedence constraints between the first
and the second, and between the second and the third Execute Process tasks, and
by using a completion precedence constraint between the third Execute Process
task and the following task.
Question 37
Question
Which data flow source adapters can you use if you would like to read data from SQL
Server? (Choose all that apply.)
Answer
-
ADO NET source
-
Raw File source
-
OLE DB source
-
ODBC source
Question 38
Question
Which data flow destinations can you use if you would like to temporarily stage data to
a file system? (Choose all that apply.)
Answer
-
OLE DB destination
-
Flat File destination
-
Raw File destination
-
Recordset destination
Question 39
Question
Which statements are true regarding data source adapters? (Choose all that apply.)
Answer
-
You can change how source data is mapped to SSIS data types.
-
You can have only one data source adapter per data flow task.
-
You must always select all columns from the source adapter.
-
You can read data from an XML file by using SSIS.
Question 40
Question
Which transformation can you use if you would like to convert data from one data
type to another? (Choose all that apply.)
Answer
-
Audit
-
Derived Column
-
Data Conversion
-
Script Component
Question 41
Question
Which transformations are fully blocking? (Choose all that apply.)
Question 42
Question
Which transformations are new in SQL Server 2012 SSIS? (Choose all that apply.)
Answer
-
CDC Splitter
-
Pivot
-
Fuzzy Lookup
-
DQS Cleansing
Question 43
Question
Which data flow transformation would you use if had to combine data from two different database tables that exist on two different servers? (Choose all that apply.)
Question 44
Question
Suppose that you want to load the data from a flat file and write it into a SQL Server,
Excel, and Raw File transformation inside one data flow task. How many Data Source
adapters do you need? (Choose all that apply.)
Question 45
Question
Which sentence is true regarding Cache connection manager? (Choose all that apply.)
Answer
-
Cache can be reused by multiple Lookup transformations.
-
You cannot incrementally update the cache while the package is running.
-
The Cache connection manager can be set on a project level.
-
You can do lookups against other (non OLE-DB) sources.
Question 46
Question
In your SSIS package you need to retrieve a scalar value from a table in the destina-
tion database, to be used by multiple tasks. What is the most appropriate method to
achieve this?
Answer
-
Embed a subquery in every existing query used by the package, so that the database
engine can prepare the most appropriate execution plan to retrieve it at run time
-
Create a variable and use an expression to retrieve the value from the database
once, and then use it throughout the execution.
-
Create a variable and use the Execute SQL task to retrieve the value once, and then
use it throughout the execution.
-
Create a variable and use the Expression task to retrieve the value from the data-
base as many times as needed.
Question 47
Question
In your SSIS package, you created a package-scoped variable to hold a value that you
want to reuse throughout the package. Later you discover that this value must be set
differently in one container, but the original variable should not be affected. What can
you do?
Answer
-
Create a new package-scoped variable with a different name and reconfigure the
tasks accordingly, to either use the new variable or the original one
-
Create a new container-scoped variable with a different name and reconfigure
only the tasks that it contains to use the new variable.
-
Create a new container-scoped variable with the same name, and leave the tasks
unchanged.
-
Use a package-scoped parameter, because this problem cannot be solved by using
variables.
Question 48
Question
In your SSIS process, a specific property will be determined by the administrator in the
production environment. The value supplied by the administrator will be used in mul-
tiple properties and will have to be overridden automatically if certain conditions are
met at run time. What is the most appropriate method to achieve this in SSIS?
Answer
-
Create a parameter and use expressions to assign its value to the corresponding
properties, but use an expression at the beginning of the execution to change the
parameter value if needed.
-
Create a parameter, use an expression at the beginning of the execution to either
assign its value to a variable or override the value if needed, and use expressions to
assign the value of the variable to the properties.
-
Create a read/write variable, use expressions to assign its value to the appropriate
properties, and assign the correct value to the variable via property paths at run
time.
-
Create a parameter and use expressions to either assign its value to the property
or override it if needed.
Question 49
Question
In your SSIS package, there are several data flow tasks, each importing data into the
destination database. You need to log the number of rows that have been inserted or
updated in each data flow. What options provided by SSIS can you use to accomplish
this? (Choose all that apply.)
Answer
-
You can use the Row Count task to count the rows passed through a data flow.
-
You can use the Row Count component to count the rows passed through a
data flow.
-
You can store the values in variables before saving them to the log.
-
You can use an expression to calculate the total number of processed rows.
Question 50
Question
In your SSIS package, you need to set the properties of several tasks based on the in-
formation available about the run time environment. Each of the properties you need
to compute can be calculated by using a mathematical expression. What would be the
most appropriate method?
Answer
-
Use the Expression Builder to build and test the expression, and then copy it to all
of the corresponding task definitions.
-
Place an Expression Task into the control flow, preceding each task whose proper-
ties need to be determined dynamically
-
Use as many Expression tasks as necessary to compute as many variables as there
are different calculations, and then use the variables to assign the values to the
corresponding tasks.
-
Use a single Expression task and store all required computed values in a row set
(Object) variable to be used in property expressions to configure the correspond-
ing tasks.
Question 51
Question
In the control flow of your SSIS package, you need to add a data maintenance task that
will rebuild the indexes of your dimension tables after they have been populated suc-
cessfully. You have implemented each dimension table load in an individual data flow.
The Execute SQL task containing the index rebuild script must be executed after the
preceding data flow has completed successfully, but only on Saturdays. The name of
the current day is stored in a variable. Is it possible to achieve this in SSIS control flow?
If so, how?
Answer
-
No, this is not possible in the control flow because the conditional split transforma-
tion is only available in a data flow.
-
Yes, it is possible to achieve this in the control flow, but only by using a Script task.
-
Yes, this can be achieved by using a success precedence constraint leading from
the data flow task to the Execute SQL task, with a precedence constraint expression
checking whether the value of the variable is Saturday.
-
Yes, this can be achieved by using a regular success precedence constraint leading
from the data flow task to the Execute SQL task.
Question 52
Question
What methods does SSIS provide for configuring child packages from the master
package?
Answer
-
Project parameters
-
Global variables
-
Package parameters
-
Solution parameters
Question 53
Question
Expressions can be used in precedence constraints to… (one or more answers are
correct)
Question 54
Question
Which of the following statements about master packages are true? (Choose all that
apply.)
Answer
-
Master packages are used to configure child packages.
-
Master packages are used to execute child packages.
-
Master packages prevent child packages from being executed directly.
-
Master packages are used to control the order in which child packages are
executed.
Question 55
Question
1. Which functionality does the Slowly Changing Dimension transformation support?
(Choose all that apply.)
Answer
-
Type 1 SCD
-
Type 2 SCD
-
Type 3 SCD
-
Inferred members
Question 56
Question
Which transformations can you use to determine whether the values of specific col-
umns have changed between the source data and the destination table? (Choose all
that apply.)
Answer
-
Data Conversion
-
Derived Column
-
Conditional Split
-
Multicast
Question 57
Question
Which statement is true regarding the Slowly Changing Dimension transformation?
(Choose all that apply.)
Answer
-
The Slowly Changing Dimension transformation supports set-based updates.
-
The Slowly Changing Dimension transformation supports inferred members.
-
You can have multiple Slowly Changing Dimension transformations in one data flow
-
The Slowly Changing Dimension Wizard supports only connections with a SQL
Server database.
Question 58
Question
Which process modes in a CDC Source can be used directly to load data without ap-
plying the additional ETL process of getting the current value of the row? (Choose all
that apply.)
Answer
-
Net
-
All
-
All with old value
-
Net with merge
Question 59
Question
Which method can you use to dynamically set SQL inside SSIS at run time if you are
using an OLE DB source adapter? (Choose all that apply.)
Answer
-
Use a stored procedure as a source
-
Use parameters inside the SQL statement
-
Use expressions
-
Use an SQL command from a variable.
Question 60
Question
Which data flow elements have an error flow? (Choose all that apply.)
Answer
-
The OLE DB Source adapter
-
The Union All transformation
-
The Merge transformation
-
The Lookup transformation
Question 61
Question
Which options are available for handling errors on the row level? (Choose all that apply.)
Answer
-
Ignore failure
-
Fail component
-
Redirect row
-
Delete row
Question 62
Question
Which data source adapters have an error flow? (Choose all that apply.)
Answer
-
OLE DB source
-
Raw File source
-
ODBC source
-
Excel source
Question 63
Question
Which task in the control flow supports transactions? (Choose all that apply.)
Answer
-
Data flow task
-
Execute SQL task
-
File System task
-
XML task
Question 64
Question
Which transaction isolation level does not lock the records being read? (Choose all that
apply.)
Answer
-
Serializable
-
Snapshot
-
Chaos
-
ReadUncommitted
Question 65
Question
Which T-SQL statements can you use to manually handle a transaction? (Choose all
that apply.)
Answer
-
BEGIN TRAN
-
ROLLBACK TRAN
-
END TRAN
-
COMMIT TRAN
Question 66
Question
Which properties do you need to set on the package level to enable checkpoints?
(Choose all that apply.)
Answer
-
CheckpointFileName
-
CheckpointUsage
-
SaveCheckpoints
-
FailPackageOnFailure
Question 67
Question
On which SSIS objects can you set checkpoints to be active? (Choose all that apply.)
Answer
-
Data flow task
-
Control flow tasks
-
Sequence container
-
Sort transformation
Question 68
Question
Which statements are correct regarding SSIS checkpoints? (Choose all that apply.)
Answer
-
You can have multiple checkpoint files per package.
-
If the data flow task fails, you can store rows with errors in the checkpoint file.
-
If the package is successful, the checkpoint file is deleted
-
If you set the CheckpointUsage property to Always, the checkpoint file must be
present or the package will not start
Question 69
Question
Which components in SSIS can act as executable components that trigger an event?
(Choose all that apply.)
Answer
-
Sequence container
-
Package
-
Data flow task
-
Data flow transformation
Question 70
Question
Which event handler types can you use if you want to log all the package errors?
(Choose all that apply.)
Answer
-
OnPostExecute
-
OnError
-
OnWarning
-
OnTaskFailed
Question 71
Question
Which event handler types can you use if you want to log information before the task
starts in the package? (Choose all that apply.)
Answer
-
OnTaskFailed
-
OnProgress
-
OnPreExecute
-
OnWarning
Question 72
Question
1. Which parameter types are available in SSIS in SQL Server 2012? (Choose all that
apply.)
Question 73
Question
Which properties can be set by using build configurations? (Choose all that apply.)
Question 74
Question
Which properties can be set by using property expressions? (Choose all that apply.)
Answer
-
SQL statement for the Execute SQL task
-
Variable values
-
Data flow task properties
-
The Lookup task cache mode property
Question 75
Question
Which configuration types can you use to store configuration values? (Choose all that
apply.)
Question 76
Question
On which objects can you set dynamic properties by using package configurations?
(Choose all that apply.)
Question 77
Question
Which SSIS elements can be configured by using a package configuration? (Choose all
that apply.)
Question 78
Question
Which statements about SSIS logging are correct? (Choose all that apply.)
Answer
-
Every task and every operation automatically reports every error to the output file
by default.
-
SSIS logging can be configured for individual SSIS tasks or containers or at the
package level
-
SSIS logging writes entries into the SQL Server Error Log
-
SSIS log entries are exposed to the environment using a variety of built-in log
providers.
Question 79
Question
Which of the following SSIS properties can be logged? (Choose all that apply.)
Answer
-
The name of the control flow task
-
The name of the event
-
The name of the user who started the process
-
The name of the machine on which the event occurred
Question 80
Question
Which feature allows you to configure logging once on a particular task or container,
and then reuse the same settings on other tasks and/or containers?
Question 81
Question
What information can be added to the data flow buffer by using the Audit transforma-
tion? (Choose all that apply.)
Question 82
Question
How can the Audit transformation be configured to provide additional system vari-
ables or user variables to the data flow?
Answer
-
The set of properties provided by the Audit transformation is fixed and cannot be
extended.
-
The Audit transformation itself cannot be extended, but by using the Expression
task, you can assign any value to an appropriate system variable used by the Audit
transformation.
-
The Audit transformation can be edited by using the Advanced Editor, which pro-
vides unlimited access to system and user variables, as well as the rest of the data
flow columns.
-
The set of properties provided by the Audit transformation can be extended at
design time by switching to advanced mode.
Question 83
Question
Which of the following pieces of information represent typical examples of audit data?
(Choose all that apply.)
Answer
-
The date and time that the row was added to the table
-
The date and time that the row was last modified
-
A value designating the success or failure of an SSIS process
-
The name of the SSIS task in which an error has occurred
Question 84
Question
What are SSIS package templates used for? (Choose all that apply.)
Answer
-
They can be used to develop new SSIS projects.
-
They can be used to preconfigure SSIS packages
-
They can be used to apply log configurations for SSIS objects.
-
They can be used to create SSIS packages.
Question 85
Question
What type is used for SSIS package template files?
Answer
-
The .dotx file type
-
The .dotsx file type
-
The .dtsx file type
-
The .xml file type
Question 86
Question
Which of the following statements about package templates are true? (Choose all
that apply.)
Answer
-
SSIS package templates can be used to reapply log settings to SSIS tasks and/or
containers.
-
SSIS package templates can be used to reduce the time needed to develop an SSIS
package.
-
SSIS package templates can be used to preconfigure SSIS operations.
-
SSIS package templates can speed up SSIS process execution.