Frage 1
Antworten
-
A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
-
is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
-
are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
-
Is a field dedicated to the analysis, processing and storage of large collections of data that frequenty originate from disparate sources
Frage 2
Antworten
-
queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
-
is a measured for gauging sucess within a particular context
-
Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
-
are typically requiered when traditional data analysis, processing and storage technologies and techniques are insufficient
Frage 3
Antworten
-
Arrives at such fast speeds that enormous datasets can accumulate within very shorts periods of time
-
does not conform to a data model or data schema
-
Data adquired such as via online customer registrations, usually contains less noise
-
distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
Frage 4
Frage
Using Big Data Solutions
Antworten
-
are closesly liked with an enterprise's strategic objectives
-
further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
-
multiple formats and types of data that need to be supported by Big Data Solutions
-
complex analysis tasks can be carried out to arrive at deeply meaningful and insightful analysis results for the benefit of the business
Frage 5
Antworten
-
Some streams are public. Other streams go to vendors and business directly
-
Analytics and Data Science
-
are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
-
can process massive quantities of data that arrive at varying speeds, may be of many different varieties and have numerous incompatibilities
Frage 6
Frage
Data within Big Data
Antworten
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
can have multiple data marts
-
is a process of loading data from a source system into a target system, the source system can be a database, a flat file or an application, similarly, the target system can be a database or some other information system
-
accumulates from being amassed within the enterprise (via applications) or from external sources that are then stored by the big datat solution
Frage 7
Frage
Data processed by Big Data
Antworten
-
does generally require special or customized logic when it comes to pre-processing and storage
-
Data adquired such as blog posting, usually contains more noise
-
store historical data that is aggregated and denormalized to support fast reporting capability
-
can be used by enterprise applications directly, or fed into a data warehouse to enrich existing data.This data is typically analyzed and subjected to analytics
Frage 8
Frage
Processed data and analysis results
Antworten
-
are closesly liked with an enterprise's strategic objectives
-
represents the main operation through which data warehouses are fed data
-
does often have special pre-processing and storage requierements, especially if the underline format is not text-based
-
are commonly used for meaningful and complex reporting and assessment task and can also be fed back into applications to enhance their behavior (such as when product recommendations are displayed online)
Frage 9
Frage
Data processed by Big Data
Antworten
-
Analytics and Data Science
-
actionable intelligence
-
operational optimization
-
can be human-generated or machine generated, although it is ultimately the responsibility of machines to generate the processing results
Frage 10
Frage
Human-generated data
Antworten
-
is a subset of the data stored in a data warehouse, that typically belongs to a department, division or specific line of business
-
each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
-
used to identify problem areas in order to take corrective actions
-
is the result of human interaction with systems, such as online services and digital devices (Ex. Social media, micro blogging, e-mails, photo sharing and messaging)
Frage 11
Frage
Machine-generated data
Antworten
-
represents the main operation through which data warehouses are fed data
-
With periodic data imports from accross the enterprise, the amount of data contained will continue to increase. Query response times for data analysis task performed as part of BI can suffer as a result
-
defined as the usefulness of data for an enterprise
-
is the result of the automated, event-driven generation of data by software programs or hardware devices (Ex. Web logs, sensor data, telemetry data, smart meter data and appliance usage data
Frage 12
Frage
BDS processing results
Antworten
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
scientific and research data (large Hadron Collider, Atacama Large Milimeter/Submilimeter Array Telescope)
-
operational optimization
-
actionable intelligence
Frage 13
Frage
BDS processing results
Antworten
-
is crucial to big data processing storage and analysis
-
With periodic data imports from accross the enterprise, the amount of data contained will continue to increase. Query response times for data analysis task performed as part of BI can suffer as a result
-
identification of new markets
-
accurate predictions
Frage 14
Frage
BDS processing results
Antworten
-
is directly related to the veracity characteristic
-
The required data is first obtained from the sources, after which the extracts are modified by applying rules
-
fault and fraud detection
-
more detailed records
Frage 15
Frage
BDS processing results
Antworten
-
related to collecting and processing large quantities of diverse data has become increasingly affordable
-
simple insert, delete and update operations with sub-second response times
-
improved decision-making
-
scientific discoveries
Frage 16
Antworten
-
improved decision-making
-
representing a common source of structured analytics input
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
Frage 17
Antworten
-
Shares the same set of attributes as others in the same dataset
-
Are the data analysis results being accurately communicated to the appropriate decision-makers?
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
is based on a quantifiable indicator that is identified and agreed upon beforehand
Frage 18
Antworten
-
either exists in textual or binary form
-
is the result of human interaction with systems, such as online services and digital devices (Ex. Social media, micro blogging, e-mails, photo sharing and messaging)
-
is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
-
helps establish patterns and relationships amog the data being analyzed
Frage 19
Antworten
-
semi-structured data
-
Can exist as a separate DBMS, as in the case of an OLAP database
-
is the discipline of gaininng an understanding of data by analyzing it via a multitude of scientific techniques and automated tools, with a focus on locating hidden patterns and correlations
-
is usually applied using highly scalable distributed technologies and frameworks for analyzing large volumes of data from different sources
Frage 20
Antworten
-
generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
-
may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
-
Shares the same set of attributes as others in the same dataset
-
attributes providing the file size and resolution of a digital photograph
Frage 21
Frage
in the business-oriented environments analytics results can lower operational costs and facilitate strategic decision-making?
Frage 22
Antworten
-
does often have special pre-processing and storage requierements, especially if the underline format is not text-based
-
is also dependent on how long data processing takes, time are inversely proportional to each other
-
is a data analysis technique that focuses on quantifying the patterns and correlations found in the data
-
analytics can help identify the cause of a phenomenon to improve the accuracy of predictions
Frage 23
Frage
services-based environments
Antworten
-
are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
-
each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
-
are commonly used for meaningful and complex reporting and assessment task and can also be fed back into applications to enhance their behavior (such as when product recommendations are displayed online)
-
analytics can help strengthen the focus on delivering high quality services by driving down cost
Frage 24
Antworten
-
are closesly liked with an enterprise's strategic objectives
-
Shares the same set of attributes as others in the same dataset
-
generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
-
enables data-driven decision-making with scientific backing, so that decisions can be based on a factual data and not on past experience or intuition alone
Frage 25
Frage
Business Intelligence
Antworten
-
generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
-
can be used as an ETL engine, or as an analytics engine for processing large amounts of structured, semi-structured and unstructured data
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
applyes analytics to large amounts of data across the enterprise
Frage 26
Frage
Business Intelligence
Antworten
-
store historical data that is aggregated and denormalized to support fast reporting capability
-
is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
can be further utilize the consolidated data contained in data warehouses to run analytical queries
Frage 27
Antworten
-
is crucial to big data processing storage and analysis
-
is mostly machine-generated and automatically appended to the data
-
is a measured for gauging sucess within a particular context
-
are closesly liked with an enterprise's strategic objectives
Frage 28
Antworten
-
Shares the same set of attributes as others in the same dataset
-
ticket reservation systems and banking and POS transactions
-
used to identify problem areas in order to take corrective actions
-
used to achieve regulatory compliance
Frage 29
Antworten
-
more detailed records
-
big data solutions particularly rely on it when processing semi-structured and unstructured data
-
act as quick reference points for measuring the overall performance of the business
-
is based on a quantifiable indicator that is identified and agreed upon beforehand
Frage 30
Frage
primary business and technology drivers
Antworten
-
the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
-
XML tags providing the author and creation date of a document
-
Analytics and Data Science
-
Digitization
Frage 31
Frage
primary business and technology drivers
Antworten
-
A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
-
are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
-
Affordable Technology & Commodity Hardware
-
Social Media
Frage 32
Frage
primary business and technology drivers
Antworten
-
does often have special pre-processing and storage requierements, especially if the underline format is not text-based
-
is directly related to the veracity characteristic
-
Hyper-Connected Communities & Devices
-
Cloud Computing
Frage 33
Frage
Analytics & Data Science
Antworten
-
generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
-
more detailed records
-
fault and fraud detection
-
The maturity of these fields of practice inspired and enabled much of the core functionality expected from contemporary Big Data solutions and tools
Frage 34
Antworten
-
How well has the data been stored?
-
is always fed with data from multiple OLTP systems using regular batch processing jobs
-
The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
-
Leads to an opportunity to collect further "secondary" data, such as when individuals carry out searches or complete surveys
Frage 35
Frage
Colecting secondary data
Antworten
-
accurate predictions
-
Extract Transform Load (ETL)
-
data bearing value leading to meaningful information
-
can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
Frage 36
Frage
Affordable Technology
Antworten
-
Hyper-Connected Communities & Devices
-
is usually applied using highly scalable distributed technologies and frameworks for analyzing large volumes of data from different sources
-
are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
-
related to collecting and processing large quantities of diverse data has become increasingly affordable
Frage 37
Frage
Tipical Big Data solutions
Antworten
-
is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
-
The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
-
operational optimization
-
are based on open-source software that requires little more than commodity hardware
Frage 38
Antworten
-
How well has the data been stored?
-
Hyper-Connected Communities & Devices
-
fault and fraud detection
-
makes the adoption of big data solutions accessible to businesses without large capital investments
Frage 39
Antworten
-
does not conform to a data model or data schema
-
store historical data that is aggregated and denormalized to support fast reporting capability
-
provide feedback in near-realtime via open and public mediums
-
business are storing increasing amounts of data on customer interaction and from social media avenues in an attempt to harvest this data to increase sales, enable targeted marketing and create new products and service
Frage 40
Antworten
-
may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
-
Are the data analysis results being accurately communicated to the appropriate decision-makers?
-
operational optimization
-
business are also increasingly interested in incorporating publicly avaliable datasets from social media and other external data source
Frage 41
Frage
Hyper-Connected Communities & Devices
Antworten
-
Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
-
is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
-
The broadening coverage of the internet and the proliferation of cellular and Wi-Fi networks has enabled more people to be continuously active in virtual communities
-
This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
Frage 42
Frage
Hyper-Connected Communities & Devices
Antworten
-
is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
-
can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
-
can also be fed back into OLTPs
-
Some streams are public. Other streams go to vendors and business directly
Frage 43
Antworten
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
attributes providing the file size and resolution of a digital photograph
-
have led to the creation of remote environments
-
are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
Frage 44
Antworten
-
multiple formats and types of data that need to be supported by Big Data Solutions
-
applyes analytics to large amounts of data across the enterprise
-
Business have the opportunity to leverage the infraestructure, storage and processing capabilities provided by these environments in order to build large scale Big Data Solutions
-
Can be leveraged for its scaling capabilities to perform Big Data Processing task
Frage 45
Antworten
-
either exists in textual or binary form
-
actionable intelligence
-
have a greater noise-to-signal ratio
-
can be leased dramatically reduces the requiered up-front investment of big data projects
Frage 46
Frage
Technologies Related to Big Data
Antworten
-
It also periodically pulls data from other sources for consolidation into a dataset (such as from OLTP, ERP, CRM, and SCM systems).
-
This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
-
Online Transaction Processing (OLTP)
-
Online Analytical Processing (OLAP)
Frage 47
Frage
Technologies Related to Big Data
Antworten
-
each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
-
represents the main operation through which data warehouses are fed data
-
Extract Transform Load (ETL)
-
Data Warehouses
Frage 48
Frage
Technologies Related to Big Data
Antworten
-
are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
-
is the discipline of gaininng an understanding of data by analyzing it via a multitude of scientific techniques and automated tools, with a focus on locating hidden patterns and correlations
-
is crucial to big data processing storage and analysis
-
Hadoop
Frage 49
Antworten
-
further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
-
is a process of loading data from a source system into a target system, the source system can be a database, a flat file or an application, similarly, the target system can be a database or some other information system
-
store operational data that is fully normalized
-
is a software system that processes transaction-oriented data
Frage 50
Antworten
-
operational optimization
-
A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
-
Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
-
the completion on an activity in realtime and not batch-processed
Frage 51
Antworten
-
representing a common source of structured analytics input
-
generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
-
require automated data cleansing and data verification when carrying out ETL processes
-
are closesly liked with an enterprise's strategic objectives
Frage 52
Frage
Big Data Analysis Results
Antworten
-
used to identify problem areas in order to take corrective actions
-
either exists in textual or binary form
-
enables data-driven decision-making with scientific backing, so that decisions can be based on a factual data and not on past experience or intuition alone
-
can also be fed back into OLTPs
Frage 53
Frage
Queries Supported by OLTP
Antworten
-
mostly exist in textual form such as XML or JSON files.
-
data bearing value leading to meaningful information
-
The broadening coverage of the internet and the proliferation of cellular and Wi-Fi networks has enabled more people to be continuously active in virtual communities
-
simple insert, delete and update operations with sub-second response times
Frage 54
Frage 55
Antworten
-
related to collecting and processing large quantities of diverse data has become increasingly affordable
-
XML tags providing the author and creation date of a document
-
is a system used for processing data analysis queries
-
form an integral part of business intelligence, data mining and machine learning processes
Frage 56
Antworten
-
Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
-
store historical data that is aggregated and denormalized to support fast reporting capability
-
are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
-
are using in diagnostic, predictive and prescriptive analysis
Frage 57
Antworten
-
Social Media
-
Sensor Data (RFID, Smart meters, GPS sensors)
-
further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
-
is always fed with data from multiple OLTP systems using regular batch processing jobs
Frage 58
Antworten
-
have a less noise-to-signal ratio
-
Are the right types of question being asked during data analysis?
-
queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
-
the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
Frage 59
Antworten
-
either exists in textual or binary form
-
generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
-
is a process of loading data from a source system into a target system, the source system can be a database, a flat file or an application, similarly, the target system can be a database or some other information system
-
represents the main operation through which data warehouses are fed data
Frage 60
Antworten
-
online transactions (point-of-scale, banking)
-
act as quick reference points for measuring the overall performance of the business
-
A big data solution encompasses this tool feature-set for converting data of different types
-
The required data is first obtained from the sources, after which the extracts are modified by applying rules
Frage 61
Antworten
-
analytics results can lower operational costs and facilitate strategic decision-making
-
Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
-
generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
-
The data is inserted into a target system
Frage 62
Antworten
-
impose distinct data storage and processing demands, as well as management ans access processes
-
is based on a quantifiable indicator that is identified and agreed upon beforehand
-
is a central, enterprise-wide repository, consisting of historical and current data
-
are heavily used by BI to run various analytical queries
Frage 63
Antworten
-
The required data is first obtained from the sources, after which the extracts are modified by applying rules
-
analytics can help identify the cause of a phenomenon to improve the accuracy of predictions
-
usually interface with an OLAP system to support analytical queries
-
It also periodically pulls data from other sources for consolidation into a dataset (such as from OLTP, ERP, CRM, and SCM systems).
Frage 64
Antworten
-
This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
-
conforms to a data model or schema
-
Data pertaining to multiple business entities from different operational systems is periodically extracted, validated, transformed an consolidated into a single database
-
With periodic data imports from accross the enterprise, the amount of data contained will continue to increase. Query response times for data analysis task performed as part of BI can suffer as a result
Frage 65
Antworten
-
can also be fed back into OLTPs
-
helps establish patterns and relationships amog the data being analyzed
-
the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
-
Usually contain optimized databases called analytical database to handle reporting and data analysis tasks
Frage 66
Frage
Analytical Database
Antworten
-
This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
-
Brings challenges for enterprises in terms of data integration, transformation, processing and storage
-
does not conform to a data model or data schema
-
Can exist as a separate DBMS, as in the case of an OLAP database
Frage 67
Antworten
-
act as quick reference points for measuring the overall performance of the business
-
online transactions (point-of-scale, banking)
-
can also be fed back into OLTPs
-
is a subset of the data stored in a data warehouse, that typically belongs to a department, division or specific line of business
Frage 68
Antworten
-
does generally require special or customized logic when it comes to pre-processing and storage
-
is directly related to the veracity characteristic
-
can have multiple data marts
-
single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
Frage 69
Antworten
-
further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
-
identification of new markets
-
is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
-
has established itself as a de facto industry platform for contemporary Big Data Solutions
Frage 70
Antworten
-
analytics can help strengthen the focus on delivering high quality services by driving down cost
-
have led to the creation of remote environments
-
are closesly liked with an enterprise's strategic objectives
-
can be used as an ETL engine, or as an analytics engine for processing large amounts of structured, semi-structured and unstructured data
Frage 71
Frage
Data Characteristics
Antworten
-
does not conform to a data model or data schema
-
Are the data analysis results being accurately communicated to the appropriate decision-makers?
-
is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
-
Volume, Velocity, Variety, Veracity & Value
Frage 72
Antworten
-
scientific and research data (large Hadron Collider, Atacama Large Milimeter/Submilimeter Array Telescope)
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
impose distinct data storage and processing demands, as well as management ans access processes
Frage 73
Antworten
-
Leads to an opportunity to collect further "secondary" data, such as when individuals carry out searches or complete surveys
-
Digitization
-
online transactions (point-of-scale, banking)
-
Sensor Data (RFID, Smart meters, GPS sensors)
Frage 74
Antworten
-
is crucial to big data processing storage and analysis
-
can be leased dramatically reduces the requiered up-front investment of big data projects
-
impose distinct data storage and processing demands, as well as management ans access processes
-
Social Media (Facebook, Tweeter)
Frage 75
Antworten
-
can be human-generated or machine generated, although it is ultimately the responsibility of machines to generate the processing results
-
analytics can help strengthen the focus on delivering high quality services by driving down cost
-
Arrives at such fast speeds that enormous datasets can accumulate within very shorts periods of time
-
translates into the amount of time it takes for the data to be processed once it enters the enterprise perimeter
Frage 76
Antworten
-
Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
-
is a measured for gauging sucess within a particular context
-
Coping with the fast inflow of data requires the enterprise to design highly elastic and avaliable processing solutions and corresponding data storage capabilities
-
may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
Frage 77
Antworten
-
data bearing value leading to meaningful information
-
big data solutions particularly rely on it when processing semi-structured and unstructured data
-
multiple formats and types of data that need to be supported by Big Data Solutions
-
Brings challenges for enterprises in terms of data integration, transformation, processing and storage
Frage 78
Antworten
-
Online Transaction Processing (OLTP)
-
Shares the same set of attributes as others in the same dataset
-
generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
-
refers to the quality or fidelity of data
Frage 79
Antworten
-
has a defined level of structure and consistency, but cannot be relational in nature
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
Coping with the fast inflow of data requires the enterprise to design highly elastic and avaliable processing solutions and corresponding data storage capabilities
-
data carrying no value
Frage 80
Antworten
-
is a subset of the data stored in a data warehouse, that typically belongs to a department, division or specific line of business
-
provide feedback in near-realtime via open and public mediums
-
A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
-
data bearing value leading to meaningful information
Frage 81
Antworten
-
are heavily used by BI to run various analytical queries
-
Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
-
makes the adoption of big data solutions accessible to businesses without large capital investments
-
Data adquired such as via online customer registrations, usually contains less noise
Frage 82
Frage
uncontrolled source
Antworten
-
business are also increasingly interested in incorporating publicly avaliable datasets from social media and other external data source
-
accurate predictions
-
Business have the opportunity to leverage the infraestructure, storage and processing capabilities provided by these environments in order to build large scale Big Data Solutions
-
Data adquired such as blog posting, usually contains more noise
Frage 83
Antworten
-
is a measured for gauging sucess within a particular context
-
act as quick reference points for measuring the overall performance of the business
-
analytics results can lower operational costs and facilitate strategic decision-making
-
Depends on the type of data present
Frage 84
Antworten
-
store historical data that is aggregated and denormalized to support fast reporting capability
-
is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
-
defined as the usefulness of data for an enterprise
-
is directly related to the veracity characteristic
Frage 85
Antworten
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
Brings challenges for enterprises in terms of data integration, transformation, processing and storage
-
is also dependent on how long data processing takes, time are inversely proportional to each other
-
The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
Frage 86
Frage
Value Considerations
Antworten
-
scientific discoveries
-
ticket reservation systems and banking and POS transactions
-
How well has the data been stored?
-
Has the data been stripped of any valuable attributes?
Frage 87
Frage
Value Considerations
Antworten
-
have a less noise-to-signal ratio
-
attributes providing the file size and resolution of a digital photograph
-
Are the right types of question being asked during data analysis?
-
Are the data analysis results being accurately communicated to the appropriate decision-makers?
Frage 88
Frage 89
Antworten
-
translates into the amount of time it takes for the data to be processed once it enters the enterprise perimeter
-
have led to the creation of remote environments
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
semi-structured data
Frage 90
Antworten
-
Are the right types of question being asked during data analysis?
-
makes the adoption of big data solutions accessible to businesses without large capital investments
-
conforms to a data model or schema
-
is stored in a tabular form
Frage 91
Antworten
-
is crucial to big data processing storage and analysis
-
is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
-
can be relational
-
is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
Frage 92
Antworten
-
can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
-
analytics can help strengthen the focus on delivering high quality services by driving down cost
-
The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
-
does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
Frage 93
Antworten
-
qualitative analysis
-
enables data-driven decision-making with scientific backing, so that decisions can be based on a factual data and not on past experience or intuition alone
-
does not conform to a data model or data schema
-
is generally inconsistent and non-relational
Frage 94
Antworten
-
simple insert, delete and update operations with sub-second response times
-
Shares the same set of attributes as others in the same dataset
-
either exists in textual or binary form
-
generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
Frage 95
Antworten
-
is mostly machine-generated and automatically appended to the data
-
Shares the same set of attributes as others in the same dataset
-
does generally require special or customized logic when it comes to pre-processing and storage
-
cannot be inherently processed or queried using SQL or traditional programming features and is usually an awkward fit with relational databases
Frage 96
Antworten
-
has a defined level of structure and consistency, but cannot be relational in nature
-
are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
-
A big data solution encompasses this tool feature-set for converting data of different types
-
A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
Frage 97
Frage
semi-structured data
Antworten
-
Are the right types of question being asked during data analysis?
-
How well has the data been stored?
-
has a defined level of structure and consistency, but cannot be relational in nature
-
mostly exist in textual form such as XML or JSON files.
Frage 98
Frage
semi-structured data
Antworten
-
defined as the usefulness of data for an enterprise
-
may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
-
Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
-
does often have special pre-processing and storage requierements, especially if the underline format is not text-based
Frage 99
Antworten
-
is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
-
require automated data cleansing and data verification when carrying out ETL processes
-
provide information about dataset's characteristics and structure
-
is mostly machine-generated and automatically appended to the data
Frage 100
Antworten
-
refers to the quality or fidelity of data
-
Has the data been stripped of any valuable attributes?
-
XML tags providing the author and creation date of a document
-
attributes providing the file size and resolution of a digital photograph
Frage 101
Antworten
-
The data is inserted into a target system
-
semi-structured data
-
single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
-
big data solutions particularly rely on it when processing semi-structured and unstructured data
Frage 102
Frage 103
Frage
semi-structured data and unstructured data
Antworten
-
identification of new markets
-
Are the data analysis results being accurately communicated to the appropriate decision-makers?
-
improved decision-making
-
have a greater noise-to-signal ratio
Frage 104
Antworten
-
A big data solution encompasses this tool feature-set for converting data of different types
-
structured data
-
Brings challenges for enterprises in terms of data integration, transformation, processing and storage
-
require automated data cleansing and data verification when carrying out ETL processes
Frage 105
Frage
Types of data analysis
Frage 106
Frage
Types of data analysis
Antworten
-
does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
-
online transactions (point-of-scale, banking)
-
is a central, enterprise-wide repository, consisting of historical and current data
-
data mining
Frage 107
Frage
quantitative analysis
Antworten
-
The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
-
queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
-
translates into the amount of time it takes for the data to be processed once it enters the enterprise perimeter
-
is a data analysis technique that focuses on quantifying the patterns and correlations found in the data
Frage 108
Frage
quantitative analysis
Antworten
-
cannot be inherently processed or queried using SQL or traditional programming features and is usually an awkward fit with relational databases
-
refers to the quality or fidelity of data
-
this technique involves analyzing a large number of observations from a dataset
-
since the sample size is large, the results can be applied in a generalized manner to the entire dataset
Frage 109
Frage
quantitative analysis
Antworten
-
defined as the usefulness of data for an enterprise
-
provide more value than any other type of analytics and correspondingly require the most advance skillset, as well as specialized software and tools
-
single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
-
are absolute in nature and can therefore be used for numerical comparisons
Frage 110
Frage
qualitative analysis
Antworten
-
Data pertaining to multiple business entities from different operational systems is periodically extracted, validated, transformed an consolidated into a single database
-
can also be fed back into OLTPs
-
is a data analysis technique that focuses on describing various data qualities using words
-
involves analyzing a smaller sample in greater depth compared to quantitative data analysis
Frage 111
Frage
qualitative analysis
Antworten
-
accurate predictions
-
the information is generated at periodic intervals in realtime or near realtime
-
theses analysis results cannot be generalized to an entire dataset due to the small sample size
-
they also cannot be measured numerically or used for numerical comparisons
Frage 112
Antworten
-
policies for data privacy and data anonymization
-
aim to determine the cause of a phenomenon that occuried in the past, using questions that focus on the reason behind the event
-
also known as data discovery, is a specialized form of data analysis that targets large datasets
-
refers to automated, sofware-based techniques that sift through massive datasets to identify patterns and trends
Frage 113
Antworten
-
is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
-
actionable intelligence
-
involves extracting hidden or unknown patterns in the data with the intention of identifying previously unknown patterns
-
forms the basis for predictive analytics and business intelligence (BI)
Frage 114
Frage
Analysis & Analitycs
Antworten
-
based on the input data, the algorithm develops an understanding of which data belongs to which category
-
data carrying no value
-
act as quick reference points for measuring the overall performance of the business
-
These techniques may not provide accurate findings in a timely manner because of the data's volume, velocity and/or variety
Frage 115
Antworten
-
enables multiple outcomes to be visualized by enabling related factors to be dynamically changed
-
are often carried out via ad-hoc reporting or dashboards
-
some realtime data analysis solutions that do exist are proprietary
-
can automate data analyses through the use of highly scalable computational technologies that apply automated statistical quantitative analysis, data mining an machine learning techniques
Frage 116
Frage 117
Antworten
-
involves analyzing a smaller sample in greater depth compared to quantitative data analysis
-
also known as data discovery, is a specialized form of data analysis that targets large datasets
-
predictive analytics
-
prescriptive analytics
Frage 118
Antworten
-
does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
-
policies for data cleansing and filtering
-
can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
-
Value and complexity increase as we move from descriptive to prescriptive analytics
Frage 119
Frage
descriptive analytics
Antworten
-
is generally inconsistent and non-relational
-
This involves identifying patterns in the training data and classifying new or unseen data based on known patterns
-
is carried out to answer questions about events that have already occurred
-
Arround 80% of analytics are ________ in nature
Frage 120
Frage
descriptive analytics
Antworten
-
refers to the information about the source of the data that helps determine its authenticity and quality. It also used for auditing purposes
-
This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
-
provides the least value and requires a relatively basic skillset
-
are often carried out via ad-hoc reporting or dashboards
Frage 121
Frage
descriptive analytics
Antworten
-
is directly related to the veracity characteristic
-
Business have the opportunity to leverage the infraestructure, storage and processing capabilities provided by these environments in order to build large scale Big Data Solutions
-
The reports are generally static in nature and display historical data that is presented in the form of data grids or charts
-
Queries are executed on the OLTP systems or data obtained from various other information systems, such as CRMs and ERPs
Frage 122
Frage
diagnostic analytics
Antworten
-
aim to determine the cause of a phenomenon that occuried in the past, using questions that focus on the reason behind the event
-
are considered to provide more value than descriptive analysis, requiring a more advanced skillset
-
data bearing value leading to meaningful information
-
The data is inserted into a target system
Frage 123
Frage
diagnostic analytics
Antworten
-
single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
-
a substancial budget may still be required to obtain external data
-
usually require collecting data from multiple sources and storing it in a structure that lends itself to performing drill-downs and roll-ups
-
analytics results are viewed via interactive visualization tools that enable users to identify trends and patterns
Frage 124
Frage
diagnostic analytics
Antworten
-
can join structured and unstructured data that is kept in memory for fast data access
-
impose distinct data storage and processing demands, as well as management ans access processes
-
will be required to control how data flows in and out of big data solutions and how feedback loops can be established to enable the processed data to undergo repeated refinements
-
the executed queries are more complex compared to descriptive analytics, and are performed on multi-dimensional data held in OLAP systems
Frage 125
Frage
predictive analytics
Antworten
-
the adoption of a big data environment may necessitate that some or all of that environment be hosted witin a cloud
-
is also dependent on how long data processing takes, time are inversely proportional to each other
-
are carried out to attempt to determine the outcome of an event that might occur in the future
-
try to predict the event outcome and predictions are made based on patterns, trends and exceptions found in historical and current data
Frage 126
Frage
predictive analytics
Antworten
-
as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
-
Graphically representing data can make it easier to understand reports, view trends and identify patterns
-
This can lead to the identification of risk and opportunities
-
involve the use of large datasets (comprised of both internal and external data), statistical techniques, quantitative analysis, machine learning and data mining techniques
Frage 127
Frage
predictive analytics
Antworten
-
may employ machine learning algorithms, such as unsupervised learning to extract previously unknown attributes
-
is considered to provide more value and required more advance skillset than both descriptive and diagnostic analytics
-
tool generally abstract underlying statistical intricacies by providing user-friendly front-end interfaces
-
enables a detailed view of the data of interest by focusing in on a data subset from the summarized view
Frage 128
Frage
prescriptive analytics
Antworten
-
is the process of teaching computers to learn from existing data and apply the adquired knowledge to formulate predictions about unknown data
-
incorporate predictive and prescriptive data analytics and data transformation features
-
build upon the results of predictive analytics by prescribing actions that should be taken. The focus is on which prescribed options to follow, and why and when it should be followed, to gain an advantage or mitigate a risk
-
provide more value than any other type of analytics and correspondingly require the most advance skillset, as well as specialized software and tools
Frage 129
Frage
prescriptive analytics
Antworten
-
rely on BI and data warehouses as core components of big data environments and ecosystems
-
risk associated with collecting accurate and relevant data, and with integrating the big data environment itself, need to be identified and quantified
-
various outcomes are calculated, and the best course of action for each outcome is suggested
-
The approach shifts form explanatory to advisory and can include the simulation of various scenarios
Frage 130
Frage
prescriptive analytics
Antworten
-
helps establish patterns and relationships amog the data being analyzed
-
unstructured data
-
incorporate internal data (current and historical sales data, customer information, product data, business rules) and external data (social media data, weather data, demographic data)
-
involve the use of business rules and large amounts of internal and/or external data to simulate outcomes and prescribe the best course of action
Frage 131
Antworten
-
coupling a traditional data warehouse with these new technologies results in a hybrid data warehouse
-
various outcomes are calculated, and the best course of action for each outcome is suggested
-
is the process of teaching computers to learn from existing data and apply the adquired knowledge to formulate predictions about unknown data
-
This involves identifying patterns in the training data and classifying new or unseen data based on known patterns
Frage 132
Frage
machine learning types
Frage 133
Frage
supervised learning
Antworten
-
distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
-
theses analysis results cannot be generalized to an entire dataset due to the small sample size
-
algorithm is first fed sample data where the data categories are already known
-
based on the input data, the algorithm develops an understanding of which data belongs to which category
Frage 134
Frage
supervised learning
Antworten
-
refers to the quality or fidelity of data
-
usually require collecting data from multiple sources and storing it in a structure that lends itself to performing drill-downs and roll-ups
-
the information is generated at periodic intervals in realtime or near realtime
-
having developed an understanding, the algorithm can then apply the learned behavior to categorize unknown data
Frage 135
Frage
unsupervised learning
Antworten
-
identification of new markets
-
try to predict the event outcome and predictions are made based on patterns, trends and exceptions found in historical and current data
-
data categories are unknown and no sample data is fed
-
Instead, the algorithm attemps to categorize data by grouping data with similar attributes together
Frage 136
Antworten
-
is directly related to the veracity characteristic
-
Online Transaction Processing (OLTP)
-
unearths hidden patterns and relationships based on previously unknown attributes of data
-
may employ machine learning algorithms, such as unsupervised learning to extract previously unknown attributes
Frage 137
Antworten
-
This can lead to the identification of risk and opportunities
-
is not "intelligent" as such because it only provides answers to correctly formulated questions
-
makes predictions by categorizing data based on known patterns
-
can use the output from data mining (identified patterns) for further data classification through supervised learning
Frage 138
Antworten
-
provide a holistic view of key business areas
-
Due to the volumes of data that some big data solutions are required to process, performance can sometimes become a concern
-
may employ machine learning algorithms, such as unsupervised learning to extract previously unknown attributes
-
this is accomplished by categorizing data which leads to the identification of patterns
Frage 139
Antworten
-
is stored in a tabular form
-
aim to determine the cause of a phenomenon that occuried in the past, using questions that focus on the reason behind the event
-
rely on BI and data warehouses as core components of big data environments and ecosystems
-
has advance BI and data warehouses technologies and practices to a point where a new generation of these platforms has emerged
Frage 140
Antworten
-
queries and statistical formulae can then be applied as part of various data analysis tasks for viewing data in a user-friendly format, such as on a dashboard
-
more detailed records
-
utilizes descriptive and diagnostic analysis to provide information on historical and current events
-
is not "intelligent" as such because it only provides answers to correctly formulated questions
Frage 141
Antworten
-
can also be fed back into OLTPs
-
is mostly machine-generated and automatically appended to the data
-
they also cannot be measured numerically or used for numerical comparisons
-
correctly formulating questions requires an understanding of business problems and issues, and of the data itself
Frage 142
Antworten
-
Sensor Data (RFID, Smart meters, GPS sensors)
-
tool generally abstract underlying statistical intricacies by providing user-friendly front-end interfaces
-
ad-hoc reports
-
dashboards
Frage 143
Antworten
-
are commonly used for meaningful and complex reporting and assessment task and can also be fed back into applications to enhance their behavior (such as when product recommendations are displayed online)
-
Online Analytical Processing (OLAP)
-
is a process that involves manually processing data to produce custom-made reports
-
the focus is usually on a specific area of the business, such as its marketing or supply chain management.
Frage 144
Antworten
-
Data adquired such as via online customer registrations, usually contains less noise
-
policies for data privacy and data anonymization
-
makes the adoption of big data solutions accessible to businesses without large capital investments
-
the generated custom reports are detailed and often tabular in nature
Frage 145
Frage
OLAP and OLTP data sources
Antworten
-
Instead, the algorithm attemps to categorize data by grouping data with similar attributes together
-
each iteration can then help fine-tune processing steps, algorithms and data models to improve the accuracy of the result and deliver greater value to the business
-
Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records
-
can be used by BI tools for both ad-hoc reporting and dashboards
Frage 146
Antworten
-
analytics results are viewed via interactive visualization tools that enable users to identify trends and patterns
-
in-house hardware resources are inadequate
-
provide a holistic view of key business areas
-
the information is generated at periodic intervals in realtime or near realtime
Frage 147
Antworten
-
are not turn-key solutions
-
does often have special pre-processing and storage requierements, especially if the underline format is not text-based
-
performing analytics on datasets can reveal confidential information about organizations or individuals
-
the presentation of data is graphical in nature, such as column charts, pie charts and gauges
Frage 148
Antworten
-
The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
-
datasets that need to be processed reside in a cloud
-
provide feedback in near-realtime via open and public mediums
-
BI tools use to display the information on dashboards
Frage 149
Frage
data warehouse and data marts
Antworten
-
is carried out to answer questions about events that have already occurred
-
either exists in textual or binary form
-
can have multiple data marts
-
contain consolidated and validated information about enterprise-wide business entities
Frage 150
Antworten
-
policies that regulate the kind of external data that can be adquired
-
does often have special pre-processing and storage requierements, especially if the underline format is not text-based
-
cannot function effectively without data marts because they contain the optimized and segregated data requires for reporting purposes
-
without data marts, data needs to be extracted from the data warehouse via an ETL process on an ad-hoc basis whenever a query needs to be run
Frage 151
Antworten
-
can be used as an ETL engine, or as an analytics engine for processing large amounts of structured, semi-structured and unstructured data
-
accumulates from being amassed within the enterprise (via applications) or from external sources that are then stored by the big datat solution
-
Near-realtime data processing can be archieved by processing transactional data as it arrives and combining it with already summarized batch-processed data
-
uses datawarehouses and data marts for reporting and data analysis, because they allow complex data analysis queries with multiple joins and aggregations to be issued
Frage 152
Antworten
-
each feedback cycle may reveal the need for existing steps to be modified, or new steps, such as pre-processing for data cleasing, to be added
-
policies for data archiving data sources and analysis results
-
builds upon BI by acting on the cleansed, consolidated enterprise-wide data in the data warehouse and combining it with semi-structured and unstructured data sources
-
comprises both predictive and prescriptive analysis to facilitate the development of an enterprise-wide understanding of the way a business works
Frage 153
Antworten
-
The broadening coverage of the internet and the proliferation of cellular and Wi-Fi networks has enabled more people to be continuously active in virtual communities
-
they also cannot be measured numerically or used for numerical comparisons
-
sound processes and sufficient skillsets for those who will be responsible for implementing, customizing, populating and using big data solutions are also necessary
-
analyses focus on multiple business processes simultaneously
Frage 154
Antworten
-
analyses generally focus on individual business processes
-
Depends on the type of data present
-
as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
-
refers to the information about the source of the data that helps determine its authenticity and quality. It also used for auditing purposes
Frage 155
Antworten
-
it is important to accept that big data solutions are not necessary for all business
-
business are also increasingly interested in incorporating publicly avaliable datasets from social media and other external data source
-
This helps reveal patterns and anomalies across a broader scope within the enterprise
-
It also leads to data discovery by identifying insights and information that may have been previously absent or unknown
Frage 156
Antworten
-
distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
-
generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
-
requires the analysis of unstructured, semi-structured and structured data residing in the enterprise data warehouse
-
requires a "next-generation" data warehouse that use new features and technologies to store cleansed data originating from a variety of sources in a single uniform data format
Frage 157
Antworten
-
has advance BI and data warehouses technologies and practices to a point where a new generation of these platforms has emerged
-
Volume, Velocity, Variety, Veracity & Value
-
coupling a traditional data warehouse with these new technologies results in a hybrid data warehouse
-
this type of data warehouse acts as a uniform and central repository of structured, semi-structured and unstructured data that can provide tools with all of the data they require
Frage 158
Antworten
-
Arround 80% of analytics are ________ in nature
-
is directly related to the veracity characteristic
-
this eliminates the need for tools to have to connect to multiple data sources to retrieve or access data
-
A next-generation data warehouse establishes a standarized data access layer accross a range of data sources
Frage 159
Antworten
-
conforms to a data model or schema
-
is based on a quantifiable indicator that is identified and agreed upon beforehand
-
is a technique whereby analytical results are graphically communicated using elements like charts, maps, data grids, infographics and alerts
-
Graphically representing data can make it easier to understand reports, view trends and identify patterns
Frage 160
Frage
Traditional Data Visualization
Antworten
-
contain consolidated and validated information about enterprise-wide business entities
-
the nature of the business may make external data very valuable. The greater the volume and variety of data, the higher the chances of finding hidden insights from patterns
-
provided mostly static charts and graphs in reports and dashboards
-
query data from relational databases, OLAP systems, data warehouses and spreadsheets to present both descriptive and diagnostic analytics results
Frage 161
Frage
contemporary data visualization
Antworten
-
unearths hidden patterns and relationships based on previously unknown attributes of data
-
can be human-generated or machine generated, although it is ultimately the responsibility of machines to generate the processing results
-
can be used by enterprise applications directly, or fed into a data warehouse to enrich existing data.This data is typically analyzed and subjected to analytics
-
are interactive and can provide both summarized and detailed views of data
Frage 162
Antworten
-
analyses focus on multiple business processes simultaneously
-
semi-structured data
-
they are designed to help people who lack statistical and/or mathematical skills to better understand analytical results, without having to resort to spreadsheets
-
Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records
Frage 163
Antworten
-
has advance BI and data warehouses technologies and practices to a point where a new generation of these platforms has emerged
-
policies for data archiving data sources and analysis results
-
generally use in-memory analytical technologies that reduce the latency normally attributed to traditional, disk-based tools
-
Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records
Frage 164
Frage
Data Visualization Features
Antworten
-
does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
-
each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
-
Aggregation
-
Drill-Down
Frage 165
Frage
Data Visualization Features
Antworten
-
also known as data discovery, is a specialized form of data analysis that targets large datasets
-
this type of data warehouse acts as a uniform and central repository of structured, semi-structured and unstructured data that can provide tools with all of the data they require
-
Filtering
-
Roll-Up
Frage 166
Frage
Data Visualization Features
Frage 167
Antworten
-
in-house hardware resources are inadequate
-
distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
-
involves extracting hidden or unknown patterns in the data with the intention of identifying previously unknown patterns
-
provides a holistic and sumerized view of data across multiple contexts
Frage 168
Antworten
-
Big data solutions access data and generate data, all of which become assets of the business
-
forms the basis for predictive analytics and business intelligence (BI)
-
since the sample size is large, the results can be applied in a generalized manner to the entire dataset
-
enables a detailed view of the data of interest by focusing in on a data subset from the summarized view
Frage 169
Antworten
-
Value and complexity increase as we move from descriptive to prescriptive analytics
-
provides a holistic and sumerized view of data across multiple contexts
-
is a data analysis technique that focuses on describing various data qualities using words
-
helps focus on a particular set of data by filtering away the data that is not of immediate interest
Frage 170
Antworten
-
qualitative analysis
-
structured data
-
queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
-
groups data across multiple categories to show subtotals and totals
Frage 171
Antworten
-
adressing concerns can require the annotation of data with source information and other metadata, when it is generated or as it arrives
-
scientific discoveries
-
also, the quality of the data targeted for processing by big data solutions needs to be assessed
-
enables multiple outcomes to be visualized by enabling related factors to be dynamically changed
Frage 172
Frage
advance visualization tools
Antworten
-
is stored in a tabular form
-
These techniques may not provide accurate findings in a timely manner because of the data's volume, velocity and/or variety
-
incorporate predictive and prescriptive data analytics and data transformation features
-
these tools eliminate the need for data pre-processing methods (such as ETL) and provide the ability to directly connect to structured, semi-structured and unstructured data sources
Frage 173
Frage
advance visualization tools
Antworten
-
based on the input data, the algorithm develops an understanding of which data belongs to which category
-
can join structured and unstructured data that is kept in memory for fast data access
-
queries and statistical formulae can then be applied as part of various data analysis tasks for viewing data in a user-friendly format, such as on a dashboard
-
correctly formulating questions requires an understanding of business problems and issues, and of the data itself
Frage 174
Frage
business justification
Antworten
-
this eliminates the need for tools to have to connect to multiple data sources to retrieve or access data
-
It also leads to data discovery by identifying insights and information that may have been previously absent or unknown
-
as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
-
clear goals regarding the measurable business value of an enterprise's big data solution need to be set
Frage 175
Frage
business justification
Antworten
-
algorithm is first fed sample data where the data categories are already known
-
Leads to an opportunity to collect further "secondary" data, such as when individuals carry out searches or complete surveys
-
anticipated benefits need to be weighed against risk and investments
-
risk associated with collecting accurate and relevant data, and with integrating the big data environment itself, need to be identified and quantified
Frage 176
Frage
business justification
Antworten
-
refers to the quality or fidelity of data
-
a substancial budget may still be required to obtain external data
-
distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
-
it is important to accept that big data solutions are not necessary for all business
Frage 177
Frage
big data frameworks
Antworten
-
based on the input data, the algorithm develops an understanding of which data belongs to which category
-
provides a holistic and sumerized view of data across multiple contexts
-
are interactive and can provide both summarized and detailed views of data
-
are not turn-key solutions
Frage 178
Frage
organizational prerequisites
Antworten
-
prescriptive analytics
-
enables multiple outcomes to be visualized by enabling related factors to be dynamically changed
-
in order for data analysis and analytics to be successful and offer value, enterprise need to have data management and big data governance frameworks
-
sound processes and sufficient skillsets for those who will be responsible for implementing, customizing, populating and using big data solutions are also necessary
Frage 179
Frage
organizational prerequisites
Antworten
-
is mostly machine-generated and automatically appended to the data
-
Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records
-
also, the quality of the data targeted for processing by big data solutions needs to be assessed
-
outdated, invalid or poorly identified data will result in low-quality input which, regardless of how good the big data solution is, will continue to produce low-quality output
Frage 180
Frage
organizational prerequisites
Antworten
-
refers to automated, sofware-based techniques that sift through massive datasets to identify patterns and trends
-
makes predictions by categorizing data based on known patterns
-
the longevity of the big data environment also needs to be planned for
-
a roadmap needs to be defined to ensure that any necessary expansion or augmentation of the environment is planned out to stay in sinc with the requirements of the enterprise
Frage 181
Antworten
-
can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
-
Hadoop
-
the adquisition of big data solutions themselves can be economical, due to open-source platform availability and opportunities to leverage commodity hardware
-
a substancial budget may still be required to obtain external data
Frage 182
Antworten
-
build upon the results of predictive analytics by prescribing actions that should be taken. The focus is on which prescribed options to follow, and why and when it should be followed, to gain an advantage or mitigate a risk
-
they are designed to help people who lack statistical and/or mathematical skills to better understand analytical results, without having to resort to spreadsheets
-
the nature of the business may make external data very valuable. The greater the volume and variety of data, the higher the chances of finding hidden insights from patterns
-
external data sources include data markets and the government. Government-provided data, like geo-spatial data may be free
Frage 183
Antworten
-
predictive analytics
-
can process massive quantities of data that arrive at varying speeds, may be of many different varieties and have numerous incompatibilities
-
Value and complexity increase as we move from descriptive to prescriptive analytics
-
most commercially relevant data will need to be purchased. Such an investment may be on-going in order to obtain updated versions of the datasets
Frage 184
Antworten
-
store operational data that is fully normalized
-
Coping with the fast inflow of data requires the enterprise to design highly elastic and avaliable processing solutions and corresponding data storage capabilities
-
performing analytics on datasets can reveal confidential information about organizations or individuals
-
even analyzing separate datasets that contain seemingly benign can reveal private information when the datasets are analyzed jointly
Frage 185
Antworten
-
predictive analytics
-
descriptive analytics
-
this can lead to intentional or inadvertent breaches of privacy
-
adressing these privacy concerns requires an undestanding of the nature of data being accumulated and relevant data privacy regulations, as well as special techniques for data tagging and anonymization
Frage 186
Antworten
-
big data security further involves establishing data access levels for different categories of users
-
The maturity of these fields of practice inspired and enabled much of the core functionality expected from contemporary Big Data solutions and tools
-
some of the components of big data solutions lack the robustness of traditional enterprise solution environments when it comes to access control and data security
-
securing big data involves ensuring that data networks provide access to repositories that are sufficiently secured, via custom authentication and autorization mechanisms
Frage 187
Antworten
-
provide a holistic view of key business areas
-
without data marts, data needs to be extracted from the data warehouse via an ETL process on an ad-hoc basis whenever a query needs to be run
-
refers to the information about the source of the data that helps determine its authenticity and quality. It also used for auditing purposes
-
maintaining as large volumes of data are adquired, combined and put through multiple processing stages can be a complex task
Frage 188
Antworten
-
provide a holistic view of key business areas
-
store historical data that is aggregated and denormalized to support fast reporting capability
-
adressing concerns can require the annotation of data with source information and other metadata, when it is generated or as it arrives
-
data may also need to be annotated with the source dataset attributes and processing steps details as it passes through the data transformation steps
Frage 189
Frage
Limited Realtime Support
Antworten
-
is stored in a tabular form
-
performing analytics on datasets can reveal confidential information about organizations or individuals
-
Dashboards and other applications that require streaming data and alerts often demand realtime or near-realtime data transmissions
-
Many contemporary open-source big data solutions and tools are batch-oriented meaning support for streaming data analysis may either be limited or non-existent
Frage 190
Frage
Limited Realtime Support
Antworten
-
algorithm is first fed sample data where the data categories are already known
-
they are designed to help people who lack statistical and/or mathematical skills to better understand analytical results, without having to resort to spreadsheets
-
some realtime data analysis solutions that do exist are proprietary
-
Near-realtime data processing can be archieved by processing transactional data as it arrives and combining it with already summarized batch-processed data
Frage 191
Frage
Distinct performance challenges
Antworten
-
refers to automated, sofware-based techniques that sift through massive datasets to identify patterns and trends
-
anticipated benefits need to be weighed against risk and investments
-
queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
-
Due to the volumes of data that some big data solutions are required to process, performance can sometimes become a concern
Frage 192
Frage
Distinct governance requirements
Antworten
-
having developed an understanding, the algorithm can then apply the learned behavior to categorize unknown data
-
are considered to provide more value than descriptive analysis, requiring a more advanced skillset
-
the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
-
Big data solutions access data and generate data, all of which become assets of the business
Frage 193
Frage
governance framework
Antworten
-
can use the output from data mining (identified patterns) for further data classification through supervised learning
-
business are storing increasing amounts of data on customer interaction and from social media avenues in an attempt to harvest this data to increase sales, enable targeted marketing and create new products and service
-
analyses focus on multiple business processes simultaneously
-
is required to ensure that the data and the solution environment itself are regulated, standarized and evolved in a controlled manner
Frage 194
Frage
what a big data governance framework would encompass
Antworten
-
does not conform to a data model or data schema
-
big data solutions particularly rely on it when processing semi-structured and unstructured data
-
standardizing how data is tagged and the metadata used for tagging
-
policies that regulate the kind of external data that can be adquired
Frage 195
Frage
what a big data governance framework would encompass
Antworten
-
policies for data cleansing and filtering
-
can also be fed back into OLTPs
-
policies for data privacy and data anonymization
-
policies for data archiving data sources and analysis results
Frage 196
Frage
Distinct methodology
Antworten
-
upfront capital investment is not available
-
simple insert, delete and update operations with sub-second response times
-
will be required to control how data flows in and out of big data solutions and how feedback loops can be established to enable the processed data to undergo repeated refinements
-
each feedback cycle may reveal the need for existing steps to be modified, or new steps, such as pre-processing for data cleasing, to be added
Frage 197
Frage
Distinct methodology
Antworten
-
the focus is usually on a specific area of the business, such as its marketing or supply chain management.
-
A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
-
Extract Transform Load (ETL)
-
each iteration can then help fine-tune processing steps, algorithms and data models to improve the accuracy of the result and deliver greater value to the business
Frage 198
Antworten
-
generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
-
A next-generation data warehouse establishes a standarized data access layer accross a range of data sources
-
introduces remote environments that can host IT infrastructure for, among other things, large-scale storage and processing
-
the adoption of a big data environment may necessitate that some or all of that environment be hosted witin a cloud
Frage 199
Antworten
-
Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
-
as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
-
upfront capital investment is not available
-
the project is to be isolated from the rest of the business so that existing business processes are not impacted
Frage 200
Antworten
-
the limits of available computing and storage resources used by an in-house Big Data solution are being reached
-
is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
-
the big data initiative is a proof of concept
-
datasets that need to be processed reside in a cloud