Trivia Quiz: What Do You Know About Data Mining And Data Warehousing?

Approved & Edited by ProProfs Editorial Team
The editorial team at ProProfs Quizzes consists of a select group of subject experts, trivia writers, and quiz masters who have authored over 10,000 quizzes taken by more than 100 million users. This team includes our in-house seasoned quiz moderators and subject matter experts. Our editorial experts, spread across the world, are rigorously trained using our comprehensive guidelines to ensure that you receive the highest quality quizzes.
Learn about Our Editorial Process
| By Lol4198
L
Lol4198
Community Contributor
Quizzes Created: 1 | Total Attempts: 700
Questions: 52 | Attempts: 700

SettingsSettingsSettings
Trivia Quiz: What Do You Know About Data Mining And Data Warehousing? - Quiz

What do you know about data mining and data warehousing? When it comes to storage of data you may store data haphazardly or may use data warehousing where you can store the same type of data under the same file. Take up the quiz below and get to see how much more you can learn the two data retrieval processing.


Questions and Answers
  • 1. 

    ETL is abbreviated as 

    • A.

      A) Except, Transformation,Load

    • B.

      B) Extract,Transform,Load

    • C.

      C) Exact,Transission,Load

    • D.

      D) None of the above

    Correct Answer
    B. B) Extract,Transform,Load
    Explanation
    ETL stands for Extract, Transform, Load. This refers to the process of extracting data from various sources, transforming it into a suitable format, and then loading it into a target database or data warehouse. The extraction phase involves gathering data from different sources, such as databases, files, or APIs. The transformation phase involves cleaning, filtering, and reformatting the data to meet the requirements of the target system. Finally, the loaded data is stored in a database or data warehouse for further analysis and reporting. Therefore, option b) Extract, Transform, Load is the correct abbreviation for ETL.

    Rate this question:

  • 2. 

    Arrange the following sequence 1.Data 2.Knowledge 3.Information 4.Insight

    • A.

      A) 3,1,4,2

    • B.

      B) 1,3,2,4

    • C.

      C) 4,1,2,3

    • D.

      D) 1,2,3,4

    Correct Answer
    B. B) 1,3,2,4
    Explanation
    The correct arrangement of the sequence is 1,3,2,4. This is because data is raw facts and figures, which are then processed and organized to become information. Knowledge is gained by understanding and interpreting the information. Finally, insight is the deeper understanding or realization that comes from applying knowledge to a specific context. Therefore, the correct order is 1 (Data), 3 (Information), 2 (Knowledge), and 4 (Insight).

    Rate this question:

  • 3. 

    OLTP systems are

    • A.

      A) Operating Systems

    • B.

      B) Purely Decision Support systems

    • C.

      C) Analytical Systems

    • D.

      D) Security systems

    Correct Answer
    A. A) Operating Systems
    Explanation
    OLTP systems refer to Online Transaction Processing systems, which are used for managing and processing real-time transactions in a business. These systems are responsible for handling day-to-day operations such as order processing, inventory management, and customer interactions. They are not purely decision support systems or analytical systems, as their primary function is to facilitate transactional activities rather than analyzing data or generating reports. Additionally, OLTP systems are not security systems, although they may have security features to protect the integrity and confidentiality of data. Therefore, the correct answer is a) Operating Systems.

    Rate this question:

  • 4. 

    Operational Databases are 

    • A.

      Highly de normalized databases

    • B.

      Performance is always slow

    • C.

      Contains historical data

    • D.

      None of the above

    Correct Answer
    D. None of the above
    Explanation
    Operational databases are databases that are designed to support day-to-day operations of an organization. They are typically normalized databases, meaning that the data is organized efficiently to minimize redundancy and improve data integrity. Performance of operational databases can vary depending on various factors such as hardware, software, and database design. While operational databases may contain historical data, it is not a defining characteristic. Therefore, the correct answer is "None of the above".

    Rate this question:

  • 5. 

    Top Down approach is suggested by 

    • A.

      A) Ralph Kimball

    • B.

      B) Bill Inmon

    • C.

      C) Barry Williams

    • D.

      D) Ted Code

    Correct Answer
    B. B) Bill Inmon
    Explanation
    Bill Inmon is the correct answer because he is known for advocating the top-down approach in data warehouse design. Inmon believes that a data warehouse should be built using a normalized relational model, with a focus on integrating and consolidating data from various sources. This approach emphasizes the importance of designing a comprehensive data model and a detailed data architecture before implementing the physical database. In contrast, Ralph Kimball is known for advocating the bottom-up approach, which focuses on building data marts first and then integrating them into a data warehouse.

    Rate this question:

  • 6. 

    It is an automated process of building a data warehouse which involves taking the data from disparate source systems, converting them into a consistent form that can be loaded into the warehouse and performing quality checks while building the data ware house. It typically accounts for 70-80% of the effort in a data warehousing initiative. 

    • A.

      Data Staging

    • B.

      Data Loading

    • C.

      Data Transaction

    • D.

      None of the above

    Correct Answer
    A. Data Staging
    Explanation
    Data staging is the correct answer because it refers to the process of taking data from disparate source systems and converting them into a consistent form that can be loaded into the data warehouse. This process also involves performing quality checks to ensure the data is accurate and reliable. Data staging is a crucial step in building a data warehouse and typically requires a significant amount of effort, accounting for 70-80% of the overall data warehousing initiative.

    Rate this question:

  • 7. 

    Analytical Systems are

    • A.

      A) Contain Historic data

    • B.

      B) Used for Decision making

    • C.

      C) Used for Analysis purpose

    • D.

      D) All of the above

    Correct Answer
    D. D) All of the above
    Explanation
    Analytical systems are tools or platforms that contain historic data, which can be used for decision making and analysis purposes. These systems provide insights and help in making informed decisions based on the analysis of the available data. Therefore, option d) "All of the above" is the correct answer as it encompasses all the mentioned characteristics of analytical systems.

    Rate this question:

  • 8. 

    Which of the following does not describe Data warehouse

    • A.

      A) Subject-oriented

    • B.

      B) Integrated

    • C.

      C) Time-variant

    • D.

      D) Updateable

    Correct Answer
    D. D) Updateable
    Explanation
    Data warehouse is a central repository of integrated and subject-oriented data that is used for reporting and data analysis. It is designed to support complex queries and provide historical data for analysis. However, data warehouses are not updateable in the sense that they are not meant to be constantly updated with real-time data. Instead, they are typically loaded with data periodically, such as daily or weekly, to provide a consistent and stable dataset for analysis. Therefore, option d) is the correct answer as it does not describe the characteristic of a data warehouse.

    Rate this question:

  • 9. 

    DWM bus concept is there in which concept

    • A.

      A) Ralph Kimball

    • B.

      B) Bill Inmon

    • C.

      C) Ted Code

    • D.

      D) John Inmon

    Correct Answer
    A. A) Ralph Kimball
    Explanation
    The correct answer is a) Ralph Kimball. Ralph Kimball is associated with the concept of the Data Warehouse Bus Architecture. This concept involves organizing the data warehouse into a central "bus" structure, where data flows from different source systems to various dimensional data marts. This approach emphasizes the use of dimensional modeling and is known for its flexibility and simplicity in designing data warehouses.

    Rate this question:

  • 10. 

    Which of the following is/are functions of warehouse manager of data warehouse 1.Transforming and managing data 2.Back ups and archiving data warehouse 3.Directing and managing queries

    • A.

      Only 1

    • B.

      Only 2

    • C.

      Only 3

    • D.

      Both 2 and 3

    Correct Answer
    A. Only 1
    Explanation
    The warehouse manager of a data warehouse is responsible for transforming and managing data. This involves tasks such as data integration, data cleansing, data transformation, and data quality management. The manager ensures that the data in the warehouse is accurate, consistent, and up-to-date. They also oversee the storage and organization of data within the warehouse. Backing up and archiving data, as well as directing and managing queries, are not typically within the scope of the warehouse manager's responsibilities.

    Rate this question:

  • 11. 

    In organization the relation b/n projects and employees

    • A.

      A) Many-to-many

    • B.

      B) Many-to-one

    • C.

      C) One-to-one

    • D.

      D) One-to-many

    Correct Answer
    A. A) Many-to-many
    Explanation
    In an organization, the relationship between projects and employees is often a many-to-many relationship. This means that multiple employees can be assigned to multiple projects, and each project can have multiple employees working on it. This type of relationship allows for flexibility and collaboration within the organization, as employees can contribute to different projects and projects can benefit from the expertise of multiple employees.

    Rate this question:

  • 12. 

    In star schema surrounding referencing tables around central factual dimension tables are

    • A.

      Dimensional tables

    • B.

      Fact tables

    • C.

      Relational tables

    • D.

      Temporary tables

    Correct Answer
    A. Dimensional tables
    Explanation
    In a star schema, the central tables are the fact tables which contain the measurements or metrics of the data being analyzed. These fact tables are surrounded by dimensional tables, which provide context and additional information about the data in the fact tables. Dimensional tables are used for categorizing and describing the data in the fact tables, allowing for easy analysis and reporting. Relational tables and temporary tables are not specific to star schemas and may or may not be used in conjunction with dimensional and fact tables.

    Rate this question:

  • 13. 

    Arrange the following steps in the correct sequence 1. Look for essential transactions. 2. Check if the fact is a dimension. 3. Check if dimension is the fact. 4. Determine key dimensions.

    • A.

      A) 1,3,2,4

    • B.

      B) 4,1,3,2

    • C.

      C) 1,4,2,3

    • D.

      D) 1,2,3,4

    Correct Answer
    C. C) 1,4,2,3
    Explanation
    The correct sequence for the given steps is as follows:
    1. Look for essential transactions - This step involves identifying the transactions that are crucial and necessary for the process.
    2. Determine key dimensions - Once the essential transactions are identified, the key dimensions related to those transactions are determined.
    3. Check if dimension is the fact - In this step, it is checked whether the identified dimensions can serve as the main fact or not.
    4. Check if the fact is a dimension - Finally, it is checked whether the main fact identified in the previous step can also function as a dimension in other contexts.

    Rate this question:

  • 14. 

    Which of the following statements is/are true about fact data and dimensional data 1. Fact data represents a physical transaction that has occurred at the point in time and as such is unlikely change during ongoing basis during the life of data warehouse 2. In general dimensional data in a star schema or snowflake schema is designed to minimize the cost of change and is typically very low volume data (i.e. under 5 GB) 3. Fact data will have only one foreign key whereas reference data will have one primary key.

    • A.

      A) Only 1

    • B.

      B) Only 2

    • C.

      C) Only 3

    • D.

      D) Both 1 and 2

    Correct Answer
    D. D) Both 1 and 2
    Explanation
    Both statements 1 and 2 are true. Statement 1 explains that fact data represents physical transactions that have occurred and is unlikely to change during the life of the data warehouse. This means that once the transaction is recorded, it remains unchanged. Statement 2 states that dimensional data in a star schema or snowflake schema is designed to minimize the cost of change and is typically low volume data. This means that the structure of the dimensional data is optimized for efficient querying and analysis, and it is usually not a large dataset. Therefore, the correct answer is d) Both 1 and 2.

    Rate this question:

  • 15. 

    Which of the following statements is/are false about OLAP tools 1. Do not learn 2. Create new knowledge 3. More powerful than data mining 4. Cannot search for new solutions

    • A.

      A) Only 1

    • B.

      B) Only 3

    • C.

      C) Both 1 and 2

    • D.

      D) Both 2 and 3

    Correct Answer
    B. B) Only 3
    Explanation
    OLAP tools are not more powerful than data mining. Data mining involves the process of discovering patterns and relationships in large datasets, often using complex algorithms and statistical techniques. It is a powerful tool for extracting valuable insights and knowledge from data. On the other hand, OLAP tools are primarily used for analyzing and querying multidimensional data, allowing users to perform advanced calculations and aggregations. While OLAP tools can provide valuable insights, they do not have the same level of complexity and advanced functionality as data mining tools. Therefore, statement 3 is false.

    Rate this question:

  • 16. 

    Arrange the following 1. Staging Data 2. Persistent staging 3. History staging 4. ODS 5. Data warehouse 6. Datamart

    • A.

      A) 1,2,3,4,5,6

    • B.

      B) 1,2,4,3,6,5

    • C.

      C) 1,3,2,4,5,6

    • D.

      D) None of the above

    Correct Answer
    C. C) 1,3,2,4,5,6
    Explanation
    The correct answer is c) 1,3,2,4,5,6. This order represents the typical flow of data in a data warehousing environment. Staging data is the first step, where raw data is loaded into a staging area. Then, history staging is performed to track changes and maintain a historical record of data. Next, the data is loaded into the Operational Data Store (ODS), which acts as a central repository for current and integrated data. After that, the data is transformed and loaded into the data warehouse, which is a centralized and structured database for reporting and analysis. Finally, data marts are created to provide a subset of data tailored to specific business needs.

    Rate this question:

  • 17. 

    What is Business Intelligence?

    • A.

      A) Process

    • B.

      B) Tool

    • C.

      C) Technique

    • D.

      D) Software

    Correct Answer(s)
    A. A) Process
    B. B) Tool
    C. C) Technique
    Explanation
    Business Intelligence refers to the process of collecting, analyzing, and interpreting data to make informed business decisions. It involves gathering data from various sources, transforming it into meaningful insights, and using those insights to drive strategic actions. While it can involve the use of tools and software, Business Intelligence is primarily a process that encompasses the entire lifecycle of data analysis, from data collection to reporting. It also involves the application of various techniques, such as data mining, data visualization, and predictive analytics, to extract valuable insights from the data.

    Rate this question:

  • 18. 

    Who invented the top-down approach?

    • A.

      A) Inmon

    • B.

      B) Ralph Kimball

    • C.

      C) Loney

    • D.

      D) None of the above

    Correct Answer
    A. A) Inmon
    Explanation
    Inmon is credited with inventing the top-down approach. This approach involves designing a data warehouse by starting with the overall business requirements and then breaking them down into smaller, more specific components. Inmon's approach emphasizes the importance of creating a centralized and integrated data model that can be used for various reporting and analysis purposes. This approach has been widely adopted in the field of data warehousing and has proven to be effective in managing and organizing large volumes of data.

    Rate this question:

  • 19. 

    Which of the following data modeling?

    • A.

      OLAP

    • B.

      OLTP

    • C.

      HOLAP

    • D.

      MOLAP

    Correct Answer
    A. OLAP
    Explanation
    OLAP stands for Online Analytical Processing, which is a data modeling technique used for analyzing and reporting data. It allows users to perform complex multidimensional analysis, such as slice-and-dice, drill-down, and roll-up operations, on large volumes of data. OLAP is specifically designed for decision support systems and provides a fast and efficient way to access and analyze data from multiple perspectives. It is commonly used in business intelligence applications to support strategic decision-making processes.

    Rate this question:

  • 20. 

    The process of examining the data available in existing data source and collecting statistics and information about the data

    • A.

      Data Cleansing

    • B.

      Data Mining

    • C.

      Data Profiling

    • D.

      Data Presentation

    Correct Answer
    C. Data Profiling
    Explanation
    Data profiling refers to the process of examining the data available in an existing data source and collecting statistics and information about the data. It involves analyzing the structure, content, and quality of the data to understand its characteristics and identify any issues or anomalies. This process helps in gaining insights into the data, such as its completeness, accuracy, consistency, and uniqueness. Data profiling is an essential step in data management and data quality improvement, as it provides a foundation for data cleansing, data integration, and data analysis.

    Rate this question:

  • 21. 

    OLTP data is the combination of master data and 

    • A.

      Dimensional data

    • B.

      Aggregated data

    • C.

      Details data

    • D.

      None of the above

    Correct Answer
    A. Dimensional data
    Explanation
    OLTP data refers to online transaction processing data, which is the data generated from day-to-day operational activities of a business. It includes both master data, which is the core data about entities such as customers, products, and employees, and dimensional data, which is the data organized into dimensions or categories for analysis purposes. Aggregated data refers to summarized data, while details data refers to the raw, granular data. Therefore, the correct answer is dimensional data, as it is a part of OLTP data along with master data.

    Rate this question:

  • 22. 

    What does fact less fact table contains

    • A.

      Measure only

    • B.

      Multipart key only

    • C.

      Both

    • D.

      None

    Correct Answer
    B. Multipart key only
    Explanation
    A factless fact table contains only a multipart key, meaning it consists of multiple foreign keys from different dimension tables. It does not contain any measures or numeric data. This type of fact table is used to represent events or occurrences that have no numerical value associated with them, such as a sales order being placed or a customer joining a loyalty program. The purpose of a factless fact table is to capture relationships between dimensions and provide a way to analyze the occurrences or events based on their attributes.

    Rate this question:

  • 23. 

    Full form of ODS

    • A.

      Operational data storage

    • B.

      Operational data system

    • C.

      Online data storage

    • D.

      None

    Correct Answer
    A. Operational data storage
    Explanation
    The correct answer is "Operational data storage." Operational data storage refers to a system or database that is used to store and manage real-time operational data within an organization. This type of storage is designed to support the day-to-day operations and decision-making processes of a business. It allows for the storage and retrieval of data related to transactions, customer interactions, inventory levels, and other operational activities.

    Rate this question:

  • 24. 

    The key which is substitution for natural primary key in data warehouse

    • A.

      Unique key

    • B.

      Natural key

    • C.

      Foreign key

    • D.

      Surrogate key

    Correct Answer
    D. Surrogate key
    Explanation
    A surrogate key is a key that is generated by the system and used as a substitution for a natural primary key in a data warehouse. It is typically a simple integer value that has no meaning or significance outside of the data warehouse. Surrogate keys are used to uniquely identify records and provide a stable identifier for data integration and manipulation purposes. They are often preferred over natural keys because they are more efficient for indexing, joining, and sorting operations in a data warehouse environment.

    Rate this question:

  • 25. 

    What is dimensional modeling?

    • A.

      Equivalent of physical data design in DWH

    • B.

      Equivalent of logical data design in DWH

    • C.

      Equivalent of data base design in DWH

    • D.

      Equivalent of conceptual data design in DWH

    Correct Answer
    B. Equivalent of logical data design in DWH
    Explanation
    Dimensional modeling is the equivalent of logical data design in a data warehouse (DWH). It involves designing the structure and relationships of the data in a way that is optimized for reporting and analysis. This includes identifying and defining the dimensions (categories or attributes) by which the data will be analyzed, as well as the measures (quantitative data) that will be used. Dimensional modeling helps to organize and structure the data in a way that is intuitive and easy to understand for end users, enabling efficient querying and analysis.

    Rate this question:

  • 26. 

    Dimension used in multiple schemas is called

    • A.

      Conformed dimension

    • B.

      Confirmed dimension

    • C.

      Shared dimension

    • D.

      None of the above

    Correct Answer
    A. Conformed dimension
    Explanation
    A dimension used in multiple schemas is called a conformed dimension. This means that the dimension is consistent and standardized across different schemas or data models. It allows for integration and comparison of data from different sources or systems. By using a conformed dimension, organizations can ensure that data is accurately and consistently represented, improving data quality and enabling meaningful analysis across different data sets.

    Rate this question:

  • 27. 

    Arrange the following 1.Presentation Layer 2.ODS Layer 3.Replica Layer 4.DWH Layer

    • A.

      3,1,4,2

    • B.

      3,2,4,1

    • C.

      1,2,3,4

    • D.

      None of the above

    Correct Answer
    B. 3,2,4,1
    Explanation
    The correct answer is 3,2,4,1. In a typical data architecture, the data flows from the Presentation Layer to the ODS (Operational Data Store) Layer, then to the Replica Layer, and finally to the DWH (Data Warehouse) Layer. The Presentation Layer is responsible for displaying the data to the end-users. The ODS Layer is used to store operational data for immediate access and processing. The Replica Layer is used for data replication and synchronization. And the DWH Layer is used for long-term storage and analysis of historical data.

    Rate this question:

  • 28. 

    A good ETL tool must be able to communicate with many different

    • A.

      Relational Databases

    • B.

      Sources

    • C.

      Targets

    • D.

      Flat files

    Correct Answer
    B. Sources
    Explanation
    A good ETL tool must be able to communicate with many different sources, including relational databases, targets, and flat files. This is important because data can be stored in various formats and locations, and the ETL tool needs to be able to extract data from these different sources in order to transform and load it into the desired destination. By being able to communicate with a wide range of sources, the ETL tool provides flexibility and versatility in handling diverse data sources, making it an effective tool for data integration and management.

    Rate this question:

  • 29. 

    Types of schemas in DWH

    • A.

      Star schema, Snowflake schema, 3NF

    • B.

      Snowflake schema, 2NF

    • C.

      Snowflake schema, Star schema

    • D.

      Star schema

    Correct Answer
    C. Snowflake schema, Star schema
    Explanation
    The correct answer is Snowflake schema, Star schema.

    A snowflake schema is a type of schema used in data warehousing where dimensions are normalized into multiple tables, creating a more complex structure. This allows for more efficient storage and reduces redundancy. On the other hand, a star schema is a simpler type of schema where dimensions are denormalized into a single table, resulting in a more straightforward structure. Both snowflake and star schemas are commonly used in data warehousing, but they have different levels of complexity and normalization.

    Rate this question:

  • 30. 

    Which one dimension is Role Playing Dimension?

    • A.

      Degenerate dimension

    • B.

      Junk dimension

    • C.

      Time dimension

    • D.

      Mystery dimension

    Correct Answer
    C. Time dimension
    Explanation
    The role-playing dimension refers to a dimension that can be used to represent different roles or perspectives within a data model. The time dimension is often used in data models to analyze data over different time periods, such as days, months, or years. It allows for the analysis of data trends and patterns over time, making it a suitable candidate for the role-playing dimension.

    Rate this question:

  • 31. 

    What type of dimension use “Rolling window operation”?

    • A.

      Degenerate dimension

    • B.

      Junk dimension

    • C.

      Time dimension

    • D.

      Mystery dimension

    Correct Answer
    A. Degenerate dimension
    Explanation
    A degenerate dimension is a dimension that is derived from the fact table itself, meaning it does not have its own separate dimension table. It is typically used to store transactional or operational data that does not fit into any other dimension. Rolling window operation refers to performing calculations or aggregations on a sliding or moving time window. Since degenerate dimensions are directly related to the fact table and often involve time-based calculations, they are the most likely type of dimension to use rolling window operations.

    Rate this question:

  • 32. 

    Which one of following is the standard methodology followed for building data warehouse?

    • A.

      Extract, Transform, Load

    • B.

      Transform, Extract, Load

    • C.

      Load, Transform, Extract

    • D.

      None of the above

    Correct Answer
    A. Extract, Transform, Load
    Explanation
    The correct answer is "Extract, Transform, Load". This is the standard methodology followed for building a data warehouse. In this process, data is first extracted from various sources, then transformed to fit the data warehouse schema and business rules, and finally loaded into the data warehouse for analysis and reporting purposes. This methodology ensures that the data is cleansed, integrated, and organized in a consistent and meaningful way, enabling effective data analysis and decision-making.

    Rate this question:

  • 33. 

    • Star schema stores data in form of

    • A.

      1NF

    • B.

      2NF

    • C.

      3NF

    • D.

      4NF

    Correct Answer
    B. 2NF
    Explanation
    The star schema stores data in the second normal form (2NF). In 2NF, the data is organized in a way that eliminates redundant data by separating them into separate tables. The star schema is a type of dimensional modeling used in data warehousing, where a central fact table is surrounded by multiple dimension tables. This design allows for efficient querying and analysis of data.

    Rate this question:

  • 34. 

    Fact tables are normalized

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    Fact tables in a data warehouse are typically normalized to reduce redundancy and improve data integrity. Normalization involves breaking down the data into smaller tables and linking them through relationships. This helps in reducing data duplication and ensures that updates or changes to the data are consistent across the database. By normalizing fact tables, it becomes easier to query and analyze the data efficiently. Therefore, the statement "Fact tables are normalized" is true.

    Rate this question:

  • 35. 

    Static extract is used for ongoing warehouse maintenance

    • A.

      True

    • B.

      False

    Correct Answer
    B. False
    Explanation
    The statement is false because a static extract is not used for ongoing warehouse maintenance. A static extract is a one-time extraction of data from a source system to be used for analysis or reporting purposes. It is not meant for ongoing maintenance tasks in a warehouse.

    Rate this question:

  • 36. 

    For “point in time” concept in DWH, history staging is must

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    The "point in time" concept in data warehousing refers to capturing and storing data as it existed at a specific moment in time. This allows for historical analysis and reporting. In order to implement this concept effectively, history staging is necessary. History staging involves storing historical versions of data, enabling the tracking of changes over time. Without history staging, it would be difficult to accurately capture and analyze data at specific points in time. Therefore, the statement "For 'point in time' concept in DWH, history staging is a must" is true.

    Rate this question:

  • 37. 

    A conditional report is based on data gathered, then analyzed during report generation, which reports out the results of analysis in report’s output

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    A conditional report is a type of report that is generated based on data that has been gathered and analyzed. The analysis of the data is done during the report generation process, and the results of the analysis are then reported in the output of the report. This means that the report is dependent on the data and the analysis that has been performed on it. Therefore, the statement "A conditional report is based on data gathered, then analyzed during report generation, which reports out the results of analysis in report's output" is true.

    Rate this question:

  • 38. 

    Can we use single dimension in multi schema

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    Yes, we can use a single dimension in multiple schemas. In a data warehouse, schemas are used to organize and structure the data. A dimension represents a specific attribute or characteristic of the data. By using a single dimension in multiple schemas, we can ensure consistency and avoid redundancy in the data. This allows us to analyze and report on the data from different perspectives without duplicating the dimension data.

    Rate this question:

  • 39. 

    Data warehouse must be de-normalized

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    A data warehouse is designed to support complex analytical queries and reporting, rather than transactional processing. Denormalization involves combining multiple tables and duplicating data to improve query performance and simplify data retrieval. By denormalizing the data warehouse, it reduces the number of joins required in queries, which can significantly improve query performance. Therefore, it is true that a data warehouse must be denormalized to optimize its performance for analytical purposes.

    Rate this question:

  • 40. 

    ODS is used for day to day decision

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    ODS stands for Operational Data Store, which is a database that is designed to support day-to-day decision-making processes. It stores real-time or near real-time data from various operational systems, making it readily available for analysis and reporting. This allows organizations to make informed decisions based on current and accurate data. Therefore, the statement "ODS is used for day to day decision" is true.

    Rate this question:

  • 41. 

    Dimension are normalized

    • A.

      True

    • B.

      False

    Correct Answer
    B. False
    Explanation
    The statement "Dimensions are normalized" is false. Normalizing dimensions means scaling them to a common range or unit. However, in this context, it is unclear what dimensions are being referred to, so a proper explanation cannot be provided.

    Rate this question:

  • 42. 

    In warehouse data is stored with bitmap index

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    In a warehouse, data is stored with a bitmap index. This means that the data is organized and indexed using a bitmap, which is a data structure that represents a set of bits. Each bit in the bitmap corresponds to a specific value or attribute in the data, and its value indicates whether or not the corresponding value is present in the data. This type of indexing allows for efficient querying and analysis of the data, as it can quickly determine the presence or absence of specific values or combinations of values. Therefore, the statement "True" is correct.

    Rate this question:

  • 43. 

    All data bases must be in third normal form

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    The statement is true because the third normal form (3NF) is a level of database normalization that ensures data is organized efficiently and eliminates redundancy. In 3NF, every non-key attribute is dependent on the primary key, and there are no transitive dependencies between non-key attributes. By adhering to 3NF, databases can minimize data duplication and improve data integrity and consistency.

    Rate this question:

  • 44. 

    Derived data are detailed, current data that are intended to be single, authoritative data for all decision support applications

    • A.

      True

    • B.

      False

    Correct Answer
    B. False
    Explanation
    Derived data refers to data that is derived or calculated from other data sources. It is not necessarily intended to be single or authoritative data for decision support applications, as it is dependent on the accuracy and reliability of the data sources it is derived from. Therefore, the given statement is false.

    Rate this question:

  • 45. 

    Every key used to join fact table with dimensional table should be a surrogate key

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    In a data warehouse, a fact table contains the measurements, metrics, or facts of a business process, while a dimensional table provides the context or dimensions for these facts. To establish a relationship between the fact and dimensional tables, a key is used. A surrogate key is a system-generated unique identifier that replaces the natural key in a table. Using surrogate keys ensures that the join between the fact and dimensional tables is efficient and avoids any potential issues with natural keys, such as changes or duplicates. Therefore, every key used to join the fact table with the dimensional table should be a surrogate key.

    Rate this question:

  • 46. 

    Periodic data are data that are physically altered once added to the store

    • A.

      True

    • B.

      False

    Correct Answer
    B. False
    Explanation
    Periodic data are not data that are physically altered once added to the store. Instead, periodic data refers to data that is collected or recorded at regular intervals, such as daily, weekly, or monthly. This type of data is used to analyze trends or patterns over time. Therefore, the correct answer is False.

    Rate this question:

  • 47. 

    Star schema is generally suited to online transaction processing and therefore is generally used in operational systems, operational data stores and EDW

    • A.

      True

    • B.

      False

    Correct Answer
    B. False
    Explanation
    Star schema is not generally suited to online transaction processing. It is actually designed for data warehousing and analytical processing. This schema is used to organize data into a central fact table surrounded by dimension tables, which allows for efficient querying and analysis of data. Therefore, it is commonly used in data warehousing environments, operational data stores, and enterprise data warehouses (EDW), rather than in operational systems.

    Rate this question:

  • 48. 

    Data in data warehouse are retrieved and loaded from operational systems

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    Data in a data warehouse is retrieved and loaded from operational systems. This means that the data warehouse collects and integrates data from various operational systems within an organization. The purpose of this is to provide a centralized and consistent source of data for reporting and analysis purposes. By retrieving and loading data from operational systems, the data warehouse ensures that the information stored is accurate, up-to-date, and suitable for decision-making processes.

    Rate this question:

  • 49. 

    In snow flake we don’t have any flake

    • A.

      True

    • B.

      False

    Correct Answer
    B. False
  • 50. 

    In star schema dimensional tables are usually not in BCNF form

    • A.

      True

    • B.

      False

    Correct Answer
    A. True
    Explanation
    In a star schema, the dimensional tables are designed to optimize query performance by denormalizing the data. This means that redundant data is intentionally introduced into the tables to avoid complex joins and improve query response time. BCNF (Boyce-Codd Normal Form) is a higher level of normalization that eliminates all functional dependencies within a table. However, in a star schema, dimensional tables often contain redundant data and are not in BCNF form to support efficient querying. Therefore, the statement that dimensional tables in star schema are usually not in BCNF form is true.

    Rate this question:

Quiz Review Timeline +

Our quizzes are rigorously reviewed, monitored and continuously updated by our expert board to maintain accuracy, relevance, and timeliness.

  • Current Version
  • Mar 21, 2023
    Quiz Edited by
    ProProfs Editorial Team
  • Oct 15, 2012
    Quiz Created by
    Lol4198
Back to Top Back to top
Advertisement
×

Wait!
Here's an interesting quiz for you.

We have other quizzes matching your interest.