🧠 Second Brain

Search

Search IconIcon to open search

Data Modeling Frameworks

Last updated Aug 5, 2024

Besides Data Modeling Tools, there are also helpful frameworks that help you model your data, asking the right questions.

# ADAPT™ for OLAP

ADAPT] says that more than existing data modeling techniques like ER and dimensional modeling are required for OLAP database design. That’s why ADAPT is a modeling technique designed specifically for OLAP databases. It addresses the unique needs of OLAP data modeling. The basic building blocks of ADAPT are cubes and dimensions, which are the core objects of the OLAP multidimensional data model.

Although ADAPT was created for OLAP cubes in the old days, most techniques and frameworks also apply to regular data modeling nowadays. There are nine ADAPT database objects, and their symbols illustrate how to use logos with simple examples.

/blog/data-modeling-for-data-engineering-architecture-pattern-tools-future/images/adapt-legend.jpg

Legend of ADAPT Framework | Source unknown

# Why ADAPT over ER and dimensional modeling

ADAPT is considered superior to ER and dimensional modeling for several reasons:

  1. Incorporation of Both Data and Process: ADAPT incorporates both data and process in its approach, which is particularly useful for designing (OLAP) data marts.
  2. Logical Modeling: ADAPT emphasizes logical modeling. This prevents the designer from jumping to solutions before fully understanding the problem.
  3. Enhanced Communication: ADAPT enhances communication among project team members, providing an everyday basis for discussion. This improved communication leads to higher-quality software applications and data models.
  4. Comprehensive Representation: ADAPT allows for the representation of an (OLAP) application in its entirety without compromising the design due to the limitations of a modeling technique designed for another purpose.

In summary, ADAPT is a says to be a more flexible, comprehensive, and communication-enhancing modeling technique for OLAP databases compared to ER and dimensional modeling.

# BEAM for Agile Data Warehousing

BEAM, or Business Event Analysis & Modeling, is a method for agile requirement gathering designed explicitly for Data Warehouses, created by Lawrence Corr and Jim Stagnitto in the  Agile Data Warehouse Design book. BEAM centers requirement analysis around business processes instead of solely focusing on reports.

It uses an inclusive, collaborative modeling notation to document business events and dimensions in a tabular format. This format is easily understood by business stakeholders and easily implemented by developers. The idea is to facilitate interaction among team members, enabling them to think dimensionally from the get-go and foster a sense of ownership among business stakeholders.

The principles of BEAM include:

Lawrence Corr emphasizes the importance of asking the right questions or “data stories.” For instance, a customer’s product purchase could trigger questions about the order date, purchase and delivery locations, the quantity bought, purchase reason, and buying channel. A comprehensive picture of the business process is formed by carefully addressing these questions and providing the basis for technical specifications.

# Common Data Model

Common Data Models are examples of standard data models, so you do not need to start from scratch. The concept behind these approaches is to transform data contained within those databases into a standard format (data model) and a common representation (terminologies, vocabularies, coding schemes), then perform systematic analyses using a library.

For example, every model needs dimensions such as customer, region, etc. Some references I found are below:

More on Frameworks and data modeling on Data Modeling - The Unsung Hero of Data Engineering: Architecture Pattern, Tools and the Future (Part 3) | ssp.sh.


Origin: Data Modeling
References:
Created 2024-08-05