Enhanced Mixed Load (BW-EML) (Obsolete)

The SAP BW Enhanced Mixed Load Benchmark (BW-EML Benchmark) meets the current demands of typical business warehouse customers. These demands are mainly coined by three major requirements:

  • Near real-time reporting – Getting instant results from analytical applications on last minute data nowadays is crucial for timely decision making. Typical examples for real-time data analysis include smart metering or trade promotion management.
  • Ad-hoc reporting capabilities – During the last few years, data volumes in data warehouses have dramatically grown. One reason for this growth is the increased complexity and detail level of the data, which in turn requires much more sophisticated and complex analysis methods. As a result, analytical applications are supposed to facilitate navigating through huge amounts of data by providing extensive slicing and dicing functionality. This makes it inherently difficult to foresee frequent navigation patterns and pre-calculate intermediate results to speed up reporting performance. Ad-hoc type query capabilities are required to satisfy these demands.
  • Reduction of TCO – Typical sizes of today's data warehouses comprise tens or hundreds of terabytes of data. It is therefore crucial to keep data redundancy at a low level, but still be able to maintain layered data models. SAP NetWeaver BW 7.30 helps reduce the total cost of ownership by allowing reports to run directly on DataStore objects which are the core building elements of a layered warehousing architecture, often eliminating the need to maintain data redundantly in multi-dimensional InfoCube data structures.

The latest addition to the family of SAP BW Application Benchmarks – the BW-EML Benchmark – has been developed with these three customer requirements in mind.

Like its predecessor the BW-MXL Benchmark, the EML Benchmark focuses on a mix of multi user reporting load and delta data that is loaded into the database simultaneously to the queries.

Benchmark Data Model

The data model consists of three InfoCubes and seven DataStore objects. Each of these objects holds data of one particular year. The three InfoCubes hold the same data as the corresponding DataStore objects for the last three years. Both object types have the same set of fields. The InfoCube comes with a full set of 16 dimensions which comprise a total of 63 characteristics, having cardinalities of up to 1 million different values and one complex hierarchy. With its 30 different key figures, including key figures requiring exception aggregation, the InfoCube data model has been defined in close accordance with typical customer data models. In the DataStore object data model, the high cardinality characteristics have been defined as key members, while other characteristics have been modeled as part of the data members.

Data Volumes

SAP BW-EML Benchmark can be executed with various different data volumes. In its smallest configuration, the benchmark rules require an initial load of a total of 500 million records (i.e. 50 million records per InfoCube / DataStore object) coming from ASCII flat files. Further possible configurations include initial load volumes of 1000 million and 2000 million records and more. Even larger data volumes can be defined for distributed server landscapes. The total record length in the ASCII files is 873 bytes. In any case, the total number of records that needs to be loaded in addition to the initial load is one thousandth of the initial records. A single benchmark run is supposed to last at least one hour. During this time, the delta data have to be loaded in small chunks every five minutes. Each InfoCube and DataStore object has to be loaded with the same number of records.

Query Model

Eight reports have been defined on two MultiProviders – one MultiProvider for the three InfoCubes, and another MultiProvider for the seven DataStore objects. Since the InfoCubes and DataStore objects have the same set of fields, the respective reports on both MultiProviders are identical so that we actually have two sets of four queries each.

Reports select data for one particular year, picking the InfoCube or DataStore object containing the data randomly. Within one report further navigation steps are executed, each of them resulting in an individual query and a database access. Although the first three reports follow similar navigation patterns, the filter and drill-down operations have been randomized to address the demand for ad-hoc types of queries. While random values for filter parameters make sure that different partitions of data are accessed, a random choice of characteristic that are used for drill downs or other slice-and-dice operations makes sure that a huge number of different characteristics combinations is covered in a multi user reporting scenario. In order to guarantee a high degree of reproducibility of the reporting results, characteristics have been grouped by their respective cardinalities, and only characteristics of the same cardinality are considered for a randomized operation.

The key figure of this benchmark is the number of ad-hoc navigation steps/hour. Given the differences of the queries and data models, results of the EML benchmark cannot be compared to those of the MXL Benchmark.

Benchmark Results

BW-EML Internet configuration, SAP NetWeaver 7.30

Want to learn more?

Contact us or call the SAP sales office nearest you.