Get better Performance of Data Warehouse

A data warehouse has several characteristics-dimensions-responsible for its performance. Often only one dimension contributes to poor data warehouse performance. It may be , narrow network bandwidth causes delays in moving data from transaction systems as well as delivering information to end users while data transformation and end-user query execution are not performance issues. To end users, poor network bandwidth is perceived as poor data warehouse performance.

The following figure shows characteristics of a high-performance data warehouse. It shows that this data warehouse supports a large number of users. Most users issue less complex queries without any performance issues and have a low data warehouse operations/maintenance time requirement, making this data warehouse highly available. Under this environment, everyone is satisfied with the data warehouse performance. In reality, this seldom happens.

The number of users, is not a good indication of data warehouse performance. One hundred users may be issuing simple queries and not have a problem until one analyst starts a complex analytical job that virtually joins a large number of tables to do complex sales trend analysis. This exercise will probably block the data warehouse server for hours. Data warehouse architects must design governing features to control such runaway resource-consuming processes during peak hours.

The consequence of not having complete order operation information in the data warehouse is that the planning, finance, sales, and marketing organizations will not have a full view of corporate operations, such as product inventories and what to stock to fulfill consumers' demands.

The performance issue here is that extracting complete data sets from OLTP and loading that information in a data warehouse all in one step can consume significant OLTP and data warehouse resources, such as the locking up of source and target data objects, network bandwidth, and CPU/memory usage.

Solution is to keep the data extraction process out of the daily OLTP maintenance operation and break down large data extraction processes into multiple tasks. Each task is scheduled several times during regular OLTP business operations to extract and move new data in the data warehouse in an operational data store. Then refresh the data warehouse once or twice a day by combining all incremental data sets from the operational data store without touching the OLTP systems. (1.4)

ABAP TOPIC WISE COMPLETE COURSE

BDC OOPS ABAP ALE IDOC'S BADI BAPI Syntax Check
Interview Questions ALV Reports with sample code ABAP complete course
ABAP Dictionary SAP Scripts Script Controls Smart Forms
Work Flow Work Flow MM Work Flow SD Communication Interface

How to Construct Data Warehouse

The previous post of the blog deals with the basic components of Data ware housing. Here we are going to see the technical issues that have to be taken cared while designing the data warehouse.

Global Information Delivery

The Internet and intranets are the primary vehicles for information delivery. There must be robustness (scalability, security, reliability, and cost) of Web services needed to deliver information to end users. It is critical that such Web services provide seamless integration and provide similar development/management environments with the rest of the data objects in all data warehouse layers.

Global Catalog

Global catalog goes hand in hand with the information delivery services. It is hard to find what you need because catalogs are based on an individual data warehouse. End users spend a lot of time finding what they need. The support of one global catalog is a key component of a global information delivery system.

Metadata Management

Metadata is information about the data such as data source type, data types, content description and usage, transformations/derivations rules, security, and audit control attributes. Access to metadata is not limited to the data warehouse administrator but must also be made available to end users. For example, users want to know how revenue figures are calculated. Metadata defines rules to qualify data before storing it in the database. The end result is a data warehouse that contains complete and clean data.

Manageability

At present scenario data objects are distributed across the world and also reside on end-user workstations (laptops), it is a difficult to manage such environments. As part of its Customer Relationship Management (CRM) initiative, SAP is planning a Mobile Sales Automation (MSA) server that integrates and manages data between the SAP Business Information Warehouse and the data sets on a salesperson's laptop.

Adaptation of New Technologies

Implementation of an enterprise data warehouse is usually a multi-year project. echnically, as long as you have built your data warehouse based on an architecture using accepted industry standard APIs, you should be able to incorporate emerging technologies without extensive reengineering.

If your existing data warehouse environment uses open APIs, you can easily join information from multidimensional and relational data sources using ODBC. ODBO, for example, integrates multidimensional data from SAP BW and Microsoft's OLAP server. It shall be made sure that the data warehouse can adopt new emerging technologies with the least amount of work.

Security

A data warehouse environment must support a very robust security administration by using roles and profiles that are information object behavior-centric rather than pure database-centric. For example, a role such as cost center auditor defined in a data warehouse allows one to view cost center information for a specific business unit for a given quarter but not to print or download cost center information to the workstation.

Reliability and Availability

New data warehouse construction products provide methods to make systems highly available during data refresh. Products like SAP BW refresh a copy of an existing data object for incoming new data updates while end users keep using the existing information objects. However, when a new information object is completely refreshed, all new requests are automatically pointed to the newly populated information object. Such technologies must be an integral component of enterprise data warehouses.

Upgradability and Continual Improvements

If any component of a data warehouse (database management system, hardware, network, or software) needs upgrades, it must not lock out users, preventing them from doing their regular tasks. Moreover, any time a new functionality is added to the environment, it must not disrupt end-user activities. One can apply certain software patches or expand hardware components (storage) while users are using the data warehouse environment. During such upgrades, end users may notice some delays in retrieving information, but they are not locked out of the system.

Scalability and Data Distribution

Due to large data volume movement requirements, data warehouses consume enormous network resources (four- to five-fold more than a typical OLTP transaction environment). In an OLTP environment, one can predict network bandwidth requirements because data content associated with each transaction is somewhat fixed. In data warehousing, it is very hard to estimate network bandwidth to meet end-user needs because the data volume may change for each request based on data selection criterion. Large data sets are distributed to remote locations across the world to build local data marts. The network must be scalable to accommodate large data movement requests.

To build dependent data marts, you need to extract large data sets from a data warehouse and copy them to a remote server. A data warehouse must support very robust and scalable services to meet data distribution and/or replication demands. These services also provide encryption and compression methods to optimize usage of network resources in a secured fashion.

This can be demonstrated as shown below.


ABAP TOPIC WISE COMPLETE COURSE

BDC OOPS ABAP ALE IDOC'S BADI BAPI Syntax Check
Interview Questions ALV Reports with sample code ABAP complete course
ABAP Dictionary SAP Scripts Script Controls Smart Forms
Work Flow Work Flow MM Work Flow SD Communication Interface


Data Ware house components

Data Provider Layer

The Data Provider layer is the primary gateway to the data sources (OLTP applications or other). Major tasks performed at this layer provide an environment in which to construct subject-oriented data analysis models.

Metadata (data about the data) is pulled in to a data warehouse environment from its data sources. Extraction, Transformation, and Transport services fetch data from data sources, qualify, perform value-added data manipulation, and push data out to data warehouse data objects. Key services performed at this layer are the following:

  1. Data Transport

  2. Data Transformation

  3. Data Cleansing

  4. Data Extraction

  5. Subject Models

Service Provider Layer

The Service Provider layer is responsible for managing and distributing data objects across the Enterprise to support business intelligence activities in a controlled and secured fashion. At this layer, data is further transformed for specific data analysis tasks such as drill-down analysis and predefined reports integration with third-party subscribed data.

Moreover, data/text-mining data services transform data into knowledge by applying analytical techniques or combining structured and unstructured data, such as building a Web-centric environment by authoring Web pages to merge product sales data, charts, graphs, and product images. This layer is very complex within the data warehouse architecture. Key services performed at this layer are the following:

  1. Analytical Applications Integration

  2. Data Distribution

  3. Data Profiling

  4. Data Partitioning

  5. Information Authoring

  6. Data Consolidatio
  7. Data Staging

  8. Data Storage

The Information Consumer layer

It accesses information objects from a data warehouse. Information delivery services, provided by service providers, must be robust enough to handle large data volumes and multimedia objects. Keep in mind that not all end users have the same needs. Some users simply want to look at a list of numbers.

Some may want to have push-button models to see charts and graphs, and analysts may need access to large volumes of data to do extensive data analysis. Data access and delivery services must be robust enough to handle all such scenarios. Key services performed at this layer are the following:

  1. Information Presentation

  2. Search Engines

  3. End-User Data Synchronization

  4. Data Conversions

  5. Information Access APIs

  6. Information Consumer Profiling

  7. Global Catalogs

  8. Information Delivery

Data Warehouse Management Layer

The Data Warehouse Management layer provides services to manage all data objects in all layers. Additional services at this layer include component installation, monitoring, tuning, scheduling, networking, database operations, and component problem tracking/analysis that can be performed globally. Key services performed at this layer are the following:

  1. Governing Services

  2. Track Resource Utilization

  3. Audit and Controls

  4. Scheduling

  5. Client Profile

  6. Multi-Tiered Models

  7. Warehouse Operations

  8. Source/Target Management

  9. Data Dictionary

  10. Development Management

  11. Hardware/Software

  12. Security

  13. Metadata

It is demonstrated as shown below.

Previous Posts are about Procurement to payment process SAP XI Introduction EDI Process and Compliance software testing

ABAP TOPIC WISE COMPLETE COURSE

BDC OOPS ABAP ALE IDOC'S BADI BAPI Syntax Check
Interview Questions ALV Reports with sample code ABAP complete course
ABAP Dictionary SAP Scripts Script Controls Smart Forms
Work Flow Work Flow MM Work Flow SD Communication Interface

Procurement to payables Process

Procurement/purchasing activities

The procurement process commences with determining the requirement for materials/services. A purchase requisition should be raised for all goods/services that the organisation procures. Requisitions may be raised with reference to a contract or outline agreement which specifies a certain volume of a material that should be purchased from a particular vendor.

The system can also suggest an appropriate vendor for the material/service being procured via source determination (a vendor may be selected from a source list). The ability to approve/release purchase requisitions should be controlled via release procedures in the SAP system to ensure that only employees with the appropriate authority can authorise a purchasing transaction. Where appropriate, the system should enforce a tender evaluation process via ‘request for quotation’ functionality.

Purchase orders can then be created based on the requisitions and any related quotations. Purchase orders should not be created without reference to a purchase requisition. Any changes to the purchase requisition or purchase order should be subject to the appropriate approval procedures. Appropriate reports should be used to monitor long outstanding purchase orders. Master data relevant to the procurement cycle includes the vendor master and material master files.
Goods receipt

A goods receipt or the entry and acceptance of services should be performed for each purchase order. The goods receipt should be processed with reference to the corresponding purchase order. The entry and acceptance of services should be separated to ensure the appropriate authorization of services accepted for payment.

Invoice processing

Invoices are processed with reference to the appropriate purchase order and goods receipt transactions via the invoice verification transaction. Invoices that do not have a valid purchase order can be processed separately via the financial accounting accounts payable module. Any invoices that do not match the purchase order and goods receipt details within defined tolerances are automatically blocked for payment.

Invoices that do not relate to a valid purchase order in the system (eg utility payments) should be processed via the financial accounting accounts payable module. These invoices are not subject to the electronic approval enforced at the requisition stage and can be subject to authorisation controls at the invoice stage via payment release procedures used in conjunction with the ‘park and post’ functionality and SAP Workflow, to ensure that all invoices beyond a certain amount are authorised in accordance with approved delegation levels.

Payment processing

Payment runs should be performed on a regular basis and all payment reports, including payment exceptions (ie. blocked invoices), should be reviewed to ensure all payments are reasonable. Payments and the claiming of prompt payment discounts are driven by the payment terms, which should be entered in the vendor master record and should not be changed in the purchase order or invoice. Manual cheques should not be used, as the payment program can be run as frequently as required to process payments. This ensures an adequate level of control over payments and reduces the risk of unauthorized payments.
Previous Post is about SAP XI Introduction

ABAP TOPIC WISE COMPLETE COURSE

BDC OOPS ABAP ALE IDOC'S BADI BAPI Syntax Check
Interview Questions ALV Reports with sample code ABAP complete course
ABAP Dictionary SAP Scripts Script Controls Smart Forms
Work Flow Work Flow MM Work Flow SD Communication Interface

SAP XI What is it ?

SAP XI is an integration technology and platform…
  1. …for SAP and non-SAP applications.
  2. …for A2A and B2B scenarios.
  3. …for asynchronous and synchronous communication.
  4. …for cross-component Business Process Management.
A goal of the Exchange Infrastructure (XI) is to provide a single point of integration for all systems, SAP and non-SAP, inside and outside the corporate boundary.

1.The XI supports B2B as well as A2A exchanges; supports synchronous and asynchronous message exchange; and includes a built-in engine for designing and executing Integration Processes (Business Processes).

2. An important feature of the XI is that it provides openness and transparency to the integration process.

SAP NetWeaver is the integration and application platform for mySAP solutions; XI represents the Process Integration layer of the NetWeaver stack, and is a crucial element of the Enterprise Services Architecture (ESA).

It Unifies and aligns people, information and business processes and

1. Integrates across technologies and organizational boundaries
2. A safe choice with full .NET and J2EE interoperability

The business foundation for SAP and partners

1. Powers business-ready solutions that reduce custom integration

2. Its Enterprise Services Architecture increases business process flexibility(10)

Related Posts :

SAP XI File adapter
XI adapter flow deployment of variants
EDI converter for SAP and EDI standards

ABAP TOPIC WISE COMPLETE COURSE

BDC OOPS ABAP ALE IDOC'S BADI BAPI Syntax Check
Interview Questions ALV Reports with sample code ABAP complete course
ABAP Dictionary SAP Scripts Script Controls Smart Forms
Work Flow Work Flow MM Work Flow SD Communication Interface

SAP XI File Adapter

The file adapter is used to read and write files at the OS level. The OS-level directory must be accessible by the service user of the Adapter engine, with the appropriate read/write permissions.

The file adapter can also act as an FTP client, meaning that it can perform GET and PUT in the sender and receiver case, respectively.

File adapter Sender :

When a queue name is specified, it gets created in XI automatically.
File Processing Parameters:

Quality of Service
  1. Best Effort
  2. EO
  3. EOIO Specify Queue name in this case
Poll Interval

Processing Mode

  1. Archive w.timestamp possible
  2. Set to Read only
  3. Delete
  4. Test
Processing Sequence

Operating system command(33)

Related Posts :

XI adapter flow deployment of variants
EDI converter for SAP and EDI standards


ABAP TOPIC WISE COMPLETE COURSE

BDC OOPS ABAP ALE IDOC'S BADI BAPI Syntax Check
Interview Questions ALV Reports with sample code ABAP complete course
ABAP Dictionary SAP Scripts Script Controls Smart Forms
Work Flow Work Flow MM Work Flow SD Communication Interface

SAP XI adapter Flow Deployment of Variants

Three possibilities for deploying the adapter framework (deployment variants):
  1. Central adapter engine (included by default in the Integration Server installation)
  2. Local adapter engine (0 or more based on customer requirements)
  3. PCK (for B2B communication)
The advantage is that the three operate in a consistent way, which provides for easier configuration, maintenance and monitoring.

Related Post

SAP XI connectivity
SAP XI adapter message flow
Syntax for Insert data into Table and Load
You can go through entire syntax check here.
Check SAP SCRIPTS AND SMART FORMS BDC HERE COMPLETELY.
Learn object oriented approach of SAP ABAP HERE COMPLETELY.
ABAP ALE CONFIGURATION
BADI Complete Series and BAPI complete course