Recently I have been working on a requirement on a report which will have the information of the volume of data being loaded for every extractor into BW system from SAP ECC(ODQMON).
This will help create a technical reconciliation report where one can see how much volume of data is being loaded for every datasource. This will help get an insight on how much volume of data is being loaded on daily basis and if any miss can be easily detected.
ODQDATA (Operational Delta Queue: Data Store for Delta) is a standard table in SAP R\3 ERP systems, which stores Operational Delta Queue for Delta data.
ODQDATA contains the data for delta request for the respective datasource which is already being loaded into the target system, same as T-Code ODQMON.
Please find the above screenshot of ODQDATA table compared with ODQMON Transaction code. The number of Lines in the table ODQDATA is number of rows of data in ODQMON T-Code.
We can match the Smallest TSN from T-Code ODQMON and field TID from table ODQDATA, point to note is the field TID from table ODQDATA is in UTC and Smallest TSN from T-Code ODQMON is in CET and hence the 2-hour difference.
The ODQ uses the following three tables to store data:
We are using the table ODQDATA for our requirement as it contains the Delta request Information. To fulfil the requirement, we have developed a Function Module custom datasource to get the required result.
The Datasource is delta based on Timestamp (TSTAPMS) and will be used to obtain data from the TID field of the ODQDATA table.
The field TID in table is 23 letter Unique Transaction ID which generate during the runtime, and it consists of Timestamp of data load (first 14 digit) and system generated number (last 9 digit).
In the Function Module the TSTAPMS is converted into same format as TID and pass through the TID of the table ODQDATA to extract the required record.
See from the above code, the TSTAMPS is converted into TID format by concatenating the required suffix into the l_r_tid which will then pass through the table as a selection.
The rest of the code follow cleaning of data and populating in the result
The Data will be exposed to reporting tool and then number of records (LINES) will be compared to the actual data which is being loading for the specific extractor.
Delta Type ‘D’ – The SAP application writes delta records directly into ODQ (PUSH) (as well as in the ODQDATA Table) for ODP extractors with delta type ‘D’. Each data record is either stored in the ODQ individually on saving/updating the corresponding transactions in the application (for example, direct delta in the LO Cockpit) or is written in groups of delta data records (after updating the transaction) to ODQ using application-specific jobs (LO Queued delta). In each case, the delta data records are in the ODQ as well as in the ODQDATA Table for the SAP source system before the delta update is requested from the target system (for example BW/4HANA). In the case of delta updates for the DataSource, the ODQ is read, and the data records that exist there are transferred to the target system (for example, BW).
Example: 2LIS_* Datasources
Delta Type ‘E’ – The ODP data source determines the delta through the extractor on request. The extractor must be capable of providing the delta records for the DataSource on demand (PULL). The delta data records that have been determined are placed in the ODQ by the extractor as well as in the ODQDATA Table and are transferred from there to the requesting target system (for example BW).
Example: Finance Datasource, Master data Attribute or custom datasources.
Above, shows a comparison of two different SAP BI DataSources with different delta processes and how ODQ is filled with delta records:
Operational Data Provisioning – Simplified | SAP Blogs
Okumaya devam et...
This will help create a technical reconciliation report where one can see how much volume of data is being loaded for every datasource. This will help get an insight on how much volume of data is being loaded on daily basis and if any miss can be easily detected.
ODQDATA – Operational Delta Queue: Data Store for Delta
ODQDATA (Operational Delta Queue: Data Store for Delta) is a standard table in SAP R\3 ERP systems, which stores Operational Delta Queue for Delta data.
ODQDATA contains the data for delta request for the respective datasource which is already being loaded into the target system, same as T-Code ODQMON.
Please see the below screenshot of how table entries for ODQDATA looks like.
Table ODQDATA
- The table ODQDATA contains the information about the delta data loaded in the respective subscriber for every datasource.
- The field QUEUENAME is the name of the datasource,
- The field LINES represent the number records loaded for that data delta run.
- TID is a standard field within SAP Table ODQDATA that stores Unique Transaction ID information. The transaction ID identifies a transaction from the perspective of the delta queue. The first 14 letter is the timestamp in UTC. It says at what start time this record is created.
See the Screenshot comparing T-Code ODQMON with Table ODQDATA
T-Code ODQMON Vs Table ODQDATA
Please find the above screenshot of ODQDATA table compared with ODQMON Transaction code. The number of Lines in the table ODQDATA is number of rows of data in ODQMON T-Code.
We can match the Smallest TSN from T-Code ODQMON and field TID from table ODQDATA, point to note is the field TID from table ODQDATA is in UTC and Smallest TSN from T-Code ODQMON is in CET and hence the 2-hour difference.
ODQ Tables
The ODQ uses the following three tables to store data:
- ODQDATA_C Contains compressed Init. request data.
- ODQDATA_F Contains compressed Full request data.
- ODQDATA Contains compressed Delta request data.
We are using the table ODQDATA for our requirement as it contains the Delta request Information. To fulfil the requirement, we have developed a Function Module custom datasource to get the required result.
Development:-
The Datasource is delta based on Timestamp (TSTAPMS) and will be used to obtain data from the TID field of the ODQDATA table.
Delta on TSTAMPS
The field TID in table is 23 letter Unique Transaction ID which generate during the runtime, and it consists of Timestamp of data load (first 14 digit) and system generated number (last 9 digit).
In the Function Module the TSTAPMS is converted into same format as TID and pass through the TID of the table ODQDATA to extract the required record.
Please find the logic below:-
Kod:
loop at s_s_if-t_select into l_s_select where fieldnm = 'TSTAMPS'.
move-corresponding l_s_select to l_r_tstamps.
timestamp2 = l_r_tstamps-low.
append l_r_tstamps.
endloop.
if timestamp2 is not initial.
timestamp3 = timestamp2+0(14).
endif.
concatenate timestamp3 '.000000000' into timestamp4.
loop at s_s_if-t_select into l_s_select where fieldnm = 'TSTAMPS'.
move-corresponding l_s_select to l_r_tid.
l_r_tid-option = 'GT'.
l_r_tid-low = timestamp4.
append l_r_tid.
endloop.
See from the above code, the TSTAMPS is converted into TID format by concatenating the required suffix into the l_r_tid which will then pass through the table as a selection.
Please find the below Function Module Logic:-
Kod:
open cursor with hold s_cursor for
select (s_s_if-t_fields)
from odqdata
where
queuename in l_r_queuename
and modelname in l_r_modelname
and tid in l_r_tid.
endif. "First data package ?
* Fetch records into interface table.
* named E_T_'Name of extract structure'.
fetch next cursor s_cursor
appending corresponding fields
of table it_odqdata
package size s_s_if-maxsize.
if sy-subrc <> 0.
close cursor s_cursor.
raise no_more_data.
endif.
if it_odqdata[] is not initial.
unassign <fs_odqdata>.
clear w_t_data.
*Deleting records for self (queuename ZBW_ODQDATA).
delete it_odqdata where queuename = 'z**_ODQDATA' .
* Populating final value in E_T_DATA
loop at it_odqdata assigning <fs_odqdata>.
timestamp = <fs_odqdata>-tid.
<fs_odqdata>-tstamps = timestamp+0(14).
* Connverting UTC from ODQDATA Table to CET
convert time stamp <fs_odqdata>-tstamps time zone sy-zonlo
into date startdate time starttime.
concatenate startdate starttime into timestamp1.
move:
timestamp1 to w_t_data-tstamps,
<fs_odqdata>-tid to w_t_data-tid,
<fs_odqdata>-unitno to w_t_data-unitno,
<fs_odqdata>-modelname to w_t_data-modelname,
<fs_odqdata>-queuename to w_t_data-queuename,
<fs_odqdata>-lines to w_t_data-lines,
<fs_odqdata>-rawsize to w_t_data-rawsize,
<fs_odqdata>-compsize to w_t_data-compsize.
append w_t_data to e_t_data.
endloop.
endif.
s_counter_datapakid = s_counter_datapakid + 1.
endif.
Explanation of Code:-
- The selection from the table will be made with the selection of l_r_tid on the field TID.
- The entry in ODQDATA table created by running the self-extraction will be deleted.
- The Timestamp in table ODQDATA in in UTC, this is converted into CET in the final result.
The rest of the code follow cleaning of data and populating in the result
Result of the Extractor
Extractor Result
The Data will be exposed to reporting tool and then number of records (LINES) will be compared to the actual data which is being loading for the specific extractor.
Delta Type for ODQ Extractors
The Operational Delta Queue is mainly used for the below delta types for ODP extractors:
Delta Type ‘D’ – The SAP application writes delta records directly into ODQ (PUSH) (as well as in the ODQDATA Table) for ODP extractors with delta type ‘D’. Each data record is either stored in the ODQ individually on saving/updating the corresponding transactions in the application (for example, direct delta in the LO Cockpit) or is written in groups of delta data records (after updating the transaction) to ODQ using application-specific jobs (LO Queued delta). In each case, the delta data records are in the ODQ as well as in the ODQDATA Table for the SAP source system before the delta update is requested from the target system (for example BW/4HANA). In the case of delta updates for the DataSource, the ODQ is read, and the data records that exist there are transferred to the target system (for example, BW).
Example: 2LIS_* Datasources
Delta Type ‘E’ – The ODP data source determines the delta through the extractor on request. The extractor must be capable of providing the delta records for the DataSource on demand (PULL). The delta data records that have been determined are placed in the ODQ by the extractor as well as in the ODQDATA Table and are transferred from there to the requesting target system (for example BW).
Example: Finance Datasource, Master data Attribute or custom datasources.
ODQ Delta Type
Above, shows a comparison of two different SAP BI DataSources with different delta processes and how ODQ is filled with delta records:
- Datasource 2LIS_02_ITM (Purchasing data (item level)) with delta type D (push):
- As this is delta type PUSH, the delta data records from the application are posted to the delta queue before they are extracted from ODQ as part of the delta update.
- At the same time, they are also updated in the ODQDATA table.
- The new datasource (Z**_ODQDATA) will pulled the data directly before they are extracted from ODQ as part of the delta update from their respective extractor.
- After the delta run of the 2LIS_02_ITM, if this new datasource (Z**_ODQDATA) runs, it will not bring any record from this extractor as it is already fetched.
- Datasource 0FI_GL_4 (GL: Line items), Master Data Attribute, Custom Datasource with delta type E (PULL)]:
- As this is a ‘PULL’ delta-type, the delta data records are determined during the delta update by the Datasource extractor, updated to the ODQ, and passed on to the SAP BW directly from there.
- At the same time, they are also updated in the ODQDATA table and can be extracted by the new datasource (Z**_ODQDATA).
Conclusion:-
- The reconciliation reports created using the data from this new extractor Z**_ODQDATA when compared with the actual BW datasources, it gives an idea of how much data are exactly present for that extractor and what difference in data volume is seen during the processing.
- The Data will be exposed to reporting tool and then number of records (LINES) from Z**_ODQDATA will be compared to the actual data which is being loaded for the specific extractor.
- This will help to identify if we are missing any data during the processing in the target system when compared with the field “LINES” of this extractor.
- This kind of reconciliation report will be helpful for the technical team to identify if we are missing any regular data during processing.
- (Note: This will only work where the number of lines is not being manipulated in the target system, or with the first layer ADSO, or if loading in any non-SAP target directly from ECC).
References:-
Operational Data Provisioning – Simplified | SAP Blogs
Okumaya devam et...