This article and the follow-on article (part 2) will cover the conversion (In Place) of a standard BW dataflow to a BW/4HANA compliant data flow using SAP’s transfer toolbox (transaction RSB4HTRF) for the Explore / Realization Phase for the migration to SAP BW/4HANA.
A SAP BW/4HANA migration typically follows this sequence:
Discover / Prepare Phase: check system for BW/4HANA compliance (gather information about objects and code that needs to be transferred or changed), estimate effort for the conversion project. I’ve already covered this phase in the following articles:
Explore / Realization Phase: Transfer legacy objects into HANA-optimized counterparts, system
conversion, post conversion tasks. This and my follow-on article (part 2) will cover the transfer toolbox (transaction RSB4HTRF) functionality of this phase.
BW/4HANA
SAP BW/4HANA is a new, next generation data warehouse product from SAP that, like SAP S/4HANA, is optimized for the SAP HANA platform, including inheriting the high performance, simplicity, and agility of SAP HANA. SAP BW/4HANA delivers real time, enterprise wide analytics that minimize the movement of data and can connect all the data in an organization into a single, logical view, including new data types and sources. SAP BW/4HANA drastically reduces the number of data objects hence less maintenance and storage. All future innovations will take place in SAP BW/4HANA.
In Place System conversion
This article is applicable for an in-place system conversion which is to migrate a current SAP BW system into a SAP BW/4HANA system. Using the Transfer Toolbox provided by SAP, the SID of the system can be kept (i.e. in-place conversion).
If you just want to check your SAP BW system (Discover / Prepare Phase), then the SAP BW/4HANA Starter Add-on is not required (it can’t be installed on release 7.4 or lower). However, the SAP BW/4HANA Starter Add-on will be required to change the operating mode or use the Transfer Cockpit (Explore / Realization Phase).
Systems running on SAP BW 7.50 powered by SAP HANA can be converted in-place keeping their SID. In the realization phase of the conversion project, classic objects must be transferred into their HANA optimized replacements using the Transfer Toolbox. This transfer can be performed scenario-by-scenario. When all classic objects have been replaced, the system conversion to BW/4HANA can be triggered.
Please note, the transfer must take place in each system separately, it is not possible to execute the transfer in a production system by importing of a transport request. It is possible to ensure the same scope is transferred through the landscape from development system by selecting the corresponding button in the User Interface and maintaining an RFC destination.
Transfer Cockpit Steps Explanation (RSB4HTRF)
The following are the steps required to be executed in the Transfer Cockpit (RSB4HTRF) for an in-place conversion of a single data flow from the multiprovider down to the 7.x datasource in a BW 7.5 system (with the BW/4HANA starter add-on installed).
Once you execute transaction RSB4HTRF, you’ll receive the following pop-up – select “In-Place”
After selecting “In-Place”, you’ll receive the following pop-up – select “New” for a new task or existing if the task list has been already created and saved but not completed (please note, you also can see the existing task lists by execute transaction STC02).
Once you create or select an existing (open) task – the following are the steps that you need to complete in Transfer Cockpit (RSB4HTRF).
Step 1: Collect Scope for Transfer (Data Flow)
This task is step 1 in the transfer tool, which defines the object collection (“scope”) of the transfer. To start a transfer of objects (In place) into the object types which are supported by BW/4HANA, you must collect all objects which have a certain association to the object(s) that you want to transfer.
For all collected objects they must be active in the current system, free of modelling errors and not captured on any transports that is not been deployed to production (in-transit).
To define the scope – select the pencil icon under parameter column for this step.
Once you select this – you have the standard BW workbench search to determine the objects
You select the objects that you want to transfer. These objects are called “start objects” to the scope.
It is possible to ensure the same scope is transferred like was transferred before in the development system by selecting the corresponding button in the User Interface and maintaining an RFC destination into the Development system.
You can choose via the corresponding toggle button, whether the collection should be a “minimal scope” or not. For in Place, if the minimal scope is not selected, all related objects of the start objects will be collected regardless of the association type.
Based on my experience of converting a single dataflow I’d recommend the following
The next step (Step 2 of the Transfer Cockpit (RSB4HTRF)) will explain all the objects that is collected based off the above selections.
Please note if the minimal scope is not selected, too many objects will be picked up. For example, one BW system had a single process chain to clear all the logistics queues for system downtime / upgrades. When collecting the scope (when minimal scope was not selected) for one logistics data flow, it collected all logistics data flows as the collection picked up the downtime process chain (with all logistics infopackages on it) and based off this process chain picked up all logistic data flows active in the BW system.
Once you’ve defined your object list along with miminal scope selected (or not), select the save button and click the start/resume task list run (f8) – fourth icon on task bar on main menu in the Transfer Cockpit.
Once executed, a check is performed by the system
The execution of this step happens in parallel to reduce the runtime – I’d suggest you execute this step in the background as it potentially takes up to 25% of the available dialog processes.
If this task fails during the collection phase, you can try to re-execute it. If you get unspecific error messages like “Task failed”, it is meant that an object could not be collected and most likely a short dump occurred in the parallel task that executed that collection.
With most error and warning messages, a detail button is provided in the log (located at the bottom section of the Transfer Cockpit). If the message is about a specific object, the button will lead you to a dependency graphic, which shows you why this object has been collected. This information is especially useful if the scope becomes surprisingly big and might help you to delete some objects or remove some unwanted dependencies before starting the collection again. In other cases, the detail button will lead you either to further logs or to the object resp. action, that are affected by the error.
Specifically, you can maintain the scope and select or deselect objects for transfer in task “Define Object Mapping and Store Object List” (step 2) only after the object collection has finished.
For my conversion task of a single data flow from 7.x datasource to multiprovider – the following are the number of objects that I entered in including having minimal scope selected:
Step 2 – Define Object Mapping and Store Object List
Once step 1: Collect Scope for Transfer (Data Flow) has successfully executed, it will automatically stop on step 2 – Define Object Mapping and Store Object List.
This task offers the “main” user interface for the transfer tool. You now review the objects collected based of the scope defined in step 1 and determine the objects you want to be converted – select the pencil icon under parameter column for this step to view the objects collected.
There are two buttons “Hide/show secondary objects” and “Hide/Show InfoObjects” in top section of this step.
In column “Transfer Object”, select all objects of the current collection which you want to transfer into BW/4HANA compatible format. Since the objects have interdependencies, particular care was given that all affected or necessary objects are always selected or deselected along with the one you click on. It is thus often not possible to select or deselect a single object. For example, if you require a ADSO to replace the PSA and select this, the associated infopackage will be selected as well.
By clicking on an entry in column “original object name”, you can see why this object was collected. It will open a graphic with the where-used-analysis.
In column “Required Objects”, the inverse association is displayed in a shortened format: All objects which were collected by the current object are listed, with long GUIDs and names shortened, and parts of the name replaced by dots. You should still be able to identify the corresponding objects, but of course you cannot copy and paste the whole name. This information is listed to give an impression of the impact it has on other objects if you deselect or select the current object.
Columns “New TLOGO” / “New Object Name” show what will be the resulting object of the transfer. For some objects, the new name must be entered manually (for example new logical source system). For others a new name is proposed but can be changed. For most of them, the new name is determined by the system and is not changeable (for example transformation object). If you selected an object for transfer, and the new name is not given, the task is displayed as “not maintained” in the task list run.
The “Special Options” column provides additional information about the object.
If the original TLOGO is an ODSO, you can click on the entry to decide whether the full changelog will be transferred, or a minimal changelog. If there is no changelog, this information is displayed as well.
If the original TLOGO is part of an SPO, this information is displayed here. It is thus not possible to select/deselect an object like this without automatically selecting/deselecting all objects of the SPO.
If the original TLOGO is a DataSource, there will be three new objects assigned: The DataSource itself (RSDS), and also an ADSO and a corresponding transformation. You can transfer the PSA of the DataSource to an aDSO, and the system automatically determines whether this would be useful or not. You can select or deselect the transfer of datasource to ADSO, but you will observe, that this has impact on the selection of datasource to datasource conversion as well as on logical system and InfoPackage which are in the collection (and vice versa). This is, because an InfoPackage transfer to a DPT only makes sense if the PSA is converted to an ADSO.
Once you’ve defined your objects to be converted, select the save button and click the start/resume task list run (f8) – fourth icon on the task bar in the Transfer Cockpit.
A Check is done
With most error and warning messages, a detail button is provided in the log (located at the bottom section of the Transfer Cockpit).
For my selections from step 1 – a total of 158 objects was collected.
The following is the summary of objects collected
A few points on my conversion task
Please note, if any objects of the DSO activation, DSO change log deletion, cube compression and/or index creation and deletion process chain variants are not being converted to ADSO – the process chain variant will not be deleted – only the reference to the DSO or cube will be removed from the process chain variant.
Step 3 – Map InfoObjects for InfoProvider
This task provides the option of performing the required InfoObject mapping for the transfer of a MultiProvider to a CompositeProvider.
This mapping is necessary, as the CompositeProvider requires a consistent mapping of navigation attributes. A navigation attribute therefore always requires a mapping, which matches the attribute-carrying characteristic.
Navigation attributes from InfoObjects based on HANA Model are mapped in a way, that they cannot be used as navigation attributes anymore.
SAP recommend creating a new reference characteristic for navigation attributes and using this new reference characteristic in all MultiProvider transfers where the same navigation attribute mapping is not consistent. This task provides the option of creating a new reference characteristic and generating a proposal from old transfers for the mapping.
SAP are planning to change how composite providers work to be in line with multiprovider functionality – they will be releasing an SAP note in Oct 18 for this fix.
For my conversion task, I’d no issues with navigational attributes so the transfer toolbox marked this as green and moved to next step.
Step 4 – Select Main Partition Criteria for Semantically Partitioned Object
If the data flow being converted is not a SPO (Semantically Partitioned Object) data flow, then this step is ignored by the conversion tool.
If the data flow being converted is a SPO (Semantically Partitioned Object) data flow then this task provides the option to select the main partition InfoObject, which should be used for the new DataStore (advanced) in the generated semantic group. As semantic groups support only one partition InfoObject, this selection is necessary. The other partition InfoObject filters are nevertheless also added to the corresponding DataStore(advanced) as filter criteria.
For my conversion task, the data flow is not a SPO data flow so the transfer toolbox marked this as green and moved to next step.
Step 5 – Determine Usage of Involved Objects
This task checks generated objects of InfoProviders and scans involving routines, planning functions etc. against problematic code, which might cause errors during the transfer.
For my conversion task, no problematic code was picked up so the transfer toolbox marked this as green and moved to next step.
Conclusion
This article (part 1) along with the follow-on article (part 2) provides a detailed explanation of the steps involved with the conversion (In Place) of a standard BW dataflow to a BW/4HANA compliant data flow using SAP’s transfer toolbox (transaction RSB4HTRF).
Even if you’re not considering migrating to BW/4HANA, it’s extremely important to start to review this conversion now due to the following factors
Finally, it’s extremely important to understand that the conversion process of the objects does not mean that you must transition the BW system onto BW/4HANA. The conversion process can be executed on a standard BW system (For example BW 7.5 SP 11) without committing to move to BW/4HANA but has to be executed in each BW system!!
Okumaya devam et...
A SAP BW/4HANA migration typically follows this sequence:
Discover / Prepare Phase: check system for BW/4HANA compliance (gather information about objects and code that needs to be transferred or changed), estimate effort for the conversion project. I’ve already covered this phase in the following articles:
SAP BW/4HANA Migration – Discover / Prepare Phase – Part 1
SAP BW/4HANA Migration – Discover / Prepare Phase – Part 2
SAP BW/4HANA Migration – Discover / Prepare Phase – Part 3
Explore / Realization Phase: Transfer legacy objects into HANA-optimized counterparts, system
conversion, post conversion tasks. This and my follow-on article (part 2) will cover the transfer toolbox (transaction RSB4HTRF) functionality of this phase.
BW/4HANA
SAP BW/4HANA is a new, next generation data warehouse product from SAP that, like SAP S/4HANA, is optimized for the SAP HANA platform, including inheriting the high performance, simplicity, and agility of SAP HANA. SAP BW/4HANA delivers real time, enterprise wide analytics that minimize the movement of data and can connect all the data in an organization into a single, logical view, including new data types and sources. SAP BW/4HANA drastically reduces the number of data objects hence less maintenance and storage. All future innovations will take place in SAP BW/4HANA.
In Place System conversion
This article is applicable for an in-place system conversion which is to migrate a current SAP BW system into a SAP BW/4HANA system. Using the Transfer Toolbox provided by SAP, the SID of the system can be kept (i.e. in-place conversion).
If you just want to check your SAP BW system (Discover / Prepare Phase), then the SAP BW/4HANA Starter Add-on is not required (it can’t be installed on release 7.4 or lower). However, the SAP BW/4HANA Starter Add-on will be required to change the operating mode or use the Transfer Cockpit (Explore / Realization Phase).
Systems running on SAP BW 7.50 powered by SAP HANA can be converted in-place keeping their SID. In the realization phase of the conversion project, classic objects must be transferred into their HANA optimized replacements using the Transfer Toolbox. This transfer can be performed scenario-by-scenario. When all classic objects have been replaced, the system conversion to BW/4HANA can be triggered.
Please note, the transfer must take place in each system separately, it is not possible to execute the transfer in a production system by importing of a transport request. It is possible to ensure the same scope is transferred through the landscape from development system by selecting the corresponding button in the User Interface and maintaining an RFC destination.
Transfer Cockpit Steps Explanation (RSB4HTRF)
The following are the steps required to be executed in the Transfer Cockpit (RSB4HTRF) for an in-place conversion of a single data flow from the multiprovider down to the 7.x datasource in a BW 7.5 system (with the BW/4HANA starter add-on installed).
Once you execute transaction RSB4HTRF, you’ll receive the following pop-up – select “In-Place”
After selecting “In-Place”, you’ll receive the following pop-up – select “New” for a new task or existing if the task list has been already created and saved but not completed (please note, you also can see the existing task lists by execute transaction STC02).
Once you create or select an existing (open) task – the following are the steps that you need to complete in Transfer Cockpit (RSB4HTRF).
Step 1: Collect Scope for Transfer (Data Flow)
This task is step 1 in the transfer tool, which defines the object collection (“scope”) of the transfer. To start a transfer of objects (In place) into the object types which are supported by BW/4HANA, you must collect all objects which have a certain association to the object(s) that you want to transfer.
For all collected objects they must be active in the current system, free of modelling errors and not captured on any transports that is not been deployed to production (in-transit).
To define the scope – select the pencil icon under parameter column for this step.
Once you select this – you have the standard BW workbench search to determine the objects
You select the objects that you want to transfer. These objects are called “start objects” to the scope.
It is possible to ensure the same scope is transferred like was transferred before in the development system by selecting the corresponding button in the User Interface and maintaining an RFC destination into the Development system.
You can choose via the corresponding toggle button, whether the collection should be a “minimal scope” or not. For in Place, if the minimal scope is not selected, all related objects of the start objects will be collected regardless of the association type.
Based on my experience of converting a single dataflow I’d recommend the following
Select “minimal scope” toggle button
Select the following object types in dataflow
Cube
DSO
Infosource (if used)
7.x Datasource
The next step (Step 2 of the Transfer Cockpit (RSB4HTRF)) will explain all the objects that is collected based off the above selections.
Please note if the minimal scope is not selected, too many objects will be picked up. For example, one BW system had a single process chain to clear all the logistics queues for system downtime / upgrades. When collecting the scope (when minimal scope was not selected) for one logistics data flow, it collected all logistics data flows as the collection picked up the downtime process chain (with all logistics infopackages on it) and based off this process chain picked up all logistic data flows active in the BW system.
Once you’ve defined your object list along with miminal scope selected (or not), select the save button and click the start/resume task list run (f8) – fourth icon on task bar on main menu in the Transfer Cockpit.
Once executed, a check is performed by the system
The collected objects must be transferrable, i.e. there must be a corresponding object handling available in the BW/4HANA metadata transfer. SAP is constantly extending the set of covered BW/4HANA object types, but there might be some, which cannot be handled. These must be manually adjusted or deleted, before the collection task can be re-executed.
The collected objects must be part of one open Task List Run only. You must not start the transfer tool for the same object(s) in parallel sessions. This restriction does not hold for the BW Source System once it was initially transferred.
Specifically, when testing the transfer tool, you might delete the transferred objects and recreate the original objects manually in order to re-execute the test. This situation is detected and rejected because the control tables of the transfer tool would not allow the objects to be transferred. In this situation, you have to reset the earlier transfer first by executing program RS_B4HANA_TRANSFER_RESET.
The execution of this step happens in parallel to reduce the runtime – I’d suggest you execute this step in the background as it potentially takes up to 25% of the available dialog processes.
If this task fails during the collection phase, you can try to re-execute it. If you get unspecific error messages like “Task failed”, it is meant that an object could not be collected and most likely a short dump occurred in the parallel task that executed that collection.
With most error and warning messages, a detail button is provided in the log (located at the bottom section of the Transfer Cockpit). If the message is about a specific object, the button will lead you to a dependency graphic, which shows you why this object has been collected. This information is especially useful if the scope becomes surprisingly big and might help you to delete some objects or remove some unwanted dependencies before starting the collection again. In other cases, the detail button will lead you either to further logs or to the object resp. action, that are affected by the error.
Specifically, you can maintain the scope and select or deselect objects for transfer in task “Define Object Mapping and Store Object List” (step 2) only after the object collection has finished.
For my conversion task of a single data flow from 7.x datasource to multiprovider – the following are the number of objects that I entered in including having minimal scope selected:
4 cubes
13 DSO’s
12 infosources
1 7.x Datasource (0BBP_TD_SC_APPR_1)
Step 2 – Define Object Mapping and Store Object List
Once step 1: Collect Scope for Transfer (Data Flow) has successfully executed, it will automatically stop on step 2 – Define Object Mapping and Store Object List.
This task offers the “main” user interface for the transfer tool. You now review the objects collected based of the scope defined in step 1 and determine the objects you want to be converted – select the pencil icon under parameter column for this step to view the objects collected.
There are two buttons “Hide/show secondary objects” and “Hide/Show InfoObjects” in top section of this step.
In column “Transfer Object”, select all objects of the current collection which you want to transfer into BW/4HANA compatible format. Since the objects have interdependencies, particular care was given that all affected or necessary objects are always selected or deselected along with the one you click on. It is thus often not possible to select or deselect a single object. For example, if you require a ADSO to replace the PSA and select this, the associated infopackage will be selected as well.
By clicking on an entry in column “original object name”, you can see why this object was collected. It will open a graphic with the where-used-analysis.
In column “Required Objects”, the inverse association is displayed in a shortened format: All objects which were collected by the current object are listed, with long GUIDs and names shortened, and parts of the name replaced by dots. You should still be able to identify the corresponding objects, but of course you cannot copy and paste the whole name. This information is listed to give an impression of the impact it has on other objects if you deselect or select the current object.
Columns “New TLOGO” / “New Object Name” show what will be the resulting object of the transfer. For some objects, the new name must be entered manually (for example new logical source system). For others a new name is proposed but can be changed. For most of them, the new name is determined by the system and is not changeable (for example transformation object). If you selected an object for transfer, and the new name is not given, the task is displayed as “not maintained” in the task list run.
The “Special Options” column provides additional information about the object.
If the original TLOGO is an ODSO, you can click on the entry to decide whether the full changelog will be transferred, or a minimal changelog. If there is no changelog, this information is displayed as well.
If the original TLOGO is part of an SPO, this information is displayed here. It is thus not possible to select/deselect an object like this without automatically selecting/deselecting all objects of the SPO.
If the original TLOGO is a DataSource, there will be three new objects assigned: The DataSource itself (RSDS), and also an ADSO and a corresponding transformation. You can transfer the PSA of the DataSource to an aDSO, and the system automatically determines whether this would be useful or not. You can select or deselect the transfer of datasource to ADSO, but you will observe, that this has impact on the selection of datasource to datasource conversion as well as on logical system and InfoPackage which are in the collection (and vice versa). This is, because an InfoPackage transfer to a DPT only makes sense if the PSA is converted to an ADSO.
Once you’ve defined your objects to be converted, select the save button and click the start/resume task list run (f8) – fourth icon on the task bar in the Transfer Cockpit.
A Check is done
For the selected objects, it is again checked whether they can be transferred. If not, the task list run will fail.
Then, the new objects are checked, whether they fulfill the naming conventions, whether they are suited for the purpose (e.g. whether the target logsys has the correct ODP-context) and other object-specific checks.
With most error and warning messages, a detail button is provided in the log (located at the bottom section of the Transfer Cockpit).
For my selections from step 1 – a total of 158 objects was collected.
The following is the summary of objects collected
4 Cubes
21 DTP’s
5 query elements
2 Infopackages
1 Logical system
1 Multiprovider
13 DSO’s
24 Routines
3 Datasources
16 process chains
23 process chain variants
12 infosources
33 transformations
A few points on my conversion task
I’m not retaining the PSA so I did not select this option (new ADSO along with transformation and converting the existing infopackages to DTPs was not selected).
As I’m not retaining the PSA, the two infopackages linked to the datasource will be deleted.
The logical system is a SRM system and an ODP source system does not exist already so the transfer toolbox will create a new logical ODP source system for SRM.
There are 3 items on datasource object type (RSDS) even though only 1 datasource is being converted (this is the datasource, new ADSO (PSA replacement option) and transformation for new ADSO).
Any transformations that have routines will be picked up.
The following is a summary of process chain variant updates
The DSO activation process chain variant will be replaced by the ADSO activation step (assuming all objects in this process chain variant are being converted to ADSO).
The DSO change log deletion process chain variant will be replaced by the ADSO change log deletion process chain variant (assuming all objects in this process chain variant are being converted to ADSO).
The cube compression process chain variant will be replaced by a new clean up old requests in ADSO process chain variant (assuming all objects in this process chain variant are being converted ADSO).
The index creation and deletion process chain variant will be deleted (assuming all objects in this process chain variant are being converted ADSO).
The deletion of contents of DSO’s and cubes will be updated with the new object type (i.e. ADSO) in the deletion of contents process chain variant – Cube -> ADSO and DSO -> ADSO.
Please note, if any objects of the DSO activation, DSO change log deletion, cube compression and/or index creation and deletion process chain variants are not being converted to ADSO – the process chain variant will not be deleted – only the reference to the DSO or cube will be removed from the process chain variant.
Step 3 – Map InfoObjects for InfoProvider
This task provides the option of performing the required InfoObject mapping for the transfer of a MultiProvider to a CompositeProvider.
This mapping is necessary, as the CompositeProvider requires a consistent mapping of navigation attributes. A navigation attribute therefore always requires a mapping, which matches the attribute-carrying characteristic.
Navigation attributes from InfoObjects based on HANA Model are mapped in a way, that they cannot be used as navigation attributes anymore.
SAP recommend creating a new reference characteristic for navigation attributes and using this new reference characteristic in all MultiProvider transfers where the same navigation attribute mapping is not consistent. This task provides the option of creating a new reference characteristic and generating a proposal from old transfers for the mapping.
SAP are planning to change how composite providers work to be in line with multiprovider functionality – they will be releasing an SAP note in Oct 18 for this fix.
For my conversion task, I’d no issues with navigational attributes so the transfer toolbox marked this as green and moved to next step.
Step 4 – Select Main Partition Criteria for Semantically Partitioned Object
If the data flow being converted is not a SPO (Semantically Partitioned Object) data flow, then this step is ignored by the conversion tool.
If the data flow being converted is a SPO (Semantically Partitioned Object) data flow then this task provides the option to select the main partition InfoObject, which should be used for the new DataStore (advanced) in the generated semantic group. As semantic groups support only one partition InfoObject, this selection is necessary. The other partition InfoObject filters are nevertheless also added to the corresponding DataStore(advanced) as filter criteria.
For my conversion task, the data flow is not a SPO data flow so the transfer toolbox marked this as green and moved to next step.
Step 5 – Determine Usage of Involved Objects
This task checks generated objects of InfoProviders and scans involving routines, planning functions etc. against problematic code, which might cause errors during the transfer.
For my conversion task, no problematic code was picked up so the transfer toolbox marked this as green and moved to next step.
Conclusion
This article (part 1) along with the follow-on article (part 2) provides a detailed explanation of the steps involved with the conversion (In Place) of a standard BW dataflow to a BW/4HANA compliant data flow using SAP’s transfer toolbox (transaction RSB4HTRF).
Even if you’re not considering migrating to BW/4HANA, it’s extremely important to start to review this conversion now due to the following factors
All future innovations will take place in SAP BW/4HANA.
SAP in the future will potentially stop supporting ‘legacy’ objects (for example open hubs/infocubes etc).
The conversion process for a medium to large scale BW system is a substantial piece of work and has to be executed in each system (for example, if you have a BW landscape of one development system, two test systems and one production system and have 30k objects to be converted then this is 120k objects that need to be converted (30k * 4)!).
Utilise the benefits of BW/4HANA compliant objects (for example composite providers, ADSO etc.) on a standard BW system (for example BW 7.5 SP 11).
Transition your LSA (Layered Scalable Architecture) to LSA++.
ODP Extraction performance benefits (see article ODP—Performance-improvements-and-benefits)
Finally, it’s extremely important to understand that the conversion process of the objects does not mean that you must transition the BW system onto BW/4HANA. The conversion process can be executed on a standard BW system (For example BW 7.5 SP 11) without committing to move to BW/4HANA but has to be executed in each BW system!!
Okumaya devam et...