What is TDMS? Test Data Migration Server (TDMS) is an excellent tool provided by SAP for creating consistent non-production clients with reduced data volumes. The ability to extract integrated subsets of business and/or configuration data for training, testing or prototyping purposes is extremely effective in terms of time, quality and cost.

TDMS provides various different scenarios for data reduction, including Time-Based and BPL (e.g. sales orders, purchase orders etc.); as well as the ability to create golden configuration clients. In addition, the tool allows one to apply conversion routines to scramble sensitive data where necessary. Essentially, TDMS provides the following data management services:

  • Efficient data replication functionality to support multiple testing and prototyping environments (non-production systems).
  • Integrated data extraction processes that retain the consistency of the source data across a multi-system landscape.
  • Reliable, reusable data migration processes based on different extraction criteria.
  • Enable users to work with true configuration and representative application data in non-production environments.

Planning a TDMS Project

Based on my experience running several TDMS projects, I would like to present a practical starting point for any company considering using the tool. As with most implementation projects, the right initial planning approach can save one significant time and costs. Implementing TDMS is not a very complex process; however, it is also not a turn-key application. The correct advance planning and strategizing will enable one to maximize both the implementation turnaround and the subsequent use of the TDMS product. For the purpose of this document, the term ‘business users’ refers to functional business process owners and trainers. The planning stage should be broken into the following components:


When planning to implement and use TDMS, it is important to allocate sufficient time to ensure that the application can be installed, executed, validated and even refreshed before any scheduled delivery deadline. In order to allow the Basis team to master the tool, it is useful for them to be able to execute a new scenario a couple of times so that any issues can be resolved, and so that they can get a good sense of the final expected runtime.

It is unrealistic to assume that final useable data will be available after only a single execution of the scenario. This is particularly relevant when complex conversion rules (data scrambling) are required. Allowance should be made for the need to work through the learning curve.In addition, it is important that the results be validated and approved by business users before using the reduced client for its intended purpose. This task should be built into the project timeline.


A TDMS implementation project requires close technical and functional collaboration. As with any new application, the business users need to be educated and involved in the planning phase. One key component for ultimately using TDMS successfully (once it is installed) is to make the business and/or trainers aware of the benefits that TDMS delivers in terms that they can appreciate. As a non-production tool, this is sometimes a challenge. Typically their focus is on business-related processes and often they do not fully recognize the relevance of support applications. It is therefore important that all key stakeholders are involved from the start. Without an appreciation of how the tool can be used by the business users, one runs the risk of minimal ROI following the TDMS implementation.

Although the actual implementation of TDMS is a technical process, it is vital for the business users to start planning how they intend to use the tool as early as possible, as this has a direct impact on the TDMS landscape and the types of scenarios to be executed. Unnecessary landscape changes and rebuilds can be avoided if the installation is done according to a pre-approved implementation plan. By partnering with the business users, a viable system design can be established and one can ensure that the tool will be frequently used going forward.

In addition, while the TDMS scenarios will typically be executed by technical resources, input from the business users in terms of defining data reduction scenarios is vital. In order for them to provide the appropriate reduction information, they need to collaborate amongst themselves (often across functional boundaries). It is most efficient if these discussions can take place while the software is being installed and tested. This will allow the momentum to be carried through from the technical installation to the scenario execution without the delays resulting from indecisions regarding the required scenarios.

Support Documentation

The first step in advance preparation for a TDMS implementation project is for the Basis, Security and Technical teams to review the TDMS guides provided by SAP. These documents are available in the Service Marketplace using the quick link ‘/tdms’. These documents are updated periodically, so it would be wise to always check the site for a newer version:

  • Master Guide: SAP Test Data Migration Server 3.0
  • Operations Guide: SAP Test Data Migration Server 3.0
  • Security Guide: SAP Test Data Migration Server 3.0

The guides present a comprehensive overview of the tool, the different reduction scenarios available and the key issues with regards to designing the TDMS landscape. In addition, detailed step-by-step instructions are provided for some complex scenarios (e.g. changing technical settings for BPL scenarios).

An understanding of these documents will provide the team with a good introduction to the tool and the various steps involved in the implementation. While the guides are generally quite technical, they do also contain information that should be shared with the business users to help them understand the TDMS tool. The TDMS Guides do refer to additional OSS Notes that should also be reviewed before starting the project. Recommendations on how to search for appropriate notes are explained in the guides.

System Design

Before installing the TDMS software, an appropriate system landscape needs to be built or identified. The TDMS Guides provide detailed information of the requirements and constraints in this regard. Essentially one needs a Sender and Receiver client, as well as a TDMS server (which can be combined in any existing system with the exception of the Receiver system). The TDMS Guides contain specific information regarding the system requirements needed to execute TDMS effectively. In addition they refer to OSS Notes that provide additional system information.

A key consideration is that the Repository of the Receiver system must be an exact copy of the Repository of the Sender system. When designing the landscape, it is important to plan how the Repository will be copied.


Once the landscape has been defined, the TDMS software must be installed on the Sender server, TDMS server and the Receiver Server. The latest Support Pack and all relevant Add-ons defined in the OSS Notes listed in the TDMS Guide should be installed as well. OSS Note 1003051 is a composite note referencing notes that should be applied (in addition to the latest Support Pack) prior to starting the TDMS project. It is worthwhile to be diligent in applying all recommended OSS Notes in advance. Execution time can be significantly reduced if all recommended OSS Notes are in place. For example, the ‘Fill Header Tables’ activity can take a very long time. However there are numerous notes available which will reduce this time considerably.

As mentioned above, the Repository of the Receiver system must be updated to match that of the Sender system. There are several methods of accomplishing this – TDMS Shell scenario, system copy, or manual import of transports.

The user IDs in the TDMS server also need to be updated with the appropriate profiles (SAP_TDMS_DEVELOPER, SAP_TDMS_MASTER etc.) In addition, a CPIC user with the SAP_TDMS_USER profile must be created on all the systems. This user will be assigned as the RFC user ID. It is important to note that this user must have a password of 8 characters with only numbers and upper case alphabetic characters (no lower case characters are allowed). These constraints often cause a problem in systems with complex password rules.


While the technical installation is progressing, the business users should be planning the initial reduction scenarios – time-based, BPL etc. Choosing the correct ‘from date’ or selection set is key to ensuring that adequate representative data will be available in the Receiver client. The installation guides provide valuable information with regards to determine the correct reduction set and issues of data consistency.

In addition, consideration should be given to the need for conversion rules (scrambling of sensitive data). If there is a need for conversion rules, the business users should also define the specific fields that need to be scrambled and well as the type of scrambling to be applied to each field (e.g. a single default value, a scrambling algorithm etc.) By preparing all this information in advance, significant time can be saved in the execution of the scenario.Another aspect for the business users to consider is the relationship between ECC data and BI or CRM data (where applicable). Consistent reduction scenarios should be used for all three systems in order to minimize data consistency issues.


While TDMS ensures data consistency from a technical perspective, it is vital that the business users are aware of the fact that they will need to validate the reduced data as well. There are certain situations where some minor data inconsistency may occur (reduction timing issues, documents (e.g. contracts) that extend well beyond the ‘from date’, etc.). Some dedicated time should be scheduled to allow key users to review and compare the data between the Sender and Receiver systems once the scenario is complete. In conclusion, if one is able to follow the recommendations above, the actual implementation and execution of TDMS will be accelerated significantly. The initial scenarios would avoid delays resulting from omissions in the landscape design and installation phase, and the results will be meaningful, based on educated input from the business users.

In a nutshell:

  • Allow enough time to work through the implementation and complete the learning curve before any deliverable deadline
  • Include the business users – they need to understand how the tool can help them and they must provide appropriate reduction scenario requests
  • Use the installation guides provided by SAP
  • Design the TDMS landscape to ensure that it is able to support the processing needs efficiently
  • Take the time to analyze the relevant OSS Notes to avoid performance issues when executing scenarios
  • Make sure that the business is able to clearly define the reduction scenarios and conversion rules that need to be applied
  • Manually validate the reduced data to ensure that it satisfies the planned use cases.


  • That’s right, we are 100% SAP focused. This approach allows us to attract the best SAP professionals in the industry and focus our services on delivering the maximum value to your SAP investment. We firmly believe that we can provide our clients with a solution which is rich in required SAP...

  • Coppercone is widely recognized as offering unique capabilities to the marketplace, especially in rapid deployment solutions for SAP Enterprise Solutions, Application Management Services and Hosting Services. We leverage these advantages to enable our clients to realize greater business value...

  • In today’s challenging business environment, companies need to have clarity across all aspects of their business, which allows them to act quickly with increased insight, efficiency and flexibility. By using SAP solutions, companies of all sizes – including small businesses and midsize co...

  • Our consultants work round-the-clock to find ways to reduce risks, shorten time-to-value and lower your total cost of ownership. Working side-by-side with clients, they deliver methods, tools and expertise that show quick results, all the while staying committed to the knowledge transfer proc...