SAP Standard for Data Volume Management

SAP Standard for E2E Solution Operations
Document Version: 1.0 – 2014-12-12
SAP Standard for Data Volume Management
SAP Solution Manager 7.1
CUSTOMER
Typographic Conventions
Type Style
Description
Example
Words or characters quoted from the screen. These include field names, screen titles,
pushbuttons labels, menu names, menu paths, and menu options.
Textual cross-references to other documents.
Example
Emphasized words or expressions.
EXAMPLE
Technical names of system objects. These include report names, program names,
transaction codes, table names, and key concepts of a programming language when they
are surrounded by body text, for example, SELECT and INCLUDE.
Example
Output on the screen. This includes file and directory names and their paths, messages,
names of variables and parameters, source text, and names of installation, upgrade and
database tools.
Example
Exact user entry. These are words or characters that you enter in the system exactly as they
appear in the documentation.
<Example>
Variable user entry. Angle brackets indicate that you replace these words and characters
with appropriate entries to make entries in the system.
EXAMPLE
2
Keys on the keyboard, for example, F 2 or E N T E R .
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Typographic Conventions
Document History
Version
Date
Change
1.0
2014-12-12>
First version created
SAP Standard for Data Volume Management
Document History
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
3
Table of Contents
1
1.1
SAP Standards for E2E Solution Operations .............................................................................................. 6
Control Center Approach ........................................................................................................................................ 7
2
2.1
2.2
2.3
2.4
Overview of the Standard for Data Volume Management ...................................................................... 10
Data Volume Management and Application Lifecycle Management ................................................................ 10
Benefits.................................................................................................................................................................... 11
Challenges .............................................................................................................................................................. 12
Basic Architecture ................................................................................................................................................. 13
2.4.1
Technical Aspects ................................................................................................................................. 13
2.4.2
Interfaces ............................................................................................................................................... 14
2.4.3
Information Lifecycle Management .................................................................................................... 14
2.4.4
Data Volume Management Tools ........................................................................................................ 15
3
3.1
Lifecycle of Data Volume Management .................................................................................................... 18
Plan ......................................................................................................................................................................... 19
3.1.1
Data Volume Management Strategy ................................................................................................... 19
3.1.2
Organizational Preparation .................................................................................................................. 19
3.1.3
Assessment ...........................................................................................................................................20
3.1.4
Scoping ..................................................................................................................................................20
Build ........................................................................................................................................................................22
3.2.1
Blueprint ................................................................................................................................................23
3.2.2
Realization .............................................................................................................................................25
3.2.3
Functional and Integration Tests .........................................................................................................26
3.2.4
Go-Live ................................................................................................................................................... 27
Run .......................................................................................................................................................................... 27
3.3.1
Data Volume Management Operations............................................................................................... 27
3.3.2
Executing Jobs ......................................................................................................................................28
3.3.3
Maintaining Documentation .................................................................................................................29
3.3.4
Reorganizing Databases.......................................................................................................................30
3.3.5
Troubleshooting .................................................................................................................................... 31
3.2
3.3
4
4.1
Optimize ........................................................................................................................................................33
Continuous Improvements ...................................................................................................................................34
5
5.1
5.2
Driving Continuous Improvement .............................................................................................................. 35
Quality Assurance Tasks.......................................................................................................................................35
Quality Targets and KPIs ......................................................................................................................................35
6
6.1
6.2
Training .......................................................................................................................................................... 37
Expert Guided Implementation Sessions ............................................................................................................ 37
Continuous Quality Check (CQC) SAP Service for Data Volume Management ..............................................38
7
7.1
More Information ........................................................................................................................................ 39
Enterprise Support: Value Map for Data Volume Management ........................................................................39
4
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Table of Contents
SAP Standard for Data Volume Management
Table of Contents
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
5
1
SAP Standards for E2E Solution
Operations
IT organizations face new challenges every day as they attempt to remain effective and future safe while also
keeping costs for day-to-day operations as low as possible. They are also being challenged more than ever to
demonstrate their value to businesses. Therefore, it is important to optimize the day-to-day tasks that have less
obvious business value and to use KPI and benchmark-based reporting to make IT processes more visible,
demonstrating the real value that IT can provide.
In order to minimize the costs of IT, it is necessary to standardize and automate IT processes end-to-end (E2E)
without reducing the SLAs required by the business, such as stability, availability, performance, process and data
transparency, data consistency, IT process compliance, and so on.
Based on the experience gained by SAP Active Global Support (AGS) while serving more than 36,000 customers,
SAP has defined process standards and best practices to help customers set up and run E2E solution operations
for their SAP-centric solutions.
The Build phase of SAP best practices supports a Build SAP Like a Factory approach, consisting of the following
processes:
•
Custom code management
•
Change, test, and release management
•
Incident, problem, and request management
•
Solution documentation
•
Remote supportability
During the Run phase of a solution, adapting your IT infrastructure to a Run SAP Like a Factory operation impacts
both application operations and business process operations. Therefore, operations processes, such as technical
monitoring, end-to-end root-cause analysis, technical administration, and data volume management need to be
optimized to achieve state-of-the-art application operations. In business process operations, the same applies to
business process and interface monitoring (including performance optimization), data consistency management,
and job scheduling management.
Quality management processes and tasks need to be established throughout the lifecycle to guarantee
continuous improvement of the end-to-end operations processes while simultaneously ensuring the flexibility
needed to react to changing requirements.
6
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
SAP Standards for E2E Solution Operations
Figure 1: Organizational model for solution operations
This figure shows an organizational model for solution operations that aligns SAP best practice topics and E2E
standards with SAP's control center approach.
The Operations Control Center executes and controls the Run SAP Like a Factory processes, while the Innovation
Control Center ensures optimal custom code management and a smooth transition to production with integration
validation procedures. SAP connects to these control centers from the Mission Control Center to ensure that
professional support is available to the customer. The following Application Lifecycle Management (ALM)
functions are not provided directly in one of the control centers because they must be handled across different
areas:
•
Change, test, and release management
•
Incident, problem, and request management
•
Solution documentation
•
Remote supportability
The quality management methodologies are an essential part of SAP's Advanced Customer Center of Expertise
(CoE) concept and ensure that the KPI-driven processes are continuously improved across all processes and
teams. In addition, the quality manager roles ensure consistent and value-centric reporting to the business and
management. This unified reporting platform is known as the Single Source of Truth.
1.1
Control Center Approach
The Operations Control Center (OCC) is the physical manifestation of the Run SAP Like a Factory philosophy. The
OCC allows for automated, proactive operations, which simultaneously reduces operational costs while increasing
the quality of IT services, leading to improved business satisfaction. The OCC also drives continuous improvement
of business processes and IT support. To achieve these goals, it relies on a close interaction with both the
Innovation Control Center (ICC) and the SAP Mission Control Center (MCC).
SAP Standard for Data Volume Management
SAP Standards for E2E Solution Operations
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
7
Figure 2: Interaction Between ICC, OCC, and MCC
The OCC is a central IT support entity at the customer site, which monitors the productive SAP environment as
well as important non-SAP applications. During operation, the OCC requires a workforce of 2 full-time equivalent
(FTE) per shift to ensure that incidents are detected and resolved as quickly as possible. The OCC is equipped
with large screens that display the status of business processes, IT landscape components, as well as exceptions
and alerts. If problems occur, you use a video link to get live support from SAP and partners. The customer usually
sets up the room with assistance from SAP Active Global Support (AGS). The customer is responsible for
managing the OCC and the team of technical and functional IT operators who act on the alerts.
The OCC is most effective when closely integrated with other IT processes, such as IT Service Management
(ITSM) and Change Management. Central monitors and dashboards based on application and business process
operations display the current status of business and IT-related processes. This data can also be used to drive
continuous improvement.
An effective system monitoring and alerting infrastructure is fundamental to the success of an OCC.
8
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
SAP Standards for E2E Solution Operations
Figure 3: OCC Concept
The OCC is most effective when closely integrated with other IT processes, such as IT Service Management
(ITSM) and Change Management. Central monitors and dashboards based on application and business process
operations display the current status of business and IT-related processes. This data can also be used to drive
continuous improvement.
An effective system monitoring and alerting infrastructure is fundamental to the success of an OCC.
For Job Scheduling Management, the OCC supervises all background monitoring processes, SAP controls and
legacy background operations. It reacts to job monitoring alerts according to predefined error-resolution
activities, and triggers follow-up activities for error handling if the relevant task are not completed within a certain
timeframe.
SAP Standard for Data Volume Management
SAP Standards for E2E Solution Operations
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
9
2
Overview of the Standard for Data Volume
Management
2.1
Data Volume Management and Application Lifecycle
Management
In SAP Solution Manager, Application Lifecycle Management (ALM) processes are supported by control centers.
The Innovation Control Center provides transparency on possible improvements to business processes. The
Operations Control Centers are focused on optimizing the system and application management of your system
landscape.
Figure 4: Operations Control Center: General Concept
In SAP Solution Manager, the Data Volume Management work center acts as the operations control center for
Data Volume Management (DVM). The work center serves as the central point of access for all features related to
monitoring and analyzing the data volume in your landscape. These processes are called Data Discovery and Data
Profiling. DVM focuses on performing the following tasks:
•
Notifying and managing alerts
•
Performing individual trend analyses for allocation statistics, utilization, and time-based distribution
•
Reporting of continuous improvements
•
Generating best-practice documents (guided self-service)
•
Calculating potential savings
•
Gathering information about executed archiving activities in your system landscape
•
Generating graphical representations of your system and application size and archiving potential
10
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
Figure 5: Data Volume Management Process
2.2
Benefits
The data volume in your landscape grows continuously due to the increasing need for easy access to larger and
larger amounts of data. But simply installing additional disks to storage area networks (SANs) and storage
subsystems can make the situation worse, for example, by reducing performance or increasing the cost and effort
of system management. To manage and reduce this growth, you need to define and implement an effective DVM
strategy.
The SAP Standard for Data Volume Management provides an overview of how to set up such a strategy, including
how to manage data growth by summarizing, deleting, or archiving data and even how to avoid creating
unnecessary data altogether.
Implementing this kind of strategy helps you to make improvements for both IT services and business
stakeholders in the following areas:
Area
IT services
Improvements
•
Overall better system performance
•
Shorter response times in dialog/batch mode for all employees
•
Reduce downtime, for example, during conversions, migrations, and upgrades
•
Fewer hardware requirements and faster point-in-time recovery
•
Improved system availability
•
Faster and easier upgrades to higher software releases
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
11
Area
Improvements
Business
stakeholders
2.3
•
Shorter runtime for backup and recovery
•
Increased transparency
•
Proactive monitoring of data distribution, growth rates, and saving potential
•
Simple presentation of statistical and analytical issues
•
Better supportability
•
Reduced effort for database maintenance
•
Fast and easy implementation of applicable reduction methods
•
Green IT
•
Lower energy costs by reducing hardware resources
•
Reduced risk
•
Fewer unplanned downtimes due to point-in-time recovery
•
Reduced impact of backups on dialog transactions
•
Fewer resources required
•
Reduced hardware costs for disk, CPU, and memory as well as administration costs (for
example, for SAP HANA-based solutions)
•
Increased process efficiency due to better system response times
•
Reduced storage costs due to optimal data lifecycle
•
Increased transparency regarding business growth rate compared to technical
landscape growth
•
Basis for decision making and planning of follow up activities
•
Increased data quality
•
Legal compliance
•
Simplified process of meeting data retention requirements and setting up end-of-life
scenarios
Challenges
Managing the data volume of your companies presents a number of challenges. In particular, you need to consider
the following challenges when deciding on your Data Volume Management strategy:
•
Financial audits and tax requirements are country-specific and so each country poses its own challenges. It is
extremely important to take this into account as these requirements can heavily influence a DVM strategy.
Example
Financial auditing may require you to retain historical data for a certain number of years. Although
archiving this data might be an effective way of reducing your data volume, archived data can differ from
the original data and can only be accessed by a limited set of transactions. Therefore, you have to
determine whether archived data will still meet the requirements for auditing.
•
12
Any process of destroying data must be legally compliant.
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
•
Size and complexity of system landscapes and exponential data volume growth can lead to an increased total
cost of ownership (TCO) and reduced green IT. In large systems it can be difficult to process archiving and
deletion jobs without impacting other business-critical processes.
To ensure that these challenges do not become critical, SAP recommends creating a thorough DVM roadmap or a
project plan, which defines all necessary steps to successfully implement your DVM strategy.
2.4
Basic Architecture
2.4.1
Technical Aspects
Technical tasks in Data Volume Management require a well-designed DVM infrastructure. This mainly applies to
monitoring and archiving tasks.
Monitoring
DVM monitoring and reporting should be based on a centralized concept for information collection. This means
that you must ensure all systems are technically connected to SAP Solution Manager. DVM-related data is
collected from each system in the DVM landscape and stored centrally in SAP Solution Manager for evaluation
using the DVM work center.
Archiving
Data archiving involves moving information from the database to the archive files. These archive files can be
maintained on any file storage system but SAP recommends storing them on an external content server.
Content servers are standalone components that you use to store large quantities of electronic documents in any
format. Relevant SAP applications must support the use of the SAP Content Server. You can run a content server
with or without Information Lifecycle Management (ILM) enablement. Archiving for SAP Business Warehouse
(BW) systems is supported by Nearline Storage (NLS) systems.
To find SAP-approved partners for the different technologies, go the SAP Application Development Partner
Directory at http://global.sap.com/community/ebook/2013_09_adpd/enEN/search.html and search for the
following key words to find corresponding partners:
•
BC-AL for storage partners certified for the ArchiveLink interface. These are generally considered the ‘normal’
content servers.
•
BC-HCS for storage partners supporting the HTTP Interface (HCS = HTTP Content Server)
•
BC-ILM for ILM-enabled storage and
•
NLS for Nearline storage partners for BW systems
As part of the Data Volume Management process, you need to perform other technical tasks, such as reorganizing
tables and indexes after extensive archiving or deleting, as well as troubleshooting any read or write jobs that
failed. Effective housekeeping is also an important part of the process, especially for SAP tables. For more
information, see SAP Notes 16083 and 706478.
The DVM work center is mainly used for monitoring and does not initiate archiving activities.
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
13
2.4.2
Interfaces
In the context of Data Volume Management, interfaces exist both on a technical level (between technical systems)
and on a business-process level (between applications). The technical level is important for DVM monitoring.
Therefore, each system in the scope must be connected to SAP Solution Manager and the data collection must be
active.
The business-process level is related to functional correctness. On this level, it is important that all interfaces
between applications are listed and you need to check whether and how they are impacted by DVM measures
(avoidance, summarization, deletion, archiving). It is important to identify whether you have any systems that try
to access archived data and whether you have any source systems that deliver data from time frames that were
already archived. For example, a BW system might try to upload data that has already been archived in the ERP
system or archiving in CRM might send BDocs to update the corresponding documents in ERP.
You need to make sure that the various responsibilities on both sides of interfaces are clearly defined, including
which teams are involved.
2.4.3
Information Lifecycle Management
Information Lifecycle Management (ILM) is a software package that you implement on top of SAP NetWeaver,
which includes features such as retention management and system decommissioning. Data archiving can be used
independently of the ILM add-on. You use data archiving to move unnecessary business data from your databases
to cheaper long-term storage in an easily accessible archive.
Data archiving is one of the prerequisites for retention management implementation. Therefore, data archiving is
always included in ILM retention management projects. This means that, although it can sometimes be difficult to
see the difference between the two processes, ILM is an expansion of data archiving.
SAP ILM supports your DVM processes by providing software, tools, and functions for the following processes:
•
Data archiving and data management
•
Retention management, which involves destroying data after a defined retention period
•
System decommissioning, which ensures compliance with legal and regulatory requirements
14
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
Figure 6: Overview of ILM
As a bare minimum, your DVM strategy should cover data archiving and data management, but you can also
include retention management. When considering how long you intend to keep your data before archiving it
(usually 1-3 years), you should also consider the length of time you will retain the data in that archive before it is
destroyed. This length of time depends on legislative and industry requirements, but is usually between 7 and 15
years.
SAP Retention Management provides tools and functions to support the following tasks:
Note
These tasks apply to both structured and unstructured data.
•
Determining the location of stored data
•
Determining the retention period of data
•
Determining when data can be destroyed
•
Defining policies based on external legal requirements or internal SLAs
SAP System Decommissioning provides tools to help decommission legacy systems and move data from both
SAP and non-SAP solutions into a central ILM retention warehouse.
If required, DVM can provide retention management, data destruction, and data archiving functions, but does not
provide any system decommissioning functionality.
2.4.4
Data Volume Management Tools
SAP Solution Manager provides some, but not all, of the tools required for Data Volume Management (DVM)
processes.
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
15
Data Volume Management Work Center
The most important platform for analyzing and monitoring data is the Data Volume Management work center in
SAP Solution Manager. The work center serves as the central point of access for different functionalities and
applications related to DVM. It provides the following tools to support a variety of processes:
Tools
Description
Decision Maker
Helps you to determine the following key points:
•
Which tables to process first
•
Which objects best fit your defined strategy
•
Which objects show high saving potential
•
Which objects are not yet part of your archiving strategy
The Decision Maker analyzes your system based on leading application areas and
weighted key figures, for example, archiving activities, data growth, size, and
complexity. Only tables of the specified application areas will be considered.
Reorganization and
Compression Analysis
Helps you to determine the following key points:
•
How much space you released during recent archiving activities
•
Whether data compression is a good option in your environment
•
Whether reorganizing your tables or indexes is necessary or beneficial
•
How much space you could save by migrating to SAP HANA
The analysis uses key figures related to compressing and reorganizing indexes
and tables to simulate potential savings.
Impact and References
DVM Planning Dashboard
Forecast and Simulation
Cockpit
Service Documents
Helps you to determine the following key points:
•
Impact of planned archiving measures on business processes
•
Which processes and steps require testing
•
Which tables and indexes are connected to which business scenarios
•
Affected transactions and reports
Enables you to perform the following tasks:
•
Visualize archiving potential
•
Determine whether you can configure your own residence times for individual
systems and application areas
•
Provide a graphical representation for greater detail regarding annual and
monthly data
Enables you to perform the following tasks:
•
Simulate cost savings after implementing proposed DVM measures
•
Project impact of planned business changes
•
Simulate HAHA migration and determine required space on HANA
Enables you to investigate your system in more detail. This tool is a Guided SelfService and can be used without assistance.
For more information, see the SAP Help Portal at http://help.sap.com/solutionmanager71  Application Help 
SAP Library  Application Operations  Work Centers in Application Operations  Data Volume Management
Work Center
16
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
You perform data archiving and ILM activities locally on each system with the appropriate NetWeaver tools. For
example, Archive Administration (transaction SARA) or the Data Archiving Cockpit, which is available from SAP
NetWeaver 7.40 Support Package 04 and which consists of the ILM Archiving work center and the ILM Reporting
work center. For more information, see the SAP Help Portal at https://go.sap.corp/kf1.
These tools are available with all NetWeaver installations regardless of whether you have an existing ILM add-on
license.
EarlyWatch Alert
From Solution Manager 7.10 SP12 onwards, you can make specific DVM content available in EarlyWatch Alert
(EWA). For more information, see SAP Note 2036442.
Recommendation
SAP recommends that you focus DVM activities on a maximum of 5 objects. You select these objects
based on their complexity. Simple objects, such as those found in cross-application areas, are suggested
first, along with the expected saving potential; for example, tables that include older data and could be
deleted or archived.
In SAP Solution Manager systems with a lower support package, DVM alerts occur if a database experiences
significant growth or reaches a predefined size. For more information on the relevant thresholds, see SAP Note
1770921.
Guided Self-Service for DVM
To help implement your data management and data archiving strategy, SAP offers a guided self-service for Data
Volume Management, which is a tool-based approach powered by SAP Solution Manager 7.1. The self-service
generates a best practice document that describes how to handle your largest data objects according to the
methodologies of data avoidance, summarization, archiving, and deletion.
NetWeaver Tools
In addition to the DVM-specific tools, you can use some SAP NetWeaver tools during a DVM project. The following
NetWeaver tools can be used for Data Volume Management:
•
Table Analysis (transaction TAANA)
•
Archive Administration (transaction SARA)
•
Document Relationship Browser for an integrated display of archived documents
•
DART (Data Retention Tool), which extracts business documents required for tax audits.
The Data Retention Tool is based on IRS Revenue Processes 98-25 in the US but was enhanced and adjusted
to also meet German tax legislation (GDPdU)
SAP Standard for Data Volume Management
Overview of the Standard for Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
17
3
Lifecycle of Data Volume Management
The following chapter describes the general methodology that you need to apply in order to implement and
operate a Data Volume Management strategy.
Figure 7: Lifecycle of Data Volume Management
The first step of implementing a DVM strategy is to assess your landscape. Even you already have an existing
strategy, reviewing your current strategy is important. You then need to define the scope for your DVM strategy
and estimate the benefits you hope to achieve.
In the following “Planning” step business processes and legal requirements on data are checked against the
technical need for data reduction. You furthermore prepare all steps you finally want to implement with a project.
Besides all the technical issues aspects like timeline and project funding have to be considered as well.
The result of all these steps is the initial implementation of a DVM strategy. The implementation triggers finally an
ongoing monitor and improvement process.
During operations the archiving and delete jobs are scheduled regularly. The success of the implementation has to
be monitored continuously by DVM Work Center to detect new or additional potential for improvement that may
be caused by changed business processes. Potential findings will trigger a new entry into planning step for making
sure your DVM strategy will be adapted to changed requirements,
18
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
3.1
Plan
When beginning a new DVM implementation or if DVM monitoring indicates that you need to consider reduction
measures, you start with the Plan (assessment and scoping) phase of the DVM standard implementation. To
begin this phase, you need to evaluate your current system.
You need to perform an initial implementation as you would a project. Project organization will help to integrate all
required teams and roles. The following chapters explain the approach you should take during your
implementation.
3.1.1
Data Volume Management Strategy
Ensuring optimal Data Volume Management is a critical task and should always be run as a centralized IT service.
If you do not already have a holistic strategy for managing your data, SAP highly recommends that you to think
about implementing DVM processes or reviewing your current strategy.
A comprehensive Data Volume Management strategy should describe the following points in detail:
•
Responsibilities
o Who is responsible for DVM issues
o How the responsible person deals with DVM issues
o How often the responsible person addresses issues that occur
•
Concepts for proper data monitoring, which form the basis for alerting and following improvements.
•
Concepts for data management and data archiving
A guided self service in DVM Work Center supports this step by helping you to define a data management and
archiving strategy.
•
Concepts for efficient data storage utilization, For example, database compression and reorganization
The Reorganization and Compression application in the DVM work center allows you to simulate certain
scenarios to optimize data storage.
•
The key goals of your DVM implementation
•
Budget and personnel requirements
Your strategy should define key roles, such as the project lead (DVM Champion) and functional experts. For
more information, see the Organizational Readiness chapter.
•
Technical requirements, including setting up DVM Work Center in SAP Solution Manager to efficiently analyze
and monitor the growing systems and setting up an external content server to store created archive files
•
Strategic goals
3.1.2
Organizational Preparation
A DVM project is executed by a project team. Once you have defined responsibilities in your DVM strategy and you
have ensured sufficient resources, you can start to plan your project.
The project team executes all project tasks and is responsible for the success of your implementation. The project
requires a high degree of cooperation between teams and business departments, such as functional or business
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
19
experts, process owners of relevant applications, and IT, for example, infrastructure experts and database or
system administrators, and so on. You might also need to define additional roles, such as external or internal
auditors, IT consultants, and service providers.
A project team lead takes responsibility for the project, managing tasks such as budget planning, team
coordination, decision making, and reporting.
An important task assigned to the project team is to implement the strategy and establish a DVM team
responsible for the Run and Optimize phases. This team must have at least 1 DVM lead, who is responsible for
checking that all DVM projects are compliant with the DVM strategy. Ensuring that your strategy is followed
properly required close cooperation between business process operations and IT departments.
3.1.3
Assessment
You need to identify the systems in your landscape for which you want to implement Data Volume Management
and which specific DVM measures are appropriate. Consider which of the following aims apply to you:
•
Minimize storage costs and capacity growth
•
Identify and check which resources in your landscape are most used
•
Monitor and minimize database size and growth
•
Improve performance (dialog, batches) according to database size and growth
•
Optimize and review your existing archiving concept
•
Reduce the data volume and TCO
You can also make a decision by considering pain points. You should use the following tools to collect all available
technical information about DVM-related issues, such as system, landscape, technical, and application-related
information:
•
DVM work center
•
EarlyWatch Alert (EWA)
•
Enterprise Support Report (ESR)
•
Database Administration Cockpit (transaction DB02)
•
Self-assessment
Based on this information, you can define thresholds to be monitored by EWA. For example, if the size and growth
is larger than the defined thresholds, an alert is raised. From SAP Solution Manager 7.1 SP12 onwards, you can
also create an EarlyWatch report that contains DVM information, which gives you an idea of the potential savings
you could make. The data volume size and the monthly data growth are the most-used criteria, but in specific
situations other criteria may be used.
As soon as one or more thresholds are reached, you should begin scoping your DVM requirements for that
specific system or object. This assessment process determines whether the data growth and volume is best
managed by implementing a DVM strategy or whether you also need to review your business processes.
3.1.4
Scoping
While the assessment process looks at a system and a database as one entity, scoping describes a more detailed
look at database objects. You examine for which business areas and business objects you can reduce the data
20
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
volume and data growth by applying DVM standard reduction methodologies, such as data avoidance, data
summarization, data deletion, and data archiving. You evaluate the reduction possibilities according to a business
process analysis. that is, you generate a report about how your situation looks if no DVM reduction possibilities
are available for a business object or the data for this business object cannot be deleted or archived because of its
assigned residence time.
Note
Business process analyses are outside of the scope of the SAP Standard for Data Volume Management.
When scoping, you determine the possible effects of DVM activities on your system's largest tables and related
archiving objects. You also evaluate the prospects of success, especially reachable saving potentials.
You use this step to decide whether it is worthwhile to continue with planning and implementing your DVM
solution. You may find that the predicted benefits are not large enough and that the expected savings do not
justify the effort required to run a DVM project.
You can find potential areas for improvement by using one of the following services:
•
Guided Self Service for DVM
•
Continuous Quality Check (CQC) SAP Service for DVM.
Both services provide an overview of database resources and expected sizes of the analyzed database tables.
If you don’t use the guided self-service, collecting the following information can help provide a basis for your DVM
implementation decision:
Type of Information
General system information
Examples
•
Date of production start
Systems older than 2 years present the best data archiving opportunities.
•
Productive clients (multi-client system)
You must perform all DVM activities on each client.
Table size and growth
information
•
Monthly data growth (default: last 12 months)
•
Categorization of tables
•
Relevant business area and business object
This includes estimating the expected complexity of the implementation.
Basis objects are usually easy to implement when compared to, for
example, accounting documents, which require your teams to consider
the appropriate legal requirements.
•
Possible ways of reducing this business object
•
Annual distribution of business-object data
You can use this to estimate the technical requirements of archiving the
data.
Historical archiving
information
•
Archived objects with completed archiving runs
•
Execution period (first run date and last run date)
This includes identifying whether they run regularly. Archive objects with a
date of last archive run older than 2 years are not considered active.
•
Number of archived objects and the size of the archive files.
This indicates whether you have already archived a significant portion of
your data.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
21
The outcome of the DVM scoping process is a clear description of what is required to implement or adjust your
DVM strategy, and a rough idea how successful this will be. If you do not identify a significant enough benefit to
your company, for example, if the saving potential is minimal, and you decide that conducting an archiving
implementation project is not reasonable, you could consider implementing regular data growth monitoring
processes to identify an appropriate time to begin certain DVM processes. SAP recommends using the DVM work
center in SAP Solution Manager as central monitoring tool.
To ensure the success of your DVM implementation, you should estimate the time and effort required and define
the scope of your service in the form of a proposal. You must ensure that all management related (project)
requirements have been met.
3.2
Build
It is important to keep your business process champion involved at every step of your implementation, especially
when including business objects that are required for internal or external reporting purposes (legal compliance) in
the scope of your data archiving strategy. Usually, the implementation is driven by the business process
operations team.
The following key factors are vital to ensuring the success of any DVM implementation:
•
Support of the management team, for example, sufficient sponsoring
•
Established IT and business unit teams
•
Committed business unit that understands the need for DVM
•
Proper scoping by performing a qualified data analysis to focus your goals.
These might be quick-wins during the first implementation phase
•
Archive administrator know-how
•
Test system for mass and performance tests
In this phase you start to reduce the system volume by implementing the recommendations given in the SelfService or SAP service report and following your project plan. In this implementation step, you can also cover the
setup and configuration of the DVM work center in SAP Solution Manager.
You can also perform post processing during this phase, such as database reorganization.
In general, the implementation follows the stages shown in the following figure:
22
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
Figure 8: Phases of a DVM Implementation Project
3.2.1
Blueprint
In this step, you match the technical requirements for Data Volume Management with business requirements for
data. To successfully complete this step, you need to involve business units. The following chapter describes all of
the processes and consideration necessary to properly blueprint your DVM implementation.
Workshops
The business process champion and the key users conduct workshops to discuss options with the application
team. Each workshop usually focuses on a specific application or document type and aims to fully inform the team
These workshops need to be carefully planned and conducted by the business process operations team.
Data Analysis
Application management teams should perform an in-depth and detailed data analysis, for example, using the
Table Analysis function (transaction TAANA). This identifies which business objects will benefit from the Data
Volume Management process. It also helps to clearly present the expected effect of any measures you take and
allows you to clearly see the effects of defining a certain residence time. In addition, it may help when specific data
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
23
has to be excluded, for example, from data-archiving for business reasons. In addition, it helps you to identify the
following:
•
Appropriate archiving object or deletion report. In some cases, more than one archiving object can be applied
for the same table and you need to find out which archiving object is the most appropriate.
•
Possible critical errors that might prevent you from implementing Data Volume Management. For example,
incomplete documents will not pass the checks for data archiving and will remain on the database.
Test Archiving
Test archiving performed before the workshop allows you to demo how to display archived data and helps to
reassure key users that their data will not be lost. For this test, you can use the following sources of data:
•
DVM CQC service
•
DVM Guided-Self-Service
This provides in-depth data analysis, explanations, and know-how on the individual data archiving objects.
•
Tables and Archiving Objects (transaction DB15)
This shows the linkage between an archiving object and its related tables.
You can use SAP Notes to find additional technical information.
For more information including the data management guide, reports, and a list of transactions for accessing
archived data, see the following SAP knowledge sources:
•
http://wiki.scn.sap.com/wiki/display/TechOps/Data+Volume+Management
•
http://scn.sap.com/community/information-lifecycle-management
Role of Business Process Champion
The business process champion must answer the following questions:
•
Is the data required at all or is it possible to avoid future postings?
•
Is the data required at the current level of detail or is it possible to summarize the data?
•
Is the data required for a limited timeframe only and can it be deleted afterwards?
•
How long should the data to remain in the database before it is archived?
•
How long do the archive files have to be stored for retrieval for business and audit purposes?
•
Where should the archive files be stored? (external content server or directly on file system)
•
Is the available standard functionality on archived data sufficient for business and audit purposes?
•
Which minimum search criteria should be available for archived data?
•
Is fast indexed access (for dialog accesses on archived data) required or is a non-indexed sequential read of
archived data in a batch process sufficient?
•
Do you need to take action before data archiving to ensure legal compliance? For example, creating a Data
Retention Tool (DART) extract for tax audits in the US and Germany.
•
What dependencies between archiving and deletion have to be considered? In some cases, the business
process defines the sequence of the archiving objects, in other cases the archiving objects follow a predefined
sequence that is achieved by implemented existing checks. For example, CO total records can only be
archived when all corresponding CO line items have been archived.
•
What are the effects of DVM measures on the system landscape? Are there any other systems connected via
an interface that will be negatively impacted? For example, it is important to consider the interface between a
transactional system (for example, R/3, CRM) and an attached BI system. Uploading to BI that already has
been archived in R/3 or CRM is possible, but it is also difficult and time consuming.
24
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
Storage and Project Costs
A successful implementation will have a short but manageable residence time. This is to minimize storage costs
while reducing impact to business processes.
In some cases, very short residence times mean that you have to define and develop customer-specific
workarounds, which can increase the implementation costs. For example, if you have to implement Z-reports for
accessing archived data.
Figure 9: Storage Costs vs. Project Costs
If data from different countries with different legal aspects is included in the project, planning and testing data
may take several months.
If you are archiving data, the application management team needs to provide information about features available
for displaying and reporting archived data so that the business process champion fully understands the effects
and consequences of data archiving.
The result of this process is a preliminary definition of the data volume management strategy. This preliminary
definition has to be confirmed by the business process champion after an intensive test phase with the
involvement of the key users.
3.2.2
Realization
In this step, you transfer the specified business requirements into the Customizing and variant definitions. In
addition, you must define the technical setup of data-archiving, for example, where to store the archive files or the
retention time of an archived file before its destruction.
Example
If you are summarizing, you need to set up the Customizing
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
25
If you are archiving, you need to customize residence times for an archiving object and define the selected
variables for the archive write job.
There is no single way to avoid creating data or to begin and schedule deletion tasks. How to proceed and which
report or Customizing option to use depends strongly on the business object or technical table.
For data archiving, the central place for customizing and variant definition is the NetWeaver Archive
Administration (SAP transaction SARA).
A full and detailed understanding of Customizing parameters and the variant definition report is required.
After this step, the system is ready for functional and integration tests.
3.2.3
Functional and Integration Tests
After you have transformed the business requirements of the preliminary DVM strategy to the Customizing, the
effects and results have to be tested. Before starting the tests, make sure that the latest SAP Notes are
implemented.
The following table describes the aspects that the tests should cover and provides some specific questions that
you need to answer:
Aspect
Functional correctness and
business acceptance
Effects on interfaces and
other systems in system
landscape
Questions
•
Is the result as expected?
•
Is the data after avoidance still sufficient for reporting and retrieval needs?
•
Does the deleted data match the data selected in the deletion report?
•
Is the display method (which may differ from the display of online data)
acceptable and does it include all required details?
•
Is the performance of accessing archived data sufficient?
•
Is the search criteria offered on archived data sufficient?
•
Is the display of linked data sufficient (for example, business objects in
document flow, PDF documents, attachments, documents)?
•
Are there any systems that try to access archived data? For example, a BW
system may try to upload data that has already been archived in the ERP
system
•
Archiving in CRM may send out BDocs to update the corresponding
documents in ERP.
Performance Test
This mass test is important to estimate the runtime in the production system. Based on the result, the data
volume processed in each job may have to be adjusted to ensure a reasonable runtime that should not exceed 6-8
hours.
If possible, this mass test should best be performed on a system of comparable size to the production system.
A test case with all detailed steps is prepared by the application management team. The functional test is then
performed by the key users of the business unit based on the test case.
As a result the business process champion signs off and confirms the planned activities and you set up job
variants and Customizing that can be used in the production environment.
26
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
3.2.4
Go-Live
You need to implement the activities and measures defined in the DVM strategy carefully. You need to closely
monitor the effects on the overall system performance, as archiving and deletion processes will put additional
load on the system.
Begin by running a small number of jobs at the same time. After gaining some experience, you can increase this
number if necessary. You do not need to start a lot of different archiving objects or deletion jobs at the same time.
Instead, archive and delete the backlog of archiving objects or deletion jobs for 2-3 business objects before
scheduling the next jobs. Removing the backlog (catchup phase) may take some time; depending on the overall
system loads and the resulting limitations in scheduling the jobs, this phase may take several months.
In contrast the standard operations phase, the catchup phase requires manual job scheduling and monitoring.
Usually, no automated or dynamic variant definitions can be used. Different variant definitions may be required for
every job. You need to ensure intensive monitoring to discover how much more work the DVM jobs can cause
without any negative effect on standard operation.
You should create a predefined list of jobs to be scheduled as result of the test phase, as well as a troubleshooting
guide explaining the most common errors and their fixes or workarounds. It is especially important to distinguish
between errors that can be handled by the IT unit and errors that need to be fixed with the help of the business
unit.
After the go-live and catchup phase, the DVM jobs can be moved to standard operation mode. During the catchup
phase, the business process operation team aims to gain experience scheduling the jobs, their runtime and their
side-effects on other processes. This means they are well prepared and able to define automated jobs that will run
without manual interaction.
3.3
Run
Once you have planned and built your solution you can begin to run your Data Volume Management processes.
This chapter discusses the key considerations when operating your solution.
3.3.1
Data Volume Management Operations
How to operate a DVM strategy is handled differently by different SAP customers and largely depends on the
number of jobs and the frequency defined for scheduling them. SAP recommends implementing regular jobs
automatically. This enables you to include these jobs in the general monitoring concept and reduces manual
effort. To achieve this, your teams need to rely on information about archiving time based on experience gained
during the go-live and the initial mass archiving phases. In addition, optimal technical customizing is required to
support this automated approach. This ensures that you create a set of well-defined jobs that run on a regular
basis in the defined job windows with as little manual interaction as possible.
Some customers schedule all their archiving and deletion jobs manually, for example, once a year after the fiscal
year closes. This approach is only feasible for systems with a medium level of data volume. Using this method can
lead to the following disadvantages:
•
Manual scheduling increases the risk of human error, for example, teams forgetting to run jobs or new
members of staff being unaware of requirements.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
27
•
Running DVM activities on data accumulated over a full year will increase the effects of the job on system
load. The SAP technical operations team and business process operations team must work closely to manage
this impact.
3.3.2
Executing Jobs
You use Data Volume Management to schedule all required housekeeping jobs. Some jobs, for example, deleting
outdated jobs or spool objects, must run periodically in a live SAP installation. For more information, see SAP
Note 16083.
In addition, the operations manuals for various SAP products give you an idea of application-specific
housekeeping jobs and regular tasks. For more information on housekeeping tasks, see the Data Management
Guide at http://wiki.scn.sap.com/wiki/display/TechOps/Data+Volume+Management Useful Links Data
Management Guide.
The following table provides some comments from SAP regarding the basic jobs:
Job
Comment
Preprocessing
Some archiving objects require a preprocessing job, which needs to be scheduled before
the write job. The preprocessing job scans the selected data and checks whether a
document qualifies for data archiving. Specifically, it has to ensure that the business
process is completed and no future updates to this specific object are expected or
required before archiving. If the job is successful, the status of these objects updates and
the write job can begin.
To ensure that the process is fully automated, you can use dynamic variants.
Write
SAP recommends using dynamic variants for write jobs because you can define them
easily. You should also write an archiving session note for the archived data table. You
can use table TVARV to generate a full text.
Delete
For each archive file, a single delete job needs to be scheduled. To check the delete job,
SAP does not recommend scheduling the job automatically using the option available in
the Customizing for Archive Administration (transaction SARA). Instead you should
instead use the RSARCHD scheduling program.
Running delete jobs at the same time as backup jobs may increase the runtime of the
backup job. The runtime of the delete jobs depends on the defined archive file size.
Reducing the archive file sizes will reduce the runtime of a single delete job but will also
increase the number of delete jobs required. This allows you to change the number of
delete jobs which are scheduled to run at the same time and may optimize throughput.
The optimum number of simultaneous delete jobs depends on your system hardware and
the workload.
Running multiple delete jobs simultaneously during normal business operations may
affect the overall system performance.
Store
SAP recommends that you set store jobs to run automatically. You can do this in the
archiving object-specific Customizing window in Archive Administrator (transaction
SARA).
If you have not connected a content server in archive link, you need to back up archive
files before the delete job is scheduled. You can fix this problem by defining a two-step
28
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
Job
Comment
job that combines the start of RSARCHD with a copy job. The copy job can be a shell script
that is defined with External Operating System Commands (transaction SM69).
Extractors and jobs
for analyses
(controlled by DVM
work center)
These jobs provide information for monitoring in the DVM landscape and decision
making. They regularly load data into the BW system related to miscellaneous areas,
covering the database and their objects in general. You can add additional non-technical
information. Specific objects are also investigated to assess their potential for data
reduction.
Example
If you want to archive some of the data in your database, you have to use the write and delete jobs. If you
want to use an external server, you also need to use the store job. For some archiving objects, there are
mandatory preprocessing jobs which also have to be scheduled. You can use DVM to process all these
jobs either manually or automatically.
You need to run these jobs regularly.
You can see the result of data archiving job in the DVM work center. You can schedule archiving activities on SAP
systems by using Archive Administration (transaction SARA) or the ILM Archiving work center. You should
schedule archiving jobs in accordance with a central Job Management concept. In general, you should monitor
any job related to preprocessing, deleting, archiving, and storing archive files on a content server. If possible, the
spool outputs should be parsed automatically. This ensures that, for example, a delete job has worked properly by
fully deleting data and not simply ended because, for example, certain authorizations were missing. If you find that
a job has been ended without successful completion, you must have defined steps for escalating the issue. These
should be defined in your operations manual. For more information, see Troubleshooting.
3.3.3
Maintaining Documentation
It is important to keep track of which data was archived in which archiving session. Especially for sequential file
scans, the user should be able to decide which files should be chosen for the sequential read.
The following table describes the different types of documentation:
Documentation
Description
Short text for an
archiving session
The short text is displayed in the management overview in Archive Administration
(transaction SARA) and in any other dialog where the user can choose an archive file
for sequential read. The short text should always be maintained. This can be done in
most cases in the variant for the archiving object. In some cases the selection
screen for the archiving object does not offer a field Archiving Note. In those cases,
the short text has to be maintained manually in the Management view of Archive
Administration (transaction SARA) afterwards. The short text should include the
most important selection criteria, such as period, fiscal year, document type.
Long text for an archiving
session
When you double-click a single archiving session in the Management view of Archive
Administration (transaction SARA), a dialog appears in which you can choose to
maintain a long text that describes the content and selection of the archiving
session. As this always requires manual interaction, this method of documentation
is not very widespread among customers.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
29
Documentation
Description
Used selection variants in
an archiving session
The values of the selection variables of an archiving session are automatically
stored and can be displayed in Archive Administration (transaction SARA) 
Management  right-click on an archiving session  User Input.
Print lists of selection
variants and spool output
for an archiving session
When you select the option Selection Cover Page in the print parameters, the values
of the selection screen are included in the top part of the spool file that is created for
the write job. In addition, you can choose Archiving mode: Print and archive instead
of the default Print only. This solution gives you the highest level of detail with the
lowest manual interaction as all necessary information will be stored in the print list
automatically. However, this solution requires a third-party solution to store the
created print lists.
Recommendation
SAP recommends that you maintain the short text (that is, the archiving session note) for every archiving
session. If you use a third-party solution as a content server, use print lists to document the details of
your archiving sessions.
The DVM work center in SAP Solution Manager also provides several possibilities (across all connected systems)
to report on the following:
•
Archiving Jobs (system, archiving object, related jobs)
•
Archiving Statistics (information regarding write and deletion tasks (Archive Administration (transaction
SARA)  SARI information)
•
Archive File Statistics (archive status overview as well as file size and object numbers)
3.3.4
Reorganizing Databases
The effect of deleting a significant amount of data from database tables depends on the implemented database
system. On some databases, you need to explicitly reorganize the database indexes to ensure performance gains.
Other databases reorganize their indexes automatically if necessary. On some databases, you have to reorganize
tables to free up previously allocated disk space. Other databases shrink their tables automatically to the required
size.
As the features regarding reorganization are continuously improved by the database vendors, they are not
discussed in detail in this SAP Standard for Data Volume Management. However, you must remember these
technical aspects and check them to get the full benefit of DVM activities performed on your databases.
30
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
3.3.5
Troubleshooting
This chapter provides recommendations for dealing with jobs that terminate before completing correctly.
Write Job Terminates
Symptom
A write job could not be completed and was terminated.
In the event of an error, the Archive Development Kit (ADK) declares the last created file to be invalid. The file
does not appear in transaction SARA and you cannot schedule a delete job for this last file. So there is no risk of
deleting data that is in fact corrupted in the archive file. In addition, before the delete jobs start, the archive file is
scanned and checked for correctness.
Cause
•
There is not enough disk space to create the file.
•
For Oracle, the most common cause is that the snapshot is too old too old (error ORA-1555).
Solution
•
Reduce the amount of selected data to reduce the runtime of the write job.
•
Run your archiving session when few updates or inserts are being made to the database. Avoid scheduling
archiving delete jobs at the same time. Do not set delete jobs to start automatically. Instead, use the RSARCHD
scheduling report after the write job has finished.
•
For SD objects, select and alternative database access
•
Extend the rollback segments of the database.
If your write job ends during an archiving job, proceed as follows to rerun the job:
1.
Schedule the delete jobs for all valid archive files that are displayed in Archive Administration (transaction
SARA).
2.
Check whether there is a remaining archive file for which you did not schedule a delete job. If there is, remove
it.
3.
Rerun the archiving session using the same selection criteria to archive the remaining data.
Caution
If you restart the write job before running all delete jobs for the existing archive files, you risk archiving the
same data twice in two separate archive write jobs.
Delete Job Terminates
Symptom
A write job could not be completed and was terminated.
Cause
There are two main causes for this error. You should distinguish between the following termination errors:
•
System errors
•
Verification errors
Solution
If the delete job ends because of a system error, there is nothing you can do other than rerun the job until it
completes successfully.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
31
If the delete job ends because of a verification error, the data in the corrupt file will not be deleted from the
database. To rerun the job, proceed as follows:
1.
Archive the data from the corrupted file in a new archiving session and then delete it.
2.
Manually remove the corrupted archive file from the management data of the Archive Development Kit (ADK)
and the file system.
3.
Rerun the job.
Note
If verification errors occur when archive files are being read or reloaded, the data in the corrupted files is
not read or reloaded. As the corrupted files must have passed the verification during the delete phase,
you can assume that the archive files were corrupted after that phase, for example, while they were being
recopied. It may be possible to repair the defective file using SAP remote consulting service. However,
this service costs extra.
32
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Lifecycle of Data Volume Management
4
Optimize
Your DVM solution should help to increase your company's TCO. In particular, a successful DVM implementation
ensures that your database is as optimized. In addition, it means that you can postpone large hardware
purchases, for example, for more disk space, and that you can process system management activities, such as
recovery and back-up, much more easily because only information required for business processes is affected.
SAP offers several tools which help you to measure the success of your implementation.
The following table describes these features and explains how you can use them to evaluate your DVM solution
and spot possible areas for improvement:
Feature
Statistics and Trend
You can use the Statistics and Trend functionality in
the DVM Work Center in SAP Solution Manager to find
trends concerning the size and the growth rate of top
DVM-related objects, like SAP Applications, systems,
archiving objects, and table.
Statistics and Trends allows you to use the following
functionalities:
•
Show data allocation (by application or business
object) and the time-based distribution of data
•
Show table utilization statistics
If you have successfully implemented implementation
your DVM Strategy, the size of target objects will
remain constant or decrease.
DVM index and achievements tracking
You can measure the business growth of a landscape
and compare it with the measured technical growth of
that landscape. You can then see how quickly these
two things are happening. Therefore, you can
determine the sustainability of your system based on
your requirements. This enables you to choose the
best course of action to in terms of decision making
and planning. In addition, this comparison provides an
overview of the success of previous DVM measures,
like deletion and archiving tasks.
The DVM Work Center supports you in this by
providing the following features:
Business Object Footprint Analysis
SAP Standard for Data Volume Management
Optimize
•
Visualize business growth rate compared to
technical landscape growth
•
Display landscape-wide achievements of
measures taken for data deletion and archiving
Business Object Footprint Analysis in the DVM Work
Center is a tool that analyzes the business objects,
such as company code, sales organizations, and plants
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
33
Feature
in the selected managed system. It provides an
overview of how many records or what percentage of
data is allocated by those business objects.
This enables you to see a business view of the
distribution of data, which can easily be used to
identify which business, organizational unit, or country
allocates which amount of data.
Improvement projects
You use the Improvement Projects functionality in the
DVM Work Center to visualize KPIs and Value Drivers,
and set targets and improvement history. The tracking
tool will always show a status of where a project is at
that point in time and what the trend analysis returns.
Performance analysis tools
You can use performance analysis tools to evaluate
technical performance indicators. These tools allow
you to track the following performance indicators:
•
Average response times of key transactions
decreasing or remaining constant.
•
Reduced response times are an indication for
increased performance.
Note
A general statement about the success of a DVM Strategy can also be derived from cut energy use
(costs) and reduced backup times.
4.1
Continuous Improvements
The continuous improvement of the DVM Strategy is an ongoing task. You need to perform DVM activities at least
four times a year. Team work between your Business Process Operations team and the DVM Champion is
essential. If you identify a way of making improvements, you need to include the Business Process Champion so
that they can provide support implementing improvements.
You need to check regularly whether the following has happened:
•
Actions taken have had the expected effect
The expected effect needs to be documented in the planning phase of the DVM implementation. This will be
your baseline
•
Archiving or deletion tasks have been completed fully
This includes checking that all of the data was processed and the residence time has been respected.
•
The DVM Strategy really covers all relevant objects and makes provisions for new growing tables that need to
be considered in addition to the already covered database objects
•
Reorganizing the index or table is beneficial
The DVM Work Center supports this analysis for Oracle and DB6.
34
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Optimize
5
Driving Continuous Improvement
To measure the success of data volume management you can check the appropriateness of your support
processes by asking the following questions:
•
How often do you access your data?
•
How often do you check data volume strategy adherence?
•
How often do you check retention time and legal requirements for data?
•
How often do you execute clearing of data?
5.1
Quality Assurance Tasks
From a quality management perspective, the following tasks are most important:
•
Define a clear data volume management strategy and ensure adherence to it
•
Arrange regular workshops to identify areas for data avoidance, data reduction, summarization, and archiving
•
Ensure formal ownership of data objects
•
Ensure regular review of archiving processes
•
Ensure adequate data volume management strategies are in place and data volume KPIs are measured
5.2
Quality Targets and KPIs
To ensure mature data volume management and to improve recognition of the value provided by IT departments,
the following quality targets are the most important:
•
Improve transparency
•
Improve stability and reliability of business processes and reduce business risks by pro-active measurements
•
Increase efficiency
•
Readiness to meet Business needs and legal requirements
To assess the quality of the data volume management process, you must have clearly-defined parameters and
measurable objectives. You need to collate and evaluate the key parameters in regular reports. You can use the
historical data that is created in this way to identify trends and derive the necessary measures.
Quality Targets
Improve transparency by proactive
monitoring of data distribution,
growth rates and saving potential
SAP Standard for Data Volume Management
Driving Continuous Improvement
Challenges
KPIs
•
Scheduling and defining
thresholds for pro-active
monitoring of data distribution
and growth rate
•
Following up on saving
potential
•
Review of EWA reports for data
growth and completeness of
actions taken
•
Trend in data growth of noncovered DVM objects
•
Frequency of data analysis and
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
35
Quality Targets
Challenges
•
DVM concept establishment
and review of adherence
KPIs
cleansing
•
Frequency of trend analysis in
terms of allocation statistics,
utilization, and time-based
distribution analyses
For example, avoiding keeping
data in the system to cover
every eventuality.
Improve stability and reliability of
business processes and reduce
business risks by pro-actively
taking measurements
Increase efficiency
•
Performing data analysis,
deletion and archiving tasks
•
Improve productivity by better
system performance
•
Time required for maintenance
window
•
Reduce the number of business
disruption and slow
performance of transactions
due to high data load
•
Trend for backup runtime
•
Percentage of business
disruptions due to data volume
handling
•
Trend in TCO after DVM
strategy in place
•
Trend in database
maintenance effort
•
Trend in backup and recovery
runtime
Mapping data retention times
to legal requirements
•
Frequency of Business/IT
meetings
Changes made to ensure legal
compliance need to be
implemented quickly
•
Frequency of reviews regarding
retention time
•
Increase performance by
better distributing data
•
Implementing the DVM
strategy and following up on
the findings
•
Readiness to meet Business needs
and legal requirements
36
•
Reducing effort and time taken
for archiving activities
•
Setting up regular meetings
with Business to ensure
expectations are being met
•
Legal requirements and
business requirements need to
be reviewed regularly to set up
end-of-life scenarios while
meeting retention times
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Driving Continuous Improvement
6
Training
SAP offers the following services for training relating to data volume management:
Figure 10: DVM Service Portfolio and Knowledge Transfer
For more information, see the SAP Support Portal at https://support.sap.com/support-programsservices/programs.html.
6.1
Expert Guided Implementation Sessions
For Enterprise Support Customers, SAP offers Expert Guided Implementation Sessions (EGI).
Expert Guided Implementation (EGI) sessions are a combination of remote training, live configuration, and ondemand expertise, which allow you to perform complex activities with the help of experienced SAP support
engineers. The instructor will demonstrate what to do step by step. Afterwards, you can perform the relevant
steps in your own version of SAP Solution Manager. If you have any questions, you can then contact an SAP
expert via phone or e-mail.
The following EGIs are available for Data Volume Management:
•
Data Volume Management (with SAP Solution Manager) Methodology & Infrastructure
•
Data Volume Management (DVM)  Guided Self Service
SAP Standard for Data Volume Management
Training
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
37
Figure 11: Overview of EGIs in different topic areas
For more information, see the SAP Solution Manager Training and Services page at
https://support.sap.com/support-programs-services  SAP Solution Manager Training and Services
6.2
Continuous Quality Check (CQC) SAP Service for Data
Volume Management
This service can also be found in the SAP Enterprise Support service portfolio. It can be used as a starting point
for understanding and applying a DVM methodology. In general, the service aims to reduce your total cost of
ownership (TCO) by minimizing the database size and the monthly data growth of SAP systems and SAP system
landscape solutions. The service puts you in contact with an SAP Expert who analyzes your system by using real
data located on your system.
For more information, see https://support.sap.com/support-programs-services  Support Programs  SAP
Enterprise Support  SAP Enterprise Support Academy  Delivery Format  Continuous Quality Check &
Improvement Services (CQC & IS).
38
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
SAP Standard for Data Volume Management
Training
7
More Information
Documentation
Link
SAP Service Market Place
http://service.sap.com/ilm
Reorganization and compression
http://wiki.scn.sap.com/wiki/download/attachments
/247399467/Reorg%20%26%20Compression%20An
alysis.docx?version=1&modificationDate=1406893642
000&api=v2
Expert Guided Session
https://support.sap.com/support-programsservices/solution-manager/training-services.html
Monitoring and Operation with SAP Solution Manager
https://www.sap-press.com/
Archiving Your SAP Data
https://www.sap-press.com/
7.1
Enterprise Support: Value Map for Data Volume
Management
The Enterprise Support (ES) Value Map for DVM is a social collaboration platform offered by SAP Enterprise
Support. The Value Map provides information on each step involved in SAP Data Volume Management from
assessment to improvement. It provides details of the Enterprise Support services that can assist you and there is
a forum where you can ask questions and create discussions. SAP Focus Advisors who have experience with DVM
are available to assist you with your queries.
Other customers also participate in the value map, so they may be in a position to share their experiences and
areas of shared interest with you.
For more information about the Enterprise Support Value Map, see the SAP Enterprise Support Value Maps page
at https://support.sap.com/support-programs-services  SAP Solution Manager  Training and Services 
Value Maps or contact your local SAP Enterprise Support Advisory Center.
SAP Standard for Data Volume Management
More Information
CUSTOMER
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
39
www.sap.com/contactsap
© 2014 SAP SE or an SAP affiliate company. All rights reserved.
No part of this publication may be reproduced or transmitted in any
form or for any purpose without the express permission of SAP SE
or an SAP affiliate company.
The information contained herein may be changed without prior
notice. Some software products marketed by SAP SE and its
distributors contain proprietary software components of other
software vendors. National product specifications may vary.
These materials are provided by SAP SE or an SAP affiliate company
for informational purposes only, without representation or warranty
of any kind, and SAP or its affiliated companies shall not be liable for
errors or omissions with respect to the materials. The only
warranties for SAP or SAP affiliate company products and services
are those that are set forth in the express warranty statements
accompanying such products and services, if any. Nothing herein
should be construed as constituting an additional warranty.
SAP and other SAP products and services mentioned herein as well
as their respective logos are trademarks or registered trademarks of
SAP SE (or an SAP affiliate company) in Germany and other
countries. All other product and service names mentioned are the
trademarks of their respective companies. Please see
www.sap.com/corporate-en/legal/copyright/index.epx for
additional trademark information and notices.
Material Number: