Overview

I thought I would share this experience with screenshots on how simple it is to perform some minimalist HANA data footprint control using the SAP tool Data Lifecycle Management (DLM) which comes as a tool in the HANA Data Warehouse Foundation (DWF).

I do not have SAP IQ or Data Tiering, and my DLM policy is to simply delete/destroy the data when it gets to a certain age, so we will be performing some simple data destruction. It’s not a bad start to get into the whole DLM strategy, and there is more to it than, what I will go into, here.

The Installation Guide is very good, and is what i followed, mostly through the process, this blog is intended to support, confirm and assure your install along side the Installation Guide http://help.sap.com/hana/SAPHANADWFInstallationGuide_en.pdf.

There are also some good DLM videos over in the SAP Academy – https://www.youtube.com/user/saphanaacademy

Blog Structure

  • Downloads
  • Installation
  • Configuration
  • Operation

Downloads

The DLM comes as part of the DWF, which today is on version 1.0 SP05, for our HANA on SPS12.

The DWF is non-cumulative, so you can install the latest without having to patch through various revisions, etc. Just make sure you choose the right SPS for your HANA SPS.

HANA Data Warehousing Foundation

The DWF has four components, not all are necessary, depending upon what your intentions are:

  • DATA LIFECYCLE MANAGER 1
  • DATA DISTRIBUTION OPTIMIZER 1
  • HANA DATA MANAGEMENT 1
  • HANA DWF DOCU 1

For the sake of this blog, I will be downloading and installing 3 components, in this order, as the Installation Guide suggests.

  • 1) HANA DATA MANAGEMENT 1 – HCOHDM05_0-80000034.zip
  • 2) DATA LIFECYCLE MANAGER 1 – HCOHDMDLM05_0-80001006.zip
  • 3) HANA DWF DOCU 1 – HDCHDM05_0-80001017.zip

Comprised Software Component Versions

Compressed zip files

For completeness, the HCOHDMDDO is for the Data Distribution Optimizer.
Installation

During the install, a 3 schemas are created.

Three schemas are created

We performed the Delivery Unit install using our usual authorized user.

Configuration

I will not document the configuration process as the Installation Guide is quite good, however, here are some screen shots you may find useful, under the respective headings in the Installation Guide.

4.2 Configure SAP HANA System Properties

xsengine.ini settings

xsengine.ini settings

4.3 Activate SQL Connection Configurations

XS Artifact Admin

http://:8030/sap/hana/xs/admin/

Navigate to the specific area (do not perform a search, as it will return something different) here is were you need to be.

SQL Connection Details

and activate:

Connection with elevated privileges

4.4.2.2 Custom Privileges at Entity Level for Data Lifecycle Manager

Source Privileges

I created a new user – DLMADMIN to perform the DLM activities.
DLM
ADMIN will perform various actions on tables within schemas. E.g. DELETE. In my case, all my custom tables containing the data I want to delete, exist in a single schema, so i applied the relevant 3 source privileges to the Schema than each individual table.

3 source privileges to Schema

Target Privileges

This user will later have more authorizations (default sap.hdm.dlm.role.GNR.Administrator) as and when the target “Storage Destination” is created.

4.6.3 Generate Default Schema for Generated Objects and Roles Needed for Data Lifecycle Manager

At this stage, we now assign our DLMADMIN user the appropriate privileges for ownership of the default generated schema – SAPHDMDLMGNR, to be.

Give DLM_ADMIN the required prerequisites.
● System privileges DATA ADMIN and ROLE ADMIN

System privileges ● Object privilege EXECUTE on “SYSREPO”.”GRANTACTIVATEDROLE” Object privilege ● Role sap.hdm.dlm.role::Administrator Role SAP

Readiness

Readiness

Performing the Call statement as DLM_ADMIN

Perfoming the call statement

DLM_ADMIN Roles

DLM_Admin roles

DLM_ADMIN Privileges

DLM_ADMIN Privileges

Now ready for use.

Operation

MANAGE STORAGE DESTINATIONS

As your DLM_AMIN navigate to “MANAGE STORAGE DESTINATIONS” which is where we set up the “Storage Destination” details to be used as a “Storage Destination” during the creation of a “Manage Lifecycle Profile”.

Manage Storage Destinations

You can see from above, the “Storage Destination Type” selected is “Deletion Bin Destination”, and the default schema “SAPHDMDLM_GNR”.

“Save”, “Activate”, and “Test Connection”, and you should end up with something like below.

Save, activate, test connection

MANAGE LIFECYCLE PROFILES

“MANAGE LIFECYCLE PROFILES” are used to configure sources and targets for the DLM. There can me more than one, depending upon your use.

Manage Lifecycle Profiles

Source Persistence is configured for a specific “SAP HANA Table”, in my “MONITIQ_TABLES” schema called “hist-linux-cpu”.

For the sake of this blog, I will be triggering the DLM using a “Scheduled” job.

Scheduled job

My table has a defined key, however, it there is not one, you would have to specify a key in the “Nominal Key” field (to be figured out)
Destination Attribute

The “Destination Attributes” are all self explanatory, and you can now see the “Storage Destination” created earlier.

Destination attributes

Destination Persistence

“Destination Persistence” will appear after activating the profile. There is no interesting Data Flow for a deletion profile.

Destination persistence

Rule Editor

The “Rule Editor” only contained “SQL Based Rule Editor”, and was sufficient for my use. A nice feature actually shows the number of affected records in real time, based upon your current rule.

SQL based rule editor

A simple query in Studio can confirm numbers
Studio query

Now to “Save” and “Activate”
save and activate

Note the “SMO Destruction Bin” “Data Distribution” section never displays, and is greyed out.

Generated Object(s)

“Generated Object(s)” will appear after activating the profile Generated objects

I can navigate out to the Schema and see this database Procedure
navigate out to Schema

Simulate

At this stage we can run a couple of simulations
Simulations

Data Relocation Preview Simulations

Data Relocation Count
Data relocation count

Import/Export

I can Export/Import this configuration as a text file, to/from other environments
Import/Export

Run

Now we are ready to actually perform the DLM activity. For the sake of this blog, I configured the DLM Profile to Scheduled earlier, so “Schedule” is the only option I have to run. I set it to run a few minutes in the future, with no recurrence.
Run Run 2 schedule relocation run

After the Job run, take a look at the “Logs” (ignore the start time inconsistency, as i had to re-schedule)

Which brings me to the point, what I do not like about the scheduling. There is nowhere to see the intended schedule, prior to it being run. For example, how can I confirm I have actually scheduled this.
job run

Click “ID” to get more detailed information
UTC Timestamp UTC timestamp part 2

After a successful report from the logs, I went to investigate.

Before

Reminder of above
Reminder of above

After

A recount certainly shows records have been removed
a recount shows the records have been removed

The Profile graphic has updated the Source number of record to suit, but does not not update the “SM0 Destruction Bin”. Whether it is meant to, who knows.
The profile graphic has updated the source number

Anyway, records are gone… for ever. There is no functionality to put back the records within the DLM.

Miscellaneous

In the top right hand side, there are two tags
two tags

“Open XS Job Tool” shells me out to the HANA XS Admin Tool Job Tool, where I can see more information about the Job Open XS Job Tool

The Job can be further edited to an extent
The job can be further edited to an extent

“Versions” simply give me some basic information about the Profile versions

Summary

So that about wraps up my quick and easy, minimalist experience with DLM using the Deletion Bin functionality. Hope there was something in there for you.