The secret to a long happy marriage with HANA is logic

The logic required by IT professionals means they are often unfairly accused of being unromantic. However, when it comes to a long and happy relationship with the HANA database, thinking about the long term helps make sure the magic lasts.

Working with a number of customers who have clustered BW on HANA systems, running for over a year, has put the Centiq team in the privileged position to experience how data growth impacts HANA. The BASIS and BW teams that look after these systems can often misunderstand memory metrics when it comes to assessing capacity for growth. Here’s the story we often hear.

Your HANA honeymoon

When you start your HANA project, if your sizing exercise worked out well, and you gave yourself margin for growth, you are likely to have started with plenty of capacity. You will have been told HANA needs 50% of memory for data and 50% for computational requirements. When you have sufficient capacity, you see the in-memory statistics in HANA-Studio and all is well. You can load all data into memory if you wanted. Users are wowed by the speed of queries. All hugs and kisses. What could possibly go wrong?

You row, but make up

Most marital rows are about money, and this is no exception. When HANA is short on the most important resource - memory - it displaces data. Disappointed by this result you seek advice and identify a new compromise; mark some of the BW objects as “non-active” in BW to allow HANA to decide which BW objects are displaced first in the event of contention for memory. Wow, you have just increased capacity at zero cost. You’re the romantic hero once more.

It's not you HANA, it’s me

Disaster averted, you now constantly keep an eye on the in-memory statistics and it always looked close to the 50% of data to memory ratio. Disappointingly, the system is unloading both active and inactive data due to low memory. The blame game starts.

Users are no-longer happy as they are getting inconsistent query times, your backups seem to take forever and when you tried to re-balance data in the cluster it took forever. Something needs to change to re-establish harmony. But what? You decide you are focusing on the wrong things. If only you had listened to your Mother and kept an eye on the total data footprint...

After all this time, how much data do I really have?

When discussing capacity, it is safer to use the term data footprint to describe the total data size as it would be in-memory if there was enough memory to hold all of the data. This is to help distinguish the difference between the current size of data in-memory (loaded data), and data size on disk, which is likely to be larger than the in-memory size). You get this value by adding all of the estimated table sizes together.

How much memory will the children need?

To answer this question, you need to understand what your data performance objectives are. If you use the non-active data concept and mark BW objects for early unload, then you can add up these objects to assess what percentage of the total data-footprint is “active” and should be kept in memory. You may also acknowledge some columns are never accessed, which may reduce your overall active data footprint estimate.

If you have captured your in-memory and total data footprint metrics over time, you should be able to draw a picture like the illustration below with three levels of capacity:

1 - Optimised for performance.

  • Enough memory to hold all data, or all of the active columns in memory with little need for non-active data concept
  • No unloads are witnessed
  • Capacity for growth

2 - Optimised for cost

  • Unused columns are not held in memory
  • Using the non-active data concept in BW allows some unloads of unimportant data or staging areas that have served its purpose as part of the ETL process
  • Extension nodes, dynamic tiering or NLS are containing the in-memory demands
  • The 50% of the physical memory is greater than the active data footprint

3 - Overloaded

Capacity planning diagram

The cliches about relationships, human and HANA, remains the same. You get out what you put in. The amount of care and attention which needs to be lavished on hot new technologies can at times seem daunting. Over the lifetime of a system, there will be highs and lows as breakthroughs follow breakdowns. This is why a careful decision on whether HANA is right for you, at the start of the process, saves heartbreak later on and why staying loyal to HANA, may turn out to be the best decision you ever made.

To read more about SAP HANA Capacity planning, take a look at our infographic on the subject, or Get In Touch to speak to one of our consultants.