SAP MAXDB : Export/Import through Loader utility

MAXDB Provides a powerful tool to perform Export/Import through Loader utility which is very useful in case of Heterogeneous System Copy.

There are some limitations to use Loadercli via command line as it require same versions on source & target to perform Export/Import.

If you are using different OS on source & target then Loadercli is not a good option as it’s very difficult to keep same versions on both side.

Loadercli will perform the same action smoothly as it works with MAXDB Studio where you can connect the both source & target DB’s & use same loadercli to perform export/import in the DB. You can export the entire SCHEMA or Tables from source & import the same in Target SCHEMA.

Loader requires SQL login to respective SCHEMA for export/import. You can do a SQL login by Right Click on DB & then SAPUSER SCHEMA name is visible. Expand the SAPUSER schema & right click on Schema or Tables —- select export. it will ask you some options & export the entire contents on your local PC.

The same loadercli can use for importing in Target Schema, do a SQL login in Target Schema, select SAPUSER schema & click on schema — Right Click & Import. Give the correct location & it will show the exported contents & you can select the options as requires & perform the import.

SAP official documentation –
https://maxdb.sap.com/doc/7_7/71/33b900661747cdad48a08b01aa06b2/content.htm

I will share the screenshots of export/import in next post for more clarification.

How to fix MaxDB SDBSETUP issue

1. Download the MaxDB Database installation (MaxDB 7.9 – SP9 Build 05 Linux on x86_64 64bit) from Support Portal.

2. Unzip the Downloaded Zip MaxDB File.

3. Navigate to /sapdb/max/51052559_/DATA_UNITS/MAXDB_LINUX_X86_64

Run ./SDBSETUP

You will get the below error –

cannot load wxWidgets properly: Can’t load ‘/var/tmp/SDBLDLdvBNG/Wx.so’ for module Wx: libpangox-1.0.so.0: cannot open shared object file: No such file or directory at DynaLoader.pm line 230.
at SDB::Common::Require line 61

As per SAP Note – 2300026, There are some graphical libraries which are not supported in SLES11, SLES12 & RHEL7 versions. The SDBSETUP tool cannot be run on these operating systems.

SAP recommends you to use SDBINST / SDBUPD tool instead of SDBSETUP.

However there is a workaround where you can fix these libraries & run the graphical SDBSETUP.

I am referring here SLES 12 SP4 as my operating system.

If you run below command from installation directory, it will throw the error for libwx_gtk2-2.6.so.0, all other libraries are available. –

ldd -v WX.SO
linux-vdso.so.1 (0x00007fffabbb8000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f07571e0000)
libwx_gtk2-2.6.so.0 => not found
libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x00007f0756dff000)
libm.so.6 => /lib64/libm.so.6 (0x00007f0756b02000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f07568ea000)
libc.so.6 => /lib64/libc.so.6 (0x00007f0756545000)
/lib64/ld-linux-x86-64.so.2 (0x00007f0757748000)

Warning – Please do the changes as per your own RISK, SAP never recommend the same.

SDBSETUP will throw error on below 3 libraries (one by one)

1. libpangox-1.0.so.0
2. libpng.so.3
3. libtiff.so.3

Navigate to directory /usr/lib64 & copy the below libraries

cp libpangoxft-1.0.so.0 libpangox-1.0.so.0
cp libpng12.so.0 libpng.so.3
cp libtiff.so.5 libtiff.so.3

Navigate back to installation directory & run the command – ./SDBSETUP

cannot load wxWidgets properly: cannot open display

Now run the same command on VNCSERVER

Max DB Image

Max DB Image

You are good to install via SDBSETUP.

AZ-120: Planning and Administering Microsoft Azure for SAP Workloads

Today I have successfully cleared AZ-120 Exam.

It really give me booster to go more deeper in Azure Technology & how we can leverage more for SAP systems.
There are lot of things which still need to discover in Azure to get more benefit for SAP Landscape. However I am currently working on SAP Migrations only but mostly we people use SAP method’s to perform the same. Azure is also providing their native tools do perform the System Migration from On-Premise to Azure but frankly speaking, I never checked the same & even not familiar with them.

This exam helps me to understand what Azure is providing for SAP & how we can utilise the same with cost optimisation.

There are 42 questions in the exam including Scenario based questions. Some questions are completely related to Azure & some are from SAP Technology(I hope you know SAP Basic Terms).

The Best Material which helps & obliviously related to exam also is only Microsoft Documentation which is officially available in docs.microsoft.com

Below links help me to go through the exam –

https://docs.microsoft.com/en-us/azure/virtual-machines/workloads/sap/get-started

https://blogs.sap.com/2019/12/15/sap-expert-role-guide-to-microsoft-azure-skills-and-certification/

https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RE44g2b

https://blogs.sap.com/2019/12/30/exam-study-resources-for-az-120-planning-and-administering-microsoft-azure-for-sap-workloads/

https://docs.microsoft.com/en-us/learn/modules/introduction-azure-sap-workloads/

I hope above material will help you to clear the exam.

Let me know if you have more questions.

Best of Luck!!!

SAP HANA 2.0 Cockpit Installation and Configuration

Recently for testing purpose, I have installed SAP HANA 2.0 Cockpit on my Google Cloud Virtual Machine by following the below steps –

SAP HANA 2.0 Cockpit InstallationSAP HANA 2.0 Cockpit Installation

SAP HANA Cockpit install with the HANA DB which also requires by default mount point – /hana/shared

However i have also summarised the below steps which are useful for a newbie to install the same –

1. Download SAP HANA Cockpit 2.0 from SAP Support Portal
2. SAP HANA Cockpit come with SAP HANA DB & can be install on separate system
3. UNCAR the SAR file of SAP HANA Cockpit & Install with ./hdblcm command
4. Default Port for SAP HANA Cockpit – 51029
5. Default Port for SAP HANA XSA – 330 – i.e. – 39630
6. Default Instance Number for SAP HANA Cockpit – 96
7. Default first time user login in HANA Cockpit – COCKPIT_ADMIN with Master Password (Created during installation)
8. SAP HANA Cockpit installation use default installation directory – /hana/shared

Note – Default Installation Directory can be change by adding extra parameter while running ./hdblcm

However when i ran ./hdblcm, i got the below error as SIGNATURE.SMF file is missing in the installation directory.

File ‘SIGNATURE.SMF’ cannot be found in the SAP HANA Database installation kit. To include the signature when extracting the installation kit from a .SAR archive, use the SAPCAR option: -manifest SIGNATURE.SMF You can ignore this error and continue with the execution by passing the ignore option ‘check_signature_file’ to hdblcm, started as root. See SAP note 2078425, section ‘How to prepare SAP HANA packages for installation or update’ for instructions how to prepare packages for update or implications of ignoring this error.

Then i followed the above given SAP Note & ignore the file by passing the below command –

./hdblcm -sh –ignore=check_signature_file

Above is the workaround, I would suggest to extract the SIGNATURE.SMF file as per SAP Note & run the installation.

Once I have successfully installed the SAP HANA Cockpit, installer given me below 2 URL to access the same.

Launch SAP HANA cockpit by opening https://hanabox1.us-east1-b.c.hana-273005:51027
Launch SAP HANA cockpit manager by opening https://hanabox1.us-east1-b.c.hana-273005.internal:51029

However as I am running the Cockpit in Google Cloud VM with Public IP, i want to access the same from outside world & I changed the above hostname with Google Public IP but still page is not opening & throwing the error – “Can’t reach the Page”

I tried by both ports – SAP HANA Cockpit & XSA URL but both are failing.

While digging into more in the system I have identified that there is an change requires in XSCONTROLLER.INI file on below location –

/usr/sap/SID/SYS/global/hdb/custom/config

Make a copy of xscontroller.ini & change the below 2 parameter by giving your PUBLIC IP –

default_domain = PUBLIC IP
api_url = https://PUBLIC IP:39630

save the file & restart the HANA instance.

Please Note – I am using Dynamic Public IP so whenever I shut down & start the Google Cloud VM, it change the Public IP & I need to update the same again.

Once it’s start successfully, wait for sometime & check the XSA URL

https://PUBLIC IP:39630

Voila… Page is now opening & now you can login to HANA Cockpit & Manage your HANA DB’s.

SAP HANA Cockpit XSA
Page

Let me know if you find this useful.

Tomcat Performance issue in SAP Business Objects 4.1 SP7

A New Installation of SAP Business Objects 4.1 SP7 with default Tomcat 7.0 Installation comes with (-Xmx) = 2048M & MaxPermSize = 384M.

However in some cases when there are multiple users login & execution of reports are happening then this memory is quite low & you can see the performance issue in your BO server where Adaptive Processing Server started responding very slow & sometime it’s hang while creating OLAP connections.

Some errors can be visible in logging directory of Tomcat on below file –

/usr/sap//bin
However in the BO system setenv.sh defined parameter to use bobjenv.sh file for memory related parameters.

We can increase the Tomcat memory on below file as per SAP recommendation –

/usr/sap/SID/sap_bobj/tomcat/bin/bobjenv.sh

Create a backup of original file & change the parameter of above file as per below notes.

Please Note – Before updating any Tomcat memory parameter, shut down the BO application & Tomcat, update the parameter & Restart the Tomcat & BO applications.

Verify the Memory of Tomcat in Linux –

ps -ef | grep tomcat ——— see the new memory parameter.

Reference SAP Notes –
2405536 – Best Practice: The popular combinations of Tomcat Max Memory Pool size and MaxPermSize
2075671 – Best Practice: How to identify Tomcat crash, unresponsive or hanging issue

SAP ADS Rendering Issue

Recently we have discovered that after configuration of ADS service in SAP Netweaver 7.40, we are facing rendering error while running report (SA38) – FP_CHECK_DESTINATION_SERVICE

We verified the configuration by going through multiple available SAP blogs but didn’t find the resolution of rendering error.

As per SAP note rendering errors occurs due to SAP Interactive forms credentials but we are not using the same & we have not configured any interactive forms but still it’s throwing errors.

By going through the multiple SAP Note, we have discovered that this issue is related to missing SUSE OS libraries which need to install & then restart the Java System or else restart the Adobe Document Service.

Helpful SAP Notes –

2215355 – Complementary analysis for ADS termination
2029940 – IFbA: Required additional RPM package for ADS on Linux

SAP Business Objects 4.1 SP07 Installation Error

Hello All,

During an Installation of SAP Business Objects 4.1 SP07 with DB2 11.1 Database, we have encountered an error after giving CMS DB & Port & click on Continue –

Database access error. Reason Loading shared object failed. First tried to load library db2 and failed because of error: [db2: cannot open shared object file: No such file or directory]. Second tried to load library libdb2.so and failed because of error: [/usr/lib64/libxml2.so.2: symbol gzopen64, version ZLIB_1.2.3.3 not defined in file libz.so.1 with link time reference]. (FWB 00090)

As this error is misleading where SAP suggest to perform some steps at DB2 level for SAP note search for BO 4.1 but in our case this error is something related to Library files which are missing in Installation Folder.

We found the solution in SAP BO 4.2 SAP notes which are listed below –

https://launchpad.support.sap.com/#/notes/2694951
https://launchpad.support.sap.com/#/notes/2689156

As per SAP Note – 2689156 – BI 4.2 SP05 installation failed with DB2 11.1 on Linux 7.3&7.4

Workaround:

Download the 3 files libxml2.so, libxml2.so.2, libxml2.so.2.9.5 from the attachments.
Place these 3 copied lib files in the dunit folder of the downloaded setup files of SBOP BI PLATFORM SERVER.
ex: BusinessObjectsServer/dunit/platform.services.cms.cpp.dbcheck-4.0-core-64. (So that we have both the compatible versions of libxml2.so and its dependents libz.so files in DU directory.)
Run the install setup.sh, It should smoothly proceed with the installation after replacing.

The above solution works like CHARM in our case & we have successfully installed BO 4.1 SP7.

I believe this post helps.

Microsoft Azure Administrator – AZ 103 Certified

I have cleared AZ 103 certification in Jan 2020 successfully & this will help us to understand the functionality of Azure.
There are lot of topics which need to cover for AZ103 Exam. I have gone through Udemy Course as well.
However Microsoft announced that from Mar 2020 , AZ103 will replace to AZ104.

some different topics which requires attention:

What is Azure Policy
Azure Subscription
Azure Baseline Resource
Azure Storage
Storage Explorer
Redundant Storage
blob storage
azure backup
power shell – AZVM
virtual networks – subnet
route table
express route
Network Security Groups
conditional access
AD identity protection
access review
azure AD connect
create custom role (power shell & CLI)
Network performance monitor
network watch – https://azurecitadel.com/infra/vdc/lab4/

Some LABS –
https://www.microsoft.com/handsonlabs/selfpacedlabs/details/SP-AZ100059
https://www.microsoft.com/handsonlabs/selfpacedlabs/details/AZ00034
https://www.microsoft.com/handsonlabs/selfpacedlabs/details/SP-AZ100002

OpenDX course – Very useful
https://openedx.microsoft.com/courses/course-v1:Microsoft+AZ-103.1+2019_T2/course/

you can find the good information here –

New AZ-104 Exam Replaces AZ-103 – What We Know!

Google Cloud Associate Engineer Certification

Recently I have prepared for GCP Associate Cloud Engineer exam and passed. Here are some of by observations –

• Exam questions are quite tricky not so much tough but basic knowledge of GCP should be there, some of the questions are really requires more attention as answers are really close to each other with some wording difference — take your time to read the question properly then respond.

• As Google FAQ said that some of the questions are not marked, they are just trial questions but actually you don’t know which one is trial so better to focus on the questions & make your attempt successful.

• There are also scenario-based questions which requires your knowledge to understand & respond.

• Topics mentioned in the exam details on Google site were covered. GCE, IAM, Google Groups, Copy Roles, GKE / App Engine, Cloud Storage (for Archiving – Coldline), Deployment Manager, KMS, Traffic Splitting VPC, gcloud commands, BigQuery & BigTable etc.

• I have an work experience on SAP BASIS & also working on cloud SAP projects, now this has been an added advantage for me to understand the cloud concepts however I had to learn a lot & also do hands on experience on Google Cloud.

• I have learned through Google official documentation for understanding the GCP concepts. It’s very comprehensive guide but obviously it’s requires hands on experience too because you don’t know when it goes deeper & wider.

• The self assessment tests given on Google official site is only for giving you a fair idea where you stand but doesn’t mean that you are ready to pass the exam. You have to go through each & every topic which mentioned in the official exam guide.

• Once you complete the test in the end you will get the provisional result indicating whether you passed or failed. Once you passed, within a day or so, you will get an official email from Google with Certification link.

• Like for any other exams, there is no break in this too. if you take break then it will minus from your exam time.

• Resources for learning –

o Google official documentation
o Udemy / Cloudera
o Braincert (for Practice exams)

Learning Resources

https://cloud.google.com/certification/cloud-engineer
• GCE: https://cloud.google.com/compute/docs/faq
• App Engine: https://cloud.google.com/appengine/kb/
• Google IAM: https://cloud.google.com/iam/docs/faq
• gcloud: https://cloud.google.com/sdk/gcloud/
• gcloud command reference: https://cloud.google.com/sdk/gcloud/reference/
• Storage FAQs: https://cloud.google.com/storage/docs/faq
• Cloud SQL FAQs: https://cloud.google.com/sql/faq
• gsutil command reference: https://cloud.google.com/storage/docs/gsutil (refer to gsutils Commands section)
• bq CLI reference: https://cloud.google.com/bigquery/docs/reference/bq-cli-reference
• Free Tier FAQs: https://cloud.google.com/free/docs/frequently-asked-questions
• Support: https://cloud.google.com/support/docs/
• Selecting storage options:https://cloud.google.com/images/storage-options/flowchart.svg
• GCP mapping to AWS & Azure:https://cloud.google.com/free/docs/map-aws-google-cloud-platform
https://cloud.google.com/free/docs/map-azure-google-cloud-platform
• Google Cost Calculator & Comparison: https://cloud.google.com/products/calculator/
https://cloud.google.com/pricing/#calculators
• 3rd Party Online Cost Comparison Websites:
https://calculator.unigma.com/
https://www.cloudorado.com/
• Practice Test:https://cloud.google.com/certification/practice-exam/cloud-engineer

Wish you the very best with your GCP certification. You can reach out to me for SAP BASIS or GCP short time consulting.