Frequently Asked Questions (FAQ)
General
Are there performance impacts for data collection?
Our product has the ability to profile an individual piece of code in ABAP or Java, and we have measured the overhead in ABAP on average to be 1-2% system CPU. This chart is included in our product information sheet.
Can the recording of the log information be filtered/restricted/limited to dedicated events/logfiles/…(?) or is a complete data extraction always carried out?\
Each dataset we extract can be:
Turned on/OFF
Every individual field inside an extractor can be turned [ON | OFF | scrambled] or obfuscated for security reasons
Large data extractors support delta extractions natively
Which SAP versions are supported?
All supported SAP versions, that are still supported by SAP. NW 7.00 – NW 7.5 (including 7.51-7.53), S/4 HANA 1610, 1709, 1809, 1909, 2020, 2021, 2022
Is a shift of collected data in separate Splunk indices possible?
Yes, you can send each individual dataset to a different Splunk index, but it must be on the same Splunk server cluster.
How much network traffic does a single production SAP system generate?
Assuming 10GB/day (this is average), we can calculate it. 10GB/d = 426MB/hour = 121 KB/s = 970K bits/s or just under 1 Mbit . Which given most network connections are now 1Gbit, this is about 0.1% of the network.
Even if it were 100GB/day it would still be very low.
Does PowerConnect support Hybris and SAP Cloud Platform?
Yes, we support Hybris on-premise and SAP Cloud, but we do not support Hybris Cloud at this time because SAP does not provide any third parties other than Dynatrace with access.
Is the traffic from PowerConnect to Splunk encrypted?
Yes, it is encrypted.
We have over (<insert big number>) production SAP systems - can we automatically deploy/install PowerConnect via automation (ansible, chef, puppet...)?
These tools do not have the ability to interface directly with the SAP GUI for an ABAP system it is not possible. Installation can occur in the Java stack via Telnet, which could be automated partially via automation, but there is security set-up and configuration in the PowerConnect app itself that would not be possible through the information. We have customers with 300+ SAP systems (SIDS) with over 1,500 servers, so scale is not a problem in our view.
How are dual stack systems licensed?
If a system is dual stack, it requires one PowerConnect license.
Does installing PowerConnect incur additional license charges in S/4 HANA from SAP?
Customers will not incur additional license charges from SAP systems due to the installation of PowerConnect because of the following reasons:
PowerConnect is an SAP certified add-on which is ABAP based and is installed via SAINT transaction in SAP
PowerConnect requires 1 (one) background SAP user to run PowerConnect batch jobs
PowerConnect collects all the data from Application APIs and doesn’t interact with the database directly
PowerConnect reads data programmatically on a schedule and send it to Splunk for read-only / reporting purposes only and does not write any data back to SAP, this scenario is covered in the SAP indirect usage guide https://news.sap.com/wp-content/blogs.dir/1/files/Indirect_Access_Guide_for_SAP_Installed_Base.pdf as indirect static read
ABAP
If filtering is possible, how can this filter provided to all SAP systems?
The filter needs to be setup in each system, however it can be setup in 1 system, and exported (as can all the config) and imported to a 2nd system meaning it only has to be setup once (it comes out of the box with default setup / configuration which can be adjusted) and then exported so it can be imported in to subsequent systems. There is functionality planned for SP4 or SP5 (due Q1 CY2021) to allow central configuration to be setup and pulled down to each system automatically.
Which authorizations are required for data collection?
There is a standard SAP role we provide that can be imported in to your SAP system can contains all of the required permissions for the PowerConnect batch job. You can import this in to a development system and examine it.
How is the data transferred and stored in Splunk?
Data is collected by ABAP code, and sent direct to Splunk from SAP, in json format. When it arrives in Splunk it is stored in an index, which Splunk holds on disk in the indexer.
Are there any implications or issues with interruptions with users when installing PowerConnect in a production system?
No user interruptions. It is regular SAINT add-on install, which is performed in 000 client. Moreover PowerConnect is fully custom, so does not require users to stop their activities.
Which SAP transaction codes can PowerConnect access?
PowerConnect has over 200+ extractors available out-of-the-box for the SAP ABAP system. Additionally, PowerConnect contains an extensible framework which allows it to collect information from any table within the SAP environment.
How often does PowerConnect pull the various transaction codes?
By default data is extracted between 30-86400 seconds depending on the transaction, but this can be customized within the administrative console to meet the user’s preference.
Can PowerConnect access custom applications and SAP tables?
Yes, it can. There is an extensible framework which allows PowerConnect to extract data from any table within the SAP system, including Z tables.
How long does PowerConnect keep the data?
PowerConnect stores data within the SAP system for as little as one day for troubleshooting purposes. Once the data is sent to Splunk it can be stored as long as the user would like.
What happens if the connectivity to the Splunk System becomes unavailable?
The data is stored in SAP temporarily if there is an issue connecting to Splunk, and it is sent once connectivity to Splunk is re-stablished. This data is kept by default 1 day, but can be kept longer if needed, once the data is sent to Splunk it is deleted from temporary storage. The best way to mitigate this problem is to set up alerting to see if data is not being sent to Splunk for a specific period of time.
How does PowerConnect retrieve OS statistics?
OS statistics are obtained from the ST06 transaction code. Customers can also use Splunk collectd to obtain additional information from the OS.
Can PowerConnect run a custom job on the SAP system and collect the output?
We can execute a custom batch job and grab the output of the batch job SM37 logs, and the output of the ABAP and send it to Splunk. We can also execute any ABAP code the customer has, but it requires a wrapper around the code to allow it to interface with PowerConnect.
Is PowerConnect able to access the SAP-GUI data like network latency or response time?
Response time is stored in the STAD transaction which is collected by the PowerConnect application. For network data there is a RUM monitor for network traffic with a geo heat map as a standard dashboard.
Can PowerConnect send certain transaction codes in real time and others in batch mode?
PowerConnect sends data in near-real time to Splunk. By default transactions are collected between 30-86400 seconds after installation depending on the type of transaction. This can be customized to meet user’s desired data extraction frequency in the PowerConnect administrative console.
Are there capabilities to limit the number of work processes that PowerConnect will consume at any one time?
It is possible to limit the count of /BNWVS/DISTRIBUTE_PARALLEL jobs and count of DIA processes spawn by /BNWVS/BC_DATA_EXTRACT.
Are there capabilities to limit the PowerConnect work processes to a specific app server?
It is possible to define a specific app server for batch jobs in Administrator->Setup Global config. DIA WPs can be targeted by setting RFC_GROUP parameter in the Global config (group should be created in advance).
What are limitations of PowerConnect in SAP Private Cloud?
Unlike the SAP HEC Environment, there are no limitations for SAP Private Cloud at current stage. Regular S/4HANA packages could be installed in this environment.
Are the PowerConnect jobs expected to run for an extended period of time?
Long running PowerConnect jobs is normal behavior. These jobs run for a defined interval then they restart themselves. This interval can be defined, it defaults to 1 day (86,400 sec or close to this) before they restart. This is normal, they spend most of their time asleep not doing anything, so while in this state they consume no resources. The max runtime value is defined in seconds. In theory, you could enter any valid number here, however, please take into account that there is a performance overhead (to load necessary configs, buffers, etc) and delay to start the job itself, which might impact on the metric uploading/extraction. Thus it is a matter of balance to find the maximum value which will not trigger any alerts and at the same time will not slow down the metric extraction/uploading. Our recommendation is to maintain the value of 1 day (86,400 sec), which is the default value.
Is it to be expected that PowerConnect will always consume two BTC work processes for collector/uploader, plus another (briefly) every 5 min for the check job.
Yes, it is expected behavior (please see the table below). We need some jobs running continuously to collect/send metrics.
Activity | WP type | Job name | Periodical | Schedule | Comment |
Check | BTC | /BNWVS/BC_CHECK_JOB | X | Scheduled to run each 5 minutes |
|
Archive | BTC | /BNWVS/BC_DATA_ARCHIVE | X | Running twice per day (according to the config) | Frequency could be defined in the Administrator->Setup Global config: ARCHIVE_DURATION_RUN (seconds) |
Extract dispatcher | BTC | /BNWVS/BC_DATA_EXTRACT |
|
|
|
Upload dispatcher | BTC | /BNWVS/BC_DATA_TRANSFER |
|
|
|
Send/Distribute | BTC | /BNWVS/DISTRIBUTE_PARALLEL |
| UPLOAD_JOB_COUNT | Sending metrics to Splunk. Maximum job count can be set in Administrator->Setup uploader config: UPLOAD_COUNT (Metrics) UPLOAD_JOB_COUNT (Jobs) MAX_UPLOAD_ITERATION (Keep jobs in the queue) |
Extract metric | DIA | - |
| MAX_PARALLEL_PROCESS | Extracts specific metric. The maximum count can be set in MAX_PARALLEL_PROCESS |
18. Can the PowerConnect extract and load background job runtime be reduced from its default setting of One day ( 86400 Seconds ) ?
a. PowerConnect extract and Load Jobs are designed by default to restart every day to ensure that the memory does not lock up and gets cleared.
PowerConnect only consumes 1 or 2 work-process which is around one percent of the total available work-process in a usual production system.
The background jobs usually spend most of its time asleep, so while it may be running most of the time it's not doing anything. So it should be okay to have these background jobs running
You can shorten the duration to say, 12 hours. In this case there usually is 2 jobs running instead of 1 and there may not be any benefits from this action.
We don't test shortening the duration but it is supported. During the shutdown the background jobs has to do some work to cleanup and shutdown cleanly. So the more times there is a restart (Stop and Start ) the more work jobs has to do for no real reason
In case there are alerts for long running jobs, we usually ask our customers to exclude these jobs, in fact the PowerConnect dashboard for SM37 has an "Exclude PC Jobs" Switch to remove them from monitoring / reporting easily
There are some examples of customer reducing the duration to 12 hours. We do not recommend going below this value.
For example, when you set the duration to 60 mins, PowerConnect spends more time stopping then starting, than collecting metrics
19. What is a trigger for sending the email to the Notification email address entered during Wizard?
PowerConnect is sending notification email to the email address defined during initial configuration wizard in following cases:
a. License expiration notification - the email will be sent each day last 14 days prior to the license expiration date.
b. License consumption notification - the email will be sent after reaching following levels of license consumption for current/today date:
80
90
100
105
Please note, that metric distribution will be stopped after license consumption is reached 105% and will be restored again next day.