Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Data Description

The LOGS event is used in SAP to view any text file from the SAP application server.

Potential Use Cases

This event could be used for the following scenarios:

  • Correlate batch job failures to the work process trace files.

  • Identify environment-wide concerns if a work process cancels.

  • Alert on specific error messages in the environment within the server logs.

Metric Filters

The metric filter for the LOGS extractor can be found by logging into the managed system and executing the /n/bnwvs/main transaction. Then go to Administrator → Metric Filters → Logs file filter.

...

To enter a new filter value, select the add row option, and enter the values based on the options above. Save, and the data will be extracted.

...

Splunk Event

The event will look like this in Splunk:

...

SAP Navigation

Navigate to this data by logging into the application server, and accessing the directory which has been specified in the data extraction format. Alternatively the data can be accessed by using the AL11 t-code. To access the information using the AL11 t-code, execute the transaction in the managed system. For this example we will access a file in the following directory path: (E:\usr\sap\ID8\DVEBMGS00\work). Once you are in the transaction double click on the directory path that was specified in the Metric Filters configuration.

...

The information displayed below will match the information that is passed to Splunk. Once the log file is specified in Metric Filters, the extractor is able to extract the timestamp from each line, so logs are pushed with timestamp metadata. For the rest test files, it will be text with a timestamp when the data was extracted (not the actual time, when entry is added in the log).

...

Field Mapping

The field mapping between the data from SAP and values in Splunk can be seen in the table below:

...

Field

...

Description

...

Unit of Measure

...

CURRENT_TIMESTAMP

...

The date time stamp when the information was collected

...

YYYYMMDDHHM

...

EVENT_SUBTYPE

...

String

...

EVENT_TYPE

...

LOGS

...

String

...

FILE_DATA

...

The data from the log file

...

String

...

FILE_NAME

...

The file name from which the log was extracted

...

String

...

FILE_PATH

...

The file path from which the log was extracted

...

String

...

INSTANCE_NAME

...

The instance name from which the log was extracted

...

String

...

SEQ_NUM

...

Sequence number of event in batch (populated when the log entry is split into few events)

...

Numeric

...

UTCDIFF

...

The UTC OFFSSET in HHMMSS that the data was collected in

...

HHMMSS

...

UTCSIGN

...

The UTC positive or negative OFFSET indicator. Positive (+) means add UTCDIFF to find the time zone of the data, negative (-) means subtract the UTCDIFF to find the time zone adjusted date time the data was collected in.

...

+ | -