Log analysis

Log analysis (or system and network log analysis) is an art and science seeking to make sense out of computer-generated records (also called log or audit trail records). The process of creating such records is called data logging.

Typical reasons why people perform log analysis are:

Logs are emitted by network devices, operating systems, applications and all manner of intelligent or programmable device. A stream of messages in time-sequence often comprise a log. Logs may be directed to files, stored on disk, or directed as a network stream to a log collector.

Log messages must usually be interpreted with respect to the internal state of its source (e.g., application) and announce security-relevant or operations-relevant events (e.g., a user login, or a systems error).

Logs are often created by the software developers to aid in the debugging of the operation of the application. The syntax and semantics of data within log messages are usually application or vendor-specific. The authentication of a user to an application may be described as a login, a logon, a user connection or authentication event. Hence, log analysis must interpret messages within the context of an application, vendor, system or configuration in order to make useful comparisons to messages from different log sources.

Log message format or content may not always be fully documented. A task of the log analyst is to induce the system to emit the full range of messages in order to understand the complete domain from which the messages must be interpreted.

A log analyst may map varying terminology from different log sources into a uniform, normalized terminology so that reports and statistics can be derived from a heterogeneous environment. E.g., log messages from Windows, Unix, network firewalls, databases may be aggregated into a "normalized" report for the auditor.

Hence, log analysis practices exist on the continuum from text retrieval to reverse engineering of software.

Functions and technologies

Pattern recognition is a function of selecting incoming messages and compare with pattern book in order to filter or handle different way.

Normalization is the function of converting message parts to same format (e.g. common date format or normalized IP address).

Classification and tagging is order messages in different classes or tag them with different keywords for later usage (e.g. filtering or display).

Correlation analysis is a technology of collecting messages from different systems and finding all the messages belong to one single event (E.g. messages generated by malicious activity on different systems: network devices, firewalls, servers etc.). It is usually connected with alerting system.

Artificial Ignorance a process whereby you throw away the log entries you know aren't interesting. If there's anything left after you've thrown away the stuff you know isn't interesting, then the leftovers must be interesting. Artificial ignorance is a method to detect the anomalies in a working system. In log analysis, this means recognizing and ignoring the regular, common log messages that result from the normal operation of the system, and therefore are not too interesting. However, new messages that have not appeared in the logs before can sign important events, and should be therefore investigated.[1][2]

See also

References