Standard editionIBM Operations Analytics - Log Analysis, Version 1.3.1

Testing the IBM Operations Analytics - Log Analysis Hadoop tier

To ensure that the IBM® Operations Analytics - Log AnalysisHadoop tier is correctly set up and configured you can run some basic tests.

Procedure

  1. Ingest log data to the IBM Operations Analytics - Log Analysis server.
    1. You can install sample data on the IBM Operations Analytics - Log Analysis UI, https://<LA_server>:9987/Unity. To ensure that log data is correctly ingested, you can also ingest logs from the Insight Pack that will be used in the set up.
    2. Ingestion results in writing the log data in avro files on the HDFS in the <LA_HADOOP_TIER>/data folder in the form <UnityCollection_Timestamp>/<DATA_SOURCE_NAME>/<DAY_BASED_ON_TIMESTAMP_IN_LOG_RECORDS>/<counter>.avro.
  2. Perform searches on the Hadoop tier through the IBM Operations Analytics - Log Analysis web UI, https://<LA_server>:9987/Unity.
    1. Prepend the search query in the UI with [_hq]. For example, [_hq]*
    2. The search on the Hadoop tier is run via a map reduce job on the Hadoop cluster.
    3. Prepend the same search query in the UI with [_sq], for example, [_sq]*, and perform the query on Apache Solr.
    4. Compare the search results.
  3. To identify errors, open the following files:
    • UnityApplication.log on the IBM Operations Analytics - Log Analysis server.
    • <HOME>/IBM/LogAnalysis/logs/hadooptier.log on the IBM Operations Analytics - Log Analysis server.
    • <LA_SERVICE_HOME>/logs on the IBM Operations Analytics - Log Analysis services server.


Feedback