To ensure that the IBM® Operations Analytics - Log AnalysisHadoop tier
is correctly set up and configured you can run some basic tests.
Procedure
- Ingest log data to the IBM Operations Analytics - Log Analysis server.
- You can install sample data on the IBM Operations Analytics - Log Analysis UI, https://<LA_server>:9987/Unity.
To ensure that log data is correctly ingested, you can also ingest
logs from the Insight Pack that
will be used in the set up.
- Ingestion results in writing the log data in avro files
on the HDFS in the <LA_HADOOP_TIER>/data folder
in the form <UnityCollection_Timestamp>/<DATA_SOURCE_NAME>/<DAY_BASED_ON_TIMESTAMP_IN_LOG_RECORDS>/<counter>.avro.
- Perform searches on the Hadoop tier
through the IBM Operations Analytics - Log Analysis web
UI, https://<LA_server>:9987/Unity.
- Prepend the search query in the UI with [_hq].
For example, [_hq]*
- The search on the Hadoop tier
is run via a map reduce job on the Hadoop cluster.
- Prepend the same search query in the UI with [_sq],
for example, [_sq]*, and perform the query on Apache Solr.
- Compare the search results.
- To identify errors, open the following files:
- UnityApplication.log on the IBM Operations Analytics - Log Analysis server.
- <HOME>/IBM/LogAnalysis/logs/hadooptier.log on
the IBM Operations Analytics - Log Analysis server.
- <LA_SERVICE_HOME>/logs on
the IBM Operations Analytics - Log Analysis services
server.