Importing your inventory data from OTbase into Elasticsearch is very easy due to the Portable Inventory Data format. Here's the process:
1. Export the inventory section that you want to import into Elasticsearch
Go to the device or software inventory and select the devices that you want to export. The JSON export will contain all the metadata of all the devices that match your selection.
Once that you are happy with your selection, click "JSON Export".
2. Convert Portable Inventory Data to Elasticsearch's bulk input format
For some funny reason Elasticsearch does not digest native JSON bulk data but only their proprietary data format, which requires you to put a command line before every single device in your JSON file, and also to end every command, and subsequent data, with a newline. If you don't have a tool in place which already does this, you can use the following Python program:
# encoding: utf-8
from json import loads, dumps
from sys import argv
DESCRIPTION = """
Generate Elasticsearch bulk input files from OT-BASE export data.
"""
USAGE = 'es_convert.py <OT-BASE Devices.json> <Result.data>'
def _main():
if(len(argv) != 3):
raise Exception("%s\nUsage: %s" % (DESCRIPTION, USAGE))
print("Reading JSON file \"%s\"..." % argv[1])
with open(argv[1], 'r') as f:
jsonData = loads(f.read())
result = ""
print("Processing data...")
for device in jsonData.get('devices', []):
result += "{ \"index\":{} }\n" + dumps(device) + "\n";
print("Writing Elasticsearch bulk input file \"%s\"..." % argv[2])
with open(argv[2], 'w') as f:
f.write(result);
if __name__ == '__main__':
try:
_main();
except Exception as err:
print(err)
Call the program with the name of the Portable Inventory Data file as the first parameter, and the desired name of the output file as the second parameter. The conversion routine will produce an output that you can pass directly to Elasticsearch, also removing the Portable Inventory Data envelope in the process.
3. Import the resulting file into Elasticsearch using the _bulk endpoint
The only thing left to do is to import the result file into Elasticsearch in one single bulk operation. We can do that by sending the data to the _bulk endpoint, now that it is already properly formatted. Don't forget to specify the index of your choice.
In the following example we are writing to the index "otdevices". Elasticsearch will create this index for us if it doesn't exist already. We have named our result file as "ot-base devices.ndjson" (for newline-delimted JSON), and we are assuming that the file, and our curl call, is using the same host as our Elasticsearch server (hence localhost):
curl -H "Content-Type: application/x-ndjson" -XPOST localhost:9200/otdevices/_bulk --data-binary @"ot-base devices.ndjson"
Afterwards, all the inventory data is accessible in Elasticsearch.
4. Check it out in Kibana
In Kibana, the visualization front-end, create the index pattern "otdevices", and start searching your OT asset inventory.
Comments
0 comments
Please sign in to leave a comment.