Layer7 API Management

  • 1.  Has anyone used Elasticsearch to store selective audit logs?

    Posted Aug 24, 2018 09:59 AM

    Some of our customers would like to have a way to browse and report access logs for their APIs. We currently use custom logger package names and trap them with log sinks to specific files that we then manually copy to the users that need them. Needless to say it's quite tedious and not practical.

     

    We have recently started to play around with Kibana and Elasticsearch and thought "what if we could ship specific audit details directly to Elasticsearch?".

     

    I thought we could add some sort of assertion(s) in our policies to POST data to an Elasticsearch cluster at execution time, without using any intermediary like Logstash or Filebeats.

     

    Has anyone done this before?



  • 2.  Re: Has anyone used Elasticsearch to store selective audit logs?

    Posted Aug 24, 2018 08:15 PM

    Hi Yanick,

     

    I'm going to move this to a discussion type rather than leave it as a question, because a question type expects a "right" answer, to which there isn't really a right answer for this one. In this case, you're sourcing ideas and suggestions from people, which is more relevant for a discussion type of thread.

     

    Thank you.



  • 3.  Re: Has anyone used Elasticsearch to store selective audit logs?

    Posted Jan 02, 2019 01:52 PM

    Be aware that the latency of POSTing to elasticsearch API within a service policy or internal audit sink policy will be added to the overall service/API response time, as seen by the client app

     

    An alternative is to have the internal audit-sink policy append to a memory buffer, which is relatively fast and consistent vs. an HTTP request.  Then have a scheduled task extract data from the memory buffer and POST it to elasticsearch/whatever.  Adjust buffer size and scheduled task frequency as needed to keep pace with your traffic load.



  • 4.  Re: Has anyone used Elasticsearch to store selective audit logs?

    Broadcom Employee
    Posted Jan 03, 2019 06:23 PM

    Since you already have the log sink for selective audits, you may run a shell script to monitor the change and post it to remote server.

    The shell script could be similar as below,

    tail -f /opt/SecureSpan/Gateway/node/default/var/logs/ssg_0_0.log|while read line; do echo $line>line; curl -H "Content-Type: text/plain" -X POST -d @line http://markgw92:8080/echo >>mylog 2>&1; done

     

    Regards,

    Mark