DX Unified Infrastructure Management

  • 1.  CA Network Flow Analysis and Splunk

    Posted Feb 23, 2016 02:28 PM

    Hello guys,

    I'm trying to find some information about how to integrate (If posible) CA Network Flow Analysis with Splunk but i could'nt find anything. Can you please tell me where i can get this documentation?.

     

    Thanks.



  • 2.  Re: CA Network Flow Analysis and Splunk

    Posted Feb 23, 2016 02:39 PM

    Good luck.  We’ve certainly tried.   Ideally, you’d be able to use an API, but that’s not available for NFA.    If you find anything,  please post.  Keep in mind, the volume could be huge…



  • 3.  Re: CA Network Flow Analysis and Splunk
    Best Answer

    Broadcom Employee
    Posted Feb 23, 2016 02:54 PM

    There is no official supported integration between NFA and Splunk.

     

    As for the API's customers have been looking for API's in NFA for a while, there is an Idea here, please upvote: API for NFA



  • 4.  Re: CA Network Flow Analysis and Splunk

    Posted Mar 01, 2016 10:36 AM

    Thanks you.

    i dont know if i should create a new topic or not, if im wrong please tell me.

     

    Is there any way to pull data from harvester database?

    As i said, i need this data to work on Splunk and this seems to be the only way.

     

    ** I was reading this article http://www.ca.com/us/support/ca-support-online/product-content/knowledgebase-articles/tec1853184.aspx 

     

    Thanks you in advice



  • 5.  Re: CA Network Flow Analysis and Splunk

    Broadcom Employee
    Posted Mar 01, 2016 12:53 PM

    Data is stored on the Harvester, but not in the 'harvester' database.

    1 minute Data is stored in the \Netflow\datafiles\ReaperArchive\ directory and 15 minute data is stored in the \Netflow\datafiles\ReaperArchive15\  directories.  Raw Flow Forensics data is stored in the \Netflow\datafiles\HarvesterArchive\ directory.  All of this datais accessed by the Custom Mysql Storage Engine that runs on port 3307.  There are three databases in this 3307 instance of mysql, archive and archive15, which is for 1 min and 15 minute data respectively, and the nsas database which has the flow forensics data. 

     

    The nsas database is probably easiest to query, but it will only have data for the last 24 hours.  There is only 1 table, ahtflows. 

    To login to the 3307 instance of mysql you run: "mysql -P3307 nsas".

     

    Once in there you can run a query like below to see how the data looks in this table and which columns are available:

    select * from ahtflows limit 5;

     

    It can be a bit tricky to query the other databases as most will need specific where clauses and if your query is poorly formed it can cause the NQMysql service to either crash or hang.  The best way to see how the db's are queries is to setup Mysql Query logging on the Harvester, then run a report that you would want to replicate via your own mysql queries, and then stop the query logging, and review the output log.  The log will show all queries run during the time frame that logging was enabled so it should capture the queries used by the NFA software to pull the report you ran.  You can use this as a template for other queries.

     

    To setup query logging follow the steps below:

     

    1. Login to the 3307 instance of mysql by running:

    mysql -P3307

     

    2. Then run the following to set the name of the log file:

    SET GLOBAL general_log_file='query.log';

     

    3. Then get a report page ready to refresh.

     

    4. Then enable query logging by running:

    set global general_log = 'ON';

     

    5. Refresh the report page, so the data is pulled again.

     

    6. Stop the query logging by running:

    SET GLOBAL general_log = 'OFF';

     

    7. Review the log file in the \Netflow\data\ directory.



  • 6.  Re: CA Network Flow Analysis and Splunk

    Posted Mar 01, 2016 03:50 PM

    Well…. This is certainly interesting.  Thanks!

     

    Can you say DB Connect?



  • 7.  Re: CA Network Flow Analysis and Splunk

    Posted Mar 02, 2016 09:13 AM

    I will try this.

     

    Thanks you very much.