Clarity

  • 1.  PMO Accelerator Logic & Metrics in the PPM Datamart & Ad-Hoc Domains

    Posted Dec 30, 2015 11:22 AM

    Hi folks. I'm curious on other's architecture plans for reproducing the results of logic and metrics we've grown accustom to relying upon in the views and functions in the CA PPM transactional database.

     

       I’m working on migrating our reports to the new CA PPM datamart and readying the datamart with the data we rely upon for a future roll out of Jasper’s Ad Hoc Reporting.

     

       We’ve ‘standardized’ on many of the logical constructs found in the PMO Accelerator. For example, let’s look at the Open Issues Indicator:

    Open Issues exist, at least one high Issue is late: 1 (red)

    Open Issues exist, at least one is late but none of the late are high: 2 (yellow)

    Open Issues exist, but none are late: 3 (green)

              No open Issues exist: 0 (white)

     

       The PMO Accelerator leverages database views and logic constructs in queries that we extract into a custom datamart today for portlet & report performance reasons. The logic above feeds an ‘Open Issues Indicator’ that everyone understands and is available in our BO Universes and presents itself on many of our portlets and reports. Easy to see, easy to use, easy on the database (doing the logic real time is expensive and kills portfolio level portlets & reports).

     

    We want to deprecate our datamart and BO tools and transition all of this to the new Datamart and Jaspersoft tools. I’d like to stay as ‘in the box’ as possible. I *wanted* to use the new Aggregated Calculated Attribute for an ‘Open Issues Indicator’ attribute that easily check boxes its way into the datamart; instantly available in Object based data providers, in Ad Hoc domains and in the datamart for Jasper Studio reports. Unfortunately (as best as I can tell) the logic is beyond the ACA’s capability.

     

    This is my first challenge – there will be more.

     

    Are others using PMO Accelerator or Custom Logic/Metrics in your transactional database reports today? How do you plan to replicate this in Jaspersoft Ad-Hoc and Studio reports? Surely we won’t want to be reproducing the logic/metric from scratch in every report. Will we (can we? should we?) be building custom views and functions in the Datamart? If not - what are the alternatives?



  • 2.  Re: PMO Accelerator Logic & Metrics in the PPM Datamart & Ad-Hoc Domains

    Posted Dec 30, 2015 12:03 PM

    There's probably a better way, but one option would be to add an attribute to the project object, then write in the KPI values via a process.

    Make this attribute available in the DWH.



  • 3.  Re: PMO Accelerator Logic & Metrics in the PPM Datamart & Ad-Hoc Domains

    Posted Jan 05, 2016 11:35 AM

    Thanks Andrew. I guess the answer is 'keep writing things to attributes'. I will - but I still want this to be easier.

     

    I'm slinging all kinds of data and KPIs like this back onto objects via unsupported read only attributes, unsupported stored procedures, and supported yet clunky processes with GEL scripts.

     

    With inclusion of an attribute in the datamart as easy as a checkbox, this supports the case for getting your data 'on the object' as the best approach in order to have it most readily available and best performing in CA PPM, the CA PPM Datamodel as well as the Datamart.

     

    My goal is to 'simplify' and get as much of what we need persisted in an attribute on the appropriate object. When I heard about the new Aggregated Calculated Attribute and how they could be included in the datamart I was hopeful. I hoped maybe here was something new that I can get data, do calculations and persist state in that would allow me to deprecate all my custom read only attribute tools/frameworks.

     

    My first test was to attempt to create an ACA that reproduces the Open Issues Indicator functionality above. Either I'm not getting it or that's not what this does (I think the latter but I'll remain curious).

     

    Drawbacks of Calculated Attributes

    - Virtual - no state other than run time

    - Not in the data model

    - Limited interface for selecting data

    - Limited functions & calculation capability

     

    I guess what I need is...

    An attribute that allows me

    - Greater capability in getting data from the system

    - Greater capability in functions/calculations

    - Persists state

    - Is in the datamodel

    - Can be put on the Rest API in Studio

    - Checkboxes into the datamart in Studio

    - Has update event(s?)

    - Doesn't have negative performance impacts (run time or other impacts)

     

    A 'Super Calculated Attribute'?

     

    Now that CA PPM has a Rest API that will continue to be built out and we can add attributes to via Studio - is there something better to use here? A REST API/JSON calculated attribute?

     

    Use Case: Organization keeps a 'Go-Live' attribute on our project object that presents the date from a Go-Live milestone on the WBS. Architecture today is a read-only NULL date attribute and a stored procedure that runs every 5 minutes and copyies the date from the milestone to the project object. Constraining by Last Updated Date, Is Milestone, Is Key with an ID of 'M2' on the task object - this is bullet proof and real light weight. Alas - highly custom & unsupported.

     

    I can get all the necessary data via the new REST API.

    rest_test.jpg

    {
      "_pageSize": 25,
      "_self": "https://<cappm_app>/ppm/rest/v1/projects/5317403/tasks?filter=%28code+%3D+%27M2%27%29+and+%28isKey+%3D+true%29+and+%28isMilestone+%3D+true%29&fields=name%2CisKey%2CisMilestone%2CstartDate%2ClastUpdatedDate",
      "_totalCount": 1,
      "_results": [
        {
          "_internalId": 5401801,
          "lastUpdatedDate": "2016-01-05T08:42:38",
          "_parent": "https://<cappm_app>/ppm/rest/v1/projects/5317403",
          "isMilestone": true,
          "name": "Go-Live",
          "isKey": true,
          "_self": "https://<cappm_app>/ppm/rest/v1/projects/5317403/tasks/5401801",
          "startDate": "2016-02-08T17:00:00"
        }
      ],
      "_recordsReturned": 1
    }
    

     

    If we had a calculated attibute that got data via the REST API, had the ability to update itself from events, persisted it's state in a database column, had the ability to expose itself on the REST API and could checkbox into the datamart - wouldn't this be powerful?

    With event capability, if done right couldn't this solve the business problems to solve that many use database triggers for or other dynamic SQL data solutions - (Ability to add Dynamic (SQL based) read only Attributes to an object that work throughout clarity)?

     

    Instead of limiting us by the expressions the expression editor allows - could this simply be a framework that has the ability to use expressions from a standard library (Java, Javascript, Groovy)?  What about a JSON Query library? If the framework was classpath based, new capabilities could be gained via the continued maturity of the main libraries and also allow for the inclusion of custom libraries.

     

    I can open the Idea - I just not that familiar with the new REST API capability to understand if this makes sense. I'd appreciate others' thoughts.