Skip navigation
All Places > Clarity PPM > Blog > 2016 > December
2016

Many times I get the following question: “What are best practices / recommendations for CA PPM housekeeping?

 

In this post, I will explain some good practices regarding CA PPM application using OOTB (out of the box) jobs and workflows. Therefore, infrastructure housekeeping maintenance will not be covered.

 

 

OOTB Scheduled Jobs:

Schedule following general jobs to run on regular basis:

 

  • Purge Audit Trail: it's a good practice to set a max period to keep records in table. You need to set it on each object where audit trail is used. I’d recommend a max of 90 or 120 days, but all depends on how many attributes are being audited and your business needs.

 

Based on my experience, when table grows beyond 1 million records, it will cause performance issues.

Run the following query to check table size:

   SELECT COUNT(*) FROM CMN_AUDITS

       Ensure it does not grow beyond 1 million records.

 

  • Purge Documents: WATCH OUT!!!: Do not run it on regular basis unless this is what really you want, it will delete permanently documents!!!!  CA PPM administrator should always back up documents (“file store” files or DB dump if case they are stored in DB) before running job.

 

Job allows you to filter based on:

Purge All Documents for the Following Objects

[Or] Purge Documents and Versions Not Accessed for [n] Days

[Or Retail the [n] Most Recent Version and Purge the Prior Versions

All Projects

Project OBS

Specific Project

All Resources

Resource OBS

Specific Resource

All Companies

Company OBS

Specific Company

Knowledge Store

 

  • Purge Notifications: It's a good practice is to delete old notifications. Based on my experience, users don’t do it .... Relay on “From Created” “To Created” to purge based on “n” days old, otherwise, table will grow and may cause performance issues (similar to audit trail)..

 

  • Delete Log and Analysis: This job will be already scheduled by default to run once per day. Do not cancel it, just accommodate scheduled time to your non-working business hours.

 

  • Delete Process Instance: It's a good practice to delete “completed” and “aborted” processes older than “n” days on regular basis depending on how long you need to keep details. Bear in mind, when going to more than 200.000 completed processes, it may cause slowness or performance issues.

 

  • Oracle Table Analyze Job: (Just if Database vendor is Oracle and CA PPM On Premise).  It's a good practice to run it weekly during non-working business hours. In case of general performance, based on CA Support or CA Services recommendations, you may run it daily.

 

This job refreshes statistics that are used to determine the best path or execution for a query.

Analyze statistics under certain circumstances, such as when the schema or the data volume has changed.

 

Processes:

 

  • Review on daily basis all failed processes: Re-try them and if still failing, troubleshooting error messages. Do not leave them unattended in “error/failed” status.

 

  • When aborting processes (cancel them), ensure they are not stuck in “aborting” status. If they are, run following query:

UPDATE BPM_RUN_PROCESSES
SET STATUS_CODE = 'BPM_PIS_ABORTED'
WHERE STATUS_CODE = 'BPM_PIS_ABORTING'

      and restart BG service.

 

  • Delete “Completed” processes via job Delete Process Instance.

 

  • Delete “Aborted” processes via job Delete Process Instance.

 

  • Ensure there are not orphan records:

SELECT * FROM BPM_RUN_PROCESSES

WHERE PROCESS_VERSION_ID NOT IN (SELECT ID FROM BPM_DEF_PROCESS_VERSIONS)        

         In case it returns results, then proceed with following:

DELETE FROM BPM_RUN_PROCESSES

WHERE PROCESS_VERSION_ID NOT IN (SELECT ID FROM BPM_DEF_PROCESS_VERSIONS)

         and restart BG service.

 

  • Ensure there are not orphan Process Engine records: It’s a good practice to remove an outdated and unused Process Engine. You can run the following queries to identify the inactive process engines and delete them: 

      For Oracle:

SELECT * FROM BPM_RUN_PROCESS_ENGINES 

WHERE END_DATE != NULL OR END_DATE <= SYSDATE

      If it returns results, then proceed with following:

 

DELETE FROM BPM_RUN_PROCESS_ENGINES

WHERE END_DATE != NULL or END_DATE <= SYSDATE

        For MSSQL:

 

SELECT * BPM_RUN_PROCESS_ENGINES
WHERE END_DATE != NULL OR END_DATE <=GETDATE()

      If it returns results, then proceed with following:

DELETE FROM BPM_RUN_PROCESS_ENGINES
WHERE END_DATE != NULL OR END_DATE <= GETDATE()

       Restart the APP and BG services.

 

 

That’s all. Thanks for reading until here. Did you like it? Please, don’t be shy and share it.

Recently I had the opportunity to visit a customer with many partitions and performance issues. One of the questions I had is: “What is the maximum number of partitions within the same CA PPM instance?”

 

  • CA PPM documentation does not provide a specific number…
  • CA support advices as a good practice no more than 3…
  • On the other hand, the tool does not really limit you on 3…

 

So, what is the right answer? From my point of view, there is not a concrete number. It will depend on what you have configured “inside” the partition.

 

Key elements to be considerate:

 

Infrastructure:

  • Ensure right CA PPM sizing for your environment.
  • Monitor Database performance. Engage your DBA team.
  • Monitor CA PPM java memory usage (it should not go higher than 80%). You can easily check via URL:

http://<your_ca_ppm:port>/niku/nu#action:security.caches

or if SSL:

https://<your_ca_ppm:port>/niku/nu#action:security.caches

Refresh several times web browser to observer the peaks (if any).

 

Application Studio Configuration:

One of the key elements to bear in mind is studio views configurations. This will drag most of your front-end memory and performance.

 

  • Do not abuse of custom attributes per object: Best practices is no more than 100 and technical limitation of the tool is 500.
  • Do not abuse of display conditions (Sub pages): Best practices is no more than 10 per view. Tool does not provide specific technical limitation.
  • Use wisely the AVP (Attribute Value Protection) settings for portlets list views.

  • Ensure if you are willing to allow users to configure portlet list. Some users will add any possible attribute they see and use CA PPM as a dumping data too. Best practices is 20-25 columns (when no attachments, URL, images) or 10-15 columns when using expensive attributes.

  • Ensure "Rows per Page" is 20:

  • We recommend to use “Do not show results until I filter” to enhance navigation from portlet page to another:

  • Do not use same data provider (especially if it’s an out of the box with many custom attributes) to build all your portlet variation: NSQL queries should be built as data providers.

 

 

But again, this is not rocket since and it requires always an analyses per customer environment and business needs.

 

From my point of view, any page taking more than 3-4 seconds to display, it could be considerate slow and a performance issue. But!!! (there is always a but) heavily configured environments or pages could 5-6 seconds and accepted as a good performance from user perspective point of view.

 

 

That’s all. Thanks for reading until here. Did you like it? Please, don’t be shy and share it.

 

This year the industry gathered at CA World in Las Vegas to eagerly hear how CA is bringing Agile Transformation to our customers, and we are proud to say once again that our PPM community took center stage. Our Service Management, PPM and Agile Central (formerly Rally) solutions were part of our Agile Management presence at CA World ‘16. Whether you attended the show to look at our Service Desk and Asset Manager solutions, the latest release of PPM - v15.1, or the new user experience for Agile Central – all solutions had one thing in common, delivering customers more agility within their IT and business investments.  A lot of the attendees were there to learn from the many pre-conference educational sessions, breakout sessions and tech talks delivered by pre-sales, customers and product leaders.  We hosted over 250 one-on-one customer sessions so we could hear firsthand about your goals and how we can work together to achieve them.

 

To make sure our larger customer base, user community and industry can leverage the great content presented at CA World, we have posted presentations on our SlideShare channel (please note more presentations, where applicable, will be uploaded throughout December): http://www.slideshare.net/CAinc/tagged/CA%20World%2016%20Agile%20Management

 

This includes presentations from PPM, Service Management and Agile Central channels, all aggregated within the Agile Management section of our CA World Slideshare.

 

Here is a list of the top 3 most attended PPM pre-con education sessions:

1)      Portfolio Management by Doug Page

2)      Value Transformation by James Chan

3)      Agile Integrations by James Chan and Brian Nathanson

 

Most attended PPM customer and CA led sessions:

1)      CA PPM vision & roadmap by Kurt Steinle

2)      CA PPM 15.1 panel with CA, Camso and IGT

3)      Checkbook & bank approach to managing project financials

 

Most popular PPM demo pods:

4)      Adaptive Project Manager – 15.1 release

5)      Reporting & Analytics

6)      Align Strategy & Portfolios

 

Hope you enjoy the content we are making available throughout the month. If you have questions, please leave us a comment below and we will get back to you with a response as quickly as possible.

 

Thanks!

 

Jeff

I’m very excited to bring you highlights of new PPM and Agile content, hot on the heels of a terrific CA World. This is my second post on the topic, and it will focus on the powerful relationship between CA PPM and CA Agile Central. Please also check out the many CA World presentations about PPM and Agile here:

 

 

From my perspective as a PPM and Agile Practitioner, the engaging presentation by Steve Demchuk and Kurt Steinle, who are both CA Technologies Product Management VPs, was very impactful. Steve and Kurt brought to life the journey that companies embark on when they integrate all of their initiatives into a single management platform. This is a new frontier, and I have heard some skepticism, but Steve and Kurt showed us how PPM and Agile work together to create cooperative management solutions that deliver incremental value, rather than competing with each other and reducing relevance. As a quick review, here is the live Periscope that I posted as part of part one of this blog:

 

Next-gen PPM and Agile management are here to stay. An article by Actuation Consulting, “2015 Study of Product Team Performance,” stated that 45% of organizations choose a bi-modal Waterfall project management/Agile approach to development; and what I find especially insightful is that this number has remained constant for the four years that the study has been conducted. With these thoughts in mind, here are some key takeaways from Steve and Kurt’s presentation, all of which focus on the concept of “better together integration” and echo my assertion in part one of this blog; that PPM and/or Agile Practitioners must stay up to date on the latest opportunities to enable our organizations’ desired outcomes.

 

The goals of integrating PPM and Agile are to:

  • Create useful and practical connection points between work execution and financial accountability.
  • Enable each stakeholder to “work where they work.” 
  • Create natural touchpoints between funding decisions and the roadmap.
  • Provide an out of the box framework on which to build the future.

 

Here is some content from the presentation that I found especially impactful. The first slide discusses Integration - Mapping Agile Initiative to a PPM Project:

This leads to Integration - Mapping Work Performance and Value:

Next we discuss Understanding Feature Value Delivery. Each feature task in PPM is continually updated with metrics from the Feature in Agile Central:

We move to Team Sync – Feeding the Team from Agile:

CapEx / Opex Learnings are key to this approach:

Engineers can now stay in Agile Central for all work.  They access the PPM timesheet from a menu option in Agile Central.  The PMO gets an integrated time platform with all the controls needed to manage cost allocations across the business, and the team members can remain in the tool they work in; Focus – Keep Engineers in the Context of their Workspaces:

The next few sections show the system. Here we can see Time Keeping Task Template:

Full Featured Labor Accounting in Agile:

Agile Dashboards in PPM:

 

For readers interested in more detail, CA World content has been published here. I also encourage you to participate in the best-in-class CA Communities site, where you have access to your peers, events and support. If you’ve read my blog series, you know I view community involvement as indispensable to success.

 

You can also reach out to CA Services for individualized business outcome references and analysis. Feel free to post in the comments section of this blog or contact me directly via email and Twitter @PPMWarrior.