Clarity

Expand all | Collapse all

Data Warehouse Load Failure

Georg Schmid

Georg SchmidApr 24, 2015 08:30 AM

  • 1.  Data Warehouse Load Failure

    Posted Apr 24, 2015 05:35 AM

    Hi guys,

     

    our first attempts to load our data warehouse have failed. In the job logs the following is displayed:

     

    Job Started 4/20/15 5:34 PM 

     

     

    Error 4/20/15 5:37 PM dwh_db_check_ - An error occurred executing this job entry :

    Couldn't execute SQL: DECLARE @V_SQL_TEXT nvarchar(max);

     

     

    BEGIN

      SET @V_SQL_TEXT = 'select BENEFITPLAN_KEY, BENEFITPLAN_NAME, BILLING_CURRENCY_CODE, CHARGE_CODE_ID_CAPTION, CHARGE_CODE_KEY, CLARITY_CREATED_DATE, CLARITY_PLAN_DETAIL_KEY, CLARITY_PLAN_KEY, CLARITY_UPDATED_DATE, COST_TYPE_KEY, DEPARTMENT_ID_CAPTION, DEPARTMENT_KEY, DESCRIPTION, FIN_PLAN_TYPE_KEY, FINISH_PERIOD_KEY, FISCAL_PERIOD_TYPE_KEY, HOME_CURRENCY_CODE, INPUT_TYPE_CODE_ID_CAPTION, INPUT_TYPE_CODE_KEY, INVESTMENT_KEY, IS_PLAN_OF_RECORD, LOCATION_ID_CAPTION, LOCATION_KEY, PLAN_DETAIL_KEY, PLAN_FINISH_DATE, PLAN_ID, PLAN_KEY, PLAN_NAME, PLAN_START_DATE, RESOURCE_CLASS_ID_CAPTION, RESOURCE_CLASS_KEY, RESOURCE_KEY, RESOURCE_ID_CAPTION, ROLE_KEY, ROLE_ID_CAPTION, START_PERIOD_KEY, TRANSACTION_CLASS_ID_CAPTION, TRANSACTION_CLASS_KEY, USER_VALUE1_KEY, USER_VALUE2_KEY, CONVERT(DATETIME ,''2015-04-20 12:36:51'') as dw_updated_date  from [PPMDBLINK].niku.niku.DWH_COSTPLAN_V where CLAR

     

     

     

    Error 4/20/15 5:37 PM User Defined Java Class - Unexpected error

     

     

    Error 4/20/15 5:37 PM User Defined Java Class - java.lang.RuntimeException: ERROR: THERE WERE ERRORS DURING DIMENSION JOB EXECUTION FOR THE TABLE - DWH_FIN_PLAN

      at Processor.processRow(Processor.java:64)

      at org.pentaho.di.trans.steps.userdefinedjavaclass.UserDefinedJavaClass.processRow(UserDefinedJavaClass.java:1181)

      at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)

      at java.lang.Thread.run(Thread.java:745)

     

     

     

    Error 4/20/15 5:37 PM dwh_etl_interface_dim - Errors detected!

     

     

    Error 4/20/15 5:37 PM dwh_etl_interface_dim - Errors detected!

     

     

    Error 4/20/15 5:37 PM ETL Job Failed. Please see log bg-dwh.log for details.

     

     

    Job Completed 4/20/15 5:37 PM NJS-0401: Execution of job failed.

     

    Has anyone else encountered similar?



  • 2.  Re: Data Warehouse Load Failure

    Posted Apr 24, 2015 08:30 AM

    Are there any details in the bg-dwh.log?



  • 3.  Re: Data Warehouse Load Failure

    Posted Apr 24, 2015 08:51 AM

    I see the following errors in bg-dwh.lof file.

     

    ERROR 2015-04-18 19:05:06,123 [Thread-1103] dwh.event dwh_db_check_ - An error occurred executing this job entry :

    Couldn't execute SQL: DECLARE @V_SQL_TEXT nvarchar(max);

     

     

    BEGIN

      SET @V_SQL_TEXT = 'select BENEFITPLAN_KEY, BENEFITPLAN_NAME, BILLING_CURRENCY_CODE, CHARGE_CODE_ID_CAPTION, CHARGE_CODE_KEY, CLARITY_CREATED_DATE, CLARITY_PLAN_DETAIL_KEY, CLARITY_PLAN_KEY, CLARITY_UPDATED_DATE, COST_TYPE_KEY, DEPARTMENT_ID_CAPTION, DEPARTMENT_KEY, DESCRIPTION, FIN_PLAN_TYPE_KEY, FINISH_PERIOD_KEY, FISCAL_PERIOD_TYPE_KEY, HOME_CURRENCY_CODE, INPUT_TYPE_CODE_ID_CAPTION, INPUT_TYPE_CODE_KEY, INVESTMENT_KEY, IS_PLAN_OF_RECORD, LOCATION_ID_CAPTION, LOCATION_KEY, PLAN_DETAIL_KEY, PLAN_FINISH_DATE, PLAN_ID, PLAN_KEY, PLAN_NAME, PLAN_START_DATE, RESOURCE_CLASS_ID_CAPTION, RESOURCE_CLASS_KEY, RESOURCE_KEY, RESOURCE_ID_CAPTION, ROLE_KEY, ROLE_ID_CAPTION, START_PERIOD_KEY, TRANSACTION_CLASS_ID_CAPTION, TRANSACTION_CLASS_KEY, USER_VALUE1_KEY, USER_VALUE2_KEY, CONVERT(DATETIME ,''2015-04-18 19:04:55'') as dw_updated_date  from [PPMDBLINK].niku.niku.DWH_COSTPLAN_V where CLARITY_UPDATED_DATE >= CONVERT(DATETIME,''1910/01/01 00:00:00'') AND CLARITY_UPDATED_DATE <= (SELECT DISTINCT DWH_DIM_START_DATE FROM DWH_CFG_SETTINGS) AND DWH_COSTPLAN_V.LANGUAGE_CODE =''en''';

      SET @V_SQL_TEXT = 'INSERT INTO DWH_FIN_PLAN(BENEFITPLAN_KEY, BENEFITPLAN_NAME, BILLING_CURRENCY_CODE, CHARGE_CODE, CHARGE_CODE_KEY, CLARITY_CREATED_DATE, CLARITY_PLAN_DETAIL_KEY, CLARITY_PLAN_KEY, CLARITY_UPDATED_DATE, COST_TYPE_KEY, DEPARTMENT, DEPARTMENT_KEY, DESCRIPTION, FIN_PLAN_TYPE_KEY, FINISH_PERIOD_KEY, FISCAL_PERIOD_TYPE_KEY, HOME_CURRENCY_CODE, INPUT_TYPE_CODE, INPUT_TYPE_CODE_KEY, INVESTMENT_KEY, IS_PLAN_OF_RECORD, LOCATION, LOCATION_KEY, PLAN_DETAIL_KEY, PLAN_FINISH_DATE, PLAN_ID, PLAN_KEY, PLAN_NAME, PLAN_START_DATE, RESOURCE_CLASS, RESOURCE_CLASS_KEY, RESOURCE_KEY, RESOURCE_NAME, ROLE_KEY, ROLE_NAME, START_PERIOD_KEY, TRANSACTION_CLASS, TRANSACTION_CLASS_KEY, USER_VALUE1_KEY, USER_VALUE2_KEY, dw_updated_date) ' + @V_SQL_TEXT;

     

      BEGIN TRANSACTION

      EXEC (@V_SQL_TEXT);

      COMMIT TRANSACTION;

      CHECKPOINT;

     

      UPDATE STATISTICS DWH_FIN_PLAN;

    END

     

     

    [CA Clarity][SQLServer JDBC Driver][SQLServer]Cannot insert the value NULL into column 'PLAN_FINISH_DATE', table 'ppm_dwh.ppm_dwh.DWH_FIN_PLAN'; column does not allow nulls. INSERT fails.

     

     

     

    dwh_etl_interface_dim - User Defined Java Class] dwh.event User Defined Java Class - DWH_FIN_PLAN - Job Log: 2015/04/18 19:05:06 - dwh_db_check_ - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : An error occurred executing this job entry :

    2015/04/18 19:05:06 - dwh_db_check_ - Couldn't execute SQL: DECLARE @V_SQL_TEXT nvarchar(max);

     

     

    ERROR 2015-04-18 19:05:06,152 [dwh_etl_interface_dim - User Defined Java Class] dwh.event User Defined Java Class - Unexpected error

    ERROR 2015-04-18 19:05:06,161 [dwh_etl_interface_dim - User Defined Java Class] dwh.event User Defined Java Class - java.lang.RuntimeException: ERROR: THERE WERE ERRORS DURING DIMENSION JOB EXECUTION FOR THE TABLE - DWH_FIN_PLAN

      at Processor.processRow(Processor.java:64)

      at org.pentaho.di.trans.steps.userdefinedjavaclass.UserDefinedJavaClass.processRow(UserDefinedJavaClass.java:1181)

      at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)

      at java.lang.Thread.run(Thread.java:745)



  • 4.  Re: Data Warehouse Load Failure

    Posted Apr 24, 2015 09:08 AM

    The line

    [CA Clarity][SQLServer JDBC Driver][SQLServer]Cannot insert the value NULL into column 'PLAN_FINISH_DATE', table 'ppm_dwh.ppm_dwh.DWH_FIN_PLAN'; column does not allow nulls. INSERT fails.

    points to a financial plan without a finish date.

     

    I would propose to check in CA PPM and/or table FIN_FINANCIALS.



  • 5.  Re: Data Warehouse Load Failure

    Posted Apr 24, 2015 10:01 AM

    Thank you Georg, we will try that and see how it goes.



  • 6.  Re: Data Warehouse Load Failure

    Posted May 01, 2015 05:27 AM

    Hi guys,

     

    just another query on this one regarding the DWH size estimate. I had a look in the post-check-results.html file and I saw the following:

    Estimated DWH schema size : 40 GB

     

    Yesterday when we ran the load dwh job again it ran for 207 minutes (longer than previous attempts) but still failed. Having noted free space on the drive before and after I see that we have used 40GB. This is only for 1 entity.

     

    We have 7 entities that we would like to eventually load into the DWH. Does this mean our storage estimate will be closer to 300GB?

     

    I have a case open to try to establish why our DWH is failing.



  • 7.  Re: Data Warehouse Load Failure

    Posted May 01, 2015 06:23 AM

    Seeing this error:

     

    NJS-0401 - Execution of job failed.

     

    Also,  Could not login to Jaspersoft, domains could not get updated for customizations.

     

    My Jaspersoft config looks to be correct though? In the NSA the status was definitely 'Available' prior to the last DWH Load attempt.



  • 8.  Re: Data Warehouse Load Failure

    Broadcom Employee
    Posted May 01, 2015 06:47 AM

    Hi Cmcn1982,

     

    What does the dwh logs says, if you can share the same then I can take a look.

     

    Regards

    Suman Pramanik



  • 9.  Re: Data Warehouse Load Failure

    Posted May 01, 2015 06:53 AM

    Thank you Suman.

     

    I cannot see anything specific in the logs but maybe you can spot what I am missing?

     

    (Sent to you via email)

     

    Thanks!



  • 10.  Re: Data Warehouse Load Failure

    Broadcom Employee
    Posted May 01, 2015 07:04 AM

    Hi Colin,

     

    I don't see error on the logs, and also it looks the job completed. Is this again coming when you are trying to run now.

     

     

    2015/04/30 12:36:39 - Write to log.0 - DWH_LOAD_END_DATE = 2015/04/30 12:36:39

    INFO  2015-04-30 14:32:34,573 [Thread-11990] dwh.event dwh_etl_master - Job execution finished

    INFO  2015-04-30 14:32:34,573 [Thread-11990] dwh.event dwh_etl_master - Job execution finished

    INFO  2015-04-30 19:28:23,204 [Thread-10661] dwh.event dwh_etl_master - Job execution finished



  • 11.  Re: Data Warehouse Load Failure

    Posted May 01, 2015 08:14 AM

    I was under the impression that it had failed because on the application side we saw:

     

    Load_DWH_Failure.PNG



  • 12.  Re: Data Warehouse Load Failure

    Broadcom Employee
    Posted May 01, 2015 08:23 AM

    Hi Colin,

     

    The logs which you share doesn't have the timing from 4 PM which we see here. The logs provided to me has time stamp till 12:30

     

    2015/04/30 12:35:21 - Execute SQL script.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)

     

    Regards

    Suman Pramanik



  • 13.  Re: Data Warehouse Load Failure

    Posted May 01, 2015 09:42 AM

    Hi Suman,

     

    the application side (4PM) is GMT but the logs from the server are EST. That is why the tims stamp is different.

    So the 12.35 is actually 5.30PM.

     

    A little bit confusing I know!



  • 14.  Re: Data Warehouse Load Failure

    Broadcom Employee
    Posted May 01, 2015 09:49 AM

    Hi Colin,

     

    Try running again and see as I don't see anything in the log.

     

    Regards

    Suman Pramanik



  • 15.  Re: Data Warehouse Load Failure

    Posted May 01, 2015 09:51 AM

    Will do.

     

    Time Slicing is running at the moment. We should wait until it has finished processing right?



  • 16.  Re: Data Warehouse Load Failure

    Broadcom Employee
    Posted May 01, 2015 09:57 AM

    Ya lets wait until it gets completed.

     

    Regards

    Suman Pramanik



  • 17.  Re: Data Warehouse Load Failure

    Broadcom Employee
    Posted May 01, 2015 09:53 AM

    Also you can query this table Dwh_cfg_audit for status of the job while its running

     

     

    Table_name – name of the table being loaded

    Dw_updated_date – data changes were retrieved up to this date

    Dw_load_start_date – date the load of this table started

    Dw_load_end_date – date the load of this table ended

     

    Regards

    Suman Pramanik



  • 18.  Re: Data Warehouse Load Failure

    Posted May 05, 2015 07:43 AM

    Hi Suman,

     

    the Time Slice job failed after running for a long time (4888 minutes!)

     

    The error "NJS-0401: Execution of job failed" appears.

     

    I searched the bg logs for more information but could not find any reference to the above error.

    Maybe I am missing something?



  • 19.  Re: Data Warehouse Load Failure

    Posted May 05, 2015 08:58 AM

    I think I found the error relating to the Time Slice failure:

     

    ERROR 2015-05-04 14:18:20,626 [Dispatch pool-5-thread-2 : bg@SERVER (tenant=clarity)] niku.njs (clarity:admin:199480846__8640E4D4-9645-43CE-AD59-FD93651DB333:Time Slicing) Error executing job: 5821997

    java.sql.SQLException: [CA Clarity][SQLServer JDBC Driver]Object has been closed.

      at com.ca.clarity.jdbc.sqlserverbase.ddcw.b(Unknown Source)

      at com.ca.clarity.jdbc.sqlserverbase.ddcw.a(Unknown Source)

      at com.ca.clarity.jdbc.sqlserverbase.ddcv.b(Unknown Source)

      at com.ca.clarity.jdbc.sqlserverbase.ddcv.a(Unknown Source)

      at com.ca.clarity.jdbc.sqlserverbase.BaseConnection.commit(Unknown Source)

      at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)

      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

      at java.lang.reflect.Method.invoke(Method.java:606)

      at org.logicalcobwebs.proxool.ProxyConnection.invoke(ProxyConnection.java:68)

      at org.logicalcobwebs.cglib.proxy.Proxy$ProxyImpl$$EnhancerByCGLIB$$290b6638.commit(<generated>)

      at com.niku.xql2.jdbc.JDBCDataSource.concludeTransaction(JDBCDataSource.java:404)

      at com.niku.xql2.jdbc.JDBCDataSource.close(JDBCDataSource.java:351)

      at com.niku.xql2.binding.XQLSessionImpl.close(XQLSessionImpl.java:260)

      at com.niku.njs.xbl.XBLService.process(XBLService.java:134)

      at com.niku.njs.xbl.XBLService.process(XBLService.java:75)

      at com.niku.njs.listeners.XBLJob.scheduledEventFired(XBLJob.java:44)

      at com.niku.njs.Dispatcher$BGTask.run(Dispatcher.java:367)

      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

      at java.lang.Thread.run(Thread.java:745)

     

    JDBC Driver object has been closed.

     

    Can anyone please shed any more light on this? Thanks.



  • 20.  Re: Data Warehouse Load Failure

    Posted May 05, 2015 10:11 AM

    Hello,

     

    From the error, what I understand is the timeslicing job was stuck in between and the connection got closed due to inactivity. Can you please run the job again and keep monitoring the progress using the following query?

     

    SELECT 'Assignment' Slice_Object, Count(*), SLICE_STATUS
    FROM prassignment
    WHERE SLICE_STATUS in (1,2,3,4)
    GROUP BY SLICE_STATUS
    UNION SELECT 'Availability' Slice_Object,Count(*), SLICE_STATUS
    FROM prj_resources
    WHERE SLICE_STATUS in (1,2,3,4)
    GROUP BY SLICE_STATUS
    UNION SELECT 'Allocation' Slice_Object, Count(*), SLICE_STATUS
    FROM prteam
    WHERE SLICE_STATUS in (1,2,3,4)
    GROUP BY SLICE_STATUS
    UNION SELECT 'Allocation' Slice_Object, Count(*), HARD_SLICE_STATUS
    FROM prteam
    WHERE HARD_SLICE_STATUS in (1,2,3,4)
    GROUP BY HARD_SLICE_STATUS
    UNION SELECT 'Timeentries' Slice_Object, Count(*), SLICE_STATUS
    FROM prtimeentry
    WHERE SLICE_STATUS in (1,2,3,4)
    GROUP BY SLICE_STATUS
    UNION SELECT 'Costplan Units' Slice_Object, Count(*), odf_ss_units as
    slice_status
    FROM fin_cost_plan_details
    WHERE odf_ss_units in (1,2,3,4)
    GROUP BY odf_ss_units
    UNION SELECT 'Costplan Bill Revenue' Slice_Object, Count(*),
    odf_ss_bill_revenue as slice_status
    FROM fin_cost_plan_details
    WHERE odf_ss_bill_revenue in (1,2,3,4)
    GROUP BY odf_ss_bill_revenue
    UNION SELECT 'Costplan Bill Cost' Slice_Object, Count(*), odf_ss_bill_cost
    as slice_status
    FROM fin_cost_plan_details
    WHERE odf_ss_bill_cost in (1,2,3,4)
    GROUP BY odf_ss_bill_cost
    UNION SELECT 'Costplan Revenue' Slice_Object, Count(*), odf_ss_revenue as
    slice_status
    FROM fin_cost_plan_details
    WHERE odf_ss_revenue in (1,2,3,4)
    GROUP BY odf_ss_revenue
    UNION SELECT 'Costplan Revenue' Slice_Object, Count(*), odf_ss_cost as
    slice_status
    FROM fin_cost_plan_details
    WHERE odf_ss_cost in (1,2,3,4)
    GROUP BY odf_ss_cost
    UNION SELECT 'Assignment Baselines' Slice_Object, Count(*), slice_status as
    slice_status
    FROM PRJ_BASELINE_DETAILS
    WHERE SLICE_STATUS in (1,2,3,4) and object_type='ASSIGNMENT' and
    IS_CURRENT=1
    GROUP BY slice_status

     

    Keep running this query every 30 minutes and take a snapshot to compare the results with the next one and see if there is any progress.

    Also we would know how many items are being processed.

     

    Regards

    Chandrani

     



  • 21.  Re: Data Warehouse Load Failure
    Best Answer

    Posted May 06, 2015 10:44 AM

    we did some further troubleshooting. We ran the Time Slicing job again and I kept a close eye on the bg-ca.log file.

    Soon we saw the error:

    "ERROR 2015-05-06 07:18:19,336 [Dispatch pool-5-thread-6 : bg@SERVER (tenant=clarity)] niku.blobcrack (clarity:admin:199481040__64A6AB84-EDA6-4B00-95D7-225EE324B3FD:Time Slicing) Bad prAllocSum value.  Object ID = 5266621 Object type = 4"

     

    After this we found the KB article - http://www.ca.com/us/support/ca-support-online/product-content/knowledgebase-articles/tec529770.aspx

     

     

    The error is caused by a corrupt value in the allocation record.

    To locate the allocation record, do the following:

    1. Use the object id in the error message to run the following query:select P.unique_name as project_code , R.UNIQUE_NAME as resource_code from srm_projects P , prteam T, srm_resources R where P.id = T.PRPROJECTID and R.ID = T.PRRESOURCEID and T.PRID = 'enter object id from the error message'e.g. this is the query for the above error:select P.unique_name as project_code , R.UNIQUE_NAME as resource_code from srm_projects P , prteam T, srm_resources R where P.id = T.PRPROJECTID and R.ID = T.PRRESOURCEID and T.PRID = 5266621
    2. Use the project_code to locate the project and the resource_code to locate the resource and allocation record.

    3. Check the allocation record for any corrupt values.

      In this case the allocation percentage field had a non numeric value i.e. '~'

     

    4. Update the value to a valid one.

    5. Restart the background (bg) services.

    6. Run the Time Slicing job.

     

    We are now in the process of running Time Slice again so we will see what happens this time.



  • 22.  Re: Data Warehouse Load Failure

    Posted May 11, 2015 07:35 AM

    Once we resolved the Time Slice issue it ran to completion.

    Then we were able to run the Load DWH job successfully and are now running the Load DWH Access Right sjob.

     

    Thanks to all for their advice on this one!