IQ: PKIX path building failed unable to find valid certification path to requested target / LicenseClient: handleLicenseException - got exception Problem with connection to server


Doc ID    SOLN282566
Version:    1.0
Status:    Published
Published date:    22 Jan 2016
Author:   
rlinwood
 

Details

ERROR LicenseClient: handleLicenseException -
got exception Problem with connection to server

ValidatorException: PKIX path building failed
unable to find valid certification path to requested target

TimerRenew.java
WARN TImerRenew::run - got exception CONNECTIONFAILURE

PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException:
unable to find valid certification path to requested target

CommonLicensing.Licensing.GRACEPERIODDAYSLEFT The license grace period will end in 10 days.
Original Problem: Trusted certificates expired
=====

IBM System x3550 M2 -[7946PEM]- (but was totally unrelated to the hardware)
5.1.3.0.77_8019_10919_SP3 (but was totally unrelated to the IQ version)

Problem Clarification

/var/log/Avaya/CCR/ADMIN*/ADMIN*.log
2015-12-27 15:37:50,793 [Timer-14] 296 TimerGracePeriod.java ERROR CommonLicensing.Licensing.GRACEPERIODDAYSLEFT The license grace period will end in 10 days.
2015-12-28 13:17:37,161 [Timer-16] 382 LicenseClient.java ERROR LicenseClient::handleLicenseException - got exception Problem with connection to server: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
2015-12-28 13:17:37,161 [Timer-16] 221 TimerRenew.java WARN TImerRenew::run - got exception CONNECTIONFAILURE

=====

TRIED:

Step 1 > vi /opt/coreservices/certmgmt/conf/distcert.conf
HTTPS_PORT=28444
WEB_SERVICE_URI=/axis/services/CertificateManagementService
TRUST_STORE=trustedcerts.jks
TRUST_STORE_PASSWORD=password
KEY_STORE=Avaya_Common.pfx
KEY_STORE_TYPE=PKCS12
KEY_STORE_PASSWORD=avayapass

Step 2 > change the above port number to
HTTPS_PORT=8443

In the end, this didn't resolve anything.
The /opt/coreservices/certmgmt/conf/distcert.conf
file was then re-edited and the HTTPS_PORT parameter was reset back to 28444.

=====

ALSO TRIED (but didn't really offer any clues):
[root@dccuvoipiq1 w]# keytool -list -keystore /u02/u01backup/product/11.1.0/db_1/javavm/lib/security/cacerts
Enter keystore password: changeit

Keystore type: jks
Keystore provider: SUN

Your keystore contains 33 entries

verisignclass1g3ca, Mar 25, 2004, trustedCertEntry,
Certificate fingerprint (MD5): B1:47:BC:18:57:D1:18:A0:78:2D:EC:71:E8:2A:95:73
equifaxsecureebusinessca1, Jul 18, 2003, trustedCertEntry,
Certificate fingerprint (MD5): 64:9C:EF:2E:44:FC:C6:8F:52:07:D0:51:73:8F:CB:3D
verisignclass2g2ca, Mar 25, 2004, trustedCertEntry,
. . . . .
starfieldclass2ca, Jan 11, 2005, trustedCertEntry,
Certificate fingerprint (MD5): 32:4A:4B:BB:C8:63:69:9B:BE:74:9A:C6:D

Cause

2016-01-05 09:49:19,784 [http-28443-Processor7] 382 LicenseClient.java ERROR LicenseClient::handleLicenseException - got exception Problem with connection to server: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
2016-01-05 09:49:19,785 [http-28443-Processor7] 579 LicenseFacade.java ERROR CommonLicensing.Licensing.LICENSESERVERUNAVAILABLE The license server is unavailable. The request will be granted for now.

=====

The problem was caused by an error condition during DNS lookup.
The information returned from DNS lookup must have returned incorrect about the IQ FQDN/IP causing a problem with certificate mapping for the local IQ host.

=====

Apparently:
- Sometime between when the system was first installed and the certs needed renewal the customer had changed their DNS environment
- The later commented that they had experienced similar problems with other products that interfaced with DNS (unfortunately this was not mentioned earlier in the investigation)

 

 

Solution

After renaming /etc/resolv.conf and restarting AdminTomcat container,
we were able to view the local self-signed certificates in OAM.

After this then we could see and manage the effort to update the certificate
but there were then complications, too.

=====

Once the DNS issues were cleared and, although the chain the customer provided was valid,
we didn't have a pending certificate request to match
which meant that the private key to decrypt it was gone also

Unless we could somehow retrieve that csr and private key combo files
we'll need a new certificate to be cut.

When we do a request in OAM it creates a file in /opt/coreservice/avaya/certs/reqs .
The alias you give it --  .req --  also creates a private key file.
Only when we import the final signed certificate does the whole private key and certificate combo gets hashed into the same pfx file.

Ultimately a new certificate series was required for final resolution.

Attachment File


Avaya -- Proprietary. Use pursuant to the terms of your signed agreement or Avaya policy