spitzbube
Goto Top

ESXI hebt Registrierung von Masschinen auf

Hallo,

nutze seit einiger Zeit für meine IT Projekte einen alten HP G7 Server mit ESXi 6.5 und einem Custom Image. Da die Kiste nur sporadisch läuft ist mir seit dem Letzen Start aufgefallen, dass die VMs die Registrierung verlieren.

Also nach dem Start sind die VMs einfach nicht mehr registriert und tümpeln als Leichen in der Liste der virutellen Maschinen herum. Hebt man dann die Registrierung auf und registriert die Maschine als vorhandene neu funktinieren die auch problemlos, bis zum nächsten neustart dann sind sie wieder verweißt.

2 der 3 Maschinen sind Nested Hyper-V Hosts um 2 Standorte zu simulieren. Dachte zunächst, dass die Batterien vom Raidcontrollers leer sind aber das würde mir ha beim Booten angezeigt. HP Smart Array. Die Festplatten selbst sind auch soweit okay und melden keine Probleme.

Was mir auch aufgefallen ist, dass der Einstellungen garnicht mehr speichert. Virtuelle Switche, welche angelegt wurden, sind nach einem Neustart nicht mehr vorhanden. Um jetzt mal zu analysieren, woher das Promblem kommt (ESXi oder Serverhardware) frage ich euch mal. Habt Ihr mir vllt. nen Tipp oder ähnliches, wie ich die Mühle wieder ans laufen bekomme?

Viele Grüße
esxi registrierung aufgehoben

Content-ID: 441364

Url: https://administrator.de/contentid/441364

Ausgedruckt am: 19.12.2024 um 12:12 Uhr

certifiedit.net
certifiedit.net 18.04.2019 um 10:00:54 Uhr
Goto Top
Hallo,

was sagen die Logs, was laufen für Skripte? Ansonsten alles i.O? (Lizenzen und co).

Warum muss man hier die Fragenden eigentlich regelmäßig erstmal dazu auffordern selbst erstmal Vorarbeit zu leisten?

Viele Grüße
Penny.Cilin
Penny.Cilin 18.04.2019 um 10:03:16 Uhr
Goto Top
Zitat von @certifiedit.net:

Hallo,

was sagen die Logs, was laufen für Skripte? Ansonsten alles i.O? (Lizenzen und co).

Warum muss man hier die Fragenden eigentlich regelmäßig erstmal dazu auffordern selbst erstmal Vorarbeit zu leisten?
Moin, Du weißt dass Heute Arbeitsmäßig schon Freitag ist.?

Ansonsten sehe ich es genauso wie Du.

Viele Grüße
Gruss Penny.
emeriks
emeriks 18.04.2019 um 10:05:43 Uhr
Goto Top
Hi,
das hört sich so an, als wenn der betreffende Datastore nach dem Hochfahren nicht rechtzeitig online ist.

E.
Penny.Cilin
Penny.Cilin 18.04.2019 um 10:06:39 Uhr
Goto Top
Zitat von @emeriks:

Hi,
das hört sich so an, als wenn der betreffende Datastore nach dem Hochfahren nicht rechtzeitig online ist.
Dazu schaut man aber in die Logs. face-wink

E.
Gruss Penny.
Spitzbube
Spitzbube 18.04.2019 um 10:20:00 Uhr
Goto Top
Okay das stimmt. Die Logs liefere ich natürlich gleich nach. Bin schon im Wochenende oder noch nicht ganz auf der Höhe heute.
certifiedit.net
certifiedit.net 18.04.2019 um 10:21:53 Uhr
Goto Top
Zitat von @Spitzbube:

Okay das stimmt. Die Logs liefere ich natürlich gleich nach. Bin schon im Wochenende oder noch nicht ganz auf der Höhe heute.

Dann fahr dich bitte hoch und stell die Frage dann nochmals ordentlich.
Spitzbube
Spitzbube 18.04.2019 um 10:51:29 Uhr
Goto Top
Also er hat nen Haufen alter Einträge drin wo er auf ne Domain nicht zugreifen kann, die es so nicht mehr gibt. Bin da grade am Umkonfigurieren aber das ist ja auch erst mal unerheblich. Aber ab Zeile 224 fehlen Ihm schon gewissen Dateien.

2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:12Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:12:12Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72297  
2019-04-17T14:12:12Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:12Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:12Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:12Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:12:12Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72309  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:31Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:12:31Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72313  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:12:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72320  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:40Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:12:40Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72330  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:12:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:12Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:12Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:12Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:13:31Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72342  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:13:31Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72347  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:13:31Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72351  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:13:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:02Z backup.sh.72360: Locking esx.conf
2019-04-17T14:14:02Z backup.sh.72360: Creating archive
2019-04-17T14:14:03Z backup.sh.72360: Unlocking esx.conf
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:14:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72516  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:14:57Z backup.sh.72561: Locking esx.conf
2019-04-17T14:14:57Z backup.sh.72561: Creating archive
2019-04-17T14:14:57Z backup.sh.72561: Unlocking esx.conf
2019-04-17T14:14:58Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:14:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:14:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:01Z crond[66699]: crond: USER root pid 72716 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-17T14:15:01Z syslog[72719]: starting hostd probing.
2019-04-17T14:15:17Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:15:17Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72715  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:25Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:15:25Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72742  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:31Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:15:31Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72752  
2019-04-17T14:15:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:15:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:15:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:15:31Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:15:31Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72754  
2019-04-17T14:15:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-17T14:15:38Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-17T14:16:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:16:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:16:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:16:50Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:16:50Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72765  
2019-04-17T14:16:50Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:16:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:16:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:16:50Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:16:50Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72768  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:06Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:17:06Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72778  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:07Z backup.sh.72785: Locking esx.conf
2019-04-17T14:17:07Z backup.sh.72785: Creating archive
2019-04-17T14:17:07Z backup.sh.72785: Unlocking esx.conf
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:07Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:17:07Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72937  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:07Z ImageConfigManager: 2019-04-17 14:17:07,975 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-17T14:17:07Z ImageConfigManager: 2019-04-17 14:17:07,975 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x9e55ecf9b0>}   
2019-04-17T14:17:07Z hostd-icm[72951]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-17T14:17:07Z ImageConfigManager: 2019-04-17 14:17:07,975 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-17T14:17:07Z ImageConfigManager: 2019-04-17 14:17:07,976 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-d77d-c4a7</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-118488364</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-17T14:17:07Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-17T14:17:08Z ImageConfigManager: 2019-04-17 14:17:08,090 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-17T14:17:08Z ImageConfigManager: 2019-04-17 14:17:08,217 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-17T14:17:08Z ImageConfigManager: 2019-04-17 14:17:08,218 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x8bdccb4978>}   
2019-04-17T14:17:08Z hostd-icm[72959]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-17T14:17:08Z ImageConfigManager: 2019-04-17 14:17:08,218 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-17T14:17:08Z ImageConfigManager: 2019-04-17 14:17:08,219 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-48da-c4b4</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-118488367</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-17T14:17:08Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-17T14:17:08Z ImageConfigManager: 2019-04-17 14:17:08,237 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:10Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-17T14:17:10Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 72971  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-17T14:17:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-17T14:17:15Z init: starting pid 72977, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh stop'  
2019-04-17T14:17:15Z addVob[72979]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-17T14:17:15Z addVob[72979]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-17T14:17:15Z addVob[72979]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-17T14:17:15Z init: starting pid 72980, tty '': '/bin/shutdown.sh'  
2019-04-17T14:17:15Z VMware[shutdown]: Stopping VMs
2019-04-17T14:17:15Z jumpstart[72996]: executing stop for daemon hp-ams.sh.
2019-04-17T14:17:15Z root: ams stop watchdog...
2019-04-17T14:17:15Z root: ams-wd: ams-watchdog stop.
2019-04-17T14:17:15Z root: Terminating ams-watchdog process with PID 73004 73005
2019-04-17T14:17:17Z root: ams stop service...
2019-04-17T14:17:20Z jumpstart[72996]: executing stop for daemon xorg.
2019-04-17T14:17:20Z jumpstart[72996]: Jumpstart failed to stop: xorg reason: Execution of command: /etc/init.d/xorg stop failed with status: 3
2019-04-17T14:17:20Z jumpstart[72996]: executing stop for daemon vmsyslogd.
2019-04-17T14:17:20Z jumpstart[72996]: Jumpstart failed to stop: vmsyslogd reason: Execution of command: /etc/init.d/vmsyslogd stop failed with status: 1
2019-04-17T14:17:20Z jumpstart[72996]: executing stop for daemon vmtoolsd.
2019-04-17T14:17:20Z jumpstart[72996]: Jumpstart failed to stop: vmtoolsd reason: Execution of command: /etc/init.d/vmtoolsd stop failed with status: 1
2019-04-17T14:17:20Z jumpstart[72996]: executing stop for daemon wsman.
2019-04-17T14:17:20Z openwsmand: Getting Exclusive access, please wait...
2019-04-17T14:17:20Z openwsmand: Exclusive access granted.
2019-04-17T14:17:20Z openwsmand: Stopping openwsmand
2019-04-17T14:17:20Z watchdog-openwsmand: Watchdog for openwsmand is now 68106
2019-04-17T14:17:20Z watchdog-openwsmand: Terminating watchdog process with PID 68106
2019-04-17T14:17:20Z watchdog-openwsmand: [68106] Signal received: exiting the watchdog
2019-04-17T14:17:21Z jumpstart[72996]: executing stop for daemon snmpd.
2019-04-17T14:17:21Z root: Stopping snmpd by administrative request
2019-04-17T14:17:21Z root: snmpd is not running.
2019-04-17T14:17:21Z jumpstart[72996]: executing stop for daemon sfcbd-watchdog.
2019-04-17T14:17:21Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-17T14:17:21Z sfcbd-init: Exclusive access granted.
2019-04-17T14:17:21Z sfcbd-init: Request to stop sfcbd-watchdog, pid 73068
2019-04-17T14:17:21Z sfcbd-init: Invoked kill 68069
2019-04-17T14:17:21Z sfcb-vmware_raw[68596]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-17T14:17:21Z sfcb-vmware_base[68588]: VICimProvider exiting on WFU cancelled.
2019-04-17T14:17:21Z sfcb-vmware_base[68588]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-17T14:17:24Z sfcbd-init: stop sfcbd process completed.
2019-04-17T14:17:24Z jumpstart[72996]: executing stop for daemon vit_loader.sh.
2019-04-17T14:17:24Z VITLOADER: [etc/init.d/vit_loader] Shutdown VITD successfully
2019-04-17T14:17:24Z jumpstart[72996]: executing stop for daemon hpe-smx.init.
2019-04-17T14:17:25Z jumpstart[72996]: executing stop for daemon hpe-nmi.init.
2019-04-17T14:17:25Z jumpstart[72996]: executing stop for daemon hpe-fc.sh.
2019-04-17T14:17:25Z jumpstart[72996]: executing stop for daemon lwsmd.
2019-04-17T14:17:25Z watchdog-lwsmd: Watchdog for lwsmd is now 67833
2019-04-17T14:17:25Z watchdog-lwsmd: Terminating watchdog process with PID 67833
2019-04-17T14:17:25Z watchdog-lwsmd: [67833] Signal received: exiting the watchdog
2019-04-17T14:17:25Z lwsmd: Shutting down running services
2019-04-17T14:17:25Z lwsmd: Stopping service: lsass
2019-04-17T14:17:25Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-17T14:17:25Z lwsmd: [lsass-ipc] Listener shut down
2019-04-17T14:17:25Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-17T14:17:25Z lwsmd: [lsass-ipc] Listener shut down
2019-04-17T14:17:25Z lwsmd: [lsass] Machine Password Sync Thread stopping
2019-04-17T14:17:25Z lwsmd: [lsass] LSA Service exiting...
2019-04-17T14:17:25Z lwsmd: Stopping service: rdr
2019-04-17T14:17:25Z lwsmd: Stopping service: lwio
2019-04-17T14:17:25Z lwsmd: [lwio-ipc] Shutting down listener
2019-04-17T14:17:25Z lwsmd: [lwio-ipc] Listener shut down
2019-04-17T14:17:25Z lwsmd: [lwio] LWIO Service exiting...
2019-04-17T14:17:25Z lwsmd: Stopping service: netlogon
2019-04-17T14:17:25Z lwsmd: [netlogon-ipc] Shutting down listener
2019-04-17T14:17:25Z lwsmd: [netlogon-ipc] Listener shut down
2019-04-17T14:17:25Z lwsmd: [netlogon] LWNET Service exiting...
2019-04-17T14:17:25Z lwsmd: Stopping service: lwreg
2019-04-17T14:17:25Z lwsmd: [lwreg-ipc] Shutting down listener
2019-04-17T14:17:25Z lwsmd: [lwreg-ipc] Listener shut down
2019-04-17T14:17:25Z lwsmd: [lwreg] REG Service exiting...
2019-04-17T14:17:25Z lwsmd: [lwsm-ipc] Shutting down listener
2019-04-17T14:17:25Z lwsmd: [lwsm-ipc] Listener shut down
2019-04-17T14:17:25Z lwsmd: Logging stopped
2019-04-17T14:17:27Z jumpstart[72996]: executing stop for daemon vpxa.
2019-04-17T14:17:27Z watchdog-vpxa: Watchdog for vpxa is now 67792
2019-04-17T14:17:28Z watchdog-vpxa: Terminating watchdog process with PID 67792
2019-04-17T14:17:28Z watchdog-vpxa: [67792] Signal received: exiting the watchdog
2019-04-17T14:17:28Z jumpstart[72996]: executing stop for daemon vobd.
2019-04-17T14:17:28Z watchdog-vobd: Watchdog for vobd is now 65960
2019-04-17T14:17:28Z watchdog-vobd: Terminating watchdog process with PID 65960
2019-04-17T14:17:28Z watchdog-vobd: [65960] Signal received: exiting the watchdog
2019-04-17T14:17:28Z jumpstart[72996]: executing stop for daemon dcbd.
2019-04-17T14:17:28Z watchdog-dcbd: Watchdog for dcbd is now 67703
2019-04-17T14:17:28Z watchdog-dcbd: Terminating watchdog process with PID 67703
2019-04-17T14:17:28Z watchdog-dcbd: [67703] Signal received: exiting the watchdog
2019-04-17T14:17:28Z jumpstart[72996]: executing stop for daemon nscd.
2019-04-17T14:17:28Z watchdog-nscd: Watchdog for nscd is now 67721
2019-04-17T14:17:28Z watchdog-nscd: Terminating watchdog process with PID 67721
2019-04-17T14:17:28Z watchdog-nscd: [67721] Signal received: exiting the watchdog
2019-04-17T14:17:28Z jumpstart[72996]: executing stop for daemon cdp.
2019-04-17T14:17:28Z watchdog-cdp: Watchdog for cdp is now 67743
2019-04-17T14:17:28Z watchdog-cdp: Terminating watchdog process with PID 67743
2019-04-17T14:17:28Z watchdog-cdp: [67743] Signal received: exiting the watchdog
2019-04-17T14:17:28Z jumpstart[72996]: executing stop for daemon lacp.
2019-04-17T14:17:28Z watchdog-net-lacp: Watchdog for net-lacp is now 66333
2019-04-17T14:17:29Z watchdog-net-lacp: Terminating watchdog process with PID 66333
2019-04-17T14:17:29Z watchdog-net-lacp: [66333] Signal received: exiting the watchdog
2019-04-17T14:17:29Z jumpstart[72996]: executing stop for daemon smartd.
2019-04-17T14:17:29Z watchdog-smartd: Watchdog for smartd is now 67762
2019-04-17T14:17:29Z watchdog-smartd: Terminating watchdog process with PID 67762
2019-04-17T14:17:29Z watchdog-smartd: [67762] Signal received: exiting the watchdog
2019-04-17T14:17:29Z smartd: [warn] smartd received signal 15
2019-04-17T14:17:29Z smartd: [warn] smartd exit.
2019-04-17T14:17:29Z jumpstart[72996]: executing stop for daemon memscrubd.
2019-04-17T14:17:29Z jumpstart[72996]: Jumpstart failed to stop: memscrubd reason: Execution of command: /etc/init.d/memscrubd stop failed with status: 3
2019-04-17T14:17:29Z jumpstart[72996]: executing stop for daemon slpd.
2019-04-17T14:17:29Z root: slpd Stopping slpd
2019-04-17T14:17:29Z slpd[67695]: SLPD daemon shutting down
2019-04-17T14:17:29Z slpd[67695]: *** SLPD daemon shut down by administrative request
2019-04-17T14:17:29Z jumpstart[72996]: executing stop for daemon sensord.
2019-04-17T14:17:29Z watchdog-sensord: Watchdog for sensord is now 67094
2019-04-17T14:17:29Z watchdog-sensord: Terminating watchdog process with PID 67094
2019-04-17T14:17:29Z watchdog-sensord: [67094] Signal received: exiting the watchdog
2019-04-17T14:17:30Z jumpstart[72996]: executing stop for daemon storageRM.
2019-04-17T14:17:30Z watchdog-storageRM: Watchdog for storageRM is now 67114
2019-04-17T14:17:30Z watchdog-storageRM: Terminating watchdog process with PID 67114
2019-04-17T14:17:30Z watchdog-storageRM: [67114] Signal received: exiting the watchdog
2019-04-17T14:17:30Z jumpstart[72996]: executing stop for daemon hostd.
2019-04-17T14:17:30Z watchdog-hostd: Watchdog for hostd is now 67140
2019-04-17T14:17:30Z watchdog-hostd: Terminating watchdog process with PID 67140
2019-04-17T14:17:30Z watchdog-hostd: [67140] Signal received: exiting the watchdog
2019-04-17T14:17:30Z jumpstart[72996]: executing stop for daemon sdrsInjector.
2019-04-17T14:17:30Z watchdog-sdrsInjector: Watchdog for sdrsInjector is now 67159
2019-04-17T14:17:30Z watchdog-sdrsInjector: Terminating watchdog process with PID 67159
2019-04-17T14:17:30Z watchdog-sdrsInjector: [67159] Signal received: exiting the watchdog
2019-04-17T14:17:30Z jumpstart[72996]: executing stop for daemon nfcd.
2019-04-17T14:17:30Z jumpstart[72996]: executing stop for daemon vvold.
2019-04-17T14:17:31Z jumpstart[72996]: Jumpstart failed to stop: vvold reason: Execution of command: /etc/init.d/vvold stop failed with status: 3
2019-04-17T14:17:31Z jumpstart[72996]: executing stop for daemon rhttpproxy.
2019-04-17T14:17:31Z watchdog-rhttpproxy: Watchdog for rhttpproxy is now 67521
2019-04-17T14:17:31Z watchdog-rhttpproxy: Terminating watchdog process with PID 67521
2019-04-17T14:17:31Z watchdog-rhttpproxy: [67521] Signal received: exiting the watchdog
2019-04-17T14:17:31Z jumpstart[72996]: executing stop for daemon hostdCgiServer.
2019-04-17T14:17:31Z watchdog-hostdCgiServer: Watchdog for hostdCgiServer is now 67548
2019-04-17T14:17:31Z watchdog-hostdCgiServer: Terminating watchdog process with PID 67548
2019-04-17T14:17:31Z watchdog-hostdCgiServer: [67548] Signal received: exiting the watchdog
2019-04-17T14:17:31Z jumpstart[72996]: executing stop for daemon lbtd.
2019-04-17T14:17:31Z watchdog-net-lbt: Watchdog for net-lbt is now 67576
2019-04-17T14:17:31Z watchdog-net-lbt: Terminating watchdog process with PID 67576
2019-04-17T14:17:31Z watchdog-net-lbt: [67576] Signal received: exiting the watchdog
2019-04-17T14:17:31Z jumpstart[72996]: executing stop for daemon rabbitmqproxy.
2019-04-17T14:17:31Z jumpstart[72996]: executing stop for daemon vmfstraced.
2019-04-17T14:17:31Z watchdog-vmfstracegd: PID file /var/run/vmware/watchdog-vmfstracegd.PID does not exist
2019-04-17T14:17:31Z watchdog-vmfstracegd: Unable to terminate watchdog: No running watchdog process for vmfstracegd
2019-04-17T14:17:32Z vmfstracegd: Failed to clear vmfstracegd memory reservation
2019-04-17T14:17:32Z jumpstart[72996]: executing stop for daemon esxui.
2019-04-17T14:17:32Z jumpstart[72996]: executing stop for daemon iofilterd-vmwarevmcrypt.
2019-04-17T14:17:32Z iofilterd-vmwarevmcrypt[73649]: Could not expand environment variable HOME.
2019-04-17T14:17:32Z iofilterd-vmwarevmcrypt[73649]: Could not expand environment variable HOME.
2019-04-17T14:17:32Z iofilterd-vmwarevmcrypt[73649]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-17T14:17:32Z iofilterd-vmwarevmcrypt[73649]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-17T14:17:32Z iofilterd-vmwarevmcrypt[73649]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-17T14:17:32Z iofilterd-vmwarevmcrypt[73649]: Resource Pool clean up for iofilter vmwarevmcrypt is done
2019-04-17T14:17:32Z jumpstart[72996]: executing stop for daemon swapobjd.
2019-04-17T14:17:32Z watchdog-swapobjd: Watchdog for swapobjd is now 67000
2019-04-17T14:17:32Z watchdog-swapobjd: Terminating watchdog process with PID 67000
2019-04-17T14:17:32Z watchdog-swapobjd: [67000] Signal received: exiting the watchdog
2019-04-17T14:17:33Z jumpstart[72996]: executing stop for daemon usbarbitrator.
2019-04-17T14:17:33Z watchdog-usbarbitrator: Watchdog for usbarbitrator is now 67038
2019-04-17T14:17:33Z watchdog-usbarbitrator: Terminating watchdog process with PID 67038
2019-04-17T14:17:33Z watchdog-usbarbitrator: [67038] Signal received: exiting the watchdog
2019-04-17T14:17:33Z jumpstart[72996]: executing stop for daemon iofilterd-spm.
2019-04-17T14:17:33Z iofilterd-spm[73712]: Could not expand environment variable HOME.
2019-04-17T14:17:33Z iofilterd-spm[73712]: Could not expand environment variable HOME.
2019-04-17T14:17:33Z iofilterd-spm[73712]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-17T14:17:33Z iofilterd-spm[73712]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-17T14:17:33Z iofilterd-spm[73712]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-17T14:17:33Z iofilterd-spm[73712]: Resource Pool clean up for iofilter spm is done
2019-04-17T14:17:33Z jumpstart[72996]: executing stop for daemon ESXShell.
2019-04-17T14:17:33Z addVob[73719]: Could not expand environment variable HOME.
2019-04-17T14:17:33Z addVob[73719]: Could not expand environment variable HOME.
2019-04-17T14:17:33Z addVob[73719]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-17T14:17:33Z addVob[73719]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-17T14:17:33Z addVob[73719]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-17T14:17:33Z addVob[73719]: VobUserLib_Init failed with -1
2019-04-17T14:17:33Z doat: Stopped wait on component ESXShell.stop
2019-04-17T14:17:33Z doat: Stopped wait on component ESXShell.disable
2019-04-17T14:17:33Z jumpstart[72996]: executing stop for daemon DCUI.
2019-04-17T14:17:33Z root: DCUI Disabling DCUI logins
2019-04-17T14:17:33Z addVob[73740]: Could not expand environment variable HOME.
2019-04-17T14:17:33Z addVob[73740]: Could not expand environment variable HOME.
2019-04-17T14:17:33Z addVob[73740]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-17T14:17:33Z addVob[73740]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-17T14:17:33Z addVob[73740]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-17T14:17:33Z addVob[73740]: VobUserLib_Init failed with -1
2019-04-17T14:17:33Z jumpstart[72996]: executing stop for daemon ntpd.
2019-04-17T14:17:33Z root: ntpd Stopping ntpd
2019-04-17T14:17:34Z watchdog-ntpd: Watchdog for ntpd is now 66913
2019-04-17T14:17:34Z watchdog-ntpd: Terminating watchdog process with PID 66913
2019-04-17T14:17:34Z watchdog-ntpd: [66913] Signal received: exiting the watchdog
2019-04-17T14:17:34Z ntpd[66923]: ntpd exiting on signal 1 (Hangup)
2019-04-17T14:17:34Z ntpd[66923]: 185.194.140.199 local addr 192.168.20.20 -> <null>
2019-04-17T14:17:34Z jumpstart[72996]: executing stop for daemon SSH.
2019-04-17T14:17:34Z addVob[73772]: Could not expand environment variable HOME.
2019-04-17T14:17:34Z addVob[73772]: Could not expand environment variable HOME.
2019-04-17T14:17:34Z addVob[73772]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-17T14:17:34Z addVob[73772]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-17T14:17:34Z addVob[73772]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-17T14:17:34Z addVob[73772]: VobUserLib_Init failed with -1
2019-04-17T14:17:34Z doat: Stopped wait on component RemoteShell.disable
2019-04-17T14:17:34Z doat: Stopped wait on component RemoteShell.stop
2019-04-17T14:17:35Z backup.sh.73828: Locking esx.conf
2019-04-17T14:17:35Z backup.sh.73828: Creating archive
2019-04-17T14:17:35Z backup.sh.73828: Unlocking esx.conf
2019-04-18T06:47:42Z watchdog-vobd: [65960] Begin '/usr/lib/vmware/vob/bin/vobd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:47:42Z watchdog-vobd: Executing '/usr/lib/vmware/vob/bin/vobd'  
2019-04-18T06:47:42Z jumpstart[65945]: Launching Executor
2019-04-18T06:47:42Z jumpstart[65945]: Setting up Executor - Reset Requested
2019-04-18T06:47:43Z jumpstart[65945]: ignoring plugin 'vsan-upgrade' because version '2.0.0'  has already been run.  
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: check-required-memory
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: restore-configuration
2019-04-18T06:47:43Z jumpstart[65991]: restoring configuration
2019-04-18T06:47:43Z jumpstart[65991]: extracting from file /local.tgz
2019-04-18T06:47:43Z jumpstart[65991]: file etc/likewise/db/registry.db has been changed before restoring the configuration - the changes will be lost
2019-04-18T06:47:43Z jumpstart[65991]: ConfigCheck: Running ipv6 option upgrade, redundantly
2019-04-18T06:47:43Z jumpstart[65991]: Util: tcpip4 IPv6 enabled
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: vmkeventd
2019-04-18T06:47:43Z watchdog-vmkeventd: [65993] Begin '/usr/lib/vmware/vmkeventd/bin/vmkeventd', min-uptime = 10, max-quick-failures = 5, max-total-failures = 9999999, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:47:43Z watchdog-vmkeventd: Executing '/usr/lib/vmware/vmkeventd/bin/vmkeventd'  
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: vmkcrypto
2019-04-18T06:47:43Z jumpstart[65970]: 65971:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T06:47:43Z jumpstart[65970]: 65971:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T06:47:43Z jumpstart[65970]: 65971:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: autodeploy-enabled
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: vsan-base
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: vsan-early
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: advanced-user-configuration-options
2019-04-18T06:47:43Z jumpstart[65945]: executing start plugin: restore-advanced-configuration
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: PSA-boot-config
2019-04-18T06:47:44Z jumpstart[65970]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:47:44Z jumpstart[65970]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:47:44Z jumpstart[65970]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:47:44Z jumpstart[65970]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:47:44Z jumpstart[65970]: lib/ssl: protocol list tls1.2
2019-04-18T06:47:44Z jumpstart[65970]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:47:44Z jumpstart[65970]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: vprobe
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: vmkapi-mgmt
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: dma-engine
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: procfs
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: mgmt-vmkapi-compatibility
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: iodm
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: vmkernel-vmkapi-compatibility
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: driver-status-check
2019-04-18T06:47:44Z jumpstart[66023]: driver_status_check: boot cmdline: /jumpstrt.gz vmbTrustedBoot=false tboot=0x101b000 installerDiskDumpSlotSize=2560 no-auto-partition bootUUID=e78269d0448c41fe200c24e8a54f93c1
2019-04-18T06:47:44Z jumpstart[66023]: driver_status_check: useropts:
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: hardware-config
2019-04-18T06:47:44Z jumpstart[66024]: Failed to symlink /etc/vmware/pci.ids: No such file or directory
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: vmklinux
2019-04-18T06:47:44Z jumpstart[65945]: executing start plugin: vmkdevmgr
2019-04-18T06:47:44Z jumpstart[66025]: Starting vmkdevmgr
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: register-vmw-mpp
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: register-vmw-satp
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: register-vmw-psp
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: etherswitch
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: aslr
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: random
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: storage-early-config-dev-settings
2019-04-18T06:47:50Z jumpstart[65945]: executing start plugin: networking-drivers
2019-04-18T06:47:50Z jumpstart[66145]: Loading network device drivers
2019-04-18T06:47:53Z jumpstart[66145]: LoadVmklinuxDriver: Loaded module bnx2
2019-04-18T06:47:54Z jumpstart[65945]: executing start plugin: register-vmw-vaai
2019-04-18T06:47:54Z jumpstart[65945]: executing start plugin: usb
2019-04-18T06:47:54Z jumpstart[65945]: executing start plugin: local-storage
2019-04-18T06:47:54Z jumpstart[65945]: executing start plugin: psa-mask-paths
2019-04-18T06:47:54Z jumpstart[65945]: executing start plugin: network-uplink-init
2019-04-18T06:47:54Z jumpstart[66226]: Trying to connect...
2019-04-18T06:47:54Z jumpstart[66226]: Connected.
2019-04-18T06:47:57Z jumpstart[66226]: Received processed
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: psa-nmp-pre-claim-config
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: psa-filter-pre-claim-config
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: restore-system-uuid
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: restore-storage-multipathing
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: network-support
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: psa-load-rules
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: vds-vmkapi-compatibility
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: psa-filter-post-claim-config
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: psa-nmp-post-claim-config
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: mlx4_en
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: dvfilters-vmkapi-compatibility
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: vds-config
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: storage-drivers
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: vxlan-base
2019-04-18T06:47:57Z jumpstart[65945]: executing start plugin: firewall
2019-04-18T06:47:58Z jumpstart[65945]: executing start plugin: dvfilter-config
2019-04-18T06:47:58Z jumpstart[65945]: executing start plugin: dvfilter-generic-fastpath
2019-04-18T06:47:58Z jumpstart[65945]: executing start plugin: lacp-daemon
2019-04-18T06:47:58Z watchdog-net-lacp: [66328] Begin '/usr/sbin/net-lacp', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:47:58Z watchdog-net-lacp: Executing '/usr/sbin/net-lacp'  
2019-04-18T06:47:58Z jumpstart[65945]: executing start plugin: storage-psa-init
2019-04-18T06:47:58Z jumpstart[66345]: Trying to connect...
2019-04-18T06:47:58Z jumpstart[66345]: Connected.
2019-04-18T06:47:58Z jumpstart[66345]: Received processed
2019-04-18T06:47:58Z jumpstart[65945]: executing start plugin: restore-networking
2019-04-18T06:47:59Z jumpstart[65970]: NetworkInfoImpl: Enabling 1 netstack instances during boot
2019-04-18T06:48:04Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Storing previous management interface:'vmk0'  
2019-04-18T06:48:04Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk0'  
2019-04-18T06:48:04Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk1'  
2019-04-18T06:48:04Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing config for'vmk0'  
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:48:04Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk0
2019-04-18T06:48:04Z jumpstart[65970]: VmkNic: Ipv6 not Enabled
2019-04-18T06:48:04Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk0
2019-04-18T06:48:04Z jumpstart[65970]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:48:04Z jumpstart[65970]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk0', portStr:'Management Network'  
2019-04-18T06:48:04Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing config for'vmk1'  
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:48:04Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk1
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk1
2019-04-18T06:48:04Z jumpstart[65970]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:48:04Z jumpstart[65970]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk1', portStr:'NFS-FreeNAS'  
2019-04-18T06:48:04Z jumpstart[65970]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:48:04Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:48:04Z jumpstart[65970]: 2019-04-18T06:48:04Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: random-seed
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: dvfilters
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: restore-pxe-marker
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: auto-configure-networking
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: storage-early-configuration
2019-04-18T06:48:04Z jumpstart[66412]: 66412:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T06:48:04Z jumpstart[66412]: 66412:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T06:48:04Z jumpstart[66412]: 66412:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T06:48:04Z jumpstart[66412]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:48:04Z jumpstart[66412]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:48:04Z jumpstart[66412]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:48:04Z jumpstart[66412]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:48:04Z jumpstart[66412]: lib/ssl: protocol list tls1.2
2019-04-18T06:48:04Z jumpstart[66412]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:48:04Z jumpstart[66412]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: bnx2fc
2019-04-18T06:48:04Z jumpstart[65945]: executing start plugin: software-iscsi
2019-04-18T06:48:04Z jumpstart[65970]: iScsi: No iBFT data present in the BIOS
2019-04-18T06:48:05Z iscsid: Notice: iSCSI Database already at latest schema. (Upgrade Skipped).
2019-04-18T06:48:05Z iscsid: iSCSI MASTER Database opened. (0x21e8008)
2019-04-18T06:48:05Z iscsid: LogLevel = 0
2019-04-18T06:48:05Z iscsid: LogSync  = 0
2019-04-18T06:48:05Z iscsid: memory (180) MB successfully reserved for 1024 sessions
2019-04-18T06:48:05Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e64) idx (0) size (460808)
2019-04-18T06:48:05Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e62) idx (1) size (460808)
2019-04-18T06:48:05Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e60) idx (2) size (460808)
2019-04-18T06:48:05Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e5e) idx (3) size (460808)
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T06:48:06Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T06:48:06Z jumpstart[65945]: executing start plugin: fcoe-config
2019-04-18T06:48:06Z jumpstart[65945]: executing start plugin: storage-path-claim
2019-04-18T06:48:10Z jumpstart[65970]: StorageInfo: Number of paths 3
2019-04-18T06:48:14Z jumpstart[65970]: StorageInfo: Number of devices 3
2019-04-18T06:48:14Z jumpstart[65970]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T06:50:20Z mark: storage-path-claim-completed
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: gss
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: mount-filesystems
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: restore-paths
2019-04-18T06:48:14Z jumpstart[65970]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: filesystem-drivers
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: rpc
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: dump-partition
2019-04-18T06:48:14Z jumpstart[65970]: execution of 'system coredump partition set --enable=true --smart' failed : Unable to smart activate a dump partition.  Error was: No suitable diagnostic partitions found..  
2019-04-18T06:48:14Z jumpstart[65970]: 2019-04-18T06:48:14Z jumpstart[65945]: Executor failed executing esxcli command system coredump partition set --enable=true --smart
2019-04-18T06:48:14Z jumpstart[65945]: Method invocation failed: dump-partition->start() failed: error while executing the cli
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: vsan-devel
2019-04-18T06:48:14Z jumpstart[66433]: VsanDevel: DevelBootDelay: 0
2019-04-18T06:48:14Z jumpstart[66433]: VsanDevel: DevelWipeConfigOnBoot: 0
2019-04-18T06:48:14Z jumpstart[66433]: VsanDevel: DevelTagSSD: Starting
2019-04-18T06:48:14Z jumpstart[66433]: 66433:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T06:48:14Z jumpstart[66433]: 66433:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T06:48:14Z jumpstart[66433]: 66433:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T06:48:14Z jumpstart[66433]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:48:14Z jumpstart[66433]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:48:14Z jumpstart[66433]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:48:14Z jumpstart[66433]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:48:14Z jumpstart[66433]: lib/ssl: protocol list tls1.2
2019-04-18T06:48:14Z jumpstart[66433]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:48:14Z jumpstart[66433]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:48:14Z jumpstart[66433]: VsanDevel: DevelTagSSD: Done.
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: vmfs
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: ufs
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: vfat
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: nfsgssd
2019-04-18T06:48:14Z watchdog-nfsgssd: [66636] Begin '/usr/lib/vmware/nfs/bin/nfsgssd -f -a', min-uptime = 60, max-quick-failures = 128, max-total-failures = 65536, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:48:14Z watchdog-nfsgssd: Executing '/usr/lib/vmware/nfs/bin/nfsgssd -f -a'  
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: vsan
2019-04-18T06:48:14Z nfsgssd[66646]: Could not expand environment variable HOME.
2019-04-18T06:48:14Z nfsgssd[66646]: Could not expand environment variable HOME.
2019-04-18T06:48:14Z nfsgssd[66646]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:48:14Z nfsgssd[66646]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:48:14Z nfsgssd[66646]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:48:14Z nfsgssd[66646]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:48:14Z nfsgssd[66646]: lib/ssl: protocol list tls1.2
2019-04-18T06:48:14Z nfsgssd[66646]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:48:14Z nfsgssd[66646]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:48:14Z nfsgssd[66646]: Empty epoch file
2019-04-18T06:48:14Z nfsgssd[66646]: Starting with epoch 1
2019-04-18T06:48:14Z nfsgssd[66646]: Connected to SunRPCGSS version 1.0
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: krb5
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: etc-hosts
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: nfs
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: nfs41
2019-04-18T06:48:14Z jumpstart[65945]: executing start plugin: mount-disk-fs
2019-04-18T06:48:15Z jumpstart[65970]: VmFileSystem: Automounted volume 5a6f6646-d13e2d89-fd8d-b499babd4e5e
2019-04-18T06:48:15Z jumpstart[65970]: VmFileSystem: Automounted volume 5ab363c3-c36e8e9f-8cfc-b499babd4e5e
2019-04-18T06:48:15Z jumpstart[65945]: executing start plugin: auto-configure-pmem
2019-04-18T06:48:15Z jumpstart[65945]: executing start plugin: restore-nfs-volumes
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:48:15Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:48:15Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:48:15Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:48:15Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:48:15Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:48:15Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:48:15Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:48:15Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:48:15Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:49:17Z jumpstart[65970]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "NFS-FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T06:49:17Z jumpstart[65970]: 2019-04-18T06:49:17Z jumpstart[65970]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T06:49:17Z jumpstart[65970]: 2019-04-18T06:49:17Z jumpstart[65945]: Executor failed executing esxcli command boot storage restore --nfs-volumes
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:49:17Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:49:17Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:49:17Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:49:17Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:49:17Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:49:17Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:49:17Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:49:17Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:49:17Z jumpstart[65945]: Method invocation failed: restore-nfs-volumes->start() failed: error while executing the cli
2019-04-18T06:49:17Z jumpstart[65945]: executing start plugin: auto-configure-storage
2019-04-18T06:49:17Z jumpstart[65945]: executing start plugin: restore-bootbanks
2019-04-18T06:49:17Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:20Z jumpstart[65970]: VmkCtl: Boot device not available, waited 3 seconds
2019-04-18T06:49:20Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:23Z jumpstart[65970]: VmkCtl: Boot device not available, waited 6 seconds
2019-04-18T06:49:23Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:26Z jumpstart[65970]: VmkCtl: Boot device not available, waited 9 seconds
2019-04-18T06:49:26Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:29Z jumpstart[65970]: VmkCtl: Boot device not available, waited 12 seconds
2019-04-18T06:49:29Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:32Z jumpstart[65970]: VmkCtl: Boot device not available, waited 15 seconds
2019-04-18T06:49:32Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:35Z jumpstart[65970]: VmkCtl: Boot device not available, waited 18 seconds
2019-04-18T06:49:35Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:38Z jumpstart[65970]: VmkCtl: Boot device not available, waited 21 seconds
2019-04-18T06:49:38Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:41Z jumpstart[65970]: VmkCtl: Boot device not available, waited 24 seconds
2019-04-18T06:49:41Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:44Z jumpstart[65970]: VmkCtl: Boot device not available, waited 27 seconds
2019-04-18T06:49:44Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:47Z jumpstart[65970]: VmkCtl: Boot device not available, waited 30 seconds
2019-04-18T06:49:47Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:50Z jumpstart[65970]: VmkCtl: Boot device not available, waited 33 seconds
2019-04-18T06:49:50Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:53Z jumpstart[65970]: VmkCtl: Boot device not available, waited 36 seconds
2019-04-18T06:49:53Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:56Z jumpstart[65970]: VmkCtl: Boot device not available, waited 39 seconds
2019-04-18T06:49:56Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:49:59Z jumpstart[65970]: VmkCtl: Boot device not available, waited 42 seconds
2019-04-18T06:49:59Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:02Z jumpstart[65970]: VmkCtl: Boot device not available, waited 45 seconds
2019-04-18T06:50:02Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:05Z jumpstart[65970]: VmkCtl: Boot device not available, waited 48 seconds
2019-04-18T06:50:05Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:08Z jumpstart[65970]: VmkCtl: Boot device not available, waited 51 seconds
2019-04-18T06:50:08Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:11Z jumpstart[65970]: VmkCtl: Boot device not available, waited 54 seconds
2019-04-18T06:50:11Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:14Z jumpstart[65970]: VmkCtl: Boot device not available, waited 57 seconds
2019-04-18T06:50:14Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:17Z jumpstart[65970]: VmkCtl: Boot device not available, waited 60 seconds
2019-04-18T06:50:17Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:17Z jumpstart[65970]: VmkCtl: Did not find a valid boot device, symlinking /bootbank to /tmp
2019-04-18T06:50:17Z jumpstart[65945]: executing start plugin: restore-host-cache
2019-04-18T06:50:17Z jumpstart[65970]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T06:50:17Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T06:50:17Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:50:17Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:50:17Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:50:17Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:17Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:17Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:50:17Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:50:17Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:50:17Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:17Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:17Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:50:17Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:50:17Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T06:50:17Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T06:50:17Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T06:50:17Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T06:50:17Z jumpstart[65970]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T06:50:17Z jumpstart[65970]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T06:50:17Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T06:50:17Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T06:50:17Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T06:50:17Z jumpstart[65970]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55097425920
2019-04-18T06:50:18Z jumpstart[65970]: SlowRefresh: path /vmfs/volumes/5ab363c4-26d208a0-fab7-b499babd4e5e total blocks 2500207837184 used blocks 2400621428736
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: vflash
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: dump-file
2019-04-18T06:50:18Z jumpstart[65970]: VmkCtl: Diagnostic File found; not auto creating
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T06:50:18Z jumpstart[65970]: VmkCtl: Dump file determined to be large enough, size: 1588592640 (recommended minimum: 1588592640)
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: vmci
2019-04-18T06:50:18Z jumpstart[65970]: execution of 'system module load --module vmci' failed : Unable to load module /usr/lib/vmware/vmkmod/vmci: Busy  
2019-04-18T06:50:18Z jumpstart[65945]: Executor failed executing esxcli command system module load --module vmci
2019-04-18T06:50:18Z jumpstart[65945]: Method invocation failed: vmci->start() failed: error while executing the cli
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: configure-locker
2019-04-18T06:50:18Z jumpstart[66691]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /scratch
2019-04-18T06:50:18Z jumpstart[66691]: Using /locker/packages/6.5.0/ as /productLocker
2019-04-18T06:50:18Z jumpstart[66691]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /locker
2019-04-18T06:50:18Z jumpstart[66691]: Using policy dir /etc/vmware/secpolicy
2019-04-18T06:50:18Z jumpstart[66691]: Parsed all objects
2019-04-18T06:50:18Z jumpstart[66691]: Objects defined and obsolete objects removed
2019-04-18T06:50:18Z jumpstart[66691]: Parsed all domain names
2019-04-18T06:50:18Z jumpstart[66691]: Domain policies parsed and syntax validated
2019-04-18T06:50:18Z jumpstart[66691]: Constraints check for domain policies succeeded
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /usr/share/nvidia
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /productLocker
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /.vmware
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /dev/vsansparse
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /dev/cbt
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /dev/svm
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /dev/upit
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /dev/vsan
2019-04-18T06:50:18Z jumpstart[66691]: Getting realpath failed: /dev/vvol
2019-04-18T06:50:18Z jumpstart[66691]: Domain policies set
2019-04-18T06:50:18Z jumpstart[66691]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T06:50:18Z jumpstart[66691]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T06:50:18Z jumpstart[66691]: Parsed all the tardisk policy files
2019-04-18T06:50:18Z jumpstart[66691]: Set all the tardisk labels and policy
2019-04-18T06:50:18Z jumpstart[66691]: Parsed all file label mappings
2019-04-18T06:50:18Z jumpstart[66691]: Set all file labels
2019-04-18T06:50:18Z jumpstart[66691]: System security policy has been set successfully
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: restore-system-swap
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: cbrc
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: tpm
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: apei
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: restore-security-policies
2019-04-18T06:50:18Z jumpstart[66697]: Using policy dir /etc/vmware/secpolicy
2019-04-18T06:50:18Z jumpstart[66697]: Parsed all objects
2019-04-18T06:50:18Z jumpstart[66697]: Objects defined and obsolete objects removed
2019-04-18T06:50:18Z jumpstart[66697]: Parsed all domain names
2019-04-18T06:50:18Z jumpstart[66697]: Domain policies parsed and syntax validated
2019-04-18T06:50:18Z jumpstart[66697]: Constraints check for domain policies succeeded
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /usr/share/nvidia
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /productLocker
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /.vmware
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /dev/vsansparse
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /dev/cbt
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /dev/svm
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /dev/upit
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /dev/vsan
2019-04-18T06:50:18Z jumpstart[66697]: Getting realpath failed: /dev/vvol
2019-04-18T06:50:18Z jumpstart[66697]: Domain policies set
2019-04-18T06:50:18Z jumpstart[66697]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T06:50:18Z jumpstart[66697]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T06:50:18Z jumpstart[66697]: Parsed all the tardisk policy files
2019-04-18T06:50:18Z jumpstart[66697]: Set all the tardisk labels and policy
2019-04-18T06:50:18Z jumpstart[66697]: Parsed all file label mappings
2019-04-18T06:50:18Z jumpstart[66697]: Set all file labels
2019-04-18T06:50:18Z jumpstart[66697]: System security policy has been set successfully
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: oem-modules
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: crond
2019-04-18T06:50:18Z crond[66701]: crond: crond (busybox 1.22.1) started, log level 8
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: restore-resource-groups
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: procMisc
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: rdma-vmkapi-compatibility
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: ipmi
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: restore-keymap
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: nmp-vmkapi-compatibility
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: iscsi-vmkapi-compatibility
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: ftcpt
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: hbr
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: autodeploy-setpassword
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: inetd
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: vrdma
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: tag-boot-bank
2019-04-18T06:50:18Z jumpstart[66788]: unable to open boot configuration: No such file or directory
2019-04-18T06:50:18Z jumpstart[65945]: Method invocation failed: tag-boot-bank->start() failed: exited with code 1
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: system-image-cache
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: iofilters
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: vit
2019-04-18T06:50:18Z jumpstart[65970]: Parser: Initializing VIT parser lib
2019-04-18T06:50:18Z jumpstart[65970]: VsanIscsiTargetImpl: The host is not in a Virtual SAN cluster.
2019-04-18T06:50:18Z jumpstart[65970]: Util: Retrieved vit status successfully
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: vmotion
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: vfc
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: balloonVMCI
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: coredump-configuration
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T06:50:18Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:18Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:18Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:50:18Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:18Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:18Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:50:18Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:50:18Z jumpstart[65970]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T06:50:18Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:18Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:50:18Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:50:18Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:50:18Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T06:50:18Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T06:50:18Z jumpstart[65970]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T06:50:18Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T06:50:18Z jumpstart[65970]: GetTypedFileSystems: fstype ufs
2019-04-18T06:50:18Z jumpstart[65970]: GetTypedFileSystems: fstype vvol
2019-04-18T06:50:18Z jumpstart[65970]: GetTypedFileSystems: fstype vsan
2019-04-18T06:50:18Z jumpstart[65970]: GetTypedFileSystems: fstype PMEM
2019-04-18T06:50:18Z jumpstart[65970]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55097425920
2019-04-18T06:50:18Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:50:18Z jumpstart[65945]: executing start plugin: set-acceptance-level
2019-04-18T06:50:19Z jumpstart[65945]: executing start plugin: scratch-storage
2019-04-18T06:50:20Z jumpstart[65945]: executing start plugin: pingback
2019-04-18T06:50:20Z jumpstart[65945]: executing start plugin: vmswapcleanup
2019-04-18T06:50:20Z jumpstart[65970]: execution of '--plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup' failed : Host Local Swap Location has not been enabled  
2019-04-18T06:50:20Z jumpstart[65945]: Executor failed executing esxcli command --plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup
2019-04-18T06:50:20Z jumpstart[65945]: Method invocation failed: vmswapcleanup->start() failed: error while executing the cli
2019-04-18T06:50:20Z jumpstart[65970]: Jumpstart executor signalled to stop
2019-04-18T06:50:20Z jumpstart[65945]: Executor has been Successfully Stopped
2019-04-18T06:50:20Z init: starting pid 66828, tty '': '/usr/lib/vmware/firstboot/bin/firstboot.py ++group=host/vim/vmvisor/boot -l'  
2019-04-18T06:50:20Z init: starting pid 66829, tty '': '/bin/services.sh start'  
2019-04-18T06:50:21Z jumpstart[66882]: executing start plugin: ESXShell
2019-04-18T06:50:21Z addVob[66888]: Could not expand environment variable HOME.
2019-04-18T06:50:21Z addVob[66888]: Could not expand environment variable HOME.
2019-04-18T06:50:21Z addVob[66888]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:50:21Z addVob[66888]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:50:21Z addVob[66888]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:50:21Z jumpstart[66882]: executing start plugin: DCUI
2019-04-18T06:50:21Z root: DCUI Enabling DCUI login: runlevel =
2019-04-18T06:50:21Z addVob[66903]: Could not expand environment variable HOME.
2019-04-18T06:50:21Z addVob[66903]: Could not expand environment variable HOME.
2019-04-18T06:50:21Z addVob[66903]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:50:21Z addVob[66903]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:50:21Z addVob[66903]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:50:21Z jumpstart[66882]: executing start plugin: ntpd
2019-04-18T06:50:21Z root: ntpd Starting ntpd
2019-04-18T06:50:21Z sntp[66908]: sntp 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:44 UTC 2016 (1)
2019-04-18T06:50:21Z sntp[66908]: 2019-04-18 06:50:21.588076 (+0000) -0.242596 +/- 0.188357 pool.ntp.org 78.46.53.2 s2 no-leap
2019-04-18T06:50:21Z watchdog-ntpd: [66915] Begin '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift', min-uptime = 60, max-quick-failures = 5, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:21Z watchdog-ntpd: Executing '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift'  
2019-04-18T06:50:21Z ntpd[66925]: ntpd 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:59 UTC 2016 (1): Starting
2019-04-18T06:50:21Z ntpd[66925]: Command line: /sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift
2019-04-18T06:50:21Z ntpd[66925]: proto: precision = 0.467 usec (-21)
2019-04-18T06:50:21Z ntpd[66925]: restrict default: KOD does nothing without LIMITED.
2019-04-18T06:50:21Z ntpd[66925]: Listen and drop on 0 v6wildcard [::]:123
2019-04-18T06:50:21Z ntpd[66925]: Listen and drop on 1 v4wildcard 0.0.0.0:123
2019-04-18T06:50:21Z ntpd[66925]: Listen normally on 2 lo0 127.0.0.1:123
2019-04-18T06:50:21Z ntpd[66925]: Listen normally on 3 vmk0 192.168.20.20:123
2019-04-18T06:50:21Z ntpd[66925]: Listen normally on 4 vmk1 192.168.55.60:123
2019-04-18T06:50:21Z ntpd[66925]: Listen normally on 5 lo0 [::1]:123
2019-04-18T06:50:21Z ntpd[66925]: Listen normally on 6 lo0 [fe80::1%1]:123
2019-04-18T06:50:21Z ntpd[66925]: Listen normally on 7 vmk1 [fe80::250:56ff:fe67:b2b0%3]:123
2019-04-18T06:50:21Z jumpstart[66882]: executing start plugin: SSH
2019-04-18T06:50:21Z addVob[66932]: Could not expand environment variable HOME.
2019-04-18T06:50:21Z addVob[66932]: Could not expand environment variable HOME.
2019-04-18T06:50:21Z addVob[66932]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:50:21Z addVob[66932]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:50:21Z addVob[66932]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:50:21Z jumpstart[66882]: executing start plugin: esxui
2019-04-18T06:50:22Z jumpstart[66882]: executing start plugin: iofilterd-vmwarevmcrypt
2019-04-18T06:50:22Z iofilterd-vmwarevmcrypt[66961]: Could not expand environment variable HOME.
2019-04-18T06:50:22Z iofilterd-vmwarevmcrypt[66961]: Could not expand environment variable HOME.
2019-04-18T06:50:22Z iofilterd-vmwarevmcrypt[66961]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:50:22Z iofilterd-vmwarevmcrypt[66961]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:50:22Z iofilterd-vmwarevmcrypt[66961]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:50:22Z iofilterd-vmwarevmcrypt[66961]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T06:50:22Z watchdog-iofiltervpd: [66974] Begin '/usr/lib/vmware/iofilter/bin/ioFilterVPServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:22Z watchdog-iofiltervpd: Executing '/usr/lib/vmware/iofilter/bin/ioFilterVPServer'  
2019-04-18T06:50:24Z jumpstart[66882]: executing start plugin: swapobjd
2019-04-18T06:50:24Z watchdog-swapobjd: [67002] Begin '/usr/lib/vmware/swapobj/bin/swapobjd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:24Z watchdog-swapobjd: Executing '/usr/lib/vmware/swapobj/bin/swapobjd'  
2019-04-18T06:50:25Z jumpstart[66882]: executing start plugin: usbarbitrator
2019-04-18T06:50:25Z usbarbitrator: evicting objects on USB from OC
2019-04-18T06:50:25Z usbarbitrator: unclaiming USB devices
2019-04-18T06:50:25Z usbarbitrator: rescanning to complete removal of USB devices
2019-04-18T06:50:25Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T06:50:25Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T06:50:25Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T06:50:25Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T06:50:25Z watchdog-usbarbitrator: [67038] Begin '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:25Z watchdog-usbarbitrator: Executing '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414'  
2019-04-18T06:50:25Z jumpstart[66882]: executing start plugin: iofilterd-spm
2019-04-18T06:50:25Z iofilterd-spm[67072]: Could not expand environment variable HOME.
2019-04-18T06:50:25Z iofilterd-spm[67072]: Could not expand environment variable HOME.
2019-04-18T06:50:25Z iofilterd-spm[67072]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:50:25Z iofilterd-spm[67072]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:50:25Z iofilterd-spm[67072]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:50:25Z iofilterd-spm[67072]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T06:50:26Z usbarbitrator: Starting USB storage detach monitor
2019-04-18T06:50:26Z usbarbitrator: reservedHbas:
2019-04-18T06:50:26Z jumpstart[66882]: executing start plugin: sensord
2019-04-18T06:50:26Z watchdog-sensord: [67097] Begin '/usr/lib/vmware/bin/sensord -l', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:26Z watchdog-sensord: Executing '/usr/lib/vmware/bin/sensord -l'  
2019-04-18T06:50:26Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T06:50:26Z jumpstart[66882]: executing start plugin: storageRM
2019-04-18T06:50:26Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T06:50:26Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T06:50:26Z watchdog-storageRM: [67115] Begin '/sbin/storageRM', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:26Z watchdog-storageRM: Executing '/sbin/storageRM'  
2019-04-18T06:50:26Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T06:50:26Z usbarbitrator: Exiting USB storage detach monitor
2019-04-18T06:50:26Z jumpstart[66882]: executing start plugin: hostd
2019-04-18T06:50:26Z hostd-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/hostd/config.xml" to file "/etc/vmware/hostd/config.xml" [force=False]   
2019-04-18T06:50:26Z hostd-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T06:50:26Z hostd-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T06:50:26Z create-statsstore[67135]: Initiating hostd statsstore ramdisk size (re)evaluation.
2019-04-18T06:50:26Z create-statsstore[67135]: Maximum number of virtual machines supported for powering-on 384. Maximum number of virtual machines supported for register 1536. Maximum number of resource pools 1000.
2019-04-18T06:50:27Z create-statsstore[67135]: Estimating statsstore ramdisk of size 803MB will be needed.
2019-04-18T06:50:27Z create-statsstore[67135]: Creating statsstore ramdisk mount point /var/lib/vmware/hostd/stats.
2019-04-18T06:50:27Z create-statsstore[67135]: Creating new statsstore ramdisk with 803MB.
2019-04-18T06:50:27Z watchdog-hostd: [67142] Begin 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:27Z watchdog-hostd: Executing 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml'  
2019-04-18T06:50:27Z jumpstart[66882]: executing start plugin: sdrsInjector
2019-04-18T06:50:27Z watchdog-sdrsInjector: [67159] Begin '/sbin/sdrsInjector', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:27Z watchdog-sdrsInjector: Executing '/sbin/sdrsInjector'  
2019-04-18T06:50:27Z jumpstart[66882]: executing start plugin: nfcd
2019-04-18T06:50:27Z watchdog-nfcd: [67180] Begin '/usr/lib/vmware/bin/nfcd ++group=nfcd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:27Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T06:50:27Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 1) 1  
2019-04-18T06:50:27Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T06:50:27Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 2) 1  
2019-04-18T06:50:27Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T06:50:27Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 3) 1  
2019-04-18T06:50:27Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T06:50:27Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 4) 1  
2019-04-18T06:50:27Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T06:50:27Z jumpstart[66882]: executing start plugin: vvold
2019-04-18T06:50:27Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 5) 1  
2019-04-18T06:50:27Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T06:50:27Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 6) 1  
2019-04-18T06:50:27Z watchdog-nfcd: End '/usr/lib/vmware/bin/nfcd ++group=nfcd', failure limit reached  
2019-04-18T06:50:28Z watchdog-vvold: [67301] Begin 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:28Z watchdog-vvold: Executing 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold'  
2019-04-18T06:50:28Z watchdog-vvold: Watchdog for vvold is now 67301
2019-04-18T06:50:28Z watchdog-vvold: Terminating watchdog process with PID 67301
2019-04-18T06:50:28Z watchdog-vvold: [67301] Signal received: exiting the watchdog
2019-04-18T06:50:29Z jumpstart[66882]: executing start plugin: rhttpproxy
2019-04-18T06:50:29Z rhttpproxy-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/rhttpproxy/config.xml" to file "/etc/vmware/rhttpproxy/config.xml" [force=False]   
2019-04-18T06:50:29Z rhttpproxy-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T06:50:29Z rhttpproxy-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T06:50:29Z watchdog-rhttpproxy: [67519] Begin 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:29Z watchdog-rhttpproxy: Executing 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml'  
2019-04-18T06:50:29Z jumpstart[66882]: executing start plugin: hostdCgiServer
2019-04-18T06:50:29Z watchdog-hostdCgiServer: [67544] Begin 'hostdCgiServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:29Z watchdog-hostdCgiServer: Executing 'hostdCgiServer'  
2019-04-18T06:50:29Z jumpstart[66882]: executing start plugin: lbtd
2019-04-18T06:50:29Z watchdog-net-lbt: [67570] Begin '/sbin/net-lbt ++min=0', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:29Z watchdog-net-lbt: Executing '/sbin/net-lbt ++min=0'  
2019-04-18T06:50:29Z PyVmomiServer: 2019-04-18 06:50:29,874 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:50:29Z jumpstart[66882]: executing start plugin: rabbitmqproxy
2019-04-18T06:50:30Z watchdog-rabbitmqproxy: [67596] Begin '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', min-uptime = 60, max-quick-failures = 1, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:30Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T06:50:30Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 1) 0  
2019-04-18T06:50:30Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T06:50:30Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 2) 0  
2019-04-18T06:50:30Z watchdog-rabbitmqproxy: End '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', failure limit reached  
2019-04-18T06:50:30Z jumpstart[66882]: executing start plugin: vmfstraced
2019-04-18T06:50:30Z vmfstracegd: VMFS Global Tracing is not enabled.
2019-04-18T06:50:30Z PyVmomiServer: 2019-04-18 06:50:30,709 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:50:30Z jumpstart[66882]: executing start plugin: slpd
2019-04-18T06:50:30Z root: slpd Starting slpd
2019-04-18T06:50:30Z root: slpd Generating registration file /etc/slp.reg
2019-04-18T06:50:30Z slpd[67692]: test - LOG_INFO
2019-04-18T06:50:30Z slpd[67692]: test - LOG_WARNING
2019-04-18T06:50:30Z slpd[67692]: test - LOG_ERROR
2019-04-18T06:50:30Z slpd[67692]: *** SLPD daemon version 1.0.0 started
2019-04-18T06:50:30Z slpd[67692]: Command line = /sbin/slpd
2019-04-18T06:50:30Z slpd[67692]: Using configuration file = /etc/slp.conf
2019-04-18T06:50:30Z slpd[67692]: Using registration file = /etc/slp.reg
2019-04-18T06:50:30Z slpd[67692]: Agent Interfaces = 192.168.20.20,192.168.55.60,fe80::250:56ff:fe67:b2b0%vmk1
2019-04-18T06:50:30Z slpd[67692]: Agent URL = service:service-agent://esxi-server.testlab.test
2019-04-18T06:50:30Z slpd[67693]: *** BEGIN SERVICES
2019-04-18T06:50:30Z jumpstart[66882]: executing start plugin: dcbd
2019-04-18T06:50:31Z watchdog-dcbd: [67701] Begin '/usr/sbin/dcbd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:31Z watchdog-dcbd: Executing '/usr/sbin/dcbd'  
2019-04-18T06:50:31Z dcbd: [info]     add_dcbx_ieee: device = default_cfg_attribs stype = 2
2019-04-18T06:50:31Z dcbd: [info]     add_ets_ieee: device = default_cfg_attribs
2019-04-18T06:50:31Z dcbd: [info]     add_pfc_ieee: device = default_cfg_attribs
2019-04-18T06:50:31Z dcbd: [info]     add_app_ieee: device = default_cfg_attribs subtype = 0
2019-04-18T06:50:31Z dcbd: [info]     Main loop running.
2019-04-18T06:50:31Z jumpstart[66882]: executing start plugin: nscd
2019-04-18T06:50:31Z watchdog-nscd: [67719] Begin '/usr/lib/vmware/nscd/bin/nscd -d', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:31Z watchdog-nscd: Executing '/usr/lib/vmware/nscd/bin/nscd -d'  
2019-04-18T06:50:31Z jumpstart[66882]: executing start plugin: cdp
2019-04-18T06:50:31Z watchdog-cdp: [67743] Begin '/usr/sbin/net-cdp', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:31Z watchdog-cdp: Executing '/usr/sbin/net-cdp'  
2019-04-18T06:50:31Z jumpstart[66882]: executing start plugin: lacp
2019-04-18T06:50:31Z jumpstart[66882]: executing start plugin: smartd
2019-04-18T06:50:31Z watchdog-smartd: [67764] Begin '/usr/sbin/smartd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:31Z watchdog-smartd: Executing '/usr/sbin/smartd'  
2019-04-18T06:50:31Z smartd: [warn] smartd starts to run with interval 30 minutes
2019-04-18T06:50:32Z jumpstart[66882]: executing start plugin: memscrubd
2019-04-18T06:50:32Z jumpstart[66882]: executing start plugin: vpxa
2019-04-18T06:50:32Z watchdog-vpxa: [67794] Begin '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:32Z watchdog-vpxa: Executing '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa'  
2019-04-18T06:50:32Z jumpstart[66882]: executing start plugin: lwsmd
2019-04-18T06:50:32Z watchdog-lwsmd: [67835] Begin '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:32Z watchdog-lwsmd: Executing '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog'  
2019-04-18T06:50:32Z lwsmd: Logging started
2019-04-18T06:50:32Z lwsmd: Likewise Service Manager starting up
2019-04-18T06:50:32Z lwsmd: Starting service: lwreg
2019-04-18T06:50:32Z lwsmd: [lwreg-ipc] Listening on endpoint /etc/likewise/lib/.regsd
2019-04-18T06:50:32Z lwsmd: [lwreg-ipc] Listener started
2019-04-18T06:50:32Z lwsmd: [lwsm-ipc] Listening on endpoint /etc/likewise/lib/.lwsm
2019-04-18T06:50:32Z lwsmd: [lwsm-ipc] Listener started
2019-04-18T06:50:32Z lwsmd: Likewise Service Manager startup complete
2019-04-18T06:50:34Z lwsmd: Starting service: netlogon
2019-04-18T06:50:34Z lwsmd: [netlogon-ipc] Listening on endpoint /etc/likewise/lib/.netlogond
2019-04-18T06:50:34Z lwsmd: [netlogon-ipc] Listener started
2019-04-18T06:50:34Z lwsmd: Starting service: lwio
2019-04-18T06:50:34Z lwsmd: [lwio-ipc] Listening on endpoint /etc/likewise/lib/.lwiod
2019-04-18T06:50:34Z lwsmd: [lwio-ipc] Listener started
2019-04-18T06:50:34Z lwsmd: Starting service: rdr
2019-04-18T06:50:34Z lwsmd: Starting service: lsass
2019-04-18T06:50:34Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.ntlmd
2019-04-18T06:50:34Z lwsmd: [lsass-ipc] Listener started
2019-04-18T06:50:35Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so'  
2019-04-18T06:50:35Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so: cannot open shared object file: No such file or directory
2019-04-18T06:50:35Z lwsmd: [lsass] Failed to load provider 'lsa-vmdir-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T06:50:35Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so'  
2019-04-18T06:50:35Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so: cannot open shared object file: No such file or directory
2019-04-18T06:50:35Z lwsmd: [lsass] Failed to load provider 'lsa-local-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T06:50:35Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.lsassd
2019-04-18T06:50:35Z lwsmd: [lsass-ipc] Listener started
2019-04-18T06:50:35Z lwsmd: [lsass] The in-memory cache file does not exist yet
2019-04-18T06:50:35Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 0  
2019-04-18T06:50:35Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.TESTLAB.TEST' failed with errno 0, h_errno = 1  
2019-04-18T06:50:35Z lwsmd: [lsass] Domain 'Testlab.test' is now offline  
2019-04-18T06:50:35Z lwsmd: [lsass] Machine Password Sync Thread starting
2019-04-18T06:50:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:50:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:50:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:50:36Z jumpstart[66882]: executing start plugin: vit_loader.sh
2019-04-18T06:50:36Z VITLOADER: [etc/init.d/vit_loader] Start vit loader
2019-04-18T06:50:36Z jumpstart[66882]: executing start plugin: hpe-smx.init
2019-04-18T06:50:36Z root: /etc/init.d/hpe-smx.init: Collecting PCI info...
2019-04-18T06:50:36Z root: /etc/init.d/hpe-smx.init: getipmibtaddress returns 254. No IPMI driver reload
2019-04-18T06:50:36Z root: /etc/init.d/hpe-smx.init: Done.
2019-04-18T06:50:37Z jumpstart[66882]: executing start plugin: hpe-nmi.init
2019-04-18T06:50:37Z root: hpe-nmi.init: Supported Server detected.  Loading NMI kernel module...
2019-04-18T06:50:37Z root: hpe-nmi.init:  Done.
2019-04-18T06:50:37Z jumpstart[66882]: executing start plugin: hpe-fc.sh
2019-04-18T06:50:37Z root: hpe-fc init script: Generating hba config file...
2019-04-18T06:50:38Z jumpstart[66882]: executing start plugin: sfcbd-watchdog
2019-04-18T06:50:38Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T06:50:38Z sfcbd-init: Exclusive access granted.
2019-04-18T06:50:38Z sfcbd-init: Request to start sfcbd-watchdog, pid 68040
2019-04-18T06:50:38Z sfcbd-config[68050]: Configuration not changed, already enabled
2019-04-18T06:50:38Z sfcbd-config[68056]: new install or upgrade previously completed, no changes made at version 0.0.0
2019-04-18T06:50:38Z sfcbd-config[68056]: file /etc/sfcb/sfcb.cfg update completed.
2019-04-18T06:50:38Z sfcbd-init: snmp has not been enabled.
2019-04-18T06:50:38Z sfcbd-init: starting sfcbd
2019-04-18T06:50:38Z sfcbd-init: Waiting for sfcb to start up.
2019-04-18T06:50:38Z amnesiac[68079]: 3 of 4. Testing Log Levels - LOG_WARNING
2019-04-18T06:50:38Z amnesiac[68079]: 4 of 4. Testing Log Levels - LOG_ERR
2019-04-18T06:50:38Z sfcbd-init: Program started normally.
2019-04-18T06:50:39Z jumpstart[66882]: executing start plugin: wsman
2019-04-18T06:50:39Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T06:50:39Z openwsmand: Exclusive access granted.
2019-04-18T06:50:39Z openwsmand: Starting openwsmand
2019-04-18T06:50:39Z watchdog-openwsmand: [68116] Begin '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process', min-uptime = 60, max-quick-failures = 5, max-total-failures = 10, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:50:39Z watchdog-openwsmand: Executing '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process'  
2019-04-18T06:50:39Z : dlopen /usr/lib/libticket.so.0 failed, error: /usr/lib/libticket.so.0: cannot open shared object file: No such file or directory, exiting. 0 Success
2019-04-18T06:50:39Z : [wrn][68126:/build/mts/release/bora-4152810/cayman_openwsman/openwsman/src/src/server/wsmand.c:320:main] nsswitch.conf successfully stat'ed  
2019-04-18T06:50:39Z jumpstart[66882]: executing start plugin: snmpd
2019-04-18T06:50:39Z root: Starting snmpd
2019-04-18T06:50:39Z root: snmpd has not been enabled.
2019-04-18T06:50:39Z jumpstart[66882]: Jumpstart failed to start: snmpd reason: Execution of command: /etc/init.d/snmpd start failed with status: 1
2019-04-18T06:50:39Z jumpstart[66882]: executing start plugin: xorg
2019-04-18T06:50:39Z jumpstart[66882]: executing start plugin: vmtoolsd
2019-04-18T06:50:40Z jumpstart[66882]: executing start plugin: hp-ams.sh
2019-04-18T06:50:40Z amshelper: Wrapper constructing internal library
2019-04-18T06:50:40Z amshelper[68178]: ams ver 10.6.0-24: Running check for supported server...
2019-04-18T06:50:40Z amshelper[68178]: Wrapper Destructing internal library
2019-04-18T06:50:40Z root: [ams] Agentless Management Service is not supported on this server.
2019-04-18T06:50:40Z jumpstart[66882]: Jumpstart failed to start: hp-ams.sh reason: Execution of command: /etc/init.d/hp-ams.sh start failed with status: 1
2019-04-18T06:50:40Z init: starting pid 68258, tty '': '/bin/apply-host-profiles'  
2019-04-18T06:50:40Z init: starting pid 68261, tty '': '/usr/lib/vmware/secureboot/bin/secureBoot.py ++group=host/vim/vmvisor/boot -a'  
2019-04-18T06:50:40Z backup.sh.68147: Locking esx.conf
2019-04-18T06:50:40Z backup.sh.68147: Creating archive
2019-04-18T06:50:40Z backup.sh.68147: Unlocking esx.conf
2019-04-18T06:50:41Z init: starting pid 68324, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh boot'  
2019-04-18T06:50:41Z addVob[68326]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:50:41Z addVob[68326]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:50:41Z addVob[68326]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:50:41Z init: starting pid 68330, tty '': '/bin/vmdumper -g 'Boot Successful''  
2019-04-18T06:50:41Z init: starting pid 68331, tty '': '/bin/sh ++min=0,group=host/vim/vimuser/terminal/shell /etc/rc.local'  
2019-04-18T06:50:42Z root: init Running kickstart.py
2019-04-18T06:50:42Z root: init Running local.sh
2019-04-18T06:50:42Z init: starting pid 68360, tty '': '/bin/esxcfg-init --set-boot-progress done'  
2019-04-18T06:50:42Z init: starting pid 68361, tty '': '/bin/vmware-autostart.sh start'  
2019-04-18T06:50:42Z VMware[startup]: Starting VMs
2019-04-18T06:50:42Z init: starting pid 68364, tty '/dev/tty1': '/bin/initterm.sh tty1 /bin/techsupport.sh'  
2019-04-18T06:50:42Z init: starting pid 68366, tty '/dev/tty2': '-/bin/initterm.sh tty2 /bin/dcuiweasel'  
2019-04-18T06:50:42Z DCUI: Starting DCUI
2019-04-18T06:50:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:50:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:50:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:11Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68404  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:11Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68408  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:23Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68466  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:23Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68475  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:23Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68480  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:24Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:24Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68491  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:25Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:25Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68500  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:26Z ImageConfigManager: 2019-04-18 06:51:26,736 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T06:51:26Z ImageConfigManager: 2019-04-18 06:51:26,736 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x6d873e9978>}   
2019-04-18T06:51:26Z hostd-icm[68510]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T06:51:26Z ImageConfigManager: 2019-04-18 06:51:26,736 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:51:26Z ImageConfigManager: 2019-04-18 06:51:26,737 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-1519-4f52</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-175614190</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T06:51:26Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T06:51:26Z ImageConfigManager: 2019-04-18 06:51:26,851 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T06:51:27Z ImageConfigManager: 2019-04-18 06:51:27,046 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T06:51:27Z ImageConfigManager: 2019-04-18 06:51:27,047 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0xe5c2121978>}   
2019-04-18T06:51:27Z hostd-icm[68522]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T06:51:27Z ImageConfigManager: 2019-04-18 06:51:27,047 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:51:27Z ImageConfigManager: 2019-04-18 06:51:27,047 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-acb1-4f5e</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-175614191</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T06:51:27Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T06:51:27Z ImageConfigManager: 2019-04-18 06:51:27,064 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:29Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68530  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:29Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68537  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:34Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:34Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68546  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:34Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:34Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68548  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:34Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:34Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68551  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:37Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:37Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68558  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:46Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:46Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68565  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:50Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:50Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68573  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:50Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:54Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:51:54Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68580  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:51:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:51:58Z backup.sh.68585: Locking esx.conf
2019-04-18T06:51:58Z backup.sh.68585: Creating archive
2019-04-18T06:51:58Z backup.sh.68585: Unlocking esx.conf
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:31Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T06:52:31Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T06:52:31Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T06:52:31Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001c7ebd094ee9229fffb824
2019-04-18T06:52:31Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T06:52:31Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T06:52:32Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T06:52:32Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001cc9f2bd5ae7909acd22b5
2019-04-18T06:52:32Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T06:52:32Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T06:52:34Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:34Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:37Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,187 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,187 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x71ed9d09b0>}   
2019-04-18T06:52:38Z hostd-icm[68822]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,187 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,188 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-a9ee-4fd6</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-175614220</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T06:52:38Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,303 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,378 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,379 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0xd70519e978>}   
2019-04-18T06:52:38Z hostd-icm[68830]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,379 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,380 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-e427-4fe2</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-175614223</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T06:52:38Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T06:52:38Z ImageConfigManager: 2019-04-18 06:52:38,397 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T06:52:39Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:52:39Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68780  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:40Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:52:40Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68848  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,976 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,976 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0xa2a38a9978>}   
2019-04-18T06:52:40Z hostd-icm[68860]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,976 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,977 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-4834-4ff4</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-175614224</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T06:52:40Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,989 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,989 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x47fb9ac978>}   
2019-04-18T06:52:40Z hostd-icm[68866]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,989 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T06:52:40Z ImageConfigManager: 2019-04-18 06:52:40,990 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-84d3-5000</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-175614227</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T06:52:40Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T06:52:41Z ImageConfigManager: 2019-04-18 06:52:41,008 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T06:52:41Z ImageConfigManager: 2019-04-18 06:52:41,092 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T06:52:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68878  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T06:52:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T06:52:44Z init: starting pid 68884, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh stop'  
2019-04-18T06:52:44Z addVob[68886]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:52:44Z addVob[68886]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:52:44Z addVob[68886]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:52:44Z init: starting pid 68887, tty '': '/bin/shutdown.sh'  
2019-04-18T06:52:44Z VMware[shutdown]: Stopping VMs
2019-04-18T06:52:45Z jumpstart[68904]: executing stop for daemon hp-ams.sh.
2019-04-18T06:52:45Z root: ams stop watchdog...
2019-04-18T06:52:45Z root: ams-wd: ams-watchdog stop.
2019-04-18T06:52:45Z root: Terminating ams-watchdog process with PID 68912 68913
2019-04-18T06:52:47Z root: ams stop service...
2019-04-18T06:52:49Z jumpstart[68904]: executing stop for daemon xorg.
2019-04-18T06:52:49Z jumpstart[68904]: Jumpstart failed to stop: xorg reason: Execution of command: /etc/init.d/xorg stop failed with status: 3
2019-04-18T06:52:49Z jumpstart[68904]: executing stop for daemon vmsyslogd.
2019-04-18T06:52:49Z jumpstart[68904]: Jumpstart failed to stop: vmsyslogd reason: Execution of command: /etc/init.d/vmsyslogd stop failed with status: 1
2019-04-18T06:52:49Z jumpstart[68904]: executing stop for daemon vmtoolsd.
2019-04-18T06:52:49Z jumpstart[68904]: Jumpstart failed to stop: vmtoolsd reason: Execution of command: /etc/init.d/vmtoolsd stop failed with status: 1
2019-04-18T06:52:49Z jumpstart[68904]: executing stop for daemon wsman.
2019-04-18T06:52:49Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T06:52:49Z openwsmand: Exclusive access granted.
2019-04-18T06:52:49Z openwsmand: Stopping openwsmand
2019-04-18T06:52:50Z watchdog-openwsmand: Watchdog for openwsmand is now 68116
2019-04-18T06:52:50Z watchdog-openwsmand: Terminating watchdog process with PID 68116
2019-04-18T06:52:50Z watchdog-openwsmand: [68116] Signal received: exiting the watchdog
2019-04-18T06:52:50Z jumpstart[68904]: executing stop for daemon snmpd.
2019-04-18T06:52:50Z root: Stopping snmpd by administrative request
2019-04-18T06:52:50Z root: snmpd is not running.
2019-04-18T06:52:50Z jumpstart[68904]: executing stop for daemon sfcbd-watchdog.
2019-04-18T06:52:50Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T06:52:50Z sfcbd-init: Exclusive access granted.
2019-04-18T06:52:50Z sfcbd-init: Request to stop sfcbd-watchdog, pid 68977
2019-04-18T06:52:50Z sfcbd-init: Invoked kill 68079
2019-04-18T06:52:50Z sfcb-vmware_raw[68747]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-18T06:52:50Z sfcb-vmware_base[68739]: VICimProvider exiting on WFU cancelled.
2019-04-18T06:52:50Z sfcb-vmware_base[68739]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-18T06:52:53Z sfcbd-init: stop sfcbd process completed.
2019-04-18T06:52:53Z jumpstart[68904]: executing stop for daemon vit_loader.sh.
2019-04-18T06:52:54Z VITLOADER: [etc/init.d/vit_loader] Shutdown VITD successfully
2019-04-18T06:52:54Z jumpstart[68904]: executing stop for daemon hpe-smx.init.
2019-04-18T06:52:54Z jumpstart[68904]: executing stop for daemon hpe-nmi.init.
2019-04-18T06:52:54Z jumpstart[68904]: executing stop for daemon hpe-fc.sh.
2019-04-18T06:52:54Z jumpstart[68904]: executing stop for daemon lwsmd.
2019-04-18T06:52:54Z watchdog-lwsmd: Watchdog for lwsmd is now 67835
2019-04-18T06:52:54Z watchdog-lwsmd: Terminating watchdog process with PID 67835
2019-04-18T06:52:54Z watchdog-lwsmd: [67835] Signal received: exiting the watchdog
2019-04-18T06:52:54Z lwsmd: Shutting down running services
2019-04-18T06:52:54Z lwsmd: Stopping service: lsass
2019-04-18T06:52:54Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-18T06:52:54Z lwsmd: [lsass-ipc] Listener shut down
2019-04-18T06:52:54Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-18T06:52:54Z lwsmd: [lsass-ipc] Listener shut down
2019-04-18T06:52:54Z lwsmd: [lsass] Machine Password Sync Thread stopping
2019-04-18T06:52:55Z lwsmd: [lsass] LSA Service exiting...
2019-04-18T06:52:55Z lwsmd: Stopping service: rdr
2019-04-18T06:52:55Z lwsmd: Stopping service: lwio
2019-04-18T06:52:55Z lwsmd: [lwio-ipc] Shutting down listener
2019-04-18T06:52:55Z lwsmd: [lwio-ipc] Listener shut down
2019-04-18T06:52:55Z lwsmd: [lwio] LWIO Service exiting...
2019-04-18T06:52:55Z lwsmd: Stopping service: netlogon
2019-04-18T06:52:55Z lwsmd: [netlogon-ipc] Shutting down listener
2019-04-18T06:52:55Z lwsmd: [netlogon-ipc] Listener shut down
2019-04-18T06:52:55Z lwsmd: [netlogon] LWNET Service exiting...
2019-04-18T06:52:55Z lwsmd: Stopping service: lwreg
2019-04-18T06:52:55Z lwsmd: [lwreg-ipc] Shutting down listener
2019-04-18T06:52:55Z lwsmd: [lwreg-ipc] Listener shut down
2019-04-18T06:52:55Z lwsmd: [lwreg] REG Service exiting...
2019-04-18T06:52:55Z lwsmd: [lwsm-ipc] Shutting down listener
2019-04-18T06:52:55Z lwsmd: [lwsm-ipc] Listener shut down
2019-04-18T06:52:55Z lwsmd: Logging stopped
2019-04-18T06:52:57Z jumpstart[68904]: executing stop for daemon vpxa.
2019-04-18T06:52:57Z watchdog-vpxa: Watchdog for vpxa is now 67794
2019-04-18T06:52:57Z watchdog-vpxa: Terminating watchdog process with PID 67794
2019-04-18T06:52:57Z watchdog-vpxa: [67794] Signal received: exiting the watchdog
2019-04-18T06:52:57Z jumpstart[68904]: executing stop for daemon vobd.
2019-04-18T06:52:57Z watchdog-vobd: Watchdog for vobd is now 65960
2019-04-18T06:52:57Z watchdog-vobd: Terminating watchdog process with PID 65960
2019-04-18T06:52:57Z watchdog-vobd: [65960] Signal received: exiting the watchdog
2019-04-18T06:52:57Z jumpstart[68904]: executing stop for daemon dcbd.
2019-04-18T06:52:57Z watchdog-dcbd: Watchdog for dcbd is now 67701
2019-04-18T06:52:57Z watchdog-dcbd: Terminating watchdog process with PID 67701
2019-04-18T06:52:57Z watchdog-dcbd: [67701] Signal received: exiting the watchdog
2019-04-18T06:52:57Z jumpstart[68904]: executing stop for daemon nscd.
2019-04-18T06:52:57Z watchdog-nscd: Watchdog for nscd is now 67719
2019-04-18T06:52:57Z watchdog-nscd: Terminating watchdog process with PID 67719
2019-04-18T06:52:57Z watchdog-nscd: [67719] Signal received: exiting the watchdog
2019-04-18T06:52:57Z jumpstart[68904]: executing stop for daemon cdp.
2019-04-18T06:52:58Z watchdog-cdp: Watchdog for cdp is now 67743
2019-04-18T06:52:58Z watchdog-cdp: Terminating watchdog process with PID 67743
2019-04-18T06:52:58Z watchdog-cdp: [67743] Signal received: exiting the watchdog
2019-04-18T06:52:58Z jumpstart[68904]: executing stop for daemon lacp.
2019-04-18T06:52:58Z watchdog-net-lacp: Watchdog for net-lacp is now 66328
2019-04-18T06:52:58Z watchdog-net-lacp: Terminating watchdog process with PID 66328
2019-04-18T06:52:58Z watchdog-net-lacp: [66328] Signal received: exiting the watchdog
2019-04-18T06:52:58Z jumpstart[68904]: executing stop for daemon smartd.
2019-04-18T06:52:58Z watchdog-smartd: Watchdog for smartd is now 67764
2019-04-18T06:52:58Z watchdog-smartd: Terminating watchdog process with PID 67764
2019-04-18T06:52:58Z watchdog-smartd: [67764] Signal received: exiting the watchdog
2019-04-18T06:52:58Z smartd: [warn] smartd received signal 15
2019-04-18T06:52:58Z smartd: [warn] smartd exit.
2019-04-18T06:52:58Z jumpstart[68904]: executing stop for daemon memscrubd.
2019-04-18T06:52:58Z jumpstart[68904]: Jumpstart failed to stop: memscrubd reason: Execution of command: /etc/init.d/memscrubd stop failed with status: 3
2019-04-18T06:52:58Z jumpstart[68904]: executing stop for daemon slpd.
2019-04-18T06:52:58Z root: slpd Stopping slpd
2019-04-18T06:52:58Z slpd[67693]: SLPD daemon shutting down
2019-04-18T06:52:58Z slpd[67693]: *** SLPD daemon shut down by administrative request
2019-04-18T06:52:59Z jumpstart[68904]: executing stop for daemon sensord.
2019-04-18T06:52:59Z watchdog-sensord: Watchdog for sensord is now 67097
2019-04-18T06:52:59Z watchdog-sensord: Terminating watchdog process with PID 67097
2019-04-18T06:52:59Z watchdog-sensord: [67097] Signal received: exiting the watchdog
2019-04-18T06:53:00Z jumpstart[68904]: executing stop for daemon storageRM.
2019-04-18T06:53:00Z watchdog-storageRM: Watchdog for storageRM is now 67115
2019-04-18T06:53:00Z watchdog-storageRM: Terminating watchdog process with PID 67115
2019-04-18T06:53:00Z watchdog-storageRM: [67115] Signal received: exiting the watchdog
2019-04-18T06:53:00Z jumpstart[68904]: executing stop for daemon hostd.
2019-04-18T06:53:00Z watchdog-hostd: Watchdog for hostd is now 67142
2019-04-18T06:53:00Z watchdog-hostd: Terminating watchdog process with PID 67142
2019-04-18T06:53:00Z watchdog-hostd: [67142] Signal received: exiting the watchdog
2019-04-18T06:53:00Z jumpstart[68904]: executing stop for daemon sdrsInjector.
2019-04-18T06:53:00Z watchdog-sdrsInjector: Watchdog for sdrsInjector is now 67159
2019-04-18T06:53:00Z watchdog-sdrsInjector: Terminating watchdog process with PID 67159
2019-04-18T06:53:00Z watchdog-sdrsInjector: [67159] Signal received: exiting the watchdog
2019-04-18T06:53:01Z jumpstart[68904]: executing stop for daemon nfcd.
2019-04-18T06:53:01Z jumpstart[68904]: executing stop for daemon vvold.
2019-04-18T06:53:01Z jumpstart[68904]: Jumpstart failed to stop: vvold reason: Execution of command: /etc/init.d/vvold stop failed with status: 3
2019-04-18T06:53:01Z jumpstart[68904]: executing stop for daemon rhttpproxy.
2019-04-18T06:53:01Z watchdog-rhttpproxy: Watchdog for rhttpproxy is now 67519
2019-04-18T06:53:01Z watchdog-rhttpproxy: Terminating watchdog process with PID 67519
2019-04-18T06:53:01Z watchdog-rhttpproxy: [67519] Signal received: exiting the watchdog
2019-04-18T06:53:01Z jumpstart[68904]: executing stop for daemon hostdCgiServer.
2019-04-18T06:53:01Z watchdog-hostdCgiServer: Watchdog for hostdCgiServer is now 67544
2019-04-18T06:53:01Z watchdog-hostdCgiServer: Terminating watchdog process with PID 67544
2019-04-18T06:53:01Z watchdog-hostdCgiServer: [67544] Signal received: exiting the watchdog
2019-04-18T06:53:01Z jumpstart[68904]: executing stop for daemon lbtd.
2019-04-18T06:53:01Z watchdog-net-lbt: Watchdog for net-lbt is now 67570
2019-04-18T06:53:01Z watchdog-net-lbt: Terminating watchdog process with PID 67570
2019-04-18T06:53:01Z watchdog-net-lbt: [67570] Signal received: exiting the watchdog
2019-04-18T06:53:02Z jumpstart[68904]: executing stop for daemon rabbitmqproxy.
2019-04-18T06:53:02Z jumpstart[68904]: executing stop for daemon vmfstraced.
2019-04-18T06:53:02Z watchdog-vmfstracegd: PID file /var/run/vmware/watchdog-vmfstracegd.PID does not exist
2019-04-18T06:53:02Z watchdog-vmfstracegd: Unable to terminate watchdog: No running watchdog process for vmfstracegd
2019-04-18T06:53:02Z vmfstracegd: Failed to clear vmfstracegd memory reservation
2019-04-18T06:53:02Z jumpstart[68904]: executing stop for daemon esxui.
2019-04-18T06:53:02Z jumpstart[68904]: executing stop for daemon iofilterd-vmwarevmcrypt.
2019-04-18T06:53:02Z iofilterd-vmwarevmcrypt[69558]: Could not expand environment variable HOME.
2019-04-18T06:53:02Z iofilterd-vmwarevmcrypt[69558]: Could not expand environment variable HOME.
2019-04-18T06:53:02Z iofilterd-vmwarevmcrypt[69558]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:53:02Z iofilterd-vmwarevmcrypt[69558]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:53:02Z iofilterd-vmwarevmcrypt[69558]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:53:02Z iofilterd-vmwarevmcrypt[69558]: Resource Pool clean up for iofilter vmwarevmcrypt is done
2019-04-18T06:53:03Z jumpstart[68904]: executing stop for daemon swapobjd.
2019-04-18T06:53:03Z watchdog-swapobjd: Watchdog for swapobjd is now 67002
2019-04-18T06:53:03Z watchdog-swapobjd: Terminating watchdog process with PID 67002
2019-04-18T06:53:03Z watchdog-swapobjd: [67002] Signal received: exiting the watchdog
2019-04-18T06:53:03Z jumpstart[68904]: executing stop for daemon usbarbitrator.
2019-04-18T06:53:03Z watchdog-usbarbitrator: Watchdog for usbarbitrator is now 67038
2019-04-18T06:53:03Z watchdog-usbarbitrator: Terminating watchdog process with PID 67038
2019-04-18T06:53:03Z watchdog-usbarbitrator: [67038] Signal received: exiting the watchdog
2019-04-18T06:53:03Z jumpstart[68904]: executing stop for daemon iofilterd-spm.
2019-04-18T06:53:03Z iofilterd-spm[69621]: Could not expand environment variable HOME.
2019-04-18T06:53:03Z iofilterd-spm[69621]: Could not expand environment variable HOME.
2019-04-18T06:53:03Z iofilterd-spm[69621]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:53:03Z iofilterd-spm[69621]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:53:03Z iofilterd-spm[69621]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:53:03Z iofilterd-spm[69621]: Resource Pool clean up for iofilter spm is done
2019-04-18T06:53:03Z jumpstart[68904]: executing stop for daemon ESXShell.
2019-04-18T06:53:03Z addVob[69628]: Could not expand environment variable HOME.
2019-04-18T06:53:03Z addVob[69628]: Could not expand environment variable HOME.
2019-04-18T06:53:03Z addVob[69628]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:53:03Z addVob[69628]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:53:03Z addVob[69628]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:53:03Z addVob[69628]: VobUserLib_Init failed with -1
2019-04-18T06:53:03Z doat: Stopped wait on component ESXShell.stop
2019-04-18T06:53:03Z doat: Stopped wait on component ESXShell.disable
2019-04-18T06:53:04Z jumpstart[68904]: executing stop for daemon DCUI.
2019-04-18T06:53:04Z root: DCUI Disabling DCUI logins
2019-04-18T06:53:04Z addVob[69649]: Could not expand environment variable HOME.
2019-04-18T06:53:04Z addVob[69649]: Could not expand environment variable HOME.
2019-04-18T06:53:04Z addVob[69649]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:53:04Z addVob[69649]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:53:04Z addVob[69649]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:53:04Z addVob[69649]: VobUserLib_Init failed with -1
2019-04-18T06:53:04Z jumpstart[68904]: executing stop for daemon ntpd.
2019-04-18T06:53:04Z root: ntpd Stopping ntpd
2019-04-18T06:53:04Z watchdog-ntpd: Watchdog for ntpd is now 66915
2019-04-18T06:53:04Z watchdog-ntpd: Terminating watchdog process with PID 66915
2019-04-18T06:53:04Z watchdog-ntpd: [66915] Signal received: exiting the watchdog
2019-04-18T06:53:04Z ntpd[66925]: ntpd exiting on signal 1 (Hangup)
2019-04-18T06:53:04Z ntpd[66925]: 78.46.53.2 local addr 192.168.20.20 -> <null>
2019-04-18T06:53:04Z jumpstart[68904]: executing stop for daemon SSH.
2019-04-18T06:53:04Z addVob[69681]: Could not expand environment variable HOME.
2019-04-18T06:53:04Z addVob[69681]: Could not expand environment variable HOME.
2019-04-18T06:53:04Z addVob[69681]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:53:04Z addVob[69681]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:53:04Z addVob[69681]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:53:04Z addVob[69681]: VobUserLib_Init failed with -1
2019-04-18T06:53:04Z doat: Stopped wait on component RemoteShell.disable
2019-04-18T06:53:04Z doat: Stopped wait on component RemoteShell.stop
2019-04-18T06:53:05Z backup.sh.69737: Locking esx.conf
2019-04-18T06:53:05Z backup.sh.69737: Creating archive
2019-04-18T06:53:05Z backup.sh.69737: Unlocking esx.conf
2019-04-18T06:58:29Z watchdog-vobd: [65960] Begin '/usr/lib/vmware/vob/bin/vobd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:58:29Z watchdog-vobd: Executing '/usr/lib/vmware/vob/bin/vobd'  
2019-04-18T06:58:29Z jumpstart[65945]: Launching Executor
2019-04-18T06:58:29Z jumpstart[65945]: Setting up Executor - Reset Requested
2019-04-18T06:58:30Z jumpstart[65945]: ignoring plugin 'vsan-upgrade' because version '2.0.0'  has already been run.  
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: check-required-memory
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: restore-configuration
2019-04-18T06:58:30Z jumpstart[65993]: restoring configuration
2019-04-18T06:58:30Z jumpstart[65993]: extracting from file /local.tgz
2019-04-18T06:58:30Z jumpstart[65993]: file etc/likewise/db/registry.db has been changed before restoring the configuration - the changes will be lost
2019-04-18T06:58:30Z jumpstart[65993]: ConfigCheck: Running ipv6 option upgrade, redundantly
2019-04-18T06:58:30Z jumpstart[65993]: Util: tcpip4 IPv6 enabled
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: vmkeventd
2019-04-18T06:58:30Z watchdog-vmkeventd: [65995] Begin '/usr/lib/vmware/vmkeventd/bin/vmkeventd', min-uptime = 10, max-quick-failures = 5, max-total-failures = 9999999, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:58:30Z watchdog-vmkeventd: Executing '/usr/lib/vmware/vmkeventd/bin/vmkeventd'  
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: vmkcrypto
2019-04-18T06:58:30Z jumpstart[65973]: 65974:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T06:58:30Z jumpstart[65973]: 65974:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T06:58:30Z jumpstart[65973]: 65974:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: autodeploy-enabled
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: vsan-base
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: vsan-early
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: advanced-user-configuration-options
2019-04-18T06:58:30Z jumpstart[65945]: executing start plugin: restore-advanced-configuration
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: PSA-boot-config
2019-04-18T06:58:31Z jumpstart[65973]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:58:31Z jumpstart[65973]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:58:31Z jumpstart[65973]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:58:31Z jumpstart[65973]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:58:31Z jumpstart[65973]: lib/ssl: protocol list tls1.2
2019-04-18T06:58:31Z jumpstart[65973]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:58:31Z jumpstart[65973]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: vprobe
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: vmkapi-mgmt
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: dma-engine
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: procfs
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: mgmt-vmkapi-compatibility
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: iodm
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: vmkernel-vmkapi-compatibility
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: driver-status-check
2019-04-18T06:58:31Z jumpstart[66025]: driver_status_check: boot cmdline: /jumpstrt.gz vmbTrustedBoot=false tboot=0x101b000 installerDiskDumpSlotSize=2560 no-auto-partition bootUUID=e78269d0448c41fe200c24e8a54f93c1
2019-04-18T06:58:31Z jumpstart[66025]: driver_status_check: useropts:
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: hardware-config
2019-04-18T06:58:31Z jumpstart[66026]: Failed to symlink /etc/vmware/pci.ids: No such file or directory
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: vmklinux
2019-04-18T06:58:31Z jumpstart[65945]: executing start plugin: vmkdevmgr
2019-04-18T06:58:31Z jumpstart[66027]: Starting vmkdevmgr
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: register-vmw-mpp
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: register-vmw-satp
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: register-vmw-psp
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: etherswitch
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: aslr
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: random
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: storage-early-config-dev-settings
2019-04-18T06:58:37Z jumpstart[65945]: executing start plugin: networking-drivers
2019-04-18T06:58:37Z jumpstart[66147]: Loading network device drivers
2019-04-18T06:58:41Z jumpstart[66147]: LoadVmklinuxDriver: Loaded module bnx2
2019-04-18T06:58:41Z jumpstart[65945]: executing start plugin: register-vmw-vaai
2019-04-18T06:58:41Z jumpstart[65945]: executing start plugin: usb
2019-04-18T06:58:41Z jumpstart[65945]: executing start plugin: local-storage
2019-04-18T06:58:41Z jumpstart[65945]: executing start plugin: psa-mask-paths
2019-04-18T06:58:41Z jumpstart[65945]: executing start plugin: network-uplink-init
2019-04-18T06:58:41Z jumpstart[66228]: Trying to connect...
2019-04-18T06:58:41Z jumpstart[66228]: Connected.
2019-04-18T06:58:44Z jumpstart[66228]: Received processed
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: psa-nmp-pre-claim-config
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: psa-filter-pre-claim-config
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: restore-system-uuid
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: restore-storage-multipathing
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: network-support
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: psa-load-rules
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: vds-vmkapi-compatibility
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: psa-filter-post-claim-config
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: psa-nmp-post-claim-config
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: mlx4_en
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: dvfilters-vmkapi-compatibility
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: vds-config
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: storage-drivers
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: vxlan-base
2019-04-18T06:58:44Z jumpstart[65945]: executing start plugin: firewall
2019-04-18T06:58:45Z jumpstart[65945]: executing start plugin: dvfilter-config
2019-04-18T06:58:45Z jumpstart[65945]: executing start plugin: dvfilter-generic-fastpath
2019-04-18T06:58:45Z jumpstart[65945]: executing start plugin: lacp-daemon
2019-04-18T06:58:45Z watchdog-net-lacp: [66330] Begin '/usr/sbin/net-lacp', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:58:45Z watchdog-net-lacp: Executing '/usr/sbin/net-lacp'  
2019-04-18T06:58:45Z jumpstart[65945]: executing start plugin: storage-psa-init
2019-04-18T06:58:45Z jumpstart[66342]: Trying to connect...
2019-04-18T06:58:45Z jumpstart[66342]: Connected.
2019-04-18T06:58:45Z jumpstart[66342]: Received processed
2019-04-18T06:58:45Z jumpstart[65945]: executing start plugin: restore-networking
2019-04-18T06:58:46Z jumpstart[65973]: NetworkInfoImpl: Enabling 1 netstack instances during boot
2019-04-18T06:58:51Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Storing previous management interface:'vmk0'  
2019-04-18T06:58:51Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk0'  
2019-04-18T06:58:51Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk1'  
2019-04-18T06:58:51Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing config for'vmk0'  
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:58:51Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk0
2019-04-18T06:58:51Z jumpstart[65973]: VmkNic: Ipv6 not Enabled
2019-04-18T06:58:51Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk0
2019-04-18T06:58:51Z jumpstart[65973]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:58:51Z jumpstart[65973]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk0', portStr:'Management Network'  
2019-04-18T06:58:51Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing config for'vmk1'  
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:58:51Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk1
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk1
2019-04-18T06:58:51Z jumpstart[65973]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:58:51Z jumpstart[65973]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk1', portStr:'NFS-FreeNAS'  
2019-04-18T06:58:51Z jumpstart[65973]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T06:58:51Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T06:58:51Z jumpstart[65973]: 2019-04-18T06:58:51Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: random-seed
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: dvfilters
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: restore-pxe-marker
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: auto-configure-networking
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: storage-early-configuration
2019-04-18T06:58:51Z jumpstart[66414]: 66414:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T06:58:51Z jumpstart[66414]: 66414:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T06:58:51Z jumpstart[66414]: 66414:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T06:58:51Z jumpstart[66414]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:58:51Z jumpstart[66414]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:58:51Z jumpstart[66414]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:58:51Z jumpstart[66414]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:58:51Z jumpstart[66414]: lib/ssl: protocol list tls1.2
2019-04-18T06:58:51Z jumpstart[66414]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:58:51Z jumpstart[66414]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: bnx2fc
2019-04-18T06:58:51Z jumpstart[65945]: executing start plugin: software-iscsi
2019-04-18T06:58:51Z jumpstart[65973]: iScsi: No iBFT data present in the BIOS
2019-04-18T06:58:52Z iscsid: Notice: iSCSI Database already at latest schema. (Upgrade Skipped).
2019-04-18T06:58:52Z iscsid: iSCSI MASTER Database opened. (0x3c2a008)
2019-04-18T06:58:52Z iscsid: LogLevel = 0
2019-04-18T06:58:52Z iscsid: LogSync  = 0
2019-04-18T06:58:52Z iscsid: memory (180) MB successfully reserved for 1024 sessions
2019-04-18T06:58:52Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e64) idx (0) size (460808)
2019-04-18T06:58:52Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e62) idx (1) size (460808)
2019-04-18T06:58:52Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e60) idx (2) size (460808)
2019-04-18T06:58:52Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e5e) idx (3) size (460808)
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T06:58:53Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T06:58:53Z jumpstart[65945]: executing start plugin: fcoe-config
2019-04-18T06:58:53Z jumpstart[65945]: executing start plugin: storage-path-claim
2019-04-18T06:58:57Z jumpstart[65973]: StorageInfo: Number of paths 3
2019-04-18T06:59:01Z jumpstart[65973]: StorageInfo: Number of devices 3
2019-04-18T06:59:01Z jumpstart[65973]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T07:01:07Z mark: storage-path-claim-completed
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: gss
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: mount-filesystems
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: restore-paths
2019-04-18T06:59:01Z jumpstart[65973]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: filesystem-drivers
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: rpc
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: dump-partition
2019-04-18T06:59:01Z jumpstart[65973]: execution of 'system coredump partition set --enable=true --smart' failed : Unable to smart activate a dump partition.  Error was: No suitable diagnostic partitions found..  
2019-04-18T06:59:01Z jumpstart[65973]: 2019-04-18T06:59:01Z jumpstart[65945]: Executor failed executing esxcli command system coredump partition set --enable=true --smart
2019-04-18T06:59:01Z jumpstart[65945]: Method invocation failed: dump-partition->start() failed: error while executing the cli
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: vsan-devel
2019-04-18T06:59:01Z jumpstart[66435]: VsanDevel: DevelBootDelay: 0
2019-04-18T06:59:01Z jumpstart[66435]: VsanDevel: DevelWipeConfigOnBoot: 0
2019-04-18T06:59:01Z jumpstart[66435]: VsanDevel: DevelTagSSD: Starting
2019-04-18T06:59:01Z jumpstart[66435]: 66435:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T06:59:01Z jumpstart[66435]: 66435:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T06:59:01Z jumpstart[66435]: 66435:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T06:59:01Z jumpstart[66435]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:59:01Z jumpstart[66435]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T06:59:01Z jumpstart[66435]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T06:59:01Z jumpstart[66435]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:59:01Z jumpstart[66435]: lib/ssl: protocol list tls1.2
2019-04-18T06:59:01Z jumpstart[66435]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:59:01Z jumpstart[66435]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:59:01Z jumpstart[66435]: VsanDevel: DevelTagSSD: Done.
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: vmfs
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: ufs
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: vfat
2019-04-18T06:59:01Z jumpstart[65945]: executing start plugin: nfsgssd
2019-04-18T06:59:02Z watchdog-nfsgssd: [66638] Begin '/usr/lib/vmware/nfs/bin/nfsgssd -f -a', min-uptime = 60, max-quick-failures = 128, max-total-failures = 65536, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T06:59:02Z watchdog-nfsgssd: Executing '/usr/lib/vmware/nfs/bin/nfsgssd -f -a'  
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: vsan
2019-04-18T06:59:02Z nfsgssd[66648]: Could not expand environment variable HOME.
2019-04-18T06:59:02Z nfsgssd[66648]: Could not expand environment variable HOME.
2019-04-18T06:59:02Z nfsgssd[66648]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T06:59:02Z nfsgssd[66648]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T06:59:02Z nfsgssd[66648]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T06:59:02Z nfsgssd[66648]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T06:59:02Z nfsgssd[66648]: lib/ssl: protocol list tls1.2
2019-04-18T06:59:02Z nfsgssd[66648]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T06:59:02Z nfsgssd[66648]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T06:59:02Z nfsgssd[66648]: Empty epoch file
2019-04-18T06:59:02Z nfsgssd[66648]: Starting with epoch 1
2019-04-18T06:59:02Z nfsgssd[66648]: Connected to SunRPCGSS version 1.0
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: krb5
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: etc-hosts
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: nfs
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: nfs41
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: mount-disk-fs
2019-04-18T06:59:02Z jumpstart[65973]: VmFileSystem: Automounted volume 5a6f6646-d13e2d89-fd8d-b499babd4e5e
2019-04-18T06:59:02Z jumpstart[65973]: VmFileSystem: Automounted volume 5ab363c3-c36e8e9f-8cfc-b499babd4e5e
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: auto-configure-pmem
2019-04-18T06:59:02Z jumpstart[65945]: executing start plugin: restore-nfs-volumes
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:59:02Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:59:02Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:59:02Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:59:02Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T06:59:02Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T06:59:02Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T06:59:02Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T06:59:02Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T06:59:02Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:00:04Z jumpstart[65973]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "NFS-FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T07:00:04Z jumpstart[65973]: 2019-04-18T07:00:04Z jumpstart[65973]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T07:00:04Z jumpstart[65973]: 2019-04-18T07:00:04Z jumpstart[65945]: Executor failed executing esxcli command boot storage restore --nfs-volumes
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:00:04Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:00:04Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:00:04Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:00:04Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:00:04Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:00:04Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:00:04Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:00:04Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:00:04Z jumpstart[65945]: Method invocation failed: restore-nfs-volumes->start() failed: error while executing the cli
2019-04-18T07:00:04Z jumpstart[65945]: executing start plugin: auto-configure-storage
2019-04-18T07:00:04Z jumpstart[65945]: executing start plugin: restore-bootbanks
2019-04-18T07:00:04Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:07Z jumpstart[65973]: VmkCtl: Boot device not available, waited 3 seconds
2019-04-18T07:00:07Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:10Z jumpstart[65973]: VmkCtl: Boot device not available, waited 6 seconds
2019-04-18T07:00:10Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:13Z jumpstart[65973]: VmkCtl: Boot device not available, waited 9 seconds
2019-04-18T07:00:13Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:16Z jumpstart[65973]: VmkCtl: Boot device not available, waited 12 seconds
2019-04-18T07:00:16Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:19Z jumpstart[65973]: VmkCtl: Boot device not available, waited 15 seconds
2019-04-18T07:00:19Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:22Z jumpstart[65973]: VmkCtl: Boot device not available, waited 18 seconds
2019-04-18T07:00:22Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:25Z jumpstart[65973]: VmkCtl: Boot device not available, waited 21 seconds
2019-04-18T07:00:25Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:28Z jumpstart[65973]: VmkCtl: Boot device not available, waited 24 seconds
2019-04-18T07:00:28Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:31Z jumpstart[65973]: VmkCtl: Boot device not available, waited 27 seconds
2019-04-18T07:00:31Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:34Z jumpstart[65973]: VmkCtl: Boot device not available, waited 30 seconds
2019-04-18T07:00:34Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:37Z jumpstart[65973]: VmkCtl: Boot device not available, waited 33 seconds
2019-04-18T07:00:37Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:40Z jumpstart[65973]: VmkCtl: Boot device not available, waited 36 seconds
2019-04-18T07:00:40Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:43Z jumpstart[65973]: VmkCtl: Boot device not available, waited 39 seconds
2019-04-18T07:00:43Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:46Z jumpstart[65973]: VmkCtl: Boot device not available, waited 42 seconds
2019-04-18T07:00:46Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:49Z jumpstart[65973]: VmkCtl: Boot device not available, waited 45 seconds
2019-04-18T07:00:49Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:52Z jumpstart[65973]: VmkCtl: Boot device not available, waited 48 seconds
2019-04-18T07:00:52Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:55Z jumpstart[65973]: VmkCtl: Boot device not available, waited 51 seconds
2019-04-18T07:00:55Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:00:58Z jumpstart[65973]: VmkCtl: Boot device not available, waited 54 seconds
2019-04-18T07:00:58Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:01:01Z jumpstart[65973]: VmkCtl: Boot device not available, waited 57 seconds
2019-04-18T07:01:01Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:01:04Z jumpstart[65973]: VmkCtl: Boot device not available, waited 60 seconds
2019-04-18T07:01:04Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:01:04Z jumpstart[65973]: VmkCtl: Did not find a valid boot device, symlinking /bootbank to /tmp
2019-04-18T07:01:04Z jumpstart[65945]: executing start plugin: restore-host-cache
2019-04-18T07:01:04Z jumpstart[65973]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T07:01:04Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T07:01:04Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:01:04Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:01:04Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:01:04Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:04Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:04Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:01:04Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:01:04Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:01:04Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:04Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:04Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:01:04Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:01:04Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T07:01:04Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T07:01:04Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T07:01:04Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T07:01:04Z jumpstart[65973]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T07:01:04Z jumpstart[65973]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T07:01:04Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T07:01:04Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T07:01:04Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T07:01:04Z jumpstart[65973]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55098474496
2019-04-18T07:01:05Z jumpstart[65973]: SlowRefresh: path /vmfs/volumes/5ab363c4-26d208a0-fab7-b499babd4e5e total blocks 2500207837184 used blocks 2400621428736
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: vflash
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: dump-file
2019-04-18T07:01:05Z jumpstart[65973]: VmkCtl: Diagnostic File found; not auto creating
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:01:05Z jumpstart[65973]: VmkCtl: Dump file determined to be large enough, size: 1588592640 (recommended minimum: 1588592640)
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: vmci
2019-04-18T07:01:05Z jumpstart[65973]: execution of 'system module load --module vmci' failed : Unable to load module /usr/lib/vmware/vmkmod/vmci: Busy  
2019-04-18T07:01:05Z jumpstart[65945]: Executor failed executing esxcli command system module load --module vmci
2019-04-18T07:01:05Z jumpstart[65945]: Method invocation failed: vmci->start() failed: error while executing the cli
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: configure-locker
2019-04-18T07:01:05Z jumpstart[66693]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /scratch
2019-04-18T07:01:05Z jumpstart[66693]: Using /locker/packages/6.5.0/ as /productLocker
2019-04-18T07:01:05Z jumpstart[66693]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /locker
2019-04-18T07:01:05Z jumpstart[66693]: Using policy dir /etc/vmware/secpolicy
2019-04-18T07:01:05Z jumpstart[66693]: Parsed all objects
2019-04-18T07:01:05Z jumpstart[66693]: Objects defined and obsolete objects removed
2019-04-18T07:01:05Z jumpstart[66693]: Parsed all domain names
2019-04-18T07:01:05Z jumpstart[66693]: Domain policies parsed and syntax validated
2019-04-18T07:01:05Z jumpstart[66693]: Constraints check for domain policies succeeded
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /usr/share/nvidia
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /productLocker
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /.vmware
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /dev/vsansparse
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /dev/cbt
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /dev/svm
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /dev/upit
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /dev/vsan
2019-04-18T07:01:05Z jumpstart[66693]: Getting realpath failed: /dev/vvol
2019-04-18T07:01:05Z jumpstart[66693]: Domain policies set
2019-04-18T07:01:05Z jumpstart[66693]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T07:01:05Z jumpstart[66693]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T07:01:05Z jumpstart[66693]: Parsed all the tardisk policy files
2019-04-18T07:01:05Z jumpstart[66693]: Set all the tardisk labels and policy
2019-04-18T07:01:05Z jumpstart[66693]: Parsed all file label mappings
2019-04-18T07:01:05Z jumpstart[66693]: Set all file labels
2019-04-18T07:01:05Z jumpstart[66693]: System security policy has been set successfully
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: restore-system-swap
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: cbrc
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: tpm
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: apei
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: restore-security-policies
2019-04-18T07:01:05Z jumpstart[66699]: Using policy dir /etc/vmware/secpolicy
2019-04-18T07:01:05Z jumpstart[66699]: Parsed all objects
2019-04-18T07:01:05Z jumpstart[66699]: Objects defined and obsolete objects removed
2019-04-18T07:01:05Z jumpstart[66699]: Parsed all domain names
2019-04-18T07:01:05Z jumpstart[66699]: Domain policies parsed and syntax validated
2019-04-18T07:01:05Z jumpstart[66699]: Constraints check for domain policies succeeded
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /usr/share/nvidia
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /productLocker
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /.vmware
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /dev/vsansparse
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /dev/cbt
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /dev/svm
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /dev/upit
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /dev/vsan
2019-04-18T07:01:05Z jumpstart[66699]: Getting realpath failed: /dev/vvol
2019-04-18T07:01:05Z jumpstart[66699]: Domain policies set
2019-04-18T07:01:05Z jumpstart[66699]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T07:01:05Z jumpstart[66699]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T07:01:05Z jumpstart[66699]: Parsed all the tardisk policy files
2019-04-18T07:01:05Z jumpstart[66699]: Set all the tardisk labels and policy
2019-04-18T07:01:05Z jumpstart[66699]: Parsed all file label mappings
2019-04-18T07:01:05Z jumpstart[66699]: Set all file labels
2019-04-18T07:01:05Z jumpstart[66699]: System security policy has been set successfully
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: oem-modules
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: crond
2019-04-18T07:01:05Z crond[66703]: crond: crond (busybox 1.22.1) started, log level 8
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: restore-resource-groups
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: procMisc
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: rdma-vmkapi-compatibility
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: ipmi
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: restore-keymap
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: nmp-vmkapi-compatibility
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: iscsi-vmkapi-compatibility
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: ftcpt
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: hbr
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: autodeploy-setpassword
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: inetd
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: vrdma
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: tag-boot-bank
2019-04-18T07:01:05Z jumpstart[66790]: unable to open boot configuration: No such file or directory
2019-04-18T07:01:05Z jumpstart[65945]: Method invocation failed: tag-boot-bank->start() failed: exited with code 1
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: system-image-cache
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: iofilters
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: vit
2019-04-18T07:01:05Z jumpstart[65973]: Parser: Initializing VIT parser lib
2019-04-18T07:01:05Z jumpstart[65973]: VsanIscsiTargetImpl: The host is not in a Virtual SAN cluster.
2019-04-18T07:01:05Z jumpstart[65973]: Util: Retrieved vit status successfully
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: vmotion
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: vfc
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: balloonVMCI
2019-04-18T07:01:05Z jumpstart[65945]: executing start plugin: coredump-configuration
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:01:05Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:05Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:05Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:01:05Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:05Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:01:05Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:05Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:01:05Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:01:05Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:01:06Z jumpstart[65973]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T07:01:06Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:01:06Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:01:06Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:06Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:01:06Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:01:06Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:01:06Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:01:06Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T07:01:06Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T07:01:06Z jumpstart[65973]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T07:01:06Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T07:01:06Z jumpstart[65973]: GetTypedFileSystems: fstype ufs
2019-04-18T07:01:06Z jumpstart[65973]: GetTypedFileSystems: fstype vvol
2019-04-18T07:01:06Z jumpstart[65973]: GetTypedFileSystems: fstype vsan
2019-04-18T07:01:06Z jumpstart[65973]: GetTypedFileSystems: fstype PMEM
2019-04-18T07:01:06Z jumpstart[65973]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55098474496
2019-04-18T07:01:06Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:01:06Z jumpstart[65945]: executing start plugin: set-acceptance-level
2019-04-18T07:01:06Z jumpstart[65945]: executing start plugin: scratch-storage
2019-04-18T07:01:07Z jumpstart[65945]: executing start plugin: pingback
2019-04-18T07:01:07Z jumpstart[65945]: executing start plugin: vmswapcleanup
2019-04-18T07:01:07Z jumpstart[65973]: execution of '--plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup' failed : Host Local Swap Location has not been enabled  
2019-04-18T07:01:07Z jumpstart[65945]: Executor failed executing esxcli command --plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup
2019-04-18T07:01:07Z jumpstart[65945]: Method invocation failed: vmswapcleanup->start() failed: error while executing the cli
2019-04-18T07:01:07Z jumpstart[65973]: Jumpstart executor signalled to stop
2019-04-18T07:01:07Z jumpstart[65945]: Executor has been Successfully Stopped
2019-04-18T07:01:07Z init: starting pid 66830, tty '': '/usr/lib/vmware/firstboot/bin/firstboot.py ++group=host/vim/vmvisor/boot -l'  
2019-04-18T07:01:07Z init: starting pid 66831, tty '': '/bin/services.sh start'  
2019-04-18T07:01:08Z jumpstart[66884]: executing start plugin: ESXShell
2019-04-18T07:01:08Z addVob[66890]: Could not expand environment variable HOME.
2019-04-18T07:01:08Z addVob[66890]: Could not expand environment variable HOME.
2019-04-18T07:01:08Z addVob[66890]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:01:08Z addVob[66890]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:01:08Z addVob[66890]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:01:08Z jumpstart[66884]: executing start plugin: DCUI
2019-04-18T07:01:08Z root: DCUI Enabling DCUI login: runlevel =
2019-04-18T07:01:08Z addVob[66905]: Could not expand environment variable HOME.
2019-04-18T07:01:08Z addVob[66905]: Could not expand environment variable HOME.
2019-04-18T07:01:08Z addVob[66905]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:01:08Z addVob[66905]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:01:08Z addVob[66905]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:01:08Z jumpstart[66884]: executing start plugin: ntpd
2019-04-18T07:01:08Z root: ntpd Starting ntpd
2019-04-18T07:01:08Z sntp[66910]: sntp 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:44 UTC 2016 (1)
2019-04-18T07:01:08Z sntp[66910]: 2019-04-18 07:01:08.894891 (+0000) +1.223882 +/- 0.840833 pool.ntp.org 136.243.177.133 s2 no-leap
2019-04-18T07:01:10Z watchdog-ntpd: [66917] Begin '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift', min-uptime = 60, max-quick-failures = 5, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:10Z watchdog-ntpd: Executing '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift'  
2019-04-18T07:01:10Z ntpd[66927]: ntpd 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:59 UTC 2016 (1): Starting
2019-04-18T07:01:10Z ntpd[66927]: Command line: /sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift
2019-04-18T07:01:10Z ntpd[66927]: proto: precision = 0.468 usec (-21)
2019-04-18T07:01:10Z ntpd[66927]: restrict default: KOD does nothing without LIMITED.
2019-04-18T07:01:10Z ntpd[66927]: Listen and drop on 0 v6wildcard [::]:123
2019-04-18T07:01:10Z ntpd[66927]: Listen and drop on 1 v4wildcard 0.0.0.0:123
2019-04-18T07:01:10Z ntpd[66927]: Listen normally on 2 lo0 127.0.0.1:123
2019-04-18T07:01:10Z ntpd[66927]: Listen normally on 3 vmk0 192.168.20.20:123
2019-04-18T07:01:10Z ntpd[66927]: Listen normally on 4 vmk1 192.168.55.60:123
2019-04-18T07:01:10Z ntpd[66927]: Listen normally on 5 lo0 [::1]:123
2019-04-18T07:01:10Z ntpd[66927]: Listen normally on 6 lo0 [fe80::1%1]:123
2019-04-18T07:01:10Z ntpd[66927]: Listen normally on 7 vmk1 [fe80::250:56ff:fe67:b2b0%3]:123
2019-04-18T07:01:11Z jumpstart[66884]: executing start plugin: SSH
2019-04-18T07:01:11Z addVob[66934]: Could not expand environment variable HOME.
2019-04-18T07:01:11Z addVob[66934]: Could not expand environment variable HOME.
2019-04-18T07:01:11Z addVob[66934]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:01:11Z addVob[66934]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:01:11Z addVob[66934]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:01:11Z jumpstart[66884]: executing start plugin: esxui
2019-04-18T07:01:11Z jumpstart[66884]: executing start plugin: iofilterd-vmwarevmcrypt
2019-04-18T07:01:11Z iofilterd-vmwarevmcrypt[66963]: Could not expand environment variable HOME.
2019-04-18T07:01:11Z iofilterd-vmwarevmcrypt[66963]: Could not expand environment variable HOME.
2019-04-18T07:01:11Z iofilterd-vmwarevmcrypt[66963]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:01:11Z iofilterd-vmwarevmcrypt[66963]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:01:11Z iofilterd-vmwarevmcrypt[66963]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:01:11Z iofilterd-vmwarevmcrypt[66963]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T07:01:12Z watchdog-iofiltervpd: [66976] Begin '/usr/lib/vmware/iofilter/bin/ioFilterVPServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:12Z watchdog-iofiltervpd: Executing '/usr/lib/vmware/iofilter/bin/ioFilterVPServer'  
2019-04-18T07:01:14Z jumpstart[66884]: executing start plugin: swapobjd
2019-04-18T07:01:14Z watchdog-swapobjd: [67006] Begin '/usr/lib/vmware/swapobj/bin/swapobjd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:14Z watchdog-swapobjd: Executing '/usr/lib/vmware/swapobj/bin/swapobjd'  
2019-04-18T07:01:14Z jumpstart[66884]: executing start plugin: usbarbitrator
2019-04-18T07:01:14Z usbarbitrator: evicting objects on USB from OC
2019-04-18T07:01:15Z usbarbitrator: unclaiming USB devices
2019-04-18T07:01:15Z usbarbitrator: rescanning to complete removal of USB devices
2019-04-18T07:01:15Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T07:01:15Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T07:01:15Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T07:01:15Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T07:01:15Z watchdog-usbarbitrator: [67042] Begin '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:15Z watchdog-usbarbitrator: Executing '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414'  
2019-04-18T07:01:15Z jumpstart[66884]: executing start plugin: iofilterd-spm
2019-04-18T07:01:15Z iofilterd-spm[67062]: Could not expand environment variable HOME.
2019-04-18T07:01:15Z iofilterd-spm[67062]: Could not expand environment variable HOME.
2019-04-18T07:01:15Z iofilterd-spm[67062]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:01:15Z iofilterd-spm[67062]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:01:15Z iofilterd-spm[67062]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:01:15Z iofilterd-spm[67062]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T07:01:15Z jumpstart[66884]: executing start plugin: sensord
2019-04-18T07:01:15Z usbarbitrator: Starting USB storage detach monitor
2019-04-18T07:01:15Z usbarbitrator: reservedHbas:
2019-04-18T07:01:15Z watchdog-sensord: [67098] Begin '/usr/lib/vmware/bin/sensord -l', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:15Z watchdog-sensord: Executing '/usr/lib/vmware/bin/sensord -l'  
2019-04-18T07:01:16Z jumpstart[66884]: executing start plugin: storageRM
2019-04-18T07:01:16Z watchdog-storageRM: [67116] Begin '/sbin/storageRM', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:16Z watchdog-storageRM: Executing '/sbin/storageRM'  
2019-04-18T07:01:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T07:01:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T07:01:16Z jumpstart[66884]: executing start plugin: hostd
2019-04-18T07:01:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T07:01:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T07:01:16Z usbarbitrator: Exiting USB storage detach monitor
2019-04-18T07:01:16Z hostd-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/hostd/config.xml" to file "/etc/vmware/hostd/config.xml" [force=False]   
2019-04-18T07:01:16Z hostd-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T07:01:16Z hostd-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T07:01:16Z create-statsstore[67137]: Initiating hostd statsstore ramdisk size (re)evaluation.
2019-04-18T07:01:16Z create-statsstore[67137]: Maximum number of virtual machines supported for powering-on 384. Maximum number of virtual machines supported for register 1536. Maximum number of resource pools 1000.
2019-04-18T07:01:16Z create-statsstore[67137]: Estimating statsstore ramdisk of size 803MB will be needed.
2019-04-18T07:01:16Z create-statsstore[67137]: Creating statsstore ramdisk mount point /var/lib/vmware/hostd/stats.
2019-04-18T07:01:16Z create-statsstore[67137]: Creating new statsstore ramdisk with 803MB.
2019-04-18T07:01:16Z watchdog-hostd: [67144] Begin 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:16Z watchdog-hostd: Executing 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml'  
2019-04-18T07:01:16Z jumpstart[66884]: executing start plugin: sdrsInjector
2019-04-18T07:01:16Z watchdog-sdrsInjector: [67163] Begin '/sbin/sdrsInjector', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:16Z watchdog-sdrsInjector: Executing '/sbin/sdrsInjector'  
2019-04-18T07:01:17Z jumpstart[66884]: executing start plugin: nfcd
2019-04-18T07:01:17Z watchdog-nfcd: [67186] Begin '/usr/lib/vmware/bin/nfcd ++group=nfcd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:17Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:01:17Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 1) 1  
2019-04-18T07:01:17Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:01:17Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 2) 1  
2019-04-18T07:01:17Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:01:17Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 3) 1  
2019-04-18T07:01:17Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:01:17Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 4) 1  
2019-04-18T07:01:17Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:01:17Z jumpstart[66884]: executing start plugin: vvold
2019-04-18T07:01:17Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 5) 1  
2019-04-18T07:01:17Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:01:17Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 6) 1  
2019-04-18T07:01:17Z watchdog-nfcd: End '/usr/lib/vmware/bin/nfcd ++group=nfcd', failure limit reached  
2019-04-18T07:01:17Z watchdog-vvold: [67314] Begin 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:17Z watchdog-vvold: Executing 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold'  
2019-04-18T07:01:17Z watchdog-vvold: Watchdog for vvold is now 67314
2019-04-18T07:01:17Z watchdog-vvold: Terminating watchdog process with PID 67314
2019-04-18T07:01:17Z watchdog-vvold: [67314] Signal received: exiting the watchdog
2019-04-18T07:01:18Z jumpstart[66884]: executing start plugin: rhttpproxy
2019-04-18T07:01:19Z rhttpproxy-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/rhttpproxy/config.xml" to file "/etc/vmware/rhttpproxy/config.xml" [force=False]   
2019-04-18T07:01:19Z rhttpproxy-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T07:01:19Z rhttpproxy-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T07:01:19Z watchdog-rhttpproxy: [67527] Begin 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:19Z watchdog-rhttpproxy: Executing 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml'  
2019-04-18T07:01:19Z jumpstart[66884]: executing start plugin: hostdCgiServer
2019-04-18T07:01:19Z watchdog-hostdCgiServer: [67554] Begin 'hostdCgiServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:19Z watchdog-hostdCgiServer: Executing 'hostdCgiServer'  
2019-04-18T07:01:19Z jumpstart[66884]: executing start plugin: lbtd
2019-04-18T07:01:19Z PyVmomiServer: 2019-04-18 07:01:19,396 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:01:19Z watchdog-net-lbt: [67581] Begin '/sbin/net-lbt ++min=0', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:19Z watchdog-net-lbt: Executing '/sbin/net-lbt ++min=0'  
2019-04-18T07:01:19Z jumpstart[66884]: executing start plugin: rabbitmqproxy
2019-04-18T07:01:19Z watchdog-rabbitmqproxy: [67612] Begin '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', min-uptime = 60, max-quick-failures = 1, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:19Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T07:01:19Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 1) 0  
2019-04-18T07:01:19Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T07:01:19Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 2) 0  
2019-04-18T07:01:19Z watchdog-rabbitmqproxy: End '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', failure limit reached  
2019-04-18T07:01:20Z jumpstart[66884]: executing start plugin: vmfstraced
2019-04-18T07:01:20Z vmfstracegd: VMFS Global Tracing is not enabled.
2019-04-18T07:01:20Z PyVmomiServer: 2019-04-18 07:01:20,229 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:01:20Z jumpstart[66884]: executing start plugin: slpd
2019-04-18T07:01:20Z root: slpd Starting slpd
2019-04-18T07:01:20Z root: slpd Generating registration file /etc/slp.reg
2019-04-18T07:01:20Z slpd[67700]: test - LOG_INFO
2019-04-18T07:01:20Z slpd[67700]: test - LOG_WARNING
2019-04-18T07:01:20Z slpd[67700]: test - LOG_ERROR
2019-04-18T07:01:20Z slpd[67700]: *** SLPD daemon version 1.0.0 started
2019-04-18T07:01:20Z slpd[67700]: Command line = /sbin/slpd
2019-04-18T07:01:20Z slpd[67700]: Using configuration file = /etc/slp.conf
2019-04-18T07:01:20Z slpd[67700]: Using registration file = /etc/slp.reg
2019-04-18T07:01:20Z slpd[67700]: Agent Interfaces = 192.168.20.20,192.168.55.60,fe80::250:56ff:fe67:b2b0%vmk1
2019-04-18T07:01:20Z slpd[67700]: Agent URL = service:service-agent://esxi-server.testlab.test
2019-04-18T07:01:20Z slpd[67701]: *** BEGIN SERVICES
2019-04-18T07:01:20Z jumpstart[66884]: executing start plugin: dcbd
2019-04-18T07:01:20Z watchdog-dcbd: [67709] Begin '/usr/sbin/dcbd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:20Z watchdog-dcbd: Executing '/usr/sbin/dcbd'  
2019-04-18T07:01:20Z dcbd: [info]     add_dcbx_ieee: device = default_cfg_attribs stype = 2
2019-04-18T07:01:20Z dcbd: [info]     add_ets_ieee: device = default_cfg_attribs
2019-04-18T07:01:20Z dcbd: [info]     add_pfc_ieee: device = default_cfg_attribs
2019-04-18T07:01:20Z dcbd: [info]     add_app_ieee: device = default_cfg_attribs subtype = 0
2019-04-18T07:01:20Z dcbd: [info]     Main loop running.
2019-04-18T07:01:20Z jumpstart[66884]: executing start plugin: nscd
2019-04-18T07:01:20Z watchdog-nscd: [67729] Begin '/usr/lib/vmware/nscd/bin/nscd -d', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:20Z watchdog-nscd: Executing '/usr/lib/vmware/nscd/bin/nscd -d'  
2019-04-18T07:01:20Z jumpstart[66884]: executing start plugin: cdp
2019-04-18T07:01:21Z watchdog-cdp: [67751] Begin '/usr/sbin/net-cdp', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:21Z watchdog-cdp: Executing '/usr/sbin/net-cdp'  
2019-04-18T07:01:21Z jumpstart[66884]: executing start plugin: lacp
2019-04-18T07:01:21Z jumpstart[66884]: executing start plugin: smartd
2019-04-18T07:01:21Z watchdog-smartd: [67770] Begin '/usr/sbin/smartd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:21Z watchdog-smartd: Executing '/usr/sbin/smartd'  
2019-04-18T07:01:21Z smartd: [warn] smartd starts to run with interval 30 minutes
2019-04-18T07:01:21Z jumpstart[66884]: executing start plugin: memscrubd
2019-04-18T07:01:21Z jumpstart[66884]: executing start plugin: vpxa
2019-04-18T07:01:21Z watchdog-vpxa: [67800] Begin '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:21Z watchdog-vpxa: Executing '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa'  
2019-04-18T07:01:21Z jumpstart[66884]: executing start plugin: lwsmd
2019-04-18T07:01:22Z watchdog-lwsmd: [67843] Begin '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:22Z watchdog-lwsmd: Executing '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog'  
2019-04-18T07:01:22Z lwsmd: Logging started
2019-04-18T07:01:22Z lwsmd: Likewise Service Manager starting up
2019-04-18T07:01:22Z lwsmd: Starting service: lwreg
2019-04-18T07:01:22Z lwsmd: [lwreg-ipc] Listening on endpoint /etc/likewise/lib/.regsd
2019-04-18T07:01:22Z lwsmd: [lwreg-ipc] Listener started
2019-04-18T07:01:22Z lwsmd: [lwsm-ipc] Listening on endpoint /etc/likewise/lib/.lwsm
2019-04-18T07:01:22Z lwsmd: [lwsm-ipc] Listener started
2019-04-18T07:01:22Z lwsmd: Likewise Service Manager startup complete
2019-04-18T07:01:23Z lwsmd: Starting service: netlogon
2019-04-18T07:01:23Z lwsmd: [netlogon-ipc] Listening on endpoint /etc/likewise/lib/.netlogond
2019-04-18T07:01:23Z lwsmd: [netlogon-ipc] Listener started
2019-04-18T07:01:23Z lwsmd: Starting service: lwio
2019-04-18T07:01:23Z lwsmd: [lwio-ipc] Listening on endpoint /etc/likewise/lib/.lwiod
2019-04-18T07:01:23Z lwsmd: [lwio-ipc] Listener started
2019-04-18T07:01:23Z lwsmd: Starting service: rdr
2019-04-18T07:01:23Z lwsmd: Starting service: lsass
2019-04-18T07:01:23Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.ntlmd
2019-04-18T07:01:23Z lwsmd: [lsass-ipc] Listener started
2019-04-18T07:01:23Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so'  
2019-04-18T07:01:23Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so: cannot open shared object file: No such file or directory
2019-04-18T07:01:23Z lwsmd: [lsass] Failed to load provider 'lsa-vmdir-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T07:01:23Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so'  
2019-04-18T07:01:23Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so: cannot open shared object file: No such file or directory
2019-04-18T07:01:23Z lwsmd: [lsass] Failed to load provider 'lsa-local-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T07:01:23Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.lsassd
2019-04-18T07:01:23Z lwsmd: [lsass-ipc] Listener started
2019-04-18T07:01:23Z lwsmd: [lsass] The in-memory cache file does not exist yet
2019-04-18T07:01:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 0  
2019-04-18T07:01:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.TESTLAB.TEST' failed with errno 0, h_errno = 1  
2019-04-18T07:01:23Z lwsmd: [lsass] Domain 'Testlab.test' is now offline  
2019-04-18T07:01:23Z lwsmd: [lsass] Machine Password Sync Thread starting
2019-04-18T07:01:24Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:01:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:01:24Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:01:24Z jumpstart[66884]: executing start plugin: vit_loader.sh
2019-04-18T07:01:24Z VITLOADER: [etc/init.d/vit_loader] Start vit loader
2019-04-18T07:01:25Z jumpstart[66884]: executing start plugin: hpe-smx.init
2019-04-18T07:01:25Z root: /etc/init.d/hpe-smx.init: Collecting PCI info...
2019-04-18T07:01:25Z root: /etc/init.d/hpe-smx.init: getipmibtaddress returns 254. No IPMI driver reload
2019-04-18T07:01:25Z root: /etc/init.d/hpe-smx.init: Done.
2019-04-18T07:01:25Z jumpstart[66884]: executing start plugin: hpe-nmi.init
2019-04-18T07:01:25Z root: hpe-nmi.init: Supported Server detected.  Loading NMI kernel module...
2019-04-18T07:01:25Z root: hpe-nmi.init:  Done.
2019-04-18T07:01:26Z jumpstart[66884]: executing start plugin: hpe-fc.sh
2019-04-18T07:01:26Z root: hpe-fc init script: Generating hba config file...
2019-04-18T07:01:26Z jumpstart[66884]: executing start plugin: sfcbd-watchdog
2019-04-18T07:01:26Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T07:01:26Z sfcbd-init: Exclusive access granted.
2019-04-18T07:01:26Z sfcbd-init: Request to start sfcbd-watchdog, pid 68037
2019-04-18T07:01:27Z sfcbd-config[68047]: Configuration not changed, already enabled
2019-04-18T07:01:27Z sfcbd-config[68053]: new install or upgrade previously completed, no changes made at version 0.0.0
2019-04-18T07:01:27Z sfcbd-config[68053]: file /etc/sfcb/sfcb.cfg update completed.
2019-04-18T07:01:27Z sfcbd-init: snmp has not been enabled.
2019-04-18T07:01:27Z sfcbd-init: starting sfcbd
2019-04-18T07:01:27Z sfcbd-init: Waiting for sfcb to start up.
2019-04-18T07:01:27Z amnesiac[68076]: 3 of 4. Testing Log Levels - LOG_WARNING
2019-04-18T07:01:27Z amnesiac[68076]: 4 of 4. Testing Log Levels - LOG_ERR
2019-04-18T07:01:27Z sfcbd-init: Program started normally.
2019-04-18T07:01:27Z jumpstart[66884]: executing start plugin: wsman
2019-04-18T07:01:27Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T07:01:27Z openwsmand: Exclusive access granted.
2019-04-18T07:01:27Z openwsmand: Starting openwsmand
2019-04-18T07:01:27Z watchdog-openwsmand: [68113] Begin '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process', min-uptime = 60, max-quick-failures = 5, max-total-failures = 10, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:01:28Z watchdog-openwsmand: Executing '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process'  
2019-04-18T07:01:28Z : dlopen /usr/lib/libticket.so.0 failed, error: /usr/lib/libticket.so.0: cannot open shared object file: No such file or directory, exiting. 0 Success
2019-04-18T07:01:28Z : [wrn][68123:/build/mts/release/bora-4152810/cayman_openwsman/openwsman/src/src/server/wsmand.c:320:main] nsswitch.conf successfully stat'ed  
2019-04-18T07:01:28Z jumpstart[66884]: executing start plugin: snmpd
2019-04-18T07:01:28Z root: Starting snmpd
2019-04-18T07:01:28Z root: snmpd has not been enabled.
2019-04-18T07:01:28Z jumpstart[66884]: Jumpstart failed to start: snmpd reason: Execution of command: /etc/init.d/snmpd start failed with status: 1
2019-04-18T07:01:28Z jumpstart[66884]: executing start plugin: xorg
2019-04-18T07:01:28Z jumpstart[66884]: executing start plugin: vmtoolsd
2019-04-18T07:01:28Z jumpstart[66884]: executing start plugin: hp-ams.sh
2019-04-18T07:01:28Z amshelper: Wrapper constructing internal library
2019-04-18T07:01:28Z amshelper[68142]: ams ver 10.6.0-24: Running check for supported server...
2019-04-18T07:01:28Z amshelper[68142]: Wrapper Destructing internal library
2019-04-18T07:01:28Z root: [ams] Agentless Management Service is not supported on this server.
2019-04-18T07:01:29Z jumpstart[66884]: Jumpstart failed to start: hp-ams.sh reason: Execution of command: /etc/init.d/hp-ams.sh start failed with status: 1
2019-04-18T07:01:29Z init: starting pid 68148, tty '': '/bin/apply-host-profiles'  
2019-04-18T07:01:29Z init: starting pid 68149, tty '': '/usr/lib/vmware/secureboot/bin/secureBoot.py ++group=host/vim/vmvisor/boot -a'  
2019-04-18T07:01:29Z backup.sh.68144: Locking esx.conf
2019-04-18T07:01:29Z backup.sh.68144: Creating archive
2019-04-18T07:01:29Z backup.sh.68144: Unlocking esx.conf
2019-04-18T07:01:30Z init: starting pid 68319, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh boot'  
2019-04-18T07:01:30Z addVob[68321]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:01:30Z addVob[68321]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T07:01:30Z addVob[68321]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T07:01:30Z init: starting pid 68325, tty '': '/bin/vmdumper -g 'Boot Successful''  
2019-04-18T07:01:30Z init: starting pid 68326, tty '': '/bin/sh ++min=0,group=host/vim/vimuser/terminal/shell /etc/rc.local'  
2019-04-18T07:01:30Z root: init Running kickstart.py
2019-04-18T07:01:31Z root: init Running local.sh
2019-04-18T07:01:31Z init: starting pid 68355, tty '': '/bin/esxcfg-init --set-boot-progress done'  
2019-04-18T07:01:31Z init: starting pid 68356, tty '': '/bin/vmware-autostart.sh start'  
2019-04-18T07:01:31Z VMware[startup]: Starting VMs
2019-04-18T07:01:31Z init: starting pid 68359, tty '/dev/tty1': '/bin/initterm.sh tty1 /bin/techsupport.sh'  
2019-04-18T07:01:31Z init: starting pid 68360, tty '/dev/tty2': '-/bin/initterm.sh tty2 /bin/dcuiweasel'  
2019-04-18T07:01:31Z DCUI: Starting DCUI
2019-04-18T07:01:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:01:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:01:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:02:01Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68398  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:02:01Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68402  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:02:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:02:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:02:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68460  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:02:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68463  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:02:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68466  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:02:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:03:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:03:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:03:21Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T07:03:21Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001c7ebd094ee9229fffb824
2019-04-18T07:03:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:03:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:03:21Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T07:03:21Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001cc9f2bd5ae7909acd22b5
2019-04-18T07:03:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:03:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:03:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:03:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:03:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:04:01Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:04:01Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68517  
2019-04-18T07:04:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:04:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:04:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:04:01Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:04:01Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68519  
2019-04-18T07:05:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:05:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:05:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:05:01Z crond[66703]: crond: USER root pid 68530 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:05:01Z syslog[68534]: starting hostd probing.
2019-04-18T07:05:20Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:05:20Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68529  
2019-04-18T07:05:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:05:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:05:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:05:20Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:05:20Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68545  
2019-04-18T07:06:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:06:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:06:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:06:39Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:06:39Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68555  
2019-04-18T07:06:39Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:06:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:06:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:06:39Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:06:39Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68558  
2019-04-18T07:07:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:07:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:07:39Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:07:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:07:39Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:07:58Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:07:58Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68568  
2019-04-18T07:07:58Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:07:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:07:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:07:58Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:07:58Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68570  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:54Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:56Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,012 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,012 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x53550b3978>}   
2019-04-18T07:08:57Z hostd-icm[68613]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,012 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,013 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-1dd8-5da2</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-124254486</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T07:08:57Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,130 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,210 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,210 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x1445cc1978>}   
2019-04-18T07:08:57Z hostd-icm[68621]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,210 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,211 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-6956-5dae</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-124254489</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T07:08:57Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:08:57Z ImageConfigManager: 2019-04-18 07:08:57,228 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:08:58Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:08:58Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:08:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:08:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:02Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:09:13Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68581  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:09:13Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68662  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:09:13Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68669  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:09:22Z backup.sh.68674: Locking esx.conf
2019-04-18T07:09:22Z backup.sh.68674: Creating archive
2019-04-18T07:09:22Z backup.sh.68674: Unlocking esx.conf
2019-04-18T07:09:58Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:09:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:09:58Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:10:01Z crond[66703]: crond: USER root pid 68834 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:10:01Z syslog[68838]: starting hostd probing.
2019-04-18T07:10:17Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:10:17Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68833  
2019-04-18T07:10:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:10:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:10:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:10:17Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:10:17Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68849  
2019-04-18T07:11:17Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:11:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:11:17Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:11:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:11:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68861  
2019-04-18T07:11:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:11:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:11:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:11:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:11:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68863  
2019-04-18T07:12:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:12:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:12:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:12:55Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:12:55Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68873  
2019-04-18T07:12:55Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:12:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:12:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:12:55Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:12:55Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68876  
2019-04-18T07:13:55Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:13:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:13:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:14:14Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:14:14Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68887  
2019-04-18T07:14:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:14:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:14:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:14:14Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:14:14Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68889  
2019-04-18T07:15:01Z crond[66703]: crond: USER root pid 68892 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:15:01Z syslog[68896]: starting hostd probing.
2019-04-18T07:15:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:15:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:15:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:15:33Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:15:33Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68912  
2019-04-18T07:15:33Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:15:33Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:15:33Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:15:33Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:15:33Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68914  
2019-04-18T07:16:33Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:16:33Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:16:33Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:16:52Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:16:52Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68923  
2019-04-18T07:16:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:16:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:16:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:16:52Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:16:52Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68927  
2019-04-18T07:17:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:17:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:17:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:17:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:17:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:18:11Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:18:11Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68936  
2019-04-18T07:18:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:18:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:18:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:18:11Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:18:11Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68940  
2019-04-18T07:19:11Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:19:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:19:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:19:30Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:19:30Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68950  
2019-04-18T07:19:30Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:19:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:19:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:19:30Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:19:30Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68953  
2019-04-18T07:20:01Z crond[66703]: crond: USER root pid 68956 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:20:01Z syslog[68960]: starting hostd probing.
2019-04-18T07:20:30Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:20:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:20:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:20:49Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:20:49Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68975  
2019-04-18T07:20:49Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:20:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:20:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:20:49Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:20:49Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68978  
2019-04-18T07:21:49Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:21:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:21:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:22:08Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:22:08Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68987  
2019-04-18T07:22:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:22:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:22:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:22:08Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:22:08Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68990  
2019-04-18T07:22:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:22:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:23:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:23:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:23:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:23:28Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:23:28Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69001  
2019-04-18T07:23:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:23:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:23:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:23:28Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:23:28Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69004  
2019-04-18T07:24:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:24:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:24:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:24:47Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:24:47Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69014  
2019-04-18T07:24:47Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:24:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:24:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:24:47Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:24:47Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69016  
2019-04-18T07:25:01Z crond[66703]: crond: USER root pid 69018 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:25:01Z syslog[69021]: starting hostd probing.
2019-04-18T07:25:47Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:25:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:25:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:26:06Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:26:06Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69037  
2019-04-18T07:26:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:26:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:26:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:26:06Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:26:06Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69041  
2019-04-18T07:27:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:27:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:27:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:27:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:27:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:27:25Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:27:25Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69050  
2019-04-18T07:27:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:27:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:27:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:27:25Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:27:25Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69052  
2019-04-18T07:28:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:28:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:28:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:28:44Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:28:44Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69064  
2019-04-18T07:28:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:28:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:28:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:28:44Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:28:44Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69067  
2019-04-18T07:29:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:29:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:29:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:30:01Z crond[66703]: crond: USER root pid 69079 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:30:01Z syslog[69083]: starting hostd probing.
2019-04-18T07:30:03Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:30:03Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69077  
2019-04-18T07:30:03Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:30:03Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:30:03Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:30:03Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:30:03Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69093  
2019-04-18T07:31:03Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:31:03Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:31:03Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:31:22Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:31:22Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69103  
2019-04-18T07:31:22Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:31:22Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:31:22Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:31:22Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:31:22Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69105  
2019-04-18T07:32:22Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:32:22Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:32:22Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:32:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:32:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:32:41Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:32:41Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69115  
2019-04-18T07:32:41Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:32:41Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:32:41Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:32:41Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:32:41Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69118  
2019-04-18T07:33:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:33:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:33:21Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T07:33:21Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001c7ebd094ee9229fffb824
2019-04-18T07:33:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:33:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:33:21Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T07:33:21Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001cc9f2bd5ae7909acd22b5
2019-04-18T07:33:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:33:21Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:33:41Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:33:41Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:33:41Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:34:00Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:34:00Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69129  
2019-04-18T07:34:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:34:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:34:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:34:00Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:34:00Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69132  
2019-04-18T07:35:00Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:00Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:01Z crond[66703]: crond: USER root pid 69141 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:35:02Z syslog[69145]: starting hostd probing.
2019-04-18T07:35:19Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:19Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69140  
2019-04-18T07:35:19Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:19Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:19Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69157  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:27Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69160  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:27Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69167  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:29Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:29Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69175  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z dcbd: [info]     Ignoring vmnic4 link state change, no port found
2019-04-18T07:35:36Z dcbd: [info]     Ignoring vmnic4 link state change, no port found
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69189  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69196  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69200  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69210  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69217  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:36Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69224  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69232  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69245  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,586 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,586 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x2cfb44d978>}   
2019-04-18T07:35:43Z hostd-icm[69255]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,586 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,587 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-1795-5f53</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-124254695</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T07:35:43Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,702 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,803 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,803 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x7dad19b978>}   
2019-04-18T07:35:43Z hostd-icm[69263]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,803 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,804 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-46af-5f5f</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-124254698</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T07:35:43Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:35:43Z ImageConfigManager: 2019-04-18 07:35:43,822 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:45Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:45Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69275  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:49Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:35:49Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69282  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:35:49Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:36:19Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:36:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:36:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:36:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:36:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69294  
2019-04-18T07:36:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:36:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:36:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:36:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:36:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69298  
2019-04-18T07:37:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:37:23Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:37:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:37:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:37:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:37:57Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:37:57Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69307  
2019-04-18T07:37:57Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:37:57Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:37:57Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:37:57Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:37:57Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69309  
2019-04-18T07:38:57Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:38:57Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:38:57Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:39:16Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:39:16Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69321  
2019-04-18T07:39:16Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:39:16Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:39:16Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:39:16Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:39:16Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69324  
2019-04-18T07:40:01Z crond[66703]: crond: USER root pid 69327 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:40:02Z syslog[69331]: starting hostd probing.
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:08Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,005 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,005 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x7387b37978>}   
2019-04-18T07:40:15Z hostd-icm[69385]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,005 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,006 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-e270-5fae</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-124254731</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T07:40:15Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,119 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,125 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,125 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0xa481e09978>}   
2019-04-18T07:40:15Z hostd-icm[69393]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,125 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,126 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-bdd6-5fba</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-124254734</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T07:40:15Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:40:15Z ImageConfigManager: 2019-04-18 07:40:15,144 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:40:16Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:16Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:16Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:16Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:16Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:16Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:19Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:27Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:40:27Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 69344  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:40:27Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:40:27Z init: starting pid 69419, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh stop'  
2019-04-18T07:40:27Z addVob[69421]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:40:27Z addVob[69421]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T07:40:27Z addVob[69421]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T07:40:27Z init: starting pid 69422, tty '': '/bin/shutdown.sh'  
2019-04-18T07:40:27Z VMware[shutdown]: Stopping VMs
2019-04-18T07:40:28Z jumpstart[69439]: executing stop for daemon hp-ams.sh.
2019-04-18T07:40:28Z root: ams stop watchdog...
2019-04-18T07:40:28Z root: ams-wd: ams-watchdog stop.
2019-04-18T07:40:28Z root: Terminating ams-watchdog process with PID 69447 69448
2019-04-18T07:40:30Z root: ams stop service...
2019-04-18T07:40:32Z jumpstart[69439]: executing stop for daemon xorg.
2019-04-18T07:40:32Z jumpstart[69439]: Jumpstart failed to stop: xorg reason: Execution of command: /etc/init.d/xorg stop failed with status: 3
2019-04-18T07:40:32Z jumpstart[69439]: executing stop for daemon vmsyslogd.
2019-04-18T07:40:32Z jumpstart[69439]: Jumpstart failed to stop: vmsyslogd reason: Execution of command: /etc/init.d/vmsyslogd stop failed with status: 1
2019-04-18T07:40:32Z jumpstart[69439]: executing stop for daemon vmtoolsd.
2019-04-18T07:40:32Z jumpstart[69439]: Jumpstart failed to stop: vmtoolsd reason: Execution of command: /etc/init.d/vmtoolsd stop failed with status: 1
2019-04-18T07:40:32Z jumpstart[69439]: executing stop for daemon wsman.
2019-04-18T07:40:32Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T07:40:32Z openwsmand: Exclusive access granted.
2019-04-18T07:40:33Z openwsmand: Stopping openwsmand
2019-04-18T07:40:33Z watchdog-openwsmand: Watchdog for openwsmand is now 68113
2019-04-18T07:40:33Z watchdog-openwsmand: Terminating watchdog process with PID 68113
2019-04-18T07:40:33Z watchdog-openwsmand: [68113] Signal received: exiting the watchdog
2019-04-18T07:40:33Z jumpstart[69439]: executing stop for daemon snmpd.
2019-04-18T07:40:33Z root: Stopping snmpd by administrative request
2019-04-18T07:40:33Z root: snmpd is not running.
2019-04-18T07:40:33Z jumpstart[69439]: executing stop for daemon sfcbd-watchdog.
2019-04-18T07:40:33Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T07:40:33Z sfcbd-init: Exclusive access granted.
2019-04-18T07:40:33Z sfcbd-init: Request to stop sfcbd-watchdog, pid 69511
2019-04-18T07:40:33Z sfcbd-init: Invoked kill 68076
2019-04-18T07:40:33Z sfcb-vmware_raw[68481]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-18T07:40:33Z sfcb-vmware_base[68473]: VICimProvider exiting on WFU cancelled.
2019-04-18T07:40:33Z sfcb-vmware_base[68473]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-18T07:40:36Z sfcbd-init: stop sfcbd process completed.
2019-04-18T07:40:36Z jumpstart[69439]: executing stop for daemon vit_loader.sh.
2019-04-18T07:40:37Z VITLOADER: [etc/init.d/vit_loader] Shutdown VITD successfully
2019-04-18T07:40:37Z jumpstart[69439]: executing stop for daemon hpe-smx.init.
2019-04-18T07:40:37Z jumpstart[69439]: executing stop for daemon hpe-nmi.init.
2019-04-18T07:40:37Z jumpstart[69439]: executing stop for daemon hpe-fc.sh.
2019-04-18T07:40:37Z jumpstart[69439]: executing stop for daemon lwsmd.
2019-04-18T07:40:37Z watchdog-lwsmd: Watchdog for lwsmd is now 67843
2019-04-18T07:40:37Z watchdog-lwsmd: Terminating watchdog process with PID 67843
2019-04-18T07:40:37Z watchdog-lwsmd: [67843] Signal received: exiting the watchdog
2019-04-18T07:40:37Z lwsmd: Shutting down running services
2019-04-18T07:40:37Z lwsmd: Stopping service: lsass
2019-04-18T07:40:37Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-18T07:40:37Z lwsmd: [lsass-ipc] Listener shut down
2019-04-18T07:40:37Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-18T07:40:37Z lwsmd: [lsass-ipc] Listener shut down
2019-04-18T07:40:37Z lwsmd: [lsass] Machine Password Sync Thread stopping
2019-04-18T07:40:38Z lwsmd: [lsass] LSA Service exiting...
2019-04-18T07:40:38Z lwsmd: Stopping service: rdr
2019-04-18T07:40:38Z lwsmd: Stopping service: lwio
2019-04-18T07:40:38Z lwsmd: [lwio-ipc] Shutting down listener
2019-04-18T07:40:38Z lwsmd: [lwio-ipc] Listener shut down
2019-04-18T07:40:38Z lwsmd: [lwio] LWIO Service exiting...
2019-04-18T07:40:38Z lwsmd: Stopping service: netlogon
2019-04-18T07:40:38Z lwsmd: [netlogon-ipc] Shutting down listener
2019-04-18T07:40:38Z lwsmd: [netlogon-ipc] Listener shut down
2019-04-18T07:40:38Z lwsmd: [netlogon] LWNET Service exiting...
2019-04-18T07:40:38Z lwsmd: Stopping service: lwreg
2019-04-18T07:40:38Z lwsmd: [lwreg-ipc] Shutting down listener
2019-04-18T07:40:38Z lwsmd: [lwreg-ipc] Listener shut down
2019-04-18T07:40:38Z lwsmd: [lwreg] REG Service exiting...
2019-04-18T07:40:38Z lwsmd: [lwsm-ipc] Shutting down listener
2019-04-18T07:40:38Z lwsmd: [lwsm-ipc] Listener shut down
2019-04-18T07:40:38Z lwsmd: Logging stopped
2019-04-18T07:40:40Z jumpstart[69439]: executing stop for daemon vpxa.
2019-04-18T07:40:40Z watchdog-vpxa: Watchdog for vpxa is now 67800
2019-04-18T07:40:40Z watchdog-vpxa: Terminating watchdog process with PID 67800
2019-04-18T07:40:40Z watchdog-vpxa: [67800] Signal received: exiting the watchdog
2019-04-18T07:40:40Z jumpstart[69439]: executing stop for daemon vobd.
2019-04-18T07:40:40Z watchdog-vobd: Watchdog for vobd is now 65960
2019-04-18T07:40:40Z watchdog-vobd: Terminating watchdog process with PID 65960
2019-04-18T07:40:40Z watchdog-vobd: [65960] Signal received: exiting the watchdog
2019-04-18T07:40:40Z jumpstart[69439]: executing stop for daemon dcbd.
2019-04-18T07:40:40Z watchdog-dcbd: Watchdog for dcbd is now 67709
2019-04-18T07:40:40Z watchdog-dcbd: Terminating watchdog process with PID 67709
2019-04-18T07:40:40Z watchdog-dcbd: [67709] Signal received: exiting the watchdog
2019-04-18T07:40:40Z jumpstart[69439]: executing stop for daemon nscd.
2019-04-18T07:40:40Z watchdog-nscd: Watchdog for nscd is now 67729
2019-04-18T07:40:40Z watchdog-nscd: Terminating watchdog process with PID 67729
2019-04-18T07:40:40Z watchdog-nscd: [67729] Signal received: exiting the watchdog
2019-04-18T07:40:40Z jumpstart[69439]: executing stop for daemon cdp.
2019-04-18T07:40:41Z watchdog-cdp: Watchdog for cdp is now 67751
2019-04-18T07:40:41Z watchdog-cdp: Terminating watchdog process with PID 67751
2019-04-18T07:40:41Z watchdog-cdp: [67751] Signal received: exiting the watchdog
2019-04-18T07:40:41Z jumpstart[69439]: executing stop for daemon lacp.
2019-04-18T07:40:41Z watchdog-net-lacp: Watchdog for net-lacp is now 66330
2019-04-18T07:40:41Z watchdog-net-lacp: Terminating watchdog process with PID 66330
2019-04-18T07:40:41Z watchdog-net-lacp: [66330] Signal received: exiting the watchdog
2019-04-18T07:40:41Z jumpstart[69439]: executing stop for daemon smartd.
2019-04-18T07:40:41Z watchdog-smartd: Watchdog for smartd is now 67770
2019-04-18T07:40:41Z watchdog-smartd: Terminating watchdog process with PID 67770
2019-04-18T07:40:41Z watchdog-smartd: [67770] Signal received: exiting the watchdog
2019-04-18T07:40:41Z smartd: [warn] smartd received signal 15
2019-04-18T07:40:41Z smartd: [warn] smartd exit.
2019-04-18T07:40:41Z jumpstart[69439]: executing stop for daemon memscrubd.
2019-04-18T07:40:41Z jumpstart[69439]: Jumpstart failed to stop: memscrubd reason: Execution of command: /etc/init.d/memscrubd stop failed with status: 3
2019-04-18T07:40:41Z jumpstart[69439]: executing stop for daemon slpd.
2019-04-18T07:40:41Z root: slpd Stopping slpd
2019-04-18T07:40:41Z slpd[67701]: SLPD daemon shutting down
2019-04-18T07:40:41Z slpd[67701]: *** SLPD daemon shut down by administrative request
2019-04-18T07:40:42Z jumpstart[69439]: executing stop for daemon sensord.
2019-04-18T07:40:42Z watchdog-sensord: Watchdog for sensord is now 67098
2019-04-18T07:40:42Z watchdog-sensord: Terminating watchdog process with PID 67098
2019-04-18T07:40:42Z watchdog-sensord: [67098] Signal received: exiting the watchdog
2019-04-18T07:40:43Z jumpstart[69439]: executing stop for daemon storageRM.
2019-04-18T07:40:43Z watchdog-storageRM: Watchdog for storageRM is now 67116
2019-04-18T07:40:43Z watchdog-storageRM: Terminating watchdog process with PID 67116
2019-04-18T07:40:43Z watchdog-storageRM: [67116] Signal received: exiting the watchdog
2019-04-18T07:40:43Z jumpstart[69439]: executing stop for daemon hostd.
2019-04-18T07:40:43Z watchdog-hostd: Watchdog for hostd is now 67144
2019-04-18T07:40:43Z watchdog-hostd: Terminating watchdog process with PID 67144
2019-04-18T07:40:43Z watchdog-hostd: [67144] Signal received: exiting the watchdog
2019-04-18T07:40:43Z jumpstart[69439]: executing stop for daemon sdrsInjector.
2019-04-18T07:40:43Z watchdog-sdrsInjector: Watchdog for sdrsInjector is now 67163
2019-04-18T07:40:43Z watchdog-sdrsInjector: Terminating watchdog process with PID 67163
2019-04-18T07:40:43Z watchdog-sdrsInjector: [67163] Signal received: exiting the watchdog
2019-04-18T07:40:44Z jumpstart[69439]: executing stop for daemon nfcd.
2019-04-18T07:40:44Z jumpstart[69439]: executing stop for daemon vvold.
2019-04-18T07:40:44Z jumpstart[69439]: Jumpstart failed to stop: vvold reason: Execution of command: /etc/init.d/vvold stop failed with status: 3
2019-04-18T07:40:44Z jumpstart[69439]: executing stop for daemon rhttpproxy.
2019-04-18T07:40:44Z watchdog-rhttpproxy: Watchdog for rhttpproxy is now 67527
2019-04-18T07:40:44Z watchdog-rhttpproxy: Terminating watchdog process with PID 67527
2019-04-18T07:40:44Z watchdog-rhttpproxy: [67527] Signal received: exiting the watchdog
2019-04-18T07:40:44Z jumpstart[69439]: executing stop for daemon hostdCgiServer.
2019-04-18T07:40:44Z watchdog-hostdCgiServer: Watchdog for hostdCgiServer is now 67554
2019-04-18T07:40:44Z watchdog-hostdCgiServer: Terminating watchdog process with PID 67554
2019-04-18T07:40:44Z watchdog-hostdCgiServer: [67554] Signal received: exiting the watchdog
2019-04-18T07:40:44Z jumpstart[69439]: executing stop for daemon lbtd.
2019-04-18T07:40:44Z watchdog-net-lbt: Watchdog for net-lbt is now 67581
2019-04-18T07:40:44Z watchdog-net-lbt: Terminating watchdog process with PID 67581
2019-04-18T07:40:44Z watchdog-net-lbt: [67581] Signal received: exiting the watchdog
2019-04-18T07:40:45Z jumpstart[69439]: executing stop for daemon rabbitmqproxy.
2019-04-18T07:40:45Z jumpstart[69439]: executing stop for daemon vmfstraced.
2019-04-18T07:40:45Z watchdog-vmfstracegd: PID file /var/run/vmware/watchdog-vmfstracegd.PID does not exist
2019-04-18T07:40:45Z watchdog-vmfstracegd: Unable to terminate watchdog: No running watchdog process for vmfstracegd
2019-04-18T07:40:45Z vmfstracegd: Failed to clear vmfstracegd memory reservation
2019-04-18T07:40:45Z jumpstart[69439]: executing stop for daemon esxui.
2019-04-18T07:40:45Z jumpstart[69439]: executing stop for daemon iofilterd-vmwarevmcrypt.
2019-04-18T07:40:45Z iofilterd-vmwarevmcrypt[70093]: Could not expand environment variable HOME.
2019-04-18T07:40:45Z iofilterd-vmwarevmcrypt[70093]: Could not expand environment variable HOME.
2019-04-18T07:40:45Z iofilterd-vmwarevmcrypt[70093]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:40:45Z iofilterd-vmwarevmcrypt[70093]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:40:45Z iofilterd-vmwarevmcrypt[70093]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:40:45Z iofilterd-vmwarevmcrypt[70093]: Resource Pool clean up for iofilter vmwarevmcrypt is done
2019-04-18T07:40:46Z jumpstart[69439]: executing stop for daemon swapobjd.
2019-04-18T07:40:46Z watchdog-swapobjd: Watchdog for swapobjd is now 67006
2019-04-18T07:40:46Z watchdog-swapobjd: Terminating watchdog process with PID 67006
2019-04-18T07:40:46Z watchdog-swapobjd: [67006] Signal received: exiting the watchdog
2019-04-18T07:40:46Z jumpstart[69439]: executing stop for daemon usbarbitrator.
2019-04-18T07:40:46Z watchdog-usbarbitrator: Watchdog for usbarbitrator is now 67042
2019-04-18T07:40:46Z watchdog-usbarbitrator: Terminating watchdog process with PID 67042
2019-04-18T07:40:46Z watchdog-usbarbitrator: [67042] Signal received: exiting the watchdog
2019-04-18T07:40:46Z jumpstart[69439]: executing stop for daemon iofilterd-spm.
2019-04-18T07:40:46Z iofilterd-spm[70156]: Could not expand environment variable HOME.
2019-04-18T07:40:46Z iofilterd-spm[70156]: Could not expand environment variable HOME.
2019-04-18T07:40:46Z iofilterd-spm[70156]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:40:46Z iofilterd-spm[70156]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:40:46Z iofilterd-spm[70156]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:40:46Z iofilterd-spm[70156]: Resource Pool clean up for iofilter spm is done
2019-04-18T07:40:46Z jumpstart[69439]: executing stop for daemon ESXShell.
2019-04-18T07:40:46Z addVob[70163]: Could not expand environment variable HOME.
2019-04-18T07:40:46Z addVob[70163]: Could not expand environment variable HOME.
2019-04-18T07:40:46Z addVob[70163]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:40:46Z addVob[70163]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:40:46Z addVob[70163]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:40:46Z addVob[70163]: VobUserLib_Init failed with -1
2019-04-18T07:40:46Z doat: Stopped wait on component ESXShell.stop
2019-04-18T07:40:46Z doat: Stopped wait on component ESXShell.disable
2019-04-18T07:40:47Z jumpstart[69439]: executing stop for daemon DCUI.
2019-04-18T07:40:47Z root: DCUI Disabling DCUI logins
2019-04-18T07:40:47Z addVob[70184]: Could not expand environment variable HOME.
2019-04-18T07:40:47Z addVob[70184]: Could not expand environment variable HOME.
2019-04-18T07:40:47Z addVob[70184]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:40:47Z addVob[70184]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:40:47Z addVob[70184]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:40:47Z addVob[70184]: VobUserLib_Init failed with -1
2019-04-18T07:40:47Z jumpstart[69439]: executing stop for daemon ntpd.
2019-04-18T07:40:47Z root: ntpd Stopping ntpd
2019-04-18T07:40:47Z watchdog-ntpd: Watchdog for ntpd is now 66917
2019-04-18T07:40:47Z watchdog-ntpd: Terminating watchdog process with PID 66917
2019-04-18T07:40:47Z watchdog-ntpd: [66917] Signal received: exiting the watchdog
2019-04-18T07:40:47Z ntpd[66927]: ntpd exiting on signal 1 (Hangup)
2019-04-18T07:40:47Z ntpd[66927]: 136.243.177.133 local addr 192.168.20.20 -> <null>
2019-04-18T07:40:47Z jumpstart[69439]: executing stop for daemon SSH.
2019-04-18T07:40:47Z addVob[70216]: Could not expand environment variable HOME.
2019-04-18T07:40:47Z addVob[70216]: Could not expand environment variable HOME.
2019-04-18T07:40:47Z addVob[70216]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:40:47Z addVob[70216]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:40:47Z addVob[70216]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:40:47Z addVob[70216]: VobUserLib_Init failed with -1
2019-04-18T07:40:47Z doat: Stopped wait on component RemoteShell.disable
2019-04-18T07:40:47Z doat: Stopped wait on component RemoteShell.stop
2019-04-18T07:40:48Z backup.sh.70272: Locking esx.conf
2019-04-18T07:40:48Z backup.sh.70272: Creating archive
2019-04-18T07:40:48Z backup.sh.70272: Unlocking esx.conf
2019-04-18T07:46:18Z watchdog-vobd: [65960] Begin '/usr/lib/vmware/vob/bin/vobd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:46:18Z watchdog-vobd: Executing '/usr/lib/vmware/vob/bin/vobd'  
2019-04-18T07:46:18Z jumpstart[65945]: Launching Executor
2019-04-18T07:46:18Z jumpstart[65945]: Setting up Executor - Reset Requested
2019-04-18T07:46:18Z jumpstart[65945]: ignoring plugin 'vsan-upgrade' because version '2.0.0'  has already been run.  
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: check-required-memory
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: restore-configuration
2019-04-18T07:46:19Z jumpstart[65991]: restoring configuration
2019-04-18T07:46:19Z jumpstart[65991]: extracting from file /local.tgz
2019-04-18T07:46:19Z jumpstart[65991]: file etc/likewise/db/registry.db has been changed before restoring the configuration - the changes will be lost
2019-04-18T07:46:19Z jumpstart[65991]: ConfigCheck: Running ipv6 option upgrade, redundantly
2019-04-18T07:46:19Z jumpstart[65991]: Util: tcpip4 IPv6 enabled
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: vmkeventd
2019-04-18T07:46:19Z watchdog-vmkeventd: [65993] Begin '/usr/lib/vmware/vmkeventd/bin/vmkeventd', min-uptime = 10, max-quick-failures = 5, max-total-failures = 9999999, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:46:19Z watchdog-vmkeventd: Executing '/usr/lib/vmware/vmkeventd/bin/vmkeventd'  
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: vmkcrypto
2019-04-18T07:46:19Z jumpstart[65970]: 65971:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T07:46:19Z jumpstart[65970]: 65971:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T07:46:19Z jumpstart[65970]: 65971:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: autodeploy-enabled
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: vsan-base
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: vsan-early
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: advanced-user-configuration-options
2019-04-18T07:46:19Z jumpstart[65945]: executing start plugin: restore-advanced-configuration
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: PSA-boot-config
2019-04-18T07:46:20Z jumpstart[65970]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:46:20Z jumpstart[65970]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T07:46:20Z jumpstart[65970]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T07:46:20Z jumpstart[65970]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T07:46:20Z jumpstart[65970]: lib/ssl: protocol list tls1.2
2019-04-18T07:46:20Z jumpstart[65970]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T07:46:20Z jumpstart[65970]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: vprobe
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: vmkapi-mgmt
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: dma-engine
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: procfs
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: mgmt-vmkapi-compatibility
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: iodm
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: vmkernel-vmkapi-compatibility
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: driver-status-check
2019-04-18T07:46:20Z jumpstart[66023]: driver_status_check: boot cmdline: /jumpstrt.gz vmbTrustedBoot=false tboot=0x101b000 installerDiskDumpSlotSize=2560 no-auto-partition bootUUID=e78269d0448c41fe200c24e8a54f93c1
2019-04-18T07:46:20Z jumpstart[66023]: driver_status_check: useropts:
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: hardware-config
2019-04-18T07:46:20Z jumpstart[66024]: Failed to symlink /etc/vmware/pci.ids: No such file or directory
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: vmklinux
2019-04-18T07:46:20Z jumpstart[65945]: executing start plugin: vmkdevmgr
2019-04-18T07:46:20Z jumpstart[66025]: Starting vmkdevmgr
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: register-vmw-mpp
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: register-vmw-satp
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: register-vmw-psp
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: etherswitch
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: aslr
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: random
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: storage-early-config-dev-settings
2019-04-18T07:46:27Z jumpstart[65945]: executing start plugin: networking-drivers
2019-04-18T07:46:27Z jumpstart[66145]: Loading network device drivers
2019-04-18T07:46:30Z jumpstart[66145]: LoadVmklinuxDriver: Loaded module bnx2
2019-04-18T07:46:31Z jumpstart[65945]: executing start plugin: register-vmw-vaai
2019-04-18T07:46:31Z jumpstart[65945]: executing start plugin: usb
2019-04-18T07:46:31Z jumpstart[65945]: executing start plugin: local-storage
2019-04-18T07:46:31Z jumpstart[65945]: executing start plugin: psa-mask-paths
2019-04-18T07:46:31Z jumpstart[65945]: executing start plugin: network-uplink-init
2019-04-18T07:46:31Z jumpstart[66226]: Trying to connect...
2019-04-18T07:46:31Z jumpstart[66226]: Connected.
2019-04-18T07:46:33Z jumpstart[66226]: Received processed
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: psa-nmp-pre-claim-config
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: psa-filter-pre-claim-config
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: restore-system-uuid
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: restore-storage-multipathing
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: network-support
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: psa-load-rules
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: vds-vmkapi-compatibility
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: psa-filter-post-claim-config
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: psa-nmp-post-claim-config
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: mlx4_en
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: dvfilters-vmkapi-compatibility
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: vds-config
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: storage-drivers
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: vxlan-base
2019-04-18T07:46:33Z jumpstart[65945]: executing start plugin: firewall
2019-04-18T07:46:34Z jumpstart[65945]: executing start plugin: dvfilter-config
2019-04-18T07:46:34Z jumpstart[65945]: executing start plugin: dvfilter-generic-fastpath
2019-04-18T07:46:34Z jumpstart[65945]: executing start plugin: lacp-daemon
2019-04-18T07:46:34Z watchdog-net-lacp: [66323] Begin '/usr/sbin/net-lacp', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:46:34Z watchdog-net-lacp: Executing '/usr/sbin/net-lacp'  
2019-04-18T07:46:34Z jumpstart[65945]: executing start plugin: storage-psa-init
2019-04-18T07:46:34Z jumpstart[66333]: Trying to connect...
2019-04-18T07:46:34Z jumpstart[66333]: Connected.
2019-04-18T07:46:34Z jumpstart[66333]: Received processed
2019-04-18T07:46:34Z jumpstart[65945]: executing start plugin: restore-networking
2019-04-18T07:46:35Z jumpstart[65970]: NetworkInfoImpl: Enabling 1 netstack instances during boot
2019-04-18T07:46:40Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Storing previous management interface:'vmk0'  
2019-04-18T07:46:40Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk0'  
2019-04-18T07:46:40Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk1'  
2019-04-18T07:46:40Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing config for'vmk0'  
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T07:46:40Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk0
2019-04-18T07:46:40Z jumpstart[65970]: VmkNic: Ipv6 not Enabled
2019-04-18T07:46:40Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk0
2019-04-18T07:46:40Z jumpstart[65970]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T07:46:40Z jumpstart[65970]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk0', portStr:'Management Network'  
2019-04-18T07:46:40Z jumpstart[65970]: VmKernelNicInfo::LoadConfig: Processing config for'vmk1'  
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T07:46:40Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk1
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk1
2019-04-18T07:46:40Z jumpstart[65970]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T07:46:40Z jumpstart[65970]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk1', portStr:'NFS-FreeNAS'  
2019-04-18T07:46:40Z jumpstart[65970]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T07:46:40Z jumpstart[65970]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T07:46:40Z jumpstart[65970]: 2019-04-18T07:46:40Z jumpstart[65970]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T07:46:40Z jumpstart[65945]: executing start plugin: random-seed
2019-04-18T07:46:40Z jumpstart[65945]: executing start plugin: dvfilters
2019-04-18T07:46:40Z jumpstart[65945]: executing start plugin: restore-pxe-marker
2019-04-18T07:46:40Z jumpstart[65945]: executing start plugin: auto-configure-networking
2019-04-18T07:46:40Z jumpstart[65945]: executing start plugin: storage-early-configuration
2019-04-18T07:46:40Z jumpstart[66410]: 66410:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T07:46:40Z jumpstart[66410]: 66410:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T07:46:40Z jumpstart[66410]: 66410:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T07:46:40Z jumpstart[66410]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:46:40Z jumpstart[66410]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T07:46:40Z jumpstart[66410]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T07:46:40Z jumpstart[66410]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T07:46:40Z jumpstart[66410]: lib/ssl: protocol list tls1.2
2019-04-18T07:46:40Z jumpstart[66410]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T07:46:40Z jumpstart[66410]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T07:46:41Z jumpstart[65945]: executing start plugin: bnx2fc
2019-04-18T07:46:41Z jumpstart[65945]: executing start plugin: software-iscsi
2019-04-18T07:46:41Z jumpstart[65970]: iScsi: No iBFT data present in the BIOS
2019-04-18T07:46:41Z iscsid: Notice: iSCSI Database already at latest schema. (Upgrade Skipped).
2019-04-18T07:46:41Z iscsid: iSCSI MASTER Database opened. (0x507b008)
2019-04-18T07:46:41Z iscsid: LogLevel = 0
2019-04-18T07:46:41Z iscsid: LogSync  = 0
2019-04-18T07:46:41Z iscsid: memory (180) MB successfully reserved for 1024 sessions
2019-04-18T07:46:41Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e64) idx (0) size (460808)
2019-04-18T07:46:41Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e62) idx (1) size (460808)
2019-04-18T07:46:41Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e60) idx (2) size (460808)
2019-04-18T07:46:41Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e5e) idx (3) size (460808)
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T07:46:42Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T07:46:42Z jumpstart[65945]: executing start plugin: fcoe-config
2019-04-18T07:46:42Z jumpstart[65945]: executing start plugin: storage-path-claim
2019-04-18T07:46:46Z jumpstart[65970]: StorageInfo: Number of paths 3
2019-04-18T07:46:50Z jumpstart[65970]: StorageInfo: Number of devices 3
2019-04-18T07:46:50Z jumpstart[65970]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T07:48:56Z mark: storage-path-claim-completed
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: gss
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: mount-filesystems
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: restore-paths
2019-04-18T07:46:50Z jumpstart[65970]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: filesystem-drivers
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: rpc
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: dump-partition
2019-04-18T07:46:50Z jumpstart[65970]: execution of 'system coredump partition set --enable=true --smart' failed : Unable to smart activate a dump partition.  Error was: No suitable diagnostic partitions found..  
2019-04-18T07:46:50Z jumpstart[65970]: 2019-04-18T07:46:50Z jumpstart[65945]: Executor failed executing esxcli command system coredump partition set --enable=true --smart
2019-04-18T07:46:50Z jumpstart[65945]: Method invocation failed: dump-partition->start() failed: error while executing the cli
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: vsan-devel
2019-04-18T07:46:50Z jumpstart[66431]: VsanDevel: DevelBootDelay: 0
2019-04-18T07:46:50Z jumpstart[66431]: VsanDevel: DevelWipeConfigOnBoot: 0
2019-04-18T07:46:50Z jumpstart[66431]: VsanDevel: DevelTagSSD: Starting
2019-04-18T07:46:50Z jumpstart[66431]: 66431:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T07:46:50Z jumpstart[66431]: 66431:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T07:46:50Z jumpstart[66431]: 66431:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T07:46:50Z jumpstart[66431]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:46:50Z jumpstart[66431]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T07:46:50Z jumpstart[66431]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T07:46:50Z jumpstart[66431]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T07:46:50Z jumpstart[66431]: lib/ssl: protocol list tls1.2
2019-04-18T07:46:50Z jumpstart[66431]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T07:46:50Z jumpstart[66431]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T07:46:50Z jumpstart[66431]: VsanDevel: DevelTagSSD: Done.
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: vmfs
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: ufs
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: vfat
2019-04-18T07:46:50Z jumpstart[65945]: executing start plugin: nfsgssd
2019-04-18T07:46:51Z watchdog-nfsgssd: [66634] Begin '/usr/lib/vmware/nfs/bin/nfsgssd -f -a', min-uptime = 60, max-quick-failures = 128, max-total-failures = 65536, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:46:51Z watchdog-nfsgssd: Executing '/usr/lib/vmware/nfs/bin/nfsgssd -f -a'  
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: vsan
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: krb5
2019-04-18T07:46:51Z nfsgssd[66643]: Could not expand environment variable HOME.
2019-04-18T07:46:51Z nfsgssd[66643]: Could not expand environment variable HOME.
2019-04-18T07:46:51Z nfsgssd[66643]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:46:51Z nfsgssd[66643]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:46:51Z nfsgssd[66643]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:46:51Z nfsgssd[66643]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T07:46:51Z nfsgssd[66643]: lib/ssl: protocol list tls1.2
2019-04-18T07:46:51Z nfsgssd[66643]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T07:46:51Z nfsgssd[66643]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T07:46:51Z nfsgssd[66643]: Empty epoch file
2019-04-18T07:46:51Z nfsgssd[66643]: Starting with epoch 1
2019-04-18T07:46:51Z nfsgssd[66643]: Connected to SunRPCGSS version 1.0
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: etc-hosts
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: nfs
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: nfs41
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: mount-disk-fs
2019-04-18T07:46:51Z jumpstart[65970]: VmFileSystem: Automounted volume 5a6f6646-d13e2d89-fd8d-b499babd4e5e
2019-04-18T07:46:51Z jumpstart[65970]: VmFileSystem: Automounted volume 5ab363c3-c36e8e9f-8cfc-b499babd4e5e
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: auto-configure-pmem
2019-04-18T07:46:51Z jumpstart[65945]: executing start plugin: restore-nfs-volumes
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:46:51Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:46:51Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:46:51Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:46:51Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:46:51Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:46:51Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:46:51Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:46:51Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:46:51Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:47:53Z jumpstart[65970]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "NFS-FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T07:47:53Z jumpstart[65970]: 2019-04-18T07:47:53Z jumpstart[65970]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T07:47:53Z jumpstart[65970]: 2019-04-18T07:47:53Z jumpstart[65945]: Executor failed executing esxcli command boot storage restore --nfs-volumes
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:47:53Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:47:53Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:47:53Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:47:53Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:47:53Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:47:53Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:47:53Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:47:53Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:47:53Z jumpstart[65945]: Method invocation failed: restore-nfs-volumes->start() failed: error while executing the cli
2019-04-18T07:47:53Z jumpstart[65945]: executing start plugin: auto-configure-storage
2019-04-18T07:47:53Z jumpstart[65945]: executing start plugin: restore-bootbanks
2019-04-18T07:47:53Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:47:56Z jumpstart[65970]: VmkCtl: Boot device not available, waited 3 seconds
2019-04-18T07:47:56Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:47:59Z jumpstart[65970]: VmkCtl: Boot device not available, waited 6 seconds
2019-04-18T07:47:59Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:02Z jumpstart[65970]: VmkCtl: Boot device not available, waited 9 seconds
2019-04-18T07:48:02Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:05Z jumpstart[65970]: VmkCtl: Boot device not available, waited 12 seconds
2019-04-18T07:48:05Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:08Z jumpstart[65970]: VmkCtl: Boot device not available, waited 15 seconds
2019-04-18T07:48:08Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:11Z jumpstart[65970]: VmkCtl: Boot device not available, waited 18 seconds
2019-04-18T07:48:11Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:14Z jumpstart[65970]: VmkCtl: Boot device not available, waited 21 seconds
2019-04-18T07:48:14Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:17Z jumpstart[65970]: VmkCtl: Boot device not available, waited 24 seconds
2019-04-18T07:48:17Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:20Z jumpstart[65970]: VmkCtl: Boot device not available, waited 27 seconds
2019-04-18T07:48:20Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:23Z jumpstart[65970]: VmkCtl: Boot device not available, waited 30 seconds
2019-04-18T07:48:23Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:26Z jumpstart[65970]: VmkCtl: Boot device not available, waited 33 seconds
2019-04-18T07:48:26Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:29Z jumpstart[65970]: VmkCtl: Boot device not available, waited 36 seconds
2019-04-18T07:48:29Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:32Z jumpstart[65970]: VmkCtl: Boot device not available, waited 39 seconds
2019-04-18T07:48:32Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:35Z jumpstart[65970]: VmkCtl: Boot device not available, waited 42 seconds
2019-04-18T07:48:35Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:38Z jumpstart[65970]: VmkCtl: Boot device not available, waited 45 seconds
2019-04-18T07:48:38Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:41Z jumpstart[65970]: VmkCtl: Boot device not available, waited 48 seconds
2019-04-18T07:48:41Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:44Z jumpstart[65970]: VmkCtl: Boot device not available, waited 51 seconds
2019-04-18T07:48:44Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:47Z jumpstart[65970]: VmkCtl: Boot device not available, waited 54 seconds
2019-04-18T07:48:47Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:50Z jumpstart[65970]: VmkCtl: Boot device not available, waited 57 seconds
2019-04-18T07:48:50Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:53Z jumpstart[65970]: VmkCtl: Boot device not available, waited 60 seconds
2019-04-18T07:48:53Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:53Z jumpstart[65970]: VmkCtl: Did not find a valid boot device, symlinking /bootbank to /tmp
2019-04-18T07:48:53Z jumpstart[65945]: executing start plugin: restore-host-cache
2019-04-18T07:48:53Z jumpstart[65970]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T07:48:53Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T07:48:53Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:48:53Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:48:53Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:48:53Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:53Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:53Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:48:53Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:48:53Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:48:53Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:53Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:53Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:48:53Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:48:53Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T07:48:53Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T07:48:53Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T07:48:53Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T07:48:53Z jumpstart[65970]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T07:48:53Z jumpstart[65970]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T07:48:53Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T07:48:53Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T07:48:53Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T07:48:53Z jumpstart[65970]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55099523072
2019-04-18T07:48:54Z jumpstart[65970]: SlowRefresh: path /vmfs/volumes/5ab363c4-26d208a0-fab7-b499babd4e5e total blocks 2500207837184 used blocks 2400621428736
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: vflash
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: dump-file
2019-04-18T07:48:54Z jumpstart[65970]: VmkCtl: Diagnostic File found; not auto creating
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:48:54Z jumpstart[65970]: VmkCtl: Dump file determined to be large enough, size: 1588592640 (recommended minimum: 1588592640)
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: vmci
2019-04-18T07:48:54Z jumpstart[65970]: execution of 'system module load --module vmci' failed : Unable to load module /usr/lib/vmware/vmkmod/vmci: Busy  
2019-04-18T07:48:54Z jumpstart[65945]: Executor failed executing esxcli command system module load --module vmci
2019-04-18T07:48:54Z jumpstart[65945]: Method invocation failed: vmci->start() failed: error while executing the cli
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: configure-locker
2019-04-18T07:48:54Z jumpstart[66687]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /scratch
2019-04-18T07:48:54Z jumpstart[66687]: Using /locker/packages/6.5.0/ as /productLocker
2019-04-18T07:48:54Z jumpstart[66687]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /locker
2019-04-18T07:48:54Z jumpstart[66687]: Using policy dir /etc/vmware/secpolicy
2019-04-18T07:48:54Z jumpstart[66687]: Parsed all objects
2019-04-18T07:48:54Z jumpstart[66687]: Objects defined and obsolete objects removed
2019-04-18T07:48:54Z jumpstart[66687]: Parsed all domain names
2019-04-18T07:48:54Z jumpstart[66687]: Domain policies parsed and syntax validated
2019-04-18T07:48:54Z jumpstart[66687]: Constraints check for domain policies succeeded
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /usr/share/nvidia
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /productLocker
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /.vmware
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /dev/vsansparse
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /dev/cbt
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /dev/svm
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /dev/upit
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /dev/vsan
2019-04-18T07:48:54Z jumpstart[66687]: Getting realpath failed: /dev/vvol
2019-04-18T07:48:54Z jumpstart[66687]: Domain policies set
2019-04-18T07:48:54Z jumpstart[66687]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T07:48:54Z jumpstart[66687]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T07:48:54Z jumpstart[66687]: Parsed all the tardisk policy files
2019-04-18T07:48:54Z jumpstart[66687]: Set all the tardisk labels and policy
2019-04-18T07:48:54Z jumpstart[66687]: Parsed all file label mappings
2019-04-18T07:48:54Z jumpstart[66687]: Set all file labels
2019-04-18T07:48:54Z jumpstart[66687]: System security policy has been set successfully
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: restore-system-swap
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: cbrc
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: tpm
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: apei
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: restore-security-policies
2019-04-18T07:48:54Z jumpstart[66693]: Using policy dir /etc/vmware/secpolicy
2019-04-18T07:48:54Z jumpstart[66693]: Parsed all objects
2019-04-18T07:48:54Z jumpstart[66693]: Objects defined and obsolete objects removed
2019-04-18T07:48:54Z jumpstart[66693]: Parsed all domain names
2019-04-18T07:48:54Z jumpstart[66693]: Domain policies parsed and syntax validated
2019-04-18T07:48:54Z jumpstart[66693]: Constraints check for domain policies succeeded
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /usr/share/nvidia
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /productLocker
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /.vmware
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /dev/vsansparse
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /dev/cbt
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /dev/svm
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /dev/upit
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /dev/vsan
2019-04-18T07:48:54Z jumpstart[66693]: Getting realpath failed: /dev/vvol
2019-04-18T07:48:54Z jumpstart[66693]: Domain policies set
2019-04-18T07:48:54Z jumpstart[66693]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T07:48:54Z jumpstart[66693]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T07:48:54Z jumpstart[66693]: Parsed all the tardisk policy files
2019-04-18T07:48:54Z jumpstart[66693]: Set all the tardisk labels and policy
2019-04-18T07:48:54Z jumpstart[66693]: Parsed all file label mappings
2019-04-18T07:48:54Z jumpstart[66693]: Set all file labels
2019-04-18T07:48:54Z jumpstart[66693]: System security policy has been set successfully
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: oem-modules
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: crond
2019-04-18T07:48:54Z crond[66697]: crond: crond (busybox 1.22.1) started, log level 8
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: restore-resource-groups
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: procMisc
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: rdma-vmkapi-compatibility
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: ipmi
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: restore-keymap
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: nmp-vmkapi-compatibility
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: iscsi-vmkapi-compatibility
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: ftcpt
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: hbr
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: autodeploy-setpassword
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: inetd
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: vrdma
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: tag-boot-bank
2019-04-18T07:48:54Z jumpstart[66784]: unable to open boot configuration: No such file or directory
2019-04-18T07:48:54Z jumpstart[65945]: Method invocation failed: tag-boot-bank->start() failed: exited with code 1
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: system-image-cache
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: iofilters
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: vit
2019-04-18T07:48:54Z jumpstart[65970]: Parser: Initializing VIT parser lib
2019-04-18T07:48:54Z jumpstart[65970]: VsanIscsiTargetImpl: The host is not in a Virtual SAN cluster.
2019-04-18T07:48:54Z jumpstart[65970]: Util: Retrieved vit status successfully
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: vmotion
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: vfc
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: balloonVMCI
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: coredump-configuration
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0x3a =                0x5
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x485 =            0x401e7
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x486 =         0x80000021
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x487 =         0xffffffff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x488 =             0x2000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x489 =            0x267ff
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48a =               0x2a
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR      0x491 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR 0xc0010114 =                  0
2019-04-18T07:48:54Z jumpstart[65970]: Common: MSR       0xce =      0xc0004011503
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:54Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:54Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:48:54Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:54Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:54Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:48:54Z jumpstart[65970]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:48:54Z jumpstart[65970]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T07:48:54Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:54Z jumpstart[65970]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T07:48:54Z jumpstart[65970]: ObjLibPluginInit: Initialized plugin
2019-04-18T07:48:54Z jumpstart[65970]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: Objlib initialized.
2019-04-18T07:48:54Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T07:48:54Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T07:48:54Z jumpstart[65970]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T07:48:54Z jumpstart[65970]: GetTypedFileSystems: fstype vfat
2019-04-18T07:48:54Z jumpstart[65970]: GetTypedFileSystems: fstype ufs
2019-04-18T07:48:54Z jumpstart[65970]: GetTypedFileSystems: fstype vvol
2019-04-18T07:48:54Z jumpstart[65970]: GetTypedFileSystems: fstype vsan
2019-04-18T07:48:54Z jumpstart[65970]: GetTypedFileSystems: fstype PMEM
2019-04-18T07:48:54Z jumpstart[65970]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55099523072
2019-04-18T07:48:54Z jumpstart[65970]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T07:48:54Z jumpstart[65945]: executing start plugin: set-acceptance-level
2019-04-18T07:48:55Z jumpstart[65945]: executing start plugin: scratch-storage
2019-04-18T07:48:56Z jumpstart[65945]: executing start plugin: pingback
2019-04-18T07:48:56Z jumpstart[65945]: executing start plugin: vmswapcleanup
2019-04-18T07:48:56Z jumpstart[65970]: execution of '--plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup' failed : Host Local Swap Location has not been enabled  
2019-04-18T07:48:56Z jumpstart[65945]: Executor failed executing esxcli command --plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup
2019-04-18T07:48:56Z jumpstart[65945]: Method invocation failed: vmswapcleanup->start() failed: error while executing the cli
2019-04-18T07:48:56Z jumpstart[65970]: Jumpstart executor signalled to stop
2019-04-18T07:48:56Z jumpstart[65945]: Executor has been Successfully Stopped
2019-04-18T07:48:56Z init: starting pid 66824, tty '': '/usr/lib/vmware/firstboot/bin/firstboot.py ++group=host/vim/vmvisor/boot -l'  
2019-04-18T07:48:56Z init: starting pid 66825, tty '': '/bin/services.sh start'  
2019-04-18T07:48:57Z jumpstart[66878]: executing start plugin: ESXShell
2019-04-18T07:48:57Z addVob[66884]: Could not expand environment variable HOME.
2019-04-18T07:48:57Z addVob[66884]: Could not expand environment variable HOME.
2019-04-18T07:48:57Z addVob[66884]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:48:57Z addVob[66884]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:48:57Z addVob[66884]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:48:57Z jumpstart[66878]: executing start plugin: DCUI
2019-04-18T07:48:57Z root: DCUI Enabling DCUI login: runlevel =
2019-04-18T07:48:57Z addVob[66899]: Could not expand environment variable HOME.
2019-04-18T07:48:57Z addVob[66899]: Could not expand environment variable HOME.
2019-04-18T07:48:57Z addVob[66899]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:48:57Z addVob[66899]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:48:57Z addVob[66899]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:48:57Z jumpstart[66878]: executing start plugin: ntpd
2019-04-18T07:48:57Z root: ntpd Starting ntpd
2019-04-18T07:48:57Z sntp[66904]: sntp 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:44 UTC 2016 (1)
2019-04-18T07:48:57Z sntp[66904]: 2019-04-18 07:48:57.762032 (+0000) +1.12300 +/- 0.748832 pool.ntp.org 134.34.3.18 s1 no-leap
2019-04-18T07:48:58Z watchdog-ntpd: [66911] Begin '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift', min-uptime = 60, max-quick-failures = 5, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:48:58Z watchdog-ntpd: Executing '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift'  
2019-04-18T07:48:58Z ntpd[66921]: ntpd 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:59 UTC 2016 (1): Starting
2019-04-18T07:48:58Z ntpd[66921]: Command line: /sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift
2019-04-18T07:48:58Z ntpd[66921]: proto: precision = 0.467 usec (-21)
2019-04-18T07:48:58Z ntpd[66921]: restrict default: KOD does nothing without LIMITED.
2019-04-18T07:48:58Z ntpd[66921]: Listen and drop on 0 v6wildcard [::]:123
2019-04-18T07:48:58Z ntpd[66921]: Listen and drop on 1 v4wildcard 0.0.0.0:123
2019-04-18T07:48:58Z ntpd[66921]: Listen normally on 2 lo0 127.0.0.1:123
2019-04-18T07:48:58Z ntpd[66921]: Listen normally on 3 vmk0 192.168.20.20:123
2019-04-18T07:48:58Z ntpd[66921]: Listen normally on 4 vmk1 192.168.55.60:123
2019-04-18T07:48:58Z ntpd[66921]: Listen normally on 5 lo0 [::1]:123
2019-04-18T07:48:58Z ntpd[66921]: Listen normally on 6 lo0 [fe80::1%1]:123
2019-04-18T07:48:58Z ntpd[66921]: Listen normally on 7 vmk1 [fe80::250:56ff:fe67:b2b0%3]:123
2019-04-18T07:49:00Z jumpstart[66878]: executing start plugin: SSH
2019-04-18T07:49:00Z addVob[66928]: Could not expand environment variable HOME.
2019-04-18T07:49:00Z addVob[66928]: Could not expand environment variable HOME.
2019-04-18T07:49:00Z addVob[66928]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:49:00Z addVob[66928]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:49:00Z addVob[66928]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:49:00Z jumpstart[66878]: executing start plugin: esxui
2019-04-18T07:49:00Z jumpstart[66878]: executing start plugin: iofilterd-vmwarevmcrypt
2019-04-18T07:49:00Z iofilterd-vmwarevmcrypt[66957]: Could not expand environment variable HOME.
2019-04-18T07:49:00Z iofilterd-vmwarevmcrypt[66957]: Could not expand environment variable HOME.
2019-04-18T07:49:00Z iofilterd-vmwarevmcrypt[66957]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:49:00Z iofilterd-vmwarevmcrypt[66957]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:49:00Z iofilterd-vmwarevmcrypt[66957]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:49:00Z iofilterd-vmwarevmcrypt[66957]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T07:49:01Z watchdog-iofiltervpd: [66970] Begin '/usr/lib/vmware/iofilter/bin/ioFilterVPServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:01Z watchdog-iofiltervpd: Executing '/usr/lib/vmware/iofilter/bin/ioFilterVPServer'  
2019-04-18T07:49:03Z jumpstart[66878]: executing start plugin: swapobjd
2019-04-18T07:49:03Z watchdog-swapobjd: [66998] Begin '/usr/lib/vmware/swapobj/bin/swapobjd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:03Z watchdog-swapobjd: Executing '/usr/lib/vmware/swapobj/bin/swapobjd'  
2019-04-18T07:49:03Z jumpstart[66878]: executing start plugin: usbarbitrator
2019-04-18T07:49:03Z usbarbitrator: evicting objects on USB from OC
2019-04-18T07:49:03Z usbarbitrator: unclaiming USB devices
2019-04-18T07:49:03Z usbarbitrator: rescanning to complete removal of USB devices
2019-04-18T07:49:04Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T07:49:04Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T07:49:04Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T07:49:04Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T07:49:04Z watchdog-usbarbitrator: [67036] Begin '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:04Z watchdog-usbarbitrator: Executing '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414'  
2019-04-18T07:49:04Z jumpstart[66878]: executing start plugin: iofilterd-spm
2019-04-18T07:49:04Z iofilterd-spm[67055]: Could not expand environment variable HOME.
2019-04-18T07:49:04Z iofilterd-spm[67055]: Could not expand environment variable HOME.
2019-04-18T07:49:04Z iofilterd-spm[67055]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:49:04Z iofilterd-spm[67055]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T07:49:04Z iofilterd-spm[67055]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T07:49:04Z iofilterd-spm[67055]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T07:49:04Z jumpstart[66878]: executing start plugin: sensord
2019-04-18T07:49:04Z usbarbitrator: Starting USB storage detach monitor
2019-04-18T07:49:04Z usbarbitrator: reservedHbas:
2019-04-18T07:49:04Z watchdog-sensord: [67094] Begin '/usr/lib/vmware/bin/sensord -l', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:04Z watchdog-sensord: Executing '/usr/lib/vmware/bin/sensord -l'  
2019-04-18T07:49:04Z jumpstart[66878]: executing start plugin: storageRM
2019-04-18T07:49:04Z watchdog-storageRM: [67114] Begin '/sbin/storageRM', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:04Z watchdog-storageRM: Executing '/sbin/storageRM'  
2019-04-18T07:49:05Z jumpstart[66878]: executing start plugin: hostd
2019-04-18T07:49:05Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T07:49:05Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T07:49:05Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T07:49:05Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T07:49:05Z hostd-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/hostd/config.xml" to file "/etc/vmware/hostd/config.xml" [force=False]   
2019-04-18T07:49:05Z hostd-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T07:49:05Z hostd-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T07:49:05Z usbarbitrator: Exiting USB storage detach monitor
2019-04-18T07:49:05Z create-statsstore[67135]: Initiating hostd statsstore ramdisk size (re)evaluation.
2019-04-18T07:49:05Z create-statsstore[67135]: Maximum number of virtual machines supported for powering-on 384. Maximum number of virtual machines supported for register 1536. Maximum number of resource pools 1000.
2019-04-18T07:49:05Z create-statsstore[67135]: Estimating statsstore ramdisk of size 803MB will be needed.
2019-04-18T07:49:05Z create-statsstore[67135]: Creating statsstore ramdisk mount point /var/lib/vmware/hostd/stats.
2019-04-18T07:49:05Z create-statsstore[67135]: Creating new statsstore ramdisk with 803MB.
2019-04-18T07:49:05Z watchdog-hostd: [67142] Begin 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:05Z watchdog-hostd: Executing 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml'  
2019-04-18T07:49:05Z jumpstart[66878]: executing start plugin: sdrsInjector
2019-04-18T07:49:05Z watchdog-sdrsInjector: [67161] Begin '/sbin/sdrsInjector', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:05Z watchdog-sdrsInjector: Executing '/sbin/sdrsInjector'  
2019-04-18T07:49:05Z jumpstart[66878]: executing start plugin: nfcd
2019-04-18T07:49:05Z watchdog-nfcd: [67182] Begin '/usr/lib/vmware/bin/nfcd ++group=nfcd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:05Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:49:05Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 1) 1  
2019-04-18T07:49:05Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:49:06Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 2) 1  
2019-04-18T07:49:06Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:49:06Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 3) 1  
2019-04-18T07:49:06Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:49:06Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 4) 1  
2019-04-18T07:49:06Z jumpstart[66878]: executing start plugin: vvold
2019-04-18T07:49:06Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:49:06Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 5) 1  
2019-04-18T07:49:06Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T07:49:06Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 6) 1  
2019-04-18T07:49:06Z watchdog-nfcd: End '/usr/lib/vmware/bin/nfcd ++group=nfcd', failure limit reached  
2019-04-18T07:49:06Z watchdog-vvold: [67302] Begin 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:06Z watchdog-vvold: Executing 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold'  
2019-04-18T07:49:06Z watchdog-vvold: Watchdog for vvold is now 67302
2019-04-18T07:49:06Z watchdog-vvold: Terminating watchdog process with PID 67302
2019-04-18T07:49:06Z watchdog-vvold: [67302] Signal received: exiting the watchdog
2019-04-18T07:49:07Z jumpstart[66878]: executing start plugin: rhttpproxy
2019-04-18T07:49:07Z rhttpproxy-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/rhttpproxy/config.xml" to file "/etc/vmware/rhttpproxy/config.xml" [force=False]   
2019-04-18T07:49:07Z rhttpproxy-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T07:49:07Z rhttpproxy-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T07:49:07Z watchdog-rhttpproxy: [67521] Begin 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:07Z watchdog-rhttpproxy: Executing 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml'  
2019-04-18T07:49:07Z jumpstart[66878]: executing start plugin: hostdCgiServer
2019-04-18T07:49:07Z watchdog-hostdCgiServer: [67546] Begin 'hostdCgiServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:07Z watchdog-hostdCgiServer: Executing 'hostdCgiServer'  
2019-04-18T07:49:08Z jumpstart[66878]: executing start plugin: lbtd
2019-04-18T07:49:08Z watchdog-net-lbt: [67572] Begin '/sbin/net-lbt ++min=0', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:08Z watchdog-net-lbt: Executing '/sbin/net-lbt ++min=0'  
2019-04-18T07:49:08Z PyVmomiServer: 2019-04-18 07:49:08,260 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:49:08Z jumpstart[66878]: executing start plugin: rabbitmqproxy
2019-04-18T07:49:08Z watchdog-rabbitmqproxy: [67596] Begin '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', min-uptime = 60, max-quick-failures = 1, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:08Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T07:49:08Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 1) 0  
2019-04-18T07:49:08Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T07:49:08Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 2) 0  
2019-04-18T07:49:08Z watchdog-rabbitmqproxy: End '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', failure limit reached  
2019-04-18T07:49:08Z jumpstart[66878]: executing start plugin: vmfstraced
2019-04-18T07:49:08Z vmfstracegd: VMFS Global Tracing is not enabled.
2019-04-18T07:49:09Z PyVmomiServer: 2019-04-18 07:49:09,141 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:49:09Z jumpstart[66878]: executing start plugin: slpd
2019-04-18T07:49:09Z root: slpd Starting slpd
2019-04-18T07:49:09Z root: slpd Generating registration file /etc/slp.reg
2019-04-18T07:49:09Z slpd[67684]: test - LOG_INFO
2019-04-18T07:49:09Z slpd[67684]: test - LOG_WARNING
2019-04-18T07:49:09Z slpd[67684]: test - LOG_ERROR
2019-04-18T07:49:09Z slpd[67684]: *** SLPD daemon version 1.0.0 started
2019-04-18T07:49:09Z slpd[67684]: Command line = /sbin/slpd
2019-04-18T07:49:09Z slpd[67684]: Using configuration file = /etc/slp.conf
2019-04-18T07:49:09Z slpd[67684]: Using registration file = /etc/slp.reg
2019-04-18T07:49:09Z slpd[67684]: Agent Interfaces = 192.168.20.20,192.168.55.60,fe80::250:56ff:fe67:b2b0%vmk1
2019-04-18T07:49:09Z slpd[67684]: Agent URL = service:service-agent://esxi-server.testlab.test
2019-04-18T07:49:09Z slpd[67685]: *** BEGIN SERVICES
2019-04-18T07:49:09Z jumpstart[66878]: executing start plugin: dcbd
2019-04-18T07:49:09Z watchdog-dcbd: [67693] Begin '/usr/sbin/dcbd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:09Z watchdog-dcbd: Executing '/usr/sbin/dcbd'  
2019-04-18T07:49:09Z dcbd: [info]     add_dcbx_ieee: device = default_cfg_attribs stype = 2
2019-04-18T07:49:09Z dcbd: [info]     add_ets_ieee: device = default_cfg_attribs
2019-04-18T07:49:09Z dcbd: [info]     add_pfc_ieee: device = default_cfg_attribs
2019-04-18T07:49:09Z dcbd: [info]     add_app_ieee: device = default_cfg_attribs subtype = 0
2019-04-18T07:49:09Z dcbd: [info]     Main loop running.
2019-04-18T07:49:09Z jumpstart[66878]: executing start plugin: nscd
2019-04-18T07:49:09Z watchdog-nscd: [67711] Begin '/usr/lib/vmware/nscd/bin/nscd -d', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:09Z watchdog-nscd: Executing '/usr/lib/vmware/nscd/bin/nscd -d'  
2019-04-18T07:49:09Z jumpstart[66878]: executing start plugin: cdp
2019-04-18T07:49:09Z watchdog-cdp: [67733] Begin '/usr/sbin/net-cdp', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:09Z watchdog-cdp: Executing '/usr/sbin/net-cdp'  
2019-04-18T07:49:09Z jumpstart[66878]: executing start plugin: lacp
2019-04-18T07:49:10Z jumpstart[66878]: executing start plugin: smartd
2019-04-18T07:49:10Z watchdog-smartd: [67754] Begin '/usr/sbin/smartd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:10Z watchdog-smartd: Executing '/usr/sbin/smartd'  
2019-04-18T07:49:10Z smartd: [warn] smartd starts to run with interval 30 minutes
2019-04-18T07:49:10Z jumpstart[66878]: executing start plugin: memscrubd
2019-04-18T07:49:10Z jumpstart[66878]: executing start plugin: vpxa
2019-04-18T07:49:10Z watchdog-vpxa: [67782] Begin '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:10Z watchdog-vpxa: Executing '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa'  
2019-04-18T07:49:10Z jumpstart[66878]: executing start plugin: lwsmd
2019-04-18T07:49:11Z watchdog-lwsmd: [67823] Begin '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:11Z watchdog-lwsmd: Executing '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog'  
2019-04-18T07:49:11Z lwsmd: Logging started
2019-04-18T07:49:11Z lwsmd: Likewise Service Manager starting up
2019-04-18T07:49:11Z lwsmd: Starting service: lwreg
2019-04-18T07:49:11Z lwsmd: [lwreg-ipc] Listening on endpoint /etc/likewise/lib/.regsd
2019-04-18T07:49:11Z lwsmd: [lwreg-ipc] Listener started
2019-04-18T07:49:11Z lwsmd: [lwsm-ipc] Listening on endpoint /etc/likewise/lib/.lwsm
2019-04-18T07:49:11Z lwsmd: [lwsm-ipc] Listener started
2019-04-18T07:49:11Z lwsmd: Likewise Service Manager startup complete
2019-04-18T07:49:12Z lwsmd: Starting service: netlogon
2019-04-18T07:49:12Z lwsmd: [netlogon-ipc] Listening on endpoint /etc/likewise/lib/.netlogond
2019-04-18T07:49:12Z lwsmd: [netlogon-ipc] Listener started
2019-04-18T07:49:12Z lwsmd: Starting service: lwio
2019-04-18T07:49:12Z lwsmd: [lwio-ipc] Listening on endpoint /etc/likewise/lib/.lwiod
2019-04-18T07:49:12Z lwsmd: [lwio-ipc] Listener started
2019-04-18T07:49:12Z lwsmd: Starting service: rdr
2019-04-18T07:49:12Z lwsmd: Starting service: lsass
2019-04-18T07:49:12Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.ntlmd
2019-04-18T07:49:12Z lwsmd: [lsass-ipc] Listener started
2019-04-18T07:49:12Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so'  
2019-04-18T07:49:12Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so: cannot open shared object file: No such file or directory
2019-04-18T07:49:12Z lwsmd: [lsass] Failed to load provider 'lsa-vmdir-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T07:49:12Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so'  
2019-04-18T07:49:12Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so: cannot open shared object file: No such file or directory
2019-04-18T07:49:12Z lwsmd: [lsass] Failed to load provider 'lsa-local-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T07:49:12Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.lsassd
2019-04-18T07:49:12Z lwsmd: [lsass-ipc] Listener started
2019-04-18T07:49:12Z lwsmd: [lsass] The in-memory cache file does not exist yet
2019-04-18T07:49:12Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 0  
2019-04-18T07:49:12Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.TESTLAB.TEST' failed with errno 0, h_errno = 1  
2019-04-18T07:49:12Z lwsmd: [lsass] Domain 'Testlab.test' is now offline  
2019-04-18T07:49:12Z lwsmd: [lsass] Machine Password Sync Thread starting
2019-04-18T07:49:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:13Z jumpstart[66878]: executing start plugin: vit_loader.sh
2019-04-18T07:49:13Z VITLOADER: [etc/init.d/vit_loader] Start vit loader
2019-04-18T07:49:13Z jumpstart[66878]: executing start plugin: hpe-smx.init
2019-04-18T07:49:13Z root: /etc/init.d/hpe-smx.init: Collecting PCI info...
2019-04-18T07:49:14Z root: /etc/init.d/hpe-smx.init: getipmibtaddress returns 254. No IPMI driver reload
2019-04-18T07:49:14Z root: /etc/init.d/hpe-smx.init: Done.
2019-04-18T07:49:14Z jumpstart[66878]: executing start plugin: hpe-nmi.init
2019-04-18T07:49:14Z root: hpe-nmi.init: Supported Server detected.  Loading NMI kernel module...
2019-04-18T07:49:14Z root: hpe-nmi.init:  Done.
2019-04-18T07:49:14Z jumpstart[66878]: executing start plugin: hpe-fc.sh
2019-04-18T07:49:14Z root: hpe-fc init script: Generating hba config file...
2019-04-18T07:49:15Z jumpstart[66878]: executing start plugin: sfcbd-watchdog
2019-04-18T07:49:15Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T07:49:15Z sfcbd-init: Exclusive access granted.
2019-04-18T07:49:15Z sfcbd-init: Request to start sfcbd-watchdog, pid 68019
2019-04-18T07:49:15Z sfcbd-config[68029]: Configuration not changed, already enabled
2019-04-18T07:49:15Z sfcbd-config[68035]: new install or upgrade previously completed, no changes made at version 0.0.0
2019-04-18T07:49:15Z sfcbd-config[68035]: file /etc/sfcb/sfcb.cfg update completed.
2019-04-18T07:49:16Z sfcbd-init: snmp has not been enabled.
2019-04-18T07:49:16Z sfcbd-init: starting sfcbd
2019-04-18T07:49:16Z sfcbd-init: Waiting for sfcb to start up.
2019-04-18T07:49:16Z amnesiac[68058]: 3 of 4. Testing Log Levels - LOG_WARNING
2019-04-18T07:49:16Z amnesiac[68058]: 4 of 4. Testing Log Levels - LOG_ERR
2019-04-18T07:49:16Z sfcbd-init: Program started normally.
2019-04-18T07:49:16Z jumpstart[66878]: executing start plugin: wsman
2019-04-18T07:49:16Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T07:49:16Z openwsmand: Exclusive access granted.
2019-04-18T07:49:16Z openwsmand: Starting openwsmand
2019-04-18T07:49:16Z watchdog-openwsmand: [68095] Begin '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process', min-uptime = 60, max-quick-failures = 5, max-total-failures = 10, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T07:49:16Z watchdog-openwsmand: Executing '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process'  
2019-04-18T07:49:16Z : dlopen /usr/lib/libticket.so.0 failed, error: /usr/lib/libticket.so.0: cannot open shared object file: No such file or directory, exiting. 0 Success
2019-04-18T07:49:16Z : [wrn][68105:/build/mts/release/bora-4152810/cayman_openwsman/openwsman/src/src/server/wsmand.c:320:main] nsswitch.conf successfully stat'ed  
2019-04-18T07:49:16Z jumpstart[66878]: executing start plugin: snmpd
2019-04-18T07:49:16Z root: Starting snmpd
2019-04-18T07:49:16Z root: snmpd has not been enabled.
2019-04-18T07:49:17Z jumpstart[66878]: Jumpstart failed to start: snmpd reason: Execution of command: /etc/init.d/snmpd start failed with status: 1
2019-04-18T07:49:17Z jumpstart[66878]: executing start plugin: xorg
2019-04-18T07:49:17Z jumpstart[66878]: executing start plugin: vmtoolsd
2019-04-18T07:49:17Z jumpstart[66878]: executing start plugin: hp-ams.sh
2019-04-18T07:49:17Z amshelper: Wrapper constructing internal library
2019-04-18T07:49:17Z amshelper[68126]: ams ver 10.6.0-24: Running check for supported server...
2019-04-18T07:49:17Z amshelper[68126]: Wrapper Destructing internal library
2019-04-18T07:49:17Z root: [ams] Agentless Management Service is not supported on this server.
2019-04-18T07:49:17Z jumpstart[66878]: Jumpstart failed to start: hp-ams.sh reason: Execution of command: /etc/init.d/hp-ams.sh start failed with status: 1
2019-04-18T07:49:17Z init: starting pid 68128, tty '': '/bin/apply-host-profiles'  
2019-04-18T07:49:17Z init: starting pid 68129, tty '': '/usr/lib/vmware/secureboot/bin/secureBoot.py ++group=host/vim/vmvisor/boot -a'  
2019-04-18T07:49:18Z backup.sh.68148: Locking esx.conf
2019-04-18T07:49:19Z init: starting pid 68288, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh boot'  
2019-04-18T07:49:19Z backup.sh.68148: Creating archive
2019-04-18T07:49:19Z addVob[68291]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T07:49:19Z addVob[68291]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T07:49:19Z addVob[68291]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T07:49:19Z backup.sh.68148: Unlocking esx.conf
2019-04-18T07:49:19Z init: starting pid 68303, tty '': '/bin/vmdumper -g 'Boot Successful''  
2019-04-18T07:49:19Z init: starting pid 68305, tty '': '/bin/sh ++min=0,group=host/vim/vimuser/terminal/shell /etc/rc.local'  
2019-04-18T07:49:19Z root: init Running kickstart.py
2019-04-18T07:49:19Z root: init Running local.sh
2019-04-18T07:49:19Z init: starting pid 68339, tty '': '/bin/esxcfg-init --set-boot-progress done'  
2019-04-18T07:49:19Z init: starting pid 68340, tty '': '/bin/vmware-autostart.sh start'  
2019-04-18T07:49:19Z VMware[startup]: Starting VMs
2019-04-18T07:49:19Z init: starting pid 68343, tty '/dev/tty1': '/bin/initterm.sh tty1 /bin/techsupport.sh'  
2019-04-18T07:49:19Z init: starting pid 68344, tty '/dev/tty2': '-/bin/initterm.sh tty2 /bin/dcuiweasel'  
2019-04-18T07:49:20Z DCUI: Starting DCUI
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:21Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,144 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,145 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x8bc7ec09b0>}   
2019-04-18T07:49:24Z hostd-icm[68414]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,145 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,145 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-d5ea-442e</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-112543955</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T07:49:24Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,262 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,289 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,290 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x939555a9b0>}   
2019-04-18T07:49:24Z hostd-icm[68422]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,290 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,291 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-340b-4439</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-112543958</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T07:49:24Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:49:24Z ImageConfigManager: 2019-04-18 07:49:24,310 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:31Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:40Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:49:40Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68383  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:43Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:49:43Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68503  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:43Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:49:44Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68511  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:49:44Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68519  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:46Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:49:46Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68528  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:49:46Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:49:47Z ImageConfigManager: 2019-04-18 07:49:47,102 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:49:47Z ImageConfigManager: 2019-04-18 07:49:47,103 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x2a8369f978>}   
2019-04-18T07:49:47Z hostd-icm[68538]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:49:47Z ImageConfigManager: 2019-04-18 07:49:47,103 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:49:47Z ImageConfigManager: 2019-04-18 07:49:47,103 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-7069-4526</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-112544079</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T07:49:47Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:49:47Z ImageConfigManager: 2019-04-18 07:49:47,220 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:49:48Z ImageConfigManager: 2019-04-18 07:49:48,207 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T07:49:48Z ImageConfigManager: 2019-04-18 07:49:48,208 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0xf27b90c978>}   
2019-04-18T07:49:48Z hostd-icm[68551]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T07:49:48Z ImageConfigManager: 2019-04-18 07:49:48,208 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T07:49:48Z ImageConfigManager: 2019-04-18 07:49:48,209 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-ef4f-4534</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-112544080</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T07:49:48Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T07:49:48Z ImageConfigManager: 2019-04-18 07:49:48,226 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T07:50:01Z crond[66697]: crond: USER root pid 68557 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:50:01Z syslog[68560]: starting hostd probing.
2019-04-18T07:50:12Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:50:12Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:50:13Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:13Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:14Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:15Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:32Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:50:32Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68573  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:32Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:50:32Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68588  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:32Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:50:32Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68591  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:50:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:51:10Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:51:10Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:51:10Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T07:51:10Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001c7ebd094ee9229fffb824
2019-04-18T07:51:10Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:51:10Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:51:10Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T07:51:10Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001cc9f2bd5ae7909acd22b5
2019-04-18T07:51:10Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T07:51:10Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T07:51:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:51:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:51:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:51:51Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:51:51Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68642  
2019-04-18T07:51:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:51:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:51:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:51:51Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:51:51Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68644  
2019-04-18T07:52:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:52:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:52:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:53:10Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:53:10Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68656  
2019-04-18T07:53:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:53:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:53:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:53:10Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:53:10Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68659  
2019-04-18T07:54:10Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:54:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:54:10Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:54:29Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:54:29Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68669  
2019-04-18T07:54:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:54:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:54:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:54:29Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:54:29Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68672  
2019-04-18T07:55:01Z crond[66697]: crond: USER root pid 68674 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T07:55:01Z syslog[68678]: starting hostd probing.
2019-04-18T07:55:29Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:55:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:55:29Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:55:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:55:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68694  
2019-04-18T07:55:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:55:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:55:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:55:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:55:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68696  
2019-04-18T07:56:11Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T07:56:11Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T07:56:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:56:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:56:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:57:07Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:57:07Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68706  
2019-04-18T07:57:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:57:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:57:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:57:07Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:57:07Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68710  
2019-04-18T07:58:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:58:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:58:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:58:26Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:58:26Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68719  
2019-04-18T07:58:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:58:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:58:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:58:26Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:58:26Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68721  
2019-04-18T07:59:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:59:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:59:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:59:45Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:59:45Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68731  
2019-04-18T07:59:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T07:59:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T07:59:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T07:59:45Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T07:59:45Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68734  
2019-04-18T08:00:01Z crond[66697]: crond: USER root pid 68737 cmd /usr/lib/vmware/vmksummary/log-heartbeat.py
2019-04-18T08:00:01Z crond[66697]: crond: USER root pid 68738 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T08:00:01Z syslog[68742]: starting hostd probing.
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:32Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:00:51Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68755  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:00:51Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68777  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:51Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,192 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,193 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x330f2bc978>}   
2019-04-18T08:00:52Z hostd-icm[68787]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,193 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,194 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-88b1-45f7</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-112544177</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T08:00:52Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,305 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,344 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,344 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x75ef240978>}   
2019-04-18T08:00:52Z hostd-icm[68795]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,344 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,345 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-447f-4603</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-112544180</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T08:00:52Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T08:00:52Z ImageConfigManager: 2019-04-18 08:00:52,364 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:52Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:00:52Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68807  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:00:52Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:00:54Z init: starting pid 68813, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh stop'  
2019-04-18T08:00:54Z addVob[68815]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:00:54Z addVob[68815]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T08:00:54Z addVob[68815]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T08:00:54Z init: starting pid 68816, tty '': '/bin/shutdown.sh'  
2019-04-18T08:00:54Z VMware[shutdown]: Stopping VMs
2019-04-18T08:00:55Z jumpstart[68832]: executing stop for daemon hp-ams.sh.
2019-04-18T08:00:55Z root: ams stop watchdog...
2019-04-18T08:00:55Z root: ams-wd: ams-watchdog stop.
2019-04-18T08:00:55Z root: Terminating ams-watchdog process with PID 68840 68841
2019-04-18T08:00:57Z root: ams stop service...
2019-04-18T08:00:59Z jumpstart[68832]: executing stop for daemon xorg.
2019-04-18T08:00:59Z jumpstart[68832]: Jumpstart failed to stop: xorg reason: Execution of command: /etc/init.d/xorg stop failed with status: 3
2019-04-18T08:00:59Z jumpstart[68832]: executing stop for daemon vmsyslogd.
2019-04-18T08:00:59Z jumpstart[68832]: Jumpstart failed to stop: vmsyslogd reason: Execution of command: /etc/init.d/vmsyslogd stop failed with status: 1
2019-04-18T08:00:59Z jumpstart[68832]: executing stop for daemon vmtoolsd.
2019-04-18T08:00:59Z jumpstart[68832]: Jumpstart failed to stop: vmtoolsd reason: Execution of command: /etc/init.d/vmtoolsd stop failed with status: 1
2019-04-18T08:00:59Z jumpstart[68832]: executing stop for daemon wsman.
2019-04-18T08:00:59Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T08:00:59Z openwsmand: Exclusive access granted.
2019-04-18T08:00:59Z openwsmand: Stopping openwsmand
2019-04-18T08:01:00Z watchdog-openwsmand: Watchdog for openwsmand is now 68095
2019-04-18T08:01:00Z watchdog-openwsmand: Terminating watchdog process with PID 68095
2019-04-18T08:01:00Z watchdog-openwsmand: [68095] Signal received: exiting the watchdog
2019-04-18T08:01:00Z jumpstart[68832]: executing stop for daemon snmpd.
2019-04-18T08:01:00Z root: Stopping snmpd by administrative request
2019-04-18T08:01:00Z root: snmpd is not running.
2019-04-18T08:01:00Z jumpstart[68832]: executing stop for daemon sfcbd-watchdog.
2019-04-18T08:01:00Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T08:01:00Z sfcbd-init: Exclusive access granted.
2019-04-18T08:01:00Z sfcbd-init: Request to stop sfcbd-watchdog, pid 68905
2019-04-18T08:01:00Z sfcbd-init: Invoked kill 68058
2019-04-18T08:01:00Z sfcb-vmware_raw[68606]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-18T08:01:00Z sfcb-vmware_base[68598]: VICimProvider exiting on WFU cancelled.
2019-04-18T08:01:00Z sfcb-vmware_base[68598]: stopProcMICleanup: Cleanup t=1 not implemented for provider type: 8
2019-04-18T08:01:01Z crond[66697]: crond: USER root pid 68934 cmd /sbin/auto-backup.sh
2019-04-18T08:01:02Z backup.sh.68965: Locking esx.conf
2019-04-18T08:01:02Z backup.sh.68965: Creating archive
2019-04-18T08:01:02Z backup.sh.68965: Unlocking esx.conf
2019-04-18T08:01:03Z sfcbd-init: stop sfcbd process completed.
2019-04-18T08:01:03Z jumpstart[68832]: executing stop for daemon vit_loader.sh.
2019-04-18T08:01:04Z VITLOADER: [etc/init.d/vit_loader] Shutdown VITD successfully
2019-04-18T08:01:04Z jumpstart[68832]: executing stop for daemon hpe-smx.init.
2019-04-18T08:01:04Z jumpstart[68832]: executing stop for daemon hpe-nmi.init.
2019-04-18T08:01:04Z jumpstart[68832]: executing stop for daemon hpe-fc.sh.
2019-04-18T08:01:04Z jumpstart[68832]: executing stop for daemon lwsmd.
2019-04-18T08:01:04Z watchdog-lwsmd: Watchdog for lwsmd is now 67823
2019-04-18T08:01:04Z watchdog-lwsmd: Terminating watchdog process with PID 67823
2019-04-18T08:01:04Z watchdog-lwsmd: [67823] Signal received: exiting the watchdog
2019-04-18T08:01:04Z lwsmd: Shutting down running services
2019-04-18T08:01:04Z lwsmd: Stopping service: lsass
2019-04-18T08:01:04Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-18T08:01:04Z lwsmd: [lsass-ipc] Listener shut down
2019-04-18T08:01:04Z lwsmd: [lsass-ipc] Shutting down listener
2019-04-18T08:01:04Z lwsmd: [lsass-ipc] Listener shut down
2019-04-18T08:01:04Z lwsmd: [lsass] Machine Password Sync Thread stopping
2019-04-18T08:01:05Z lwsmd: [lsass] LSA Service exiting...
2019-04-18T08:01:05Z lwsmd: Stopping service: rdr
2019-04-18T08:01:05Z lwsmd: Stopping service: lwio
2019-04-18T08:01:05Z lwsmd: [lwio-ipc] Shutting down listener
2019-04-18T08:01:05Z lwsmd: [lwio-ipc] Listener shut down
2019-04-18T08:01:05Z lwsmd: [lwio] LWIO Service exiting...
2019-04-18T08:01:05Z lwsmd: Stopping service: netlogon
2019-04-18T08:01:05Z lwsmd: [netlogon-ipc] Shutting down listener
2019-04-18T08:01:05Z lwsmd: [netlogon-ipc] Listener shut down
2019-04-18T08:01:05Z lwsmd: [netlogon] LWNET Service exiting...
2019-04-18T08:01:05Z lwsmd: Stopping service: lwreg
2019-04-18T08:01:05Z lwsmd: [lwreg-ipc] Shutting down listener
2019-04-18T08:01:05Z lwsmd: [lwreg-ipc] Listener shut down
2019-04-18T08:01:05Z lwsmd: [lwreg] REG Service exiting...
2019-04-18T08:01:05Z lwsmd: [lwsm-ipc] Shutting down listener
2019-04-18T08:01:05Z lwsmd: [lwsm-ipc] Listener shut down
2019-04-18T08:01:05Z lwsmd: Logging stopped
2019-04-18T08:01:07Z jumpstart[68832]: executing stop for daemon vpxa.
2019-04-18T08:01:07Z watchdog-vpxa: Watchdog for vpxa is now 67782
2019-04-18T08:01:07Z watchdog-vpxa: Terminating watchdog process with PID 67782
2019-04-18T08:01:07Z watchdog-vpxa: [67782] Signal received: exiting the watchdog
2019-04-18T08:01:07Z jumpstart[68832]: executing stop for daemon vobd.
2019-04-18T08:01:07Z watchdog-vobd: Watchdog for vobd is now 65960
2019-04-18T08:01:07Z watchdog-vobd: Terminating watchdog process with PID 65960
2019-04-18T08:01:07Z watchdog-vobd: [65960] Signal received: exiting the watchdog
2019-04-18T08:01:07Z jumpstart[68832]: executing stop for daemon dcbd.
2019-04-18T08:01:07Z watchdog-dcbd: Watchdog for dcbd is now 67693
2019-04-18T08:01:07Z watchdog-dcbd: Terminating watchdog process with PID 67693
2019-04-18T08:01:07Z watchdog-dcbd: [67693] Signal received: exiting the watchdog
2019-04-18T08:01:07Z jumpstart[68832]: executing stop for daemon nscd.
2019-04-18T08:01:07Z watchdog-nscd: Watchdog for nscd is now 67711
2019-04-18T08:01:07Z watchdog-nscd: Terminating watchdog process with PID 67711
2019-04-18T08:01:07Z watchdog-nscd: [67711] Signal received: exiting the watchdog
2019-04-18T08:01:07Z jumpstart[68832]: executing stop for daemon cdp.
2019-04-18T08:01:08Z watchdog-cdp: Watchdog for cdp is now 67733
2019-04-18T08:01:08Z watchdog-cdp: Terminating watchdog process with PID 67733
2019-04-18T08:01:08Z watchdog-cdp: [67733] Signal received: exiting the watchdog
2019-04-18T08:01:08Z jumpstart[68832]: executing stop for daemon lacp.
2019-04-18T08:01:08Z watchdog-net-lacp: Watchdog for net-lacp is now 66323
2019-04-18T08:01:08Z watchdog-net-lacp: Terminating watchdog process with PID 66323
2019-04-18T08:01:08Z watchdog-net-lacp: [66323] Signal received: exiting the watchdog
2019-04-18T08:01:08Z jumpstart[68832]: executing stop for daemon smartd.
2019-04-18T08:01:08Z watchdog-smartd: Watchdog for smartd is now 67754
2019-04-18T08:01:08Z watchdog-smartd: Terminating watchdog process with PID 67754
2019-04-18T08:01:08Z watchdog-smartd: [67754] Signal received: exiting the watchdog
2019-04-18T08:01:08Z smartd: [warn] smartd received signal 15
2019-04-18T08:01:08Z smartd: [warn] smartd exit.
2019-04-18T08:01:08Z jumpstart[68832]: executing stop for daemon memscrubd.
2019-04-18T08:01:08Z jumpstart[68832]: Jumpstart failed to stop: memscrubd reason: Execution of command: /etc/init.d/memscrubd stop failed with status: 3
2019-04-18T08:01:08Z jumpstart[68832]: executing stop for daemon slpd.
2019-04-18T08:01:08Z root: slpd Stopping slpd
2019-04-18T08:01:08Z slpd[67685]: SLPD daemon shutting down
2019-04-18T08:01:08Z slpd[67685]: *** SLPD daemon shut down by administrative request
2019-04-18T08:01:08Z jumpstart[68832]: executing stop for daemon sensord.
2019-04-18T08:01:09Z watchdog-sensord: Watchdog for sensord is now 67094
2019-04-18T08:01:09Z watchdog-sensord: Terminating watchdog process with PID 67094
2019-04-18T08:01:09Z watchdog-sensord: [67094] Signal received: exiting the watchdog
2019-04-18T08:01:09Z jumpstart[68832]: executing stop for daemon storageRM.
2019-04-18T08:01:09Z watchdog-storageRM: Watchdog for storageRM is now 67114
2019-04-18T08:01:09Z watchdog-storageRM: Terminating watchdog process with PID 67114
2019-04-18T08:01:09Z watchdog-storageRM: [67114] Signal received: exiting the watchdog
2019-04-18T08:01:09Z jumpstart[68832]: executing stop for daemon hostd.
2019-04-18T08:01:09Z watchdog-hostd: Watchdog for hostd is now 67142
2019-04-18T08:01:09Z watchdog-hostd: Terminating watchdog process with PID 67142
2019-04-18T08:01:09Z watchdog-hostd: [67142] Signal received: exiting the watchdog
2019-04-18T08:01:09Z jumpstart[68832]: executing stop for daemon sdrsInjector.
2019-04-18T08:01:09Z watchdog-sdrsInjector: Watchdog for sdrsInjector is now 67161
2019-04-18T08:01:09Z watchdog-sdrsInjector: Terminating watchdog process with PID 67161
2019-04-18T08:01:09Z watchdog-sdrsInjector: [67161] Signal received: exiting the watchdog
2019-04-18T08:01:09Z jumpstart[68832]: executing stop for daemon nfcd.
2019-04-18T08:01:10Z jumpstart[68832]: executing stop for daemon vvold.
2019-04-18T08:01:10Z jumpstart[68832]: Jumpstart failed to stop: vvold reason: Execution of command: /etc/init.d/vvold stop failed with status: 3
2019-04-18T08:01:10Z jumpstart[68832]: executing stop for daemon rhttpproxy.
2019-04-18T08:01:10Z watchdog-rhttpproxy: Watchdog for rhttpproxy is now 67521
2019-04-18T08:01:10Z watchdog-rhttpproxy: Terminating watchdog process with PID 67521
2019-04-18T08:01:10Z watchdog-rhttpproxy: [67521] Signal received: exiting the watchdog
2019-04-18T08:01:10Z jumpstart[68832]: executing stop for daemon hostdCgiServer.
2019-04-18T08:01:10Z watchdog-hostdCgiServer: Watchdog for hostdCgiServer is now 67546
2019-04-18T08:01:10Z watchdog-hostdCgiServer: Terminating watchdog process with PID 67546
2019-04-18T08:01:10Z watchdog-hostdCgiServer: [67546] Signal received: exiting the watchdog
2019-04-18T08:01:10Z jumpstart[68832]: executing stop for daemon lbtd.
2019-04-18T08:01:10Z watchdog-net-lbt: Watchdog for net-lbt is now 67572
2019-04-18T08:01:10Z watchdog-net-lbt: Terminating watchdog process with PID 67572
2019-04-18T08:01:10Z watchdog-net-lbt: [67572] Signal received: exiting the watchdog
2019-04-18T08:01:11Z jumpstart[68832]: executing stop for daemon rabbitmqproxy.
2019-04-18T08:01:11Z jumpstart[68832]: executing stop for daemon vmfstraced.
2019-04-18T08:01:11Z watchdog-vmfstracegd: PID file /var/run/vmware/watchdog-vmfstracegd.PID does not exist
2019-04-18T08:01:11Z watchdog-vmfstracegd: Unable to terminate watchdog: No running watchdog process for vmfstracegd
2019-04-18T08:01:11Z vmfstracegd: Failed to clear vmfstracegd memory reservation
2019-04-18T08:01:11Z jumpstart[68832]: executing stop for daemon esxui.
2019-04-18T08:01:11Z jumpstart[68832]: executing stop for daemon iofilterd-vmwarevmcrypt.
2019-04-18T08:01:11Z iofilterd-vmwarevmcrypt[69661]: Could not expand environment variable HOME.
2019-04-18T08:01:11Z iofilterd-vmwarevmcrypt[69661]: Could not expand environment variable HOME.
2019-04-18T08:01:11Z iofilterd-vmwarevmcrypt[69661]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:01:11Z iofilterd-vmwarevmcrypt[69661]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:01:11Z iofilterd-vmwarevmcrypt[69661]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:01:11Z iofilterd-vmwarevmcrypt[69661]: Resource Pool clean up for iofilter vmwarevmcrypt is done
2019-04-18T08:01:12Z jumpstart[68832]: executing stop for daemon swapobjd.
2019-04-18T08:01:12Z watchdog-swapobjd: Watchdog for swapobjd is now 66998
2019-04-18T08:01:12Z watchdog-swapobjd: Terminating watchdog process with PID 66998
2019-04-18T08:01:12Z watchdog-swapobjd: [66998] Signal received: exiting the watchdog
2019-04-18T08:01:12Z jumpstart[68832]: executing stop for daemon usbarbitrator.
2019-04-18T08:01:12Z watchdog-usbarbitrator: Watchdog for usbarbitrator is now 67036
2019-04-18T08:01:12Z watchdog-usbarbitrator: Terminating watchdog process with PID 67036
2019-04-18T08:01:12Z watchdog-usbarbitrator: [67036] Signal received: exiting the watchdog
2019-04-18T08:01:12Z jumpstart[68832]: executing stop for daemon iofilterd-spm.
2019-04-18T08:01:12Z iofilterd-spm[69724]: Could not expand environment variable HOME.
2019-04-18T08:01:12Z iofilterd-spm[69724]: Could not expand environment variable HOME.
2019-04-18T08:01:12Z iofilterd-spm[69724]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:01:12Z iofilterd-spm[69724]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:01:12Z iofilterd-spm[69724]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:01:12Z iofilterd-spm[69724]: Resource Pool clean up for iofilter spm is done
2019-04-18T08:01:12Z jumpstart[68832]: executing stop for daemon ESXShell.
2019-04-18T08:01:12Z addVob[69731]: Could not expand environment variable HOME.
2019-04-18T08:01:12Z addVob[69731]: Could not expand environment variable HOME.
2019-04-18T08:01:12Z addVob[69731]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:01:12Z addVob[69731]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:01:12Z addVob[69731]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:01:12Z addVob[69731]: VobUserLib_Init failed with -1
2019-04-18T08:01:12Z doat: Stopped wait on component ESXShell.stop
2019-04-18T08:01:12Z doat: Stopped wait on component ESXShell.disable
2019-04-18T08:01:13Z jumpstart[68832]: executing stop for daemon DCUI.
2019-04-18T08:01:13Z root: DCUI Disabling DCUI logins
2019-04-18T08:01:13Z addVob[69752]: Could not expand environment variable HOME.
2019-04-18T08:01:13Z addVob[69752]: Could not expand environment variable HOME.
2019-04-18T08:01:13Z addVob[69752]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:01:13Z addVob[69752]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:01:13Z addVob[69752]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:01:13Z addVob[69752]: VobUserLib_Init failed with -1
2019-04-18T08:01:13Z jumpstart[68832]: executing stop for daemon ntpd.
2019-04-18T08:01:13Z root: ntpd Stopping ntpd
2019-04-18T08:01:13Z watchdog-ntpd: Watchdog for ntpd is now 66911
2019-04-18T08:01:13Z watchdog-ntpd: Terminating watchdog process with PID 66911
2019-04-18T08:01:13Z watchdog-ntpd: [66911] Signal received: exiting the watchdog
2019-04-18T08:01:13Z ntpd[66921]: ntpd exiting on signal 1 (Hangup)
2019-04-18T08:01:13Z ntpd[66921]: 134.34.3.18 local addr 192.168.20.20 -> <null>
2019-04-18T08:01:13Z jumpstart[68832]: executing stop for daemon SSH.
2019-04-18T08:01:13Z addVob[69784]: Could not expand environment variable HOME.
2019-04-18T08:01:13Z addVob[69784]: Could not expand environment variable HOME.
2019-04-18T08:01:13Z addVob[69784]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:01:13Z addVob[69784]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:01:13Z addVob[69784]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:01:13Z addVob[69784]: VobUserLib_Init failed with -1
2019-04-18T08:01:13Z doat: Stopped wait on component RemoteShell.disable
2019-04-18T08:01:13Z doat: Stopped wait on component RemoteShell.stop
2019-04-18T08:01:14Z backup.sh.69840: Locking esx.conf
2019-04-18T08:01:14Z backup.sh.69840: Creating archive
2019-04-18T08:01:14Z backup.sh.69840: Unlocking esx.conf
2019-04-18T08:21:53Z watchdog-vobd: [65960] Begin '/usr/lib/vmware/vob/bin/vobd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:21:53Z watchdog-vobd: Executing '/usr/lib/vmware/vob/bin/vobd'  
2019-04-18T08:21:53Z jumpstart[65945]: Launching Executor
2019-04-18T08:21:53Z jumpstart[65945]: Setting up Executor - Reset Requested
2019-04-18T08:21:53Z jumpstart[65945]: ignoring plugin 'vsan-upgrade' because version '2.0.0'  has already been run.  
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: check-required-memory
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: restore-configuration
2019-04-18T08:21:54Z jumpstart[65993]: restoring configuration
2019-04-18T08:21:54Z jumpstart[65993]: extracting from file /local.tgz
2019-04-18T08:21:54Z jumpstart[65993]: file etc/likewise/db/registry.db has been changed before restoring the configuration - the changes will be lost
2019-04-18T08:21:54Z jumpstart[65993]: ConfigCheck: Running ipv6 option upgrade, redundantly
2019-04-18T08:21:54Z jumpstart[65993]: Util: tcpip4 IPv6 enabled
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: vmkeventd
2019-04-18T08:21:54Z watchdog-vmkeventd: [65995] Begin '/usr/lib/vmware/vmkeventd/bin/vmkeventd', min-uptime = 10, max-quick-failures = 5, max-total-failures = 9999999, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:21:54Z watchdog-vmkeventd: Executing '/usr/lib/vmware/vmkeventd/bin/vmkeventd'  
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: vmkcrypto
2019-04-18T08:21:54Z jumpstart[65973]: 65974:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T08:21:54Z jumpstart[65973]: 65974:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T08:21:54Z jumpstart[65973]: 65974:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: autodeploy-enabled
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: vsan-base
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: vsan-early
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: advanced-user-configuration-options
2019-04-18T08:21:54Z jumpstart[65945]: executing start plugin: restore-advanced-configuration
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: PSA-boot-config
2019-04-18T08:21:55Z jumpstart[65973]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:21:55Z jumpstart[65973]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T08:21:55Z jumpstart[65973]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T08:21:55Z jumpstart[65973]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T08:21:55Z jumpstart[65973]: lib/ssl: protocol list tls1.2
2019-04-18T08:21:55Z jumpstart[65973]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T08:21:55Z jumpstart[65973]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: vprobe
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: vmkapi-mgmt
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: dma-engine
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: procfs
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: mgmt-vmkapi-compatibility
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: iodm
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: vmkernel-vmkapi-compatibility
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: driver-status-check
2019-04-18T08:21:55Z jumpstart[66025]: driver_status_check: boot cmdline: /jumpstrt.gz vmbTrustedBoot=false tboot=0x101b000 installerDiskDumpSlotSize=2560 no-auto-partition bootUUID=e78269d0448c41fe200c24e8a54f93c1
2019-04-18T08:21:55Z jumpstart[66025]: driver_status_check: useropts:
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: hardware-config
2019-04-18T08:21:55Z jumpstart[66026]: Failed to symlink /etc/vmware/pci.ids: No such file or directory
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: vmklinux
2019-04-18T08:21:55Z jumpstart[65945]: executing start plugin: vmkdevmgr
2019-04-18T08:21:55Z jumpstart[66027]: Starting vmkdevmgr
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: register-vmw-mpp
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: register-vmw-satp
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: register-vmw-psp
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: etherswitch
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: aslr
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: random
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: storage-early-config-dev-settings
2019-04-18T08:22:01Z jumpstart[65945]: executing start plugin: networking-drivers
2019-04-18T08:22:01Z jumpstart[66147]: Loading network device drivers
2019-04-18T08:22:04Z jumpstart[66147]: LoadVmklinuxDriver: Loaded module bnx2
2019-04-18T08:22:05Z jumpstart[65945]: executing start plugin: register-vmw-vaai
2019-04-18T08:22:05Z jumpstart[65945]: executing start plugin: usb
2019-04-18T08:22:05Z jumpstart[65945]: executing start plugin: local-storage
2019-04-18T08:22:05Z jumpstart[65945]: executing start plugin: psa-mask-paths
2019-04-18T08:22:05Z jumpstart[65945]: executing start plugin: network-uplink-init
2019-04-18T08:22:05Z jumpstart[66228]: Trying to connect...
2019-04-18T08:22:05Z jumpstart[66228]: Connected.
2019-04-18T08:22:08Z jumpstart[66228]: Received processed
2019-04-18T08:22:08Z jumpstart[65945]: executing start plugin: psa-nmp-pre-claim-config
2019-04-18T08:22:08Z jumpstart[65945]: executing start plugin: psa-filter-pre-claim-config
2019-04-18T08:22:08Z jumpstart[65945]: executing start plugin: restore-system-uuid
2019-04-18T08:22:08Z jumpstart[65945]: executing start plugin: restore-storage-multipathing
2019-04-18T08:22:08Z jumpstart[65945]: executing start plugin: network-support
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: psa-load-rules
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: vds-vmkapi-compatibility
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: psa-filter-post-claim-config
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: psa-nmp-post-claim-config
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: mlx4_en
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: dvfilters-vmkapi-compatibility
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: vds-config
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: storage-drivers
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: vxlan-base
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: firewall
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: dvfilter-config
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: dvfilter-generic-fastpath
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: lacp-daemon
2019-04-18T08:22:09Z watchdog-net-lacp: [66335] Begin '/usr/sbin/net-lacp', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:22:09Z watchdog-net-lacp: Executing '/usr/sbin/net-lacp'  
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: storage-psa-init
2019-04-18T08:22:09Z jumpstart[66345]: Trying to connect...
2019-04-18T08:22:09Z jumpstart[66345]: Connected.
2019-04-18T08:22:09Z jumpstart[66345]: Received processed
2019-04-18T08:22:09Z jumpstart[65945]: executing start plugin: restore-networking
2019-04-18T08:22:10Z jumpstart[65973]: NetworkInfoImpl: Enabling 1 netstack instances during boot
2019-04-18T08:22:15Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Storing previous management interface:'vmk0'  
2019-04-18T08:22:15Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk0'  
2019-04-18T08:22:15Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing migration for'vmk1'  
2019-04-18T08:22:15Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing config for'vmk0'  
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T08:22:15Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk0
2019-04-18T08:22:15Z jumpstart[65973]: VmkNic: Ipv6 not Enabled
2019-04-18T08:22:15Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk0
2019-04-18T08:22:15Z jumpstart[65973]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T08:22:15Z jumpstart[65973]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk0', portStr:'Management Network'  
2019-04-18T08:22:15Z jumpstart[65973]: VmKernelNicInfo::LoadConfig: Processing config for'vmk1'  
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T08:22:15Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv4 address set up successfully on vmknic vmk1
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: VmkNic::SetIpConfigInternal: IPv6 address set up successfully on vmknic vmk1
2019-04-18T08:22:15Z jumpstart[65973]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T08:22:15Z jumpstart[65973]: VmkNic::Enable: netstack:'defaultTcpipStack', interface:'vmk1', portStr:'NFS-FreeNAS'  
2019-04-18T08:22:15Z jumpstart[65973]: RoutingInfo: LoadConfig called on RoutingInfo
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: GetManagementInterface: Tagging vmk0 as Management
2019-04-18T08:22:15Z jumpstart[65973]: VmkNicImpl::IsDhclientProcess: No valid dhclient pid found
2019-04-18T08:22:15Z jumpstart[65973]: 2019-04-18T08:22:15Z jumpstart[65973]: SetTaggedManagementInterface: Writing vmk0 to the ManagementIface node
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: random-seed
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: dvfilters
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: restore-pxe-marker
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: auto-configure-networking
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: storage-early-configuration
2019-04-18T08:22:15Z jumpstart[66412]: 66412:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T08:22:15Z jumpstart[66412]: 66412:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T08:22:15Z jumpstart[66412]: 66412:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T08:22:15Z jumpstart[66412]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:22:15Z jumpstart[66412]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T08:22:15Z jumpstart[66412]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T08:22:15Z jumpstart[66412]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T08:22:15Z jumpstart[66412]: lib/ssl: protocol list tls1.2
2019-04-18T08:22:15Z jumpstart[66412]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T08:22:15Z jumpstart[66412]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: bnx2fc
2019-04-18T08:22:15Z jumpstart[65945]: executing start plugin: software-iscsi
2019-04-18T08:22:15Z jumpstart[65973]: iScsi: No iBFT data present in the BIOS
2019-04-18T08:22:15Z iscsid: Notice: iSCSI Database already at latest schema. (Upgrade Skipped).
2019-04-18T08:22:15Z iscsid: iSCSI MASTER Database opened. (0x5851008)
2019-04-18T08:22:15Z iscsid: LogLevel = 0
2019-04-18T08:22:15Z iscsid: LogSync  = 0
2019-04-18T08:22:15Z iscsid: memory (180) MB successfully reserved for 1024 sessions
2019-04-18T08:22:15Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e64) idx (0) size (460808)
2019-04-18T08:22:15Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e62) idx (1) size (460808)
2019-04-18T08:22:15Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e60) idx (2) size (460808)
2019-04-18T08:22:15Z iscsid: allocated transportCache for transport (bnx2i-b499babd4e5e) idx (3) size (460808)
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T08:22:16Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T08:22:16Z jumpstart[65945]: executing start plugin: fcoe-config
2019-04-18T08:22:16Z jumpstart[65945]: executing start plugin: storage-path-claim
2019-04-18T08:22:20Z jumpstart[65973]: StorageInfo: Number of paths 3
2019-04-18T08:22:25Z jumpstart[65973]: StorageInfo: Number of devices 3
2019-04-18T08:22:25Z jumpstart[65973]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T08:24:31Z mark: storage-path-claim-completed
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: gss
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: mount-filesystems
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: restore-paths
2019-04-18T08:22:25Z jumpstart[65973]: StorageInfo: Unable to name LUN mpx.vmhba0:C0:T0:L0: Cannot set display name on this device.  Unable to guarantee name will not change across reboots or media change.
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: filesystem-drivers
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: rpc
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: dump-partition
2019-04-18T08:22:25Z jumpstart[65973]: execution of 'system coredump partition set --enable=true --smart' failed : Unable to smart activate a dump partition.  Error was: No suitable diagnostic partitions found..  
2019-04-18T08:22:25Z jumpstart[65973]: 2019-04-18T08:22:25Z jumpstart[65945]: Executor failed executing esxcli command system coredump partition set --enable=true --smart
2019-04-18T08:22:25Z jumpstart[65945]: Method invocation failed: dump-partition->start() failed: error while executing the cli
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: vsan-devel
2019-04-18T08:22:25Z jumpstart[66433]: VsanDevel: DevelBootDelay: 0
2019-04-18T08:22:25Z jumpstart[66433]: VsanDevel: DevelWipeConfigOnBoot: 0
2019-04-18T08:22:25Z jumpstart[66433]: VsanDevel: DevelTagSSD: Starting
2019-04-18T08:22:25Z jumpstart[66433]: 66433:VVOLLIB : VVolLib_GetSoapContext:379: Using 30 secs for soap connect timeout.
2019-04-18T08:22:25Z jumpstart[66433]: 66433:VVOLLIB : VVolLib_GetSoapContext:380: Using 200 secs for soap receive timeout.
2019-04-18T08:22:25Z jumpstart[66433]: 66433:VVOLLIB : VVolLibTracingInit:89: Successfully initialized the VVolLib tracing module
2019-04-18T08:22:25Z jumpstart[66433]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:22:25Z jumpstart[66433]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T08:22:25Z jumpstart[66433]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T08:22:25Z jumpstart[66433]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T08:22:25Z jumpstart[66433]: lib/ssl: protocol list tls1.2
2019-04-18T08:22:25Z jumpstart[66433]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T08:22:25Z jumpstart[66433]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T08:22:25Z jumpstart[66433]: VsanDevel: DevelTagSSD: Done.
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: vmfs
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: ufs
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: vfat
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: nfsgssd
2019-04-18T08:22:25Z watchdog-nfsgssd: [66636] Begin '/usr/lib/vmware/nfs/bin/nfsgssd -f -a', min-uptime = 60, max-quick-failures = 128, max-total-failures = 65536, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:22:25Z watchdog-nfsgssd: Executing '/usr/lib/vmware/nfs/bin/nfsgssd -f -a'  
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: vsan
2019-04-18T08:22:25Z nfsgssd[66646]: Could not expand environment variable HOME.
2019-04-18T08:22:25Z nfsgssd[66646]: Could not expand environment variable HOME.
2019-04-18T08:22:25Z nfsgssd[66646]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:22:25Z nfsgssd[66646]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:22:25Z nfsgssd[66646]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:22:25Z nfsgssd[66646]: lib/ssl: OpenSSL using FIPS_drbg for RAND
2019-04-18T08:22:25Z nfsgssd[66646]: lib/ssl: protocol list tls1.2
2019-04-18T08:22:25Z nfsgssd[66646]: lib/ssl: protocol list tls1.2 (openssl flags 0x17000000)
2019-04-18T08:22:25Z nfsgssd[66646]: lib/ssl: cipher list !aNULL:kECDH+AESGCM:ECDH+AESGCM:RSA+AESGCM:kECDH+AES:ECDH+AES:RSA+AES
2019-04-18T08:22:25Z nfsgssd[66646]: Empty epoch file
2019-04-18T08:22:25Z nfsgssd[66646]: Starting with epoch 1
2019-04-18T08:22:25Z nfsgssd[66646]: Connected to SunRPCGSS version 1.0
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: krb5
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: etc-hosts
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: nfs
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: nfs41
2019-04-18T08:22:25Z jumpstart[65945]: executing start plugin: mount-disk-fs
2019-04-18T08:22:26Z jumpstart[65973]: VmFileSystem: Automounted volume 5a6f6646-d13e2d89-fd8d-b499babd4e5e
2019-04-18T08:22:26Z jumpstart[65973]: VmFileSystem: Automounted volume 5ab363c3-c36e8e9f-8cfc-b499babd4e5e
2019-04-18T08:22:26Z jumpstart[65945]: executing start plugin: auto-configure-pmem
2019-04-18T08:22:26Z jumpstart[65945]: executing start plugin: restore-nfs-volumes
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:22:26Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:22:26Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:22:26Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:22:26Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:22:26Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:22:26Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:22:26Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:22:26Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T08:22:26Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:23:28Z jumpstart[65973]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "NFS-FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T08:23:28Z jumpstart[65973]: 2019-04-18T08:23:28Z jumpstart[65973]: execution of 'boot storage restore --nfs-volumes' failed : failed to restore mount "FreeNAS": Unable to complete Sysinfo operation.  Please see the VMkernel log file for more details.: Sysinfo error: Unable to connect to NFS serverSee VMkernel log for details.  
2019-04-18T08:23:28Z jumpstart[65973]: 2019-04-18T08:23:28Z jumpstart[65945]: Executor failed executing esxcli command boot storage restore --nfs-volumes
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:23:28Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:23:28Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:23:28Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:23:28Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:23:28Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:23:28Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:23:28Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T08:23:28Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:23:28Z jumpstart[65945]: Method invocation failed: restore-nfs-volumes->start() failed: error while executing the cli
2019-04-18T08:23:28Z jumpstart[65945]: executing start plugin: auto-configure-storage
2019-04-18T08:23:28Z jumpstart[65945]: executing start plugin: restore-bootbanks
2019-04-18T08:23:28Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:31Z jumpstart[65973]: VmkCtl: Boot device not available, waited 3 seconds
2019-04-18T08:23:31Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:34Z jumpstart[65973]: VmkCtl: Boot device not available, waited 6 seconds
2019-04-18T08:23:34Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:37Z jumpstart[65973]: VmkCtl: Boot device not available, waited 9 seconds
2019-04-18T08:23:37Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:40Z jumpstart[65973]: VmkCtl: Boot device not available, waited 12 seconds
2019-04-18T08:23:40Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:43Z jumpstart[65973]: VmkCtl: Boot device not available, waited 15 seconds
2019-04-18T08:23:43Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:46Z jumpstart[65973]: VmkCtl: Boot device not available, waited 18 seconds
2019-04-18T08:23:46Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:49Z jumpstart[65973]: VmkCtl: Boot device not available, waited 21 seconds
2019-04-18T08:23:49Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:52Z jumpstart[65973]: VmkCtl: Boot device not available, waited 24 seconds
2019-04-18T08:23:52Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:55Z jumpstart[65973]: VmkCtl: Boot device not available, waited 27 seconds
2019-04-18T08:23:55Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:23:58Z jumpstart[65973]: VmkCtl: Boot device not available, waited 30 seconds
2019-04-18T08:23:58Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:01Z jumpstart[65973]: VmkCtl: Boot device not available, waited 33 seconds
2019-04-18T08:24:01Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:04Z jumpstart[65973]: VmkCtl: Boot device not available, waited 36 seconds
2019-04-18T08:24:04Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:07Z jumpstart[65973]: VmkCtl: Boot device not available, waited 39 seconds
2019-04-18T08:24:07Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:10Z jumpstart[65973]: VmkCtl: Boot device not available, waited 42 seconds
2019-04-18T08:24:10Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:13Z jumpstart[65973]: VmkCtl: Boot device not available, waited 45 seconds
2019-04-18T08:24:13Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:16Z jumpstart[65973]: VmkCtl: Boot device not available, waited 48 seconds
2019-04-18T08:24:16Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:19Z jumpstart[65973]: VmkCtl: Boot device not available, waited 51 seconds
2019-04-18T08:24:19Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:22Z jumpstart[65973]: VmkCtl: Boot device not available, waited 54 seconds
2019-04-18T08:24:22Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:25Z jumpstart[65973]: VmkCtl: Boot device not available, waited 57 seconds
2019-04-18T08:24:25Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:28Z jumpstart[65973]: VmkCtl: Boot device not available, waited 60 seconds
2019-04-18T08:24:28Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:28Z jumpstart[65973]: VmkCtl: Did not find a valid boot device, symlinking /bootbank to /tmp
2019-04-18T08:24:28Z jumpstart[65945]: executing start plugin: restore-host-cache
2019-04-18T08:24:28Z jumpstart[65973]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T08:24:28Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T08:24:28Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:24:28Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:24:28Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:24:28Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:28Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:28Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:24:28Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:24:28Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:24:28Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:28Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:28Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:24:28Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:24:28Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T08:24:28Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T08:24:28Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T08:24:28Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T08:24:28Z jumpstart[65973]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T08:24:28Z jumpstart[65973]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T08:24:28Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T08:24:28Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T08:24:28Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T08:24:28Z jumpstart[65973]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55096377344
2019-04-18T08:24:29Z jumpstart[65973]: SlowRefresh: path /vmfs/volumes/5ab363c4-26d208a0-fab7-b499babd4e5e total blocks 2500207837184 used blocks 2400621428736
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: vflash
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: dump-file
2019-04-18T08:24:29Z jumpstart[65973]: VmkCtl: Diagnostic File found; not auto creating
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T08:24:29Z jumpstart[65973]: VmkCtl: Dump file determined to be large enough, size: 1588592640 (recommended minimum: 1588592640)
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: vmci
2019-04-18T08:24:29Z jumpstart[65973]: execution of 'system module load --module vmci' failed : Unable to load module /usr/lib/vmware/vmkmod/vmci: Busy  
2019-04-18T08:24:29Z jumpstart[65945]: Executor failed executing esxcli command system module load --module vmci
2019-04-18T08:24:29Z jumpstart[65945]: Method invocation failed: vmci->start() failed: error while executing the cli
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: configure-locker
2019-04-18T08:24:29Z jumpstart[66691]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /scratch
2019-04-18T08:24:29Z jumpstart[66691]: Using /locker/packages/6.5.0/ as /productLocker
2019-04-18T08:24:29Z jumpstart[66691]: using /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker as /locker
2019-04-18T08:24:29Z jumpstart[66691]: Using policy dir /etc/vmware/secpolicy
2019-04-18T08:24:29Z jumpstart[66691]: Parsed all objects
2019-04-18T08:24:29Z jumpstart[66691]: Objects defined and obsolete objects removed
2019-04-18T08:24:29Z jumpstart[66691]: Parsed all domain names
2019-04-18T08:24:29Z jumpstart[66691]: Domain policies parsed and syntax validated
2019-04-18T08:24:29Z jumpstart[66691]: Constraints check for domain policies succeeded
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /usr/share/nvidia
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /productLocker
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /.vmware
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /dev/vsansparse
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /dev/cbt
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /dev/svm
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /dev/upit
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /dev/vsan
2019-04-18T08:24:29Z jumpstart[66691]: Getting realpath failed: /dev/vvol
2019-04-18T08:24:29Z jumpstart[66691]: Domain policies set
2019-04-18T08:24:29Z jumpstart[66691]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T08:24:29Z jumpstart[66691]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T08:24:29Z jumpstart[66691]: Parsed all the tardisk policy files
2019-04-18T08:24:29Z jumpstart[66691]: Set all the tardisk labels and policy
2019-04-18T08:24:29Z jumpstart[66691]: Parsed all file label mappings
2019-04-18T08:24:29Z jumpstart[66691]: Set all file labels
2019-04-18T08:24:29Z jumpstart[66691]: System security policy has been set successfully
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: restore-system-swap
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: cbrc
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: tpm
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: apei
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: restore-security-policies
2019-04-18T08:24:29Z jumpstart[66697]: Using policy dir /etc/vmware/secpolicy
2019-04-18T08:24:29Z jumpstart[66697]: Parsed all objects
2019-04-18T08:24:29Z jumpstart[66697]: Objects defined and obsolete objects removed
2019-04-18T08:24:29Z jumpstart[66697]: Parsed all domain names
2019-04-18T08:24:29Z jumpstart[66697]: Domain policies parsed and syntax validated
2019-04-18T08:24:29Z jumpstart[66697]: Constraints check for domain policies succeeded
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /usr/share/nvidia
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /productLocker
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /.vmware
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /dev/vsansparse
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /dev/cbt
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /dev/svm
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /dev/upit
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /dev/vsan
2019-04-18T08:24:29Z jumpstart[66697]: Getting realpath failed: /dev/vvol
2019-04-18T08:24:29Z jumpstart[66697]: Domain policies set
2019-04-18T08:24:29Z jumpstart[66697]: Error: More than one exception specification for tardisk /tardisks/vsan.v00
2019-04-18T08:24:29Z jumpstart[66697]: Error: Ignoring /etc/vmware/secpolicy/tardisks/vsan
2019-04-18T08:24:29Z jumpstart[66697]: Parsed all the tardisk policy files
2019-04-18T08:24:29Z jumpstart[66697]: Set all the tardisk labels and policy
2019-04-18T08:24:29Z jumpstart[66697]: Parsed all file label mappings
2019-04-18T08:24:29Z jumpstart[66697]: Set all file labels
2019-04-18T08:24:29Z jumpstart[66697]: System security policy has been set successfully
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: oem-modules
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: crond
2019-04-18T08:24:29Z crond[66701]: crond: crond (busybox 1.22.1) started, log level 8
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: restore-resource-groups
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: procMisc
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: rdma-vmkapi-compatibility
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: ipmi
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: restore-keymap
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: nmp-vmkapi-compatibility
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: iscsi-vmkapi-compatibility
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: ftcpt
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: hbr
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: autodeploy-setpassword
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: inetd
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: vrdma
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: tag-boot-bank
2019-04-18T08:24:29Z jumpstart[66788]: unable to open boot configuration: No such file or directory
2019-04-18T08:24:29Z jumpstart[65945]: Method invocation failed: tag-boot-bank->start() failed: exited with code 1
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: system-image-cache
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: iofilters
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: vit
2019-04-18T08:24:29Z jumpstart[65973]: Parser: Initializing VIT parser lib
2019-04-18T08:24:29Z jumpstart[65973]: VsanIscsiTargetImpl: The host is not in a Virtual SAN cluster.
2019-04-18T08:24:29Z jumpstart[65973]: Util: Retrieved vit status successfully
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: vmotion
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: vfc
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: balloonVMCI
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: coredump-configuration
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0x3a =                0x5
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x480 =   0xda04000000000f
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x481 =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x482 = 0xfff9fffe0401e172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x483 =   0x7fffff00036dff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x484 =     0xffff000011ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x485 =            0x401e7
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x486 =         0x80000021
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x487 =         0xffffffff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x488 =             0x2000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x489 =            0x267ff
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48a =               0x2a
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48b =      0x4ff00000000
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48c =      0xf0106134141
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48d =       0x7f00000016
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48e = 0xfff9fffe04006172
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x48f =   0x7fffff00036dfb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x490 =     0xffff000011fb
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR      0x491 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR 0xc0010114 =                  0
2019-04-18T08:24:29Z jumpstart[65973]: Common: MSR       0xce =      0xc0004011503
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:29Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:29Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:24:29Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:29Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:29Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:24:29Z jumpstart[65973]: VmFileSystemImpl: Probably unmounted volume. Console path not set
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:24:29Z jumpstart[65973]: GetVmfsFileSystems: Vmfs mounted volumes from fsswitch
2019-04-18T08:24:29Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5a6f6646-db921f99-e5cd-b499babd4e5e
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibPlugins_Load: Loading plugin directory '/usr/lib/vmware/plugin/objLib'  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/upitObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/upitObjBE.so' from '/usr/lib/vmware/plugin/objLib/upitObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:29Z jumpstart[65973]: UpitBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'upit' found and registered as type 5  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLibLoadPlugin: Loading plugin 'objLib/vsanObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: PluginLdr_Load: Loaded plugin 'objLib/vsanObjBE.so' from '/usr/lib/vmware/plugin/objLib/vsanObjBE.so'  
2019-04-18T08:24:29Z jumpstart[65973]: ObjLibPluginInit: Initialized plugin
2019-04-18T08:24:29Z jumpstart[65973]: VsanBackend_GetAPI: Received request for API com.vmware.plugin.objlib.backend.api@1.0
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib_RegisterDynamicBE: Back-end 'vsan' found and registered as type 3  
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: Objlib initialized.
2019-04-18T08:24:29Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: uuid 5ab363c4-26d208a0-fab7-b499babd4e5e
2019-04-18T08:24:29Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 2 mounted VMFS volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VMFS-L volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetMountedVmfsFileSystemsInt: Found 0 mounted VFFS volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetVmfsFileSystems: Vmfs umounted volumes from LVM
2019-04-18T08:24:29Z jumpstart[65973]: GetUnmountedVmfsFileSystems: There are 0 unmounted (NoSnaphot) volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VMFS-L volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetUnmountedVmfsFileSystemsInt: Found 0 unmounted VFFS volumes
2019-04-18T08:24:29Z jumpstart[65973]: GetTypedFileSystems: fstype vfat
2019-04-18T08:24:29Z jumpstart[65973]: GetTypedFileSystems: fstype ufs
2019-04-18T08:24:29Z jumpstart[65973]: GetTypedFileSystems: fstype vvol
2019-04-18T08:24:29Z jumpstart[65973]: GetTypedFileSystems: fstype vsan
2019-04-18T08:24:29Z jumpstart[65973]: GetTypedFileSystems: fstype PMEM
2019-04-18T08:24:29Z jumpstart[65973]: SlowRefresh: path /vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e total blocks 146565758976 used blocks 55096377344
2019-04-18T08:24:29Z jumpstart[65973]: OBJLIB-LIB: ObjLib cleanup done.
2019-04-18T08:24:29Z jumpstart[65945]: executing start plugin: set-acceptance-level
2019-04-18T08:24:30Z jumpstart[65945]: executing start plugin: scratch-storage
2019-04-18T08:24:31Z jumpstart[65945]: executing start plugin: pingback
2019-04-18T08:24:31Z jumpstart[65945]: executing start plugin: vmswapcleanup
2019-04-18T08:24:31Z jumpstart[65973]: execution of '--plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup' failed : Host Local Swap Location has not been enabled  
2019-04-18T08:24:31Z jumpstart[65945]: Executor failed executing esxcli command --plugin-dir /usr/lib/vmware/esxcli/int/ systemInternal vmswapcleanup cleanup
2019-04-18T08:24:31Z jumpstart[65945]: Method invocation failed: vmswapcleanup->start() failed: error while executing the cli
2019-04-18T08:24:31Z jumpstart[65973]: Jumpstart executor signalled to stop
2019-04-18T08:24:31Z jumpstart[65945]: Executor has been Successfully Stopped
2019-04-18T08:24:31Z init: starting pid 66828, tty '': '/usr/lib/vmware/firstboot/bin/firstboot.py ++group=host/vim/vmvisor/boot -l'  
2019-04-18T08:24:31Z init: starting pid 66829, tty '': '/bin/services.sh start'  
2019-04-18T08:24:32Z jumpstart[66882]: executing start plugin: ESXShell
2019-04-18T08:24:32Z addVob[66888]: Could not expand environment variable HOME.
2019-04-18T08:24:32Z addVob[66888]: Could not expand environment variable HOME.
2019-04-18T08:24:32Z addVob[66888]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:24:32Z addVob[66888]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:24:32Z addVob[66888]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:24:32Z jumpstart[66882]: executing start plugin: DCUI
2019-04-18T08:24:32Z root: DCUI Enabling DCUI login: runlevel =
2019-04-18T08:24:32Z addVob[66903]: Could not expand environment variable HOME.
2019-04-18T08:24:32Z addVob[66903]: Could not expand environment variable HOME.
2019-04-18T08:24:32Z addVob[66903]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:24:32Z addVob[66903]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:24:32Z addVob[66903]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:24:32Z jumpstart[66882]: executing start plugin: ntpd
2019-04-18T08:24:32Z root: ntpd Starting ntpd
2019-04-18T08:24:32Z sntp[66908]: sntp 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:44 UTC 2016 (1)
2019-04-18T08:24:33Z sntp[66908]: 2019-04-18 08:24:33.475576 (+0000) +1.23440 +/- 0.858973 pool.ntp.org 78.46.107.140 s2 no-leap
2019-04-18T08:24:34Z watchdog-ntpd: [66915] Begin '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift', min-uptime = 60, max-quick-failures = 5, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:34Z watchdog-ntpd: Executing '/sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift'  
2019-04-18T08:24:34Z ntpd[66925]: ntpd 4.2.8p8+vmware@1.3677-o Sat May 28 14:02:59 UTC 2016 (1): Starting
2019-04-18T08:24:34Z ntpd[66925]: Command line: /sbin/ntpd -g -n -c /etc/ntp.conf -f /etc/ntp.drift
2019-04-18T08:24:34Z ntpd[66925]: proto: precision = 0.462 usec (-21)
2019-04-18T08:24:34Z ntpd[66925]: restrict default: KOD does nothing without LIMITED.
2019-04-18T08:24:34Z ntpd[66925]: Listen and drop on 0 v6wildcard [::]:123
2019-04-18T08:24:34Z ntpd[66925]: Listen and drop on 1 v4wildcard 0.0.0.0:123
2019-04-18T08:24:34Z ntpd[66925]: Listen normally on 2 lo0 127.0.0.1:123
2019-04-18T08:24:34Z ntpd[66925]: Listen normally on 3 vmk0 192.168.20.20:123
2019-04-18T08:24:34Z ntpd[66925]: Listen normally on 4 vmk1 192.168.55.60:123
2019-04-18T08:24:34Z ntpd[66925]: Listen normally on 5 lo0 [::1]:123
2019-04-18T08:24:34Z ntpd[66925]: Listen normally on 6 lo0 [fe80::1%1]:123
2019-04-18T08:24:34Z ntpd[66925]: Listen normally on 7 vmk1 [fe80::250:56ff:fe67:b2b0%3]:123
2019-04-18T08:24:34Z jumpstart[66882]: executing start plugin: SSH
2019-04-18T08:24:34Z addVob[66930]: Could not expand environment variable HOME.
2019-04-18T08:24:34Z addVob[66930]: Could not expand environment variable HOME.
2019-04-18T08:24:34Z addVob[66930]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:24:34Z addVob[66930]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:24:34Z addVob[66930]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:24:34Z jumpstart[66882]: executing start plugin: esxui
2019-04-18T08:24:35Z jumpstart[66882]: executing start plugin: iofilterd-vmwarevmcrypt
2019-04-18T08:24:35Z iofilterd-vmwarevmcrypt[66959]: Could not expand environment variable HOME.
2019-04-18T08:24:35Z iofilterd-vmwarevmcrypt[66959]: Could not expand environment variable HOME.
2019-04-18T08:24:35Z iofilterd-vmwarevmcrypt[66959]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:24:35Z iofilterd-vmwarevmcrypt[66959]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:24:35Z iofilterd-vmwarevmcrypt[66959]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:24:35Z iofilterd-vmwarevmcrypt[66959]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T08:24:35Z watchdog-iofiltervpd: [66972] Begin '/usr/lib/vmware/iofilter/bin/ioFilterVPServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:35Z watchdog-iofiltervpd: Executing '/usr/lib/vmware/iofilter/bin/ioFilterVPServer'  
2019-04-18T08:24:38Z jumpstart[66882]: executing start plugin: swapobjd
2019-04-18T08:24:38Z watchdog-swapobjd: [67002] Begin '/usr/lib/vmware/swapobj/bin/swapobjd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:38Z watchdog-swapobjd: Executing '/usr/lib/vmware/swapobj/bin/swapobjd'  
2019-04-18T08:24:38Z jumpstart[66882]: executing start plugin: usbarbitrator
2019-04-18T08:24:38Z usbarbitrator: evicting objects on USB from OC
2019-04-18T08:24:38Z usbarbitrator: unclaiming USB devices
2019-04-18T08:24:38Z usbarbitrator: rescanning to complete removal of USB devices
2019-04-18T08:24:38Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T08:24:38Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T08:24:38Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T08:24:38Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T08:24:39Z watchdog-usbarbitrator: [67040] Begin '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:39Z watchdog-usbarbitrator: Executing '/usr/lib/vmware/bin/vmware-usbarbitrator -t --max-clients=414'  
2019-04-18T08:24:39Z jumpstart[66882]: executing start plugin: iofilterd-spm
2019-04-18T08:24:39Z iofilterd-spm[67072]: Could not expand environment variable HOME.
2019-04-18T08:24:39Z iofilterd-spm[67072]: Could not expand environment variable HOME.
2019-04-18T08:24:39Z iofilterd-spm[67072]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:24:39Z iofilterd-spm[67072]: DictionaryLoad: Cannot open file "~/.vmware/config": No such file or directory.  
2019-04-18T08:24:39Z iofilterd-spm[67072]: DictionaryLoad: Cannot open file "~/.vmware/preferences": No such file or directory.  
2019-04-18T08:24:39Z iofilterd-spm[67072]: Exiting daemon post RP init due to rp-init-only invocation
2019-04-18T08:24:39Z usbarbitrator: Starting USB storage detach monitor
2019-04-18T08:24:39Z usbarbitrator: reservedHbas:
2019-04-18T08:24:39Z jumpstart[66882]: executing start plugin: sensord
2019-04-18T08:24:39Z watchdog-sensord: [67097] Begin '/usr/lib/vmware/bin/sensord -l', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:39Z watchdog-sensord: Executing '/usr/lib/vmware/bin/sensord -l'  
2019-04-18T08:24:39Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e5e Pending=0 Failed=0
2019-04-18T08:24:39Z jumpstart[66882]: executing start plugin: storageRM
2019-04-18T08:24:39Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e60 Pending=0 Failed=0
2019-04-18T08:24:39Z watchdog-storageRM: [67115] Begin '/sbin/storageRM', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:39Z watchdog-storageRM: Executing '/sbin/storageRM'  
2019-04-18T08:24:39Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e62 Pending=0 Failed=0
2019-04-18T08:24:39Z iscsid: DISCOVERY: transport_name=bnx2i-b499babd4e64 Pending=0 Failed=0
2019-04-18T08:24:39Z usbarbitrator: Exiting USB storage detach monitor
2019-04-18T08:24:40Z jumpstart[66882]: executing start plugin: hostd
2019-04-18T08:24:40Z hostd-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/hostd/config.xml" to file "/etc/vmware/hostd/config.xml" [force=False]   
2019-04-18T08:24:40Z hostd-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T08:24:40Z hostd-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T08:24:40Z create-statsstore[67135]: Initiating hostd statsstore ramdisk size (re)evaluation.
2019-04-18T08:24:40Z create-statsstore[67135]: Maximum number of virtual machines supported for powering-on 384. Maximum number of virtual machines supported for register 1536. Maximum number of resource pools 1000.
2019-04-18T08:24:40Z create-statsstore[67135]: Estimating statsstore ramdisk of size 803MB will be needed.
2019-04-18T08:24:40Z create-statsstore[67135]: Creating statsstore ramdisk mount point /var/lib/vmware/hostd/stats.
2019-04-18T08:24:40Z create-statsstore[67135]: Creating new statsstore ramdisk with 803MB.
2019-04-18T08:24:40Z watchdog-hostd: [67142] Begin 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:40Z watchdog-hostd: Executing 'hostd ++min=0,swapscope=system /etc/vmware/hostd/config.xml'  
2019-04-18T08:24:40Z jumpstart[66882]: executing start plugin: sdrsInjector
2019-04-18T08:24:40Z watchdog-sdrsInjector: [67161] Begin '/sbin/sdrsInjector', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:40Z watchdog-sdrsInjector: Executing '/sbin/sdrsInjector'  
2019-04-18T08:24:40Z jumpstart[66882]: executing start plugin: nfcd
2019-04-18T08:24:40Z watchdog-nfcd: [67182] Begin '/usr/lib/vmware/bin/nfcd ++group=nfcd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:40Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T08:24:40Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 1) 1  
2019-04-18T08:24:40Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T08:24:40Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 2) 1  
2019-04-18T08:24:40Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T08:24:40Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 3) 1  
2019-04-18T08:24:40Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T08:24:41Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 4) 1  
2019-04-18T08:24:41Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T08:24:41Z jumpstart[66882]: executing start plugin: vvold
2019-04-18T08:24:41Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 5) 1  
2019-04-18T08:24:41Z watchdog-nfcd: Executing '/usr/lib/vmware/bin/nfcd ++group=nfcd'  
2019-04-18T08:24:41Z watchdog-nfcd: '/usr/lib/vmware/bin/nfcd ++group=nfcd' exited after 0 seconds (quick failure 6) 1  
2019-04-18T08:24:41Z watchdog-nfcd: End '/usr/lib/vmware/bin/nfcd ++group=nfcd', failure limit reached  
2019-04-18T08:24:41Z watchdog-vvold: [67303] Begin 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:41Z watchdog-vvold: Executing 'vvold -o -8090 -V vvol.version.version1 -f /etc/vmware/vvold/config.xml -L syslog:Vvold'  
2019-04-18T08:24:41Z watchdog-vvold: Watchdog for vvold is now 67303
2019-04-18T08:24:41Z watchdog-vvold: Terminating watchdog process with PID 67303
2019-04-18T08:24:41Z watchdog-vvold: [67303] Signal received: exiting the watchdog
2019-04-18T08:24:42Z jumpstart[66882]: executing start plugin: rhttpproxy
2019-04-18T08:24:42Z rhttpproxy-upgrade-config: INFO: Carrying some config entries from file "/etc/vmware/rhttpproxy/config.xml" to file "/etc/vmware/rhttpproxy/config.xml" [force=False]   
2019-04-18T08:24:42Z rhttpproxy-upgrade-config: DEBUG: From and to doc are on the same version 
2019-04-18T08:24:42Z rhttpproxy-upgrade-config: DEBUG: Skip migrating since the version of the new file is the same as the version of the existing file 
2019-04-18T08:24:42Z watchdog-rhttpproxy: [67519] Begin 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:42Z watchdog-rhttpproxy: Executing 'rhttpproxy ++min=0,swapscope=system -r /etc/vmware/rhttpproxy/config.xml'  
2019-04-18T08:24:42Z jumpstart[66882]: executing start plugin: hostdCgiServer
2019-04-18T08:24:42Z watchdog-hostdCgiServer: [67544] Begin 'hostdCgiServer', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:42Z watchdog-hostdCgiServer: Executing 'hostdCgiServer'  
2019-04-18T08:24:43Z jumpstart[66882]: executing start plugin: lbtd
2019-04-18T08:24:43Z watchdog-net-lbt: [67572] Begin '/sbin/net-lbt ++min=0', min-uptime = 1000, max-quick-failures = 100, max-total-failures = 100, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:43Z watchdog-net-lbt: Executing '/sbin/net-lbt ++min=0'  
2019-04-18T08:24:43Z PyVmomiServer: 2019-04-18 08:24:43,128 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:24:43Z jumpstart[66882]: executing start plugin: rabbitmqproxy
2019-04-18T08:24:43Z watchdog-rabbitmqproxy: [67594] Begin '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', min-uptime = 60, max-quick-failures = 1, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:43Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T08:24:43Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 1) 0  
2019-04-18T08:24:43Z watchdog-rabbitmqproxy: Executing '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml'  
2019-04-18T08:24:43Z watchdog-rabbitmqproxy: '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml' exited after 0 seconds (quick failure 2) 0  
2019-04-18T08:24:43Z watchdog-rabbitmqproxy: End '/usr/lib/vmware/rabbitmqproxy/bin/rabbitmqproxy /etc/vmware/rabbitmqproxy/config.xml', failure limit reached  
2019-04-18T08:24:43Z jumpstart[66882]: executing start plugin: vmfstraced
2019-04-18T08:24:43Z vmfstracegd: VMFS Global Tracing is not enabled.
2019-04-18T08:24:44Z PyVmomiServer: 2019-04-18 08:24:44,000 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:24:44Z jumpstart[66882]: executing start plugin: slpd
2019-04-18T08:24:44Z root: slpd Starting slpd
2019-04-18T08:24:44Z root: slpd Generating registration file /etc/slp.reg
2019-04-18T08:24:44Z slpd[67690]: test - LOG_INFO
2019-04-18T08:24:44Z slpd[67690]: test - LOG_WARNING
2019-04-18T08:24:44Z slpd[67690]: test - LOG_ERROR
2019-04-18T08:24:44Z slpd[67690]: *** SLPD daemon version 1.0.0 started
2019-04-18T08:24:44Z slpd[67690]: Command line = /sbin/slpd
2019-04-18T08:24:44Z slpd[67690]: Using configuration file = /etc/slp.conf
2019-04-18T08:24:44Z slpd[67690]: Using registration file = /etc/slp.reg
2019-04-18T08:24:44Z slpd[67690]: Agent Interfaces = 192.168.20.20,192.168.55.60,fe80::250:56ff:fe67:b2b0%vmk1
2019-04-18T08:24:44Z slpd[67690]: Agent URL = service:service-agent://esxi-server.testlab.test
2019-04-18T08:24:44Z slpd[67691]: *** BEGIN SERVICES
2019-04-18T08:24:44Z jumpstart[66882]: executing start plugin: dcbd
2019-04-18T08:24:44Z watchdog-dcbd: [67699] Begin '/usr/sbin/dcbd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:44Z watchdog-dcbd: Executing '/usr/sbin/dcbd'  
2019-04-18T08:24:44Z dcbd: [info]     add_dcbx_ieee: device = default_cfg_attribs stype = 2
2019-04-18T08:24:44Z dcbd: [info]     add_ets_ieee: device = default_cfg_attribs
2019-04-18T08:24:44Z dcbd: [info]     add_pfc_ieee: device = default_cfg_attribs
2019-04-18T08:24:44Z dcbd: [info]     add_app_ieee: device = default_cfg_attribs subtype = 0
2019-04-18T08:24:44Z dcbd: [info]     Main loop running.
2019-04-18T08:24:44Z jumpstart[66882]: executing start plugin: nscd
2019-04-18T08:24:44Z watchdog-nscd: [67717] Begin '/usr/lib/vmware/nscd/bin/nscd -d', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:44Z watchdog-nscd: Executing '/usr/lib/vmware/nscd/bin/nscd -d'  
2019-04-18T08:24:44Z jumpstart[66882]: executing start plugin: cdp
2019-04-18T08:24:44Z watchdog-cdp: [67741] Begin '/usr/sbin/net-cdp', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:44Z watchdog-cdp: Executing '/usr/sbin/net-cdp'  
2019-04-18T08:24:44Z jumpstart[66882]: executing start plugin: lacp
2019-04-18T08:24:45Z jumpstart[66882]: executing start plugin: smartd
2019-04-18T08:24:45Z watchdog-smartd: [67760] Begin '/usr/sbin/smartd', min-uptime = 60, max-quick-failures = 5, max-total-failures = 5, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:45Z watchdog-smartd: Executing '/usr/sbin/smartd'  
2019-04-18T08:24:45Z smartd: [warn] smartd starts to run with interval 30 minutes
2019-04-18T08:24:45Z jumpstart[66882]: executing start plugin: memscrubd
2019-04-18T08:24:45Z jumpstart[66882]: executing start plugin: vpxa
2019-04-18T08:24:45Z watchdog-vpxa: [67788] Begin '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:45Z watchdog-vpxa: Executing '/usr/lib/vmware/vpxa/bin/vpxa ++min=0,swapscope=system -D /etc/vmware/vpxa'  
2019-04-18T08:24:45Z jumpstart[66882]: executing start plugin: lwsmd
2019-04-18T08:24:45Z watchdog-lwsmd: [67829] Begin '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog', min-uptime = 60, max-quick-failures = 5, max-total-failures = 1000000, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:45Z watchdog-lwsmd: Executing '/usr/lib/vmware/likewise/sbin/lwsmd ++group=likewise --syslog'  
2019-04-18T08:24:45Z lwsmd: Logging started
2019-04-18T08:24:45Z lwsmd: Likewise Service Manager starting up
2019-04-18T08:24:45Z lwsmd: Starting service: lwreg
2019-04-18T08:24:45Z lwsmd: [lwreg-ipc] Listening on endpoint /etc/likewise/lib/.regsd
2019-04-18T08:24:45Z lwsmd: [lwreg-ipc] Listener started
2019-04-18T08:24:45Z lwsmd: [lwsm-ipc] Listening on endpoint /etc/likewise/lib/.lwsm
2019-04-18T08:24:45Z lwsmd: [lwsm-ipc] Listener started
2019-04-18T08:24:45Z lwsmd: Likewise Service Manager startup complete
2019-04-18T08:24:46Z lwsmd: Starting service: netlogon
2019-04-18T08:24:47Z lwsmd: [netlogon-ipc] Listening on endpoint /etc/likewise/lib/.netlogond
2019-04-18T08:24:47Z lwsmd: [netlogon-ipc] Listener started
2019-04-18T08:24:47Z lwsmd: Starting service: lwio
2019-04-18T08:24:47Z lwsmd: [lwio-ipc] Listening on endpoint /etc/likewise/lib/.lwiod
2019-04-18T08:24:47Z lwsmd: [lwio-ipc] Listener started
2019-04-18T08:24:47Z lwsmd: Starting service: rdr
2019-04-18T08:24:47Z lwsmd: Starting service: lsass
2019-04-18T08:24:47Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.ntlmd
2019-04-18T08:24:47Z lwsmd: [lsass-ipc] Listener started
2019-04-18T08:24:47Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so'  
2019-04-18T08:24:47Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so: cannot open shared object file: No such file or directory
2019-04-18T08:24:47Z lwsmd: [lsass] Failed to load provider 'lsa-vmdir-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_vmdir.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T08:24:47Z lwsmd: [lsass] Failed to open auth provider at path '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so'  
2019-04-18T08:24:47Z lwsmd: [lsass] /usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so: cannot open shared object file: No such file or directory
2019-04-18T08:24:47Z lwsmd: [lsass] Failed to load provider 'lsa-local-provider' from '/usr/lib/vmware/likewise/lib/liblsass_auth_provider_local.so' - error 40040 (LW_ERROR_INVALID_AUTH_PROVIDER)  
2019-04-18T08:24:47Z lwsmd: [lsass-ipc] Listening on endpoint /etc/likewise/lib/.lsassd
2019-04-18T08:24:47Z lwsmd: [lsass-ipc] Listener started
2019-04-18T08:24:47Z lwsmd: [lsass] The in-memory cache file does not exist yet
2019-04-18T08:24:47Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 0  
2019-04-18T08:24:47Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.TESTLAB.TEST' failed with errno 0, h_errno = 1  
2019-04-18T08:24:47Z lwsmd: [lsass] Domain 'Testlab.test' is now offline  
2019-04-18T08:24:47Z lwsmd: [lsass] Machine Password Sync Thread starting
2019-04-18T08:24:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:24:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:24:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:24:48Z jumpstart[66882]: executing start plugin: vit_loader.sh
2019-04-18T08:24:48Z VITLOADER: [etc/init.d/vit_loader] Start vit loader
2019-04-18T08:24:48Z jumpstart[66882]: executing start plugin: hpe-smx.init
2019-04-18T08:24:48Z root: /etc/init.d/hpe-smx.init: Collecting PCI info...
2019-04-18T08:24:49Z root: /etc/init.d/hpe-smx.init: getipmibtaddress returns 254. No IPMI driver reload
2019-04-18T08:24:49Z root: /etc/init.d/hpe-smx.init: Done.
2019-04-18T08:24:49Z jumpstart[66882]: executing start plugin: hpe-nmi.init
2019-04-18T08:24:49Z root: hpe-nmi.init: Supported Server detected.  Loading NMI kernel module...
2019-04-18T08:24:49Z root: hpe-nmi.init:  Done.
2019-04-18T08:24:49Z jumpstart[66882]: executing start plugin: hpe-fc.sh
2019-04-18T08:24:49Z root: hpe-fc init script: Generating hba config file...
2019-04-18T08:24:50Z jumpstart[66882]: executing start plugin: sfcbd-watchdog
2019-04-18T08:24:50Z sfcbd-init: Getting Exclusive access, please wait...
2019-04-18T08:24:50Z sfcbd-init: Exclusive access granted.
2019-04-18T08:24:50Z sfcbd-init: Request to start sfcbd-watchdog, pid 68020
2019-04-18T08:24:50Z sfcbd-config[68030]: Configuration not changed, already enabled
2019-04-18T08:24:50Z sfcbd-config[68036]: new install or upgrade previously completed, no changes made at version 0.0.0
2019-04-18T08:24:50Z sfcbd-config[68036]: file /etc/sfcb/sfcb.cfg update completed.
2019-04-18T08:24:50Z sfcbd-init: snmp has not been enabled.
2019-04-18T08:24:50Z sfcbd-init: starting sfcbd
2019-04-18T08:24:51Z sfcbd-init: Waiting for sfcb to start up.
2019-04-18T08:24:51Z amnesiac[68059]: 3 of 4. Testing Log Levels - LOG_WARNING
2019-04-18T08:24:51Z amnesiac[68059]: 4 of 4. Testing Log Levels - LOG_ERR
2019-04-18T08:24:51Z sfcbd-init: Program started normally.
2019-04-18T08:24:51Z jumpstart[66882]: executing start plugin: wsman
2019-04-18T08:24:51Z openwsmand: Getting Exclusive access, please wait...
2019-04-18T08:24:51Z openwsmand: Exclusive access granted.
2019-04-18T08:24:51Z openwsmand: Starting openwsmand
2019-04-18T08:24:51Z watchdog-openwsmand: [68096] Begin '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process', min-uptime = 60, max-quick-failures = 5, max-total-failures = 10, bg_pid_file = '', reboot-flag = '0'  
2019-04-18T08:24:51Z watchdog-openwsmand: Executing '/sbin/openwsmand ++min=0,securitydom=6 --syslog=3 --foreground-process'  
2019-04-18T08:24:51Z : dlopen /usr/lib/libticket.so.0 failed, error: /usr/lib/libticket.so.0: cannot open shared object file: No such file or directory, exiting. 0 Success
2019-04-18T08:24:51Z : [wrn][68106:/build/mts/release/bora-4152810/cayman_openwsman/openwsman/src/src/server/wsmand.c:320:main] nsswitch.conf successfully stat'ed  
2019-04-18T08:24:51Z jumpstart[66882]: executing start plugin: snmpd
2019-04-18T08:24:51Z root: Starting snmpd
2019-04-18T08:24:51Z root: snmpd has not been enabled.
2019-04-18T08:24:51Z jumpstart[66882]: Jumpstart failed to start: snmpd reason: Execution of command: /etc/init.d/snmpd start failed with status: 1
2019-04-18T08:24:51Z jumpstart[66882]: executing start plugin: xorg
2019-04-18T08:24:52Z jumpstart[66882]: executing start plugin: vmtoolsd
2019-04-18T08:24:52Z jumpstart[66882]: executing start plugin: hp-ams.sh
2019-04-18T08:24:52Z amshelper: Wrapper constructing internal library
2019-04-18T08:24:52Z amshelper[68127]: ams ver 10.6.0-24: Running check for supported server...
2019-04-18T08:24:52Z amshelper[68127]: Wrapper Destructing internal library
2019-04-18T08:24:52Z root: [ams] Agentless Management Service is not supported on this server.
2019-04-18T08:24:52Z jumpstart[66882]: Jumpstart failed to start: hp-ams.sh reason: Execution of command: /etc/init.d/hp-ams.sh start failed with status: 1
2019-04-18T08:24:52Z init: starting pid 68129, tty '': '/bin/apply-host-profiles'  
2019-04-18T08:24:52Z init: starting pid 68130, tty '': '/usr/lib/vmware/secureboot/bin/secureBoot.py ++group=host/vim/vmvisor/boot -a'  
2019-04-18T08:24:53Z init: starting pid 68163, tty '': '/usr/lib/vmware/vmksummary/log-bootstop.sh boot'  
2019-04-18T08:24:53Z addVob[68166]: DictionaryLoad: Cannot open file "/usr/lib/vmware/config": No such file or directory.  
2019-04-18T08:24:53Z addVob[68166]: DictionaryLoad: Cannot open file "//.vmware/config": No such file or directory.  
2019-04-18T08:24:53Z addVob[68166]: DictionaryLoad: Cannot open file "//.vmware/preferences": No such file or directory.  
2019-04-18T08:24:54Z init: starting pid 68207, tty '': '/bin/vmdumper -g 'Boot Successful''  
2019-04-18T08:24:54Z init: starting pid 68212, tty '': '/bin/sh ++min=0,group=host/vim/vimuser/terminal/shell /etc/rc.local'  
2019-04-18T08:24:54Z backup.sh.68152: Locking esx.conf
2019-04-18T08:24:54Z backup.sh.68152: Creating archive
2019-04-18T08:24:54Z backup.sh.68152: Unlocking esx.conf
2019-04-18T08:24:54Z root: init Running kickstart.py
2019-04-18T08:24:54Z root: init Running local.sh
2019-04-18T08:24:54Z init: starting pid 68340, tty '': '/bin/esxcfg-init --set-boot-progress done'  
2019-04-18T08:24:54Z init: starting pid 68341, tty '': '/bin/vmware-autostart.sh start'  
2019-04-18T08:24:54Z init: starting pid 68344, tty '/dev/tty1': '/bin/initterm.sh tty1 /bin/techsupport.sh'  
2019-04-18T08:24:54Z VMware[startup]: Starting VMs
2019-04-18T08:24:54Z init: starting pid 68345, tty '/dev/tty2': '-/bin/initterm.sh tty2 /bin/dcuiweasel'  
2019-04-18T08:24:55Z DCUI: Starting DCUI
2019-04-18T08:25:02Z crond[66701]: crond: USER root pid 68383 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T08:25:02Z syslog[68386]: starting hostd probing.
2019-04-18T08:25:06Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:06Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:25Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68397  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:25Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68401  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:25Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68461  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68469  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68471  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68481  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:40Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:40Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68490  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:40Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:40Z ImageConfigManager: 2019-04-18 08:25:40,844 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T08:25:40Z ImageConfigManager: 2019-04-18 08:25:40,844 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x782a4bb978>}   
2019-04-18T08:25:40Z hostd-icm[68497]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T08:25:40Z ImageConfigManager: 2019-04-18 08:25:40,844 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:25:40Z ImageConfigManager: 2019-04-18 08:25:40,845 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-9e10-ce1b</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-127269102</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T08:25:40Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T08:25:40Z ImageConfigManager: 2019-04-18 08:25:40,959 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T08:25:41Z ImageConfigManager: 2019-04-18 08:25:41,098 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T08:25:41Z ImageConfigManager: 2019-04-18 08:25:41,098 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0xef1fc309b0>}   
2019-04-18T08:25:41Z hostd-icm[68508]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T08:25:41Z ImageConfigManager: 2019-04-18 08:25:41,098 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:25:41Z ImageConfigManager: 2019-04-18 08:25:41,099 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-a074-ce27</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-127269103</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T08:25:41Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T08:25:41Z ImageConfigManager: 2019-04-18 08:25:41,115 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T08:25:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T08:25:47Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68522  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68524  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:25:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68527  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:25:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:26:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68578  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:44Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:26:44Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68591  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:44Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:45Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T08:26:45Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T08:26:45Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T08:26:45Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001c7ebd094ee9229fffb824
2019-04-18T08:26:45Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T08:26:45Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T08:26:45Z smartd: libsmartsata: SG_IO ioctl ret:0 status:2 host_status:0 driver_status:0
2019-04-18T08:26:45Z smartd: libsmartsata: Not an ATA SMART device:naa.600508b1001cc9f2bd5ae7909acd22b5
2019-04-18T08:26:45Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartmicron.so is already loaded
2019-04-18T08:26:45Z smartd: smartmgt: plugin /usr/lib/vmware/smart_plugins/libsmartnvme.so is already loaded
2019-04-18T08:26:45Z ImageConfigManager: 2019-04-18 08:26:45,742 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T08:26:45Z ImageConfigManager: 2019-04-18 08:26:45,742 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x4c0bc64978>}   
2019-04-18T08:26:45Z hostd-icm[68601]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T08:26:45Z ImageConfigManager: 2019-04-18 08:26:45,742 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:26:45Z ImageConfigManager: 2019-04-18 08:26:45,743 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-32b9-ce5d</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.installDate-127269120</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<installDate xmlns="urn:vim25"><_this type="Host  
2019-04-18T08:26:45Z ImageConfigManager: ImageConfigManager">ha-image-config-manager</_this></installDate>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T08:26:45Z ImageConfigManager: 2019-04-18 08:26:45,860 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body><installDateResponse xmlns='urn:vim25'><returnval>2018-01-31T20:00:52Z</returnval></installDateResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T08:26:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:26:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68611  
2019-04-18T08:26:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:26:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:26:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:26:48Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:26:48Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68613  
2019-04-18T08:26:52Z ImageConfigManager: 2019-04-18 08:26:52,253 [MainProcess INFO 'HostImage' MainThread] Installer <class 'vmware.esximage.Installer.BootBankInstaller.BootBankInstaller'> was not initiated - reason: altbootbank is invalid: Error in loading boot.cfg from bootbank /bootbank: Error parsing bootbank boot.cfg file /bootbank/boot.cfg: [Errno 2] No such file or directory: '/bootbank/boot.cfg'   
2019-04-18T08:26:52Z ImageConfigManager: 2019-04-18 08:26:52,254 [MainProcess INFO 'HostImage' MainThread] Installers initiated are {'live': <vmware.esximage.Installer.LiveImageInstaller.LiveImageInstaller object at 0x1739b0a978>}   
2019-04-18T08:26:52Z hostd-icm[68621]: Registered 'ImageConfigManagerImpl:ha-image-config-manager'  
2019-04-18T08:26:52Z ImageConfigManager: 2019-04-18 08:26:52,254 [MainProcess INFO 'root' MainThread] Starting CGI server on stdin/stdout   
2019-04-18T08:26:52Z ImageConfigManager: 2019-04-18 08:26:52,254 [MainProcess DEBUG 'root' MainThread] b'<?xml version="1.0" encoding="UTF-8"?>\n<soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"\n xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"\n xmlns:xsd="http://www.w3.org/2001/XMLSchema"\n xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">\n<soapenv:Header>\n<operationID xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">esxui-fb2b-ce6d</operationID><taskKey xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:vim25" versionId="6.5" xsi:type="xsd:string">haTask--vim.host.ImageConfigManager.queryHostImageProfile-127269125</taskKey>\n</soapenv:Header>\n<soapenv:Body>\n<HostImageConfigGetProfile xmlns="urn:  
2019-04-18T08:26:52Z ImageConfigManager: vim25"><_this type="HostImageConfigManager">ha-image-config-manager</_this></HostImageConfigGetProfile>\n</soapenv:Body>\n</soapenv:Envelope>'   
2019-04-18T08:26:52Z ImageConfigManager: 2019-04-18 08:26:52,272 [MainProcess DEBUG 'root' MainThread] <?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body><HostImageConfigGetProfileResponse xmlns='urn:vim25'><returnval><name>(Updated) ESXICUST</name><vendor>Muffin's ESX Fix</vendor></returnval></HostImageConfigGetProfileResponse></soapenv:Body></soapenv:Envelope>   
2019-04-18T08:27:09Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:27:09Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:27:09Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:27:28Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:27:28Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68629  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:27:28Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:27:48Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:27:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:27:48Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:28:07Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:28:07Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68639  
2019-04-18T08:28:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:28:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:28:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:28:07Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:28:07Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68643  
2019-04-18T08:29:07Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:29:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:29:07Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:29:26Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:29:26Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68652  
2019-04-18T08:29:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:29:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:29:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:29:26Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:29:26Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68656  
2019-04-18T08:30:01Z crond[66701]: crond: USER root pid 68657 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T08:30:01Z syslog[68661]: starting hostd probing.
2019-04-18T08:30:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:30:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:30:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:30:45Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:30:45Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68676  
2019-04-18T08:30:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:30:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:30:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:30:45Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:30:45Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68679  
2019-04-18T08:30:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T08:30:47Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T08:31:45Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:31:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:31:45Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:32:04Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:32:04Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68691  
2019-04-18T08:32:04Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:32:04Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:32:04Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:32:04Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:32:04Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68693  
2019-04-18T08:33:04Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:33:04Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:33:04Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:33:23Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:33:23Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68703  
2019-04-18T08:33:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:33:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:33:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:33:23Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:33:23Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68706  
2019-04-18T08:34:23Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:34:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:34:23Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:34:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:34:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68716  
2019-04-18T08:34:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:34:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:34:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:34:42Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:34:42Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68719  
2019-04-18T08:35:01Z crond[66701]: crond: USER root pid 68720 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T08:35:01Z syslog[68723]: starting hostd probing.
2019-04-18T08:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:35:42Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:35:47Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 1  
2019-04-18T08:35:47Z lwsmd: [netlogon] DNS lookup for '_ldap._tcp.dc._msdcs.Testlab.test' failed with errno 0, h_errno = 1  
2019-04-18T08:36:01Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:36:01Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68741  
2019-04-18T08:36:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:36:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:36:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:36:01Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:36:01Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68743  
2019-04-18T08:37:01Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:37:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:37:01Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:37:20Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:37:20Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68753  
2019-04-18T08:37:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:37:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:37:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:37:20Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:37:20Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68757  
2019-04-18T08:38:20Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:20Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:26Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:30Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:38:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68766  
2019-04-18T08:38:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:38:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:38:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:38:38Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:38:38Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68783  
2019-04-18T08:39:36Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:39:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:39:36Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:39:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:39:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:39:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:39:38Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:39:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:39:38Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:39:55Z lwsmd: [LwKrb5GetTgtImpl ../lwadvapi/threaded/krbtgt.c:262] KRB5 Error code: -1765328228 (Message: Cannot contact any KDC for realm 'TESTLAB.TEST')  
2019-04-18T08:39:55Z lwsmd: [lsass] Failed to run provider specific request (request code = 14, provider = 'lsa-activedirectory-provider') -> error = 40121, symbol = LW_ERROR_DOMAIN_IS_OFFLINE, client pid = 68789  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'TESTLAB.TEST', site '<null>' with flags 100  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 100  
2019-04-18T08:39:55Z lwsmd: [netlogon] Looking for a DC in domain 'Testlab.test', site '<null>' with flags 140  
2019-04-18T08:40:01Z crond[66701]: crond: USER root pid 68801 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-04-18T08:40:01Z syslog[68804]: starting hostd probing.
2019-04-18T08:40:03Z sftp-server[68817]: session opened for local user root from [192.168.10.62]
2019-04-18T08:40:03Z sftp-server[68817]: opendir "/"  
2019-04-18T08:40:03Z sftp-server[68817]: closedir "/"  
2019-04-18T08:40:08Z sftp-server[68817]: opendir "/var"  
2019-04-18T08:40:08Z sftp-server[68817]: closedir "/var"  
2019-04-18T08:40:08Z sftp-server[68817]: opendir "/var"  
2019-04-18T08:40:08Z sftp-server[68817]: closedir "/var"  
2019-04-18T08:40:10Z sftp-server[68817]: opendir "/var/log"  
2019-04-18T08:40:10Z sftp-server[68817]: closedir "/var/log"  
2019-04-18T08:40:10Z sftp-server[68817]: opendir "/var/log"  
2019-04-18T08:40:10Z sftp-server[68817]: closedir "/var/log"  
2019-04-18T08:40:13Z sftp-server[68817]: opendir "/vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker/log/syslog.log"  
2019-04-18T08:40:13Z sftp-server[68817]: sent status No such file
2019-04-18T08:40:22Z sftp-server[68817]: opendir "/var/log"  
2019-04-18T08:40:22Z sftp-server[68817]: closedir "/var/log"  
2019-04-18T08:40:22Z sftp-server[68817]: opendir "/vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker/log/syslog.log"  
2019-04-18T08:40:22Z sftp-server[68817]: sent status No such file
2019-04-18T08:40:22Z sftp-server[68823]: session opened for local user root from [192.168.10.62]
2019-04-18T08:40:22Z sftp-server[68823]: opendir "/var/log"  
2019-04-18T08:40:22Z sftp-server[68823]: closedir "/var/log"  
2019-04-18T08:40:23Z sftp-server[68823]: open "/vmfs/volumes/5a6f6646-db921f99-e5cd-b499babd4e5e/.locker/log/syslog.log" flags READ mode 0666  
chgorges
chgorges 18.04.2019 um 13:32:57 Uhr
Goto Top
Hi,

ich denke der erste Weg zur Besserung ist eine ESXi-Version einzusetzen, die auch mit dem Gen7 kompatibel ist. Heißt für deinen Server ist bei ESXi 6.0 Schluss http://h17007.www1.hpe.com/us/en/enterprise/servers/supportmatrix/vmwar ...

Korrigiere das, dann gehts weiter ;)
Spitzbube
Spitzbube 18.04.2019 um 15:56:17 Uhr
Goto Top
Na das ist ja wohl nicht dein ernst face-smile Erstens gehts hier nicht um einen Businessbetrieb (Wobei da immer noch ein XP gegen Kohle supportet wird) sondern um einen alten Testserver, der ja vllt. schon in die Jahre gekommen ist aber trotzdem noch seinen Dienst bis zu diesem Fehler verrichtet hat der für mein Einsatzgebiet perfortmant genug ist. Außerdem läuft da ein ESXi Custom Image 6.5 drauf und was soll deins jetzt bringen oder besser machen? Das hat doch nichts mit dem OS zutun, was vorher ohne Probleme 2 Jahre lief.
ukulele-7
ukulele-7 18.04.2019 um 16:49:24 Uhr
Goto Top
Das tritt wirklich nach jedem Kaltstart auf oder nur wenn er wirklich längere Zeit gestanden hat?

Ich kann mir auch nicht so recht vorstellen das er eine Einstellung akzeptiert und dann nicht speichert wegen einer unsupported version. Auf was für einem Speicher läuft der ESXi, interne Platten? Du kannst einen USB Stick nehmen und darauf ESXi installieren und Testen ob das Verhalten noch auftritt.
Spitzbube
Spitzbube 18.04.2019 um 16:59:36 Uhr
Goto Top
Das ist egal wann, ob neustart oder nach ein paar Tagen. Muss er ja die Maschinen anzeigen. Die verschwinden ja nicht einfach so...

Also laufen tut der ESXi auf ner SD-Karte. Möglich, dass diese nen Schuss hat, aber dann würde ja gar nix booten aber hier in den logs spricht er ja nur von einzelnen Config Files.

Hatte bisher das Muffins ESXi Fix drauf und bin damit super gefahren.drauf und bin damit super gefahren.

Interne Platten sind halt HP SAS 10K. Kann es vllt. dran liegen, wenn der Diskplatz eines Pools intern arg klein ist bzw. nicht mehr genügend Speicher zur Verfügung steht oder ne Grenze erreicht ist, dass deshalb der Fehler auftritt?
ukulele-7
ukulele-7 19.04.2019 um 08:49:59 Uhr
Goto Top
Also die internen Platten sollten nicht das Problem sein, eher wenn die SD keinen Platz mehr hat. Aber ich hatte das Problem auch noch nicht, sind nur Ideen.
Spitzbube
Spitzbube 27.04.2019 um 11:02:35 Uhr
Goto Top
So: Habe die Maschien mal weggesichert und den ESXi neuinstalliert. Zusätzlich ne neue neue SD-Karte mit 32 GB Speicher spendiert. Die Maschinen etc. blieben der nach der neuinstallation bestehen und die VM bleibt auch nach dem ersten Reboot registriert. Einzig die Netzwerke und vSwitche müssen wieder angelegt werden, was aber ne Kleinigkeit ist. Schade ist, dass ich das Problem nicht verstanden habe und keine Ahnung habe, was die Ursache dafür war.