Error Deploymanager hadoop failed

advertisement
[INFO] Launching installer back end
[INFO] Running as root, /root/Desktop/biginsights-enterpriselinux64_b20130531_1113/installer/bin/install.sh simple-fullinstall.xml
[INFO] Distribution Vendor : ibm
[INFO] Progress - Initializing install properties
[INFO] Progress - 0%
[INFO] HadoopConf - Hadoop Configuration class is not on classpath
[INFO] MgmtConfInitializer - Biginsights nodes [oc3537047140.ibm.com]
[INFO] MgmtConfInitializer - install mode : install
[INFO] MgmtConfInitializer - distro vendor : ibm
[INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
[INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
[INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
[INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
[INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
[INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
[INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
[INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
[INFO] MgmtConfInitializer - datanode is not set
[INFO] MgmtConfInitializer - oc3537047140.ibm.com is NameNode
[INFO] MgmtConfInitializer - MgmtConfInitializer: oc3537047140.ibm.com is Secondary NameNode
[INFO] MgmtConfInitializer - oc3537047140.ibm.com is JobTracker
[INFO] MgmtConfInitializer - oc3537047140.ibm.com is DataNode
[INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
[INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
[INFO] MgmtConfInitializer - mgmt.ssh.config=by_privileged_user
[INFO] MgmtConfInitializer - mgmt.user=biadmin
[INFO] MgmtConfInitializer - mgmt.group=501
[INFO] MgmtConfInitializer - biginsights.virtualnodes=null
[INFO] HadoopConf - Hadoop conf saved to /root/Desktop/biginsights-enterpriselinux64_b20130531_1113/installer/hdm/hadoop-conf-staging
[INFO] Progress - Check cluster environment
[INFO] Progress - 2%
[INFO] HadoopConf - Hadoop Configuration class is not on classpath
[INFO] Prerequisite check - Proceeding mandatory check to verify if given credentials are correct
to access cluster nodes : [oc3537047140.ibm.com]
[INFO] Prerequisite check - Node Access : OK
[INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
[INFO] @oc3537047140.ibm.com - Check directory - /hadoop/hdfs/data
[INFO] @oc3537047140.ibm.com - Check disk - /dev/mapper/vg_oc7060277565-lv_root
[INFO] @oc3537047140.ibm.com - Free space of /dev/mapper/vg_oc7060277565-lv_root is 284G
[INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 30494267802
[INFO] Deployer - reset dfs.datanode.du.reserved=30494267802
[INFO] HadoopConf - Hadoop conf saved to /root/Desktop/biginsights-enterpriselinux64_b20130531_1113/installer/hdm/hadoop-conf-staging
[INFO] Prerequisite check - Ignore checking NFS/VIP for HA
[INFO] @oc3537047140.ibm.com - oc3537047140.ibm.com->127.0.0.1 : valid
[INFO] Progress - 2%
[INFO] Prerequisite check - Hostname/ip check - succeed
[INFO] @oc3537047140.ibm.com - 8888 : port available
[INFO] @oc3537047140.ibm.com - 9999 : port available
[INFO] @oc3537047140.ibm.com - 50090 : port available
[INFO] @oc3537047140.ibm.com - 50070 : port available
[INFO] @oc3537047140.ibm.com - 50030 : port available
[INFO] @oc3537047140.ibm.com - 50075 : port available
[INFO] @oc3537047140.ibm.com - 60030 : port available
[INFO] @oc3537047140.ibm.com - 50010 : port available
[INFO] @oc3537047140.ibm.com - 60010 : port available
[INFO] @oc3537047140.ibm.com - 9000 : port available
[INFO] @oc3537047140.ibm.com - 9001 : port available
[INFO] @oc3537047140.ibm.com - 1528 : port available
[INFO] @oc3537047140.ibm.com - 8080 : port available
[INFO] @oc3537047140.ibm.com - 9093 : port available
[INFO] @oc3537047140.ibm.com - 10000 : port available
[INFO] @oc3537047140.ibm.com - 8280 : port available
[INFO] @oc3537047140.ibm.com - 50020 : port available
[INFO] @oc3537047140.ibm.com - 10101 : port available
[INFO] @oc3537047140.ibm.com - 10102 : port available
[INFO] @oc3537047140.ibm.com - 60020 : port available
[INFO] @oc3537047140.ibm.com - 7052 : port available
[INFO] @oc3537047140.ibm.com - 60000 : port available
[INFO] @oc3537047140.ibm.com - 9099 : port available
[INFO] @oc3537047140.ibm.com - 2181 : port available
[INFO] @oc3537047140.ibm.com - 2182 : port available
[INFO] @oc3537047140.ibm.com - 8004 : port available
[INFO] @oc3537047140.ibm.com - 8006 : port available
[INFO] @oc3537047140.ibm.com - 8007 : port available
[INFO] Prerequisite check - Ports check - succeed
[INFO] @oc3537047140.ibm.com - SELINUX - permissive : ok
[INFO] @oc3537047140.ibm.com - OS - Red Hat Enterprise Linux Workstation release 6.4
(Santiago) Kernel on an m : supported
[INFO] Prerequisite check - Server configuration check - succeed
[INFO] @oc3537047140.ibm.com - Check directory - /hadoop/hdfs/data
[INFO] @oc3537047140.ibm.com - Check disk - /dev/mapper/vg_oc7060277565-lv_root
[INFO] @oc3537047140.ibm.com - Free space of /dev/mapper/vg_oc7060277565-lv_root is 284G
[INFO] @oc3537047140.ibm.com - Check minimal disk space requirement for biginsights installation
[INFO] @oc3537047140.ibm.com - Check disk usage of BIGINSIGHTS_HOME :
/opt/ibm/biginsights
[INFO] @oc3537047140.ibm.com - Free space of /dev/mapper/vg_oc7060277565-lv_root is 284G
[INFO] @oc3537047140.ibm.com - 41.90GB free space required, 284.00GB space available
[INFO] Progress - 3%
[INFO] Prerequisite check - Disk space check - succeed
[INFO] @oc3537047140.ibm.com - Check directory - /hadoop/hdfs/data
[INFO] @oc3537047140.ibm.com - Check disk - /dev/mapper/vg_oc7060277565-lv_root
[INFO] @oc3537047140.ibm.com - Free space of /dev/mapper/vg_oc7060277565-lv_root is 284G
[INFO] @oc3537047140.ibm.com - Check datanode disk space requirement
[INFO] @oc3537047140.ibm.com - 28.40GB free space required, 284.00GB space available
[INFO] Prerequisite check - Datanode disk space check - succeed
[INFO] @oc3537047140.ibm.com - Program - scp,zip,bash,tar,ssh,unzip : installed
[INFO] Prerequisite check - Required software/libraries Check - succeed
[INFO] @oc3537047140.ibm.com - Client: oc3537047140.ibm.com >> Server:
oc3537047140.ibm.com:9001 : Good
[INFO] @oc3537047140.ibm.com - Client: oc3537047140.ibm.com >> Server:
oc3537047140.ibm.com:50010 : Good
[INFO] @oc3537047140.ibm.com - Client: oc3537047140.ibm.com >> Server:
oc3537047140.ibm.com:9000 : Good
[INFO] Prerequisite check - Internal connectivity Check - succeed
[INFO] Check the current user
RHEL
Added existing user biadmin to group 501
[INFO] Running as biadmin, /home/biadmin/__biginsights_install/installer/bin/install.sh
[INFO] Distribution Vendor : ibm
[INFO] Extract Java for biadmin...
[INFO] Check the current user
[INFO] Current user is biadmin and 501 exists
[INFO] User login shell : BASH
[INFO] Using... BIGINSIGHTS_HOME: /opt/ibm/biginsights
[INFO] Using... BIGINSIGHTS_VAR: /var/ibm/biginsights
[INFO] Using... SSH CONFIG MODE: by_privileged_user
[INFO] Using... Biginsights administrator: biadmin
[INFO] Progress - BigInsights installation response file type: install
[INFO] HadoopConf - Hadoop Configuration class is not on classpath
[INFO] Progress - Installing HDM
[INFO] Progress - 3%
[INFO] Progress - Preparing JDK package
[INFO] Progress - 4%
[INFO] JDK at /opt/ibm/biginsights/hdm/jdk
[INFO] Progress - Preparing Hadoop package
[INFO] Progress - 6%
[INFO] Hadoop at /opt/ibm/biginsights/hdm/IHC
[INFO] Progress - Configuring password-less SSH
[INFO] Progress - 8%
[INFO] HadoopMgmtCmdline - Running configAccountAndSsh
/home/biadmin/__biginsights_install/installer/bin/../../artifacts
[INFO] Cluster - Setup biginsights admin user/group, setup passwordless SSH
[INFO] Cluster - Biadmin configured locally
Password:
RHEL
Added existing user biadmin to group 501
Generating public/private rsa key pair.
Your identification has been saved in /home/biadmin/.ssh/id_rsa.
Your public key has been saved in /home/biadmin/.ssh/id_rsa.pub.
The key fingerprint is:
cb:09:d1:b0:e4:b9:9a:d0:59:a5:0e:15:72:c1:56:4c biadmin@oc3537047140.ibm.com
The key s randomart image is:
+--[ RSA 2048]----+
| ..B*E |
| *oB. |
| ..B . |
|.=o|
|.o+S|
|.ooo|
|o+|
||
||
+-----------------+
SSH ID generated at /home/biadmin/.ssh/id_rsa
[INFO] @oc3537047140.ibm.com - root@oc3537047140.ibm.com s password:
RHEL
Added existing user biadmin to group 501
SSH pub key appended to /home/biadmin/.ssh/authorized_keys
Skip ID file generation as they exist
[INFO] Progress - 11%
[INFO] Cluster - Check biginsights admin passwordless SSH setup
[INFO] @oc3537047140.ibm.com - OK, password-less SSH has setup
[INFO] @oc3537047140.ibm.com - OK, group 501 exists
[INFO] Progress - 13%
[INFO] DupHostDefender - Add other known names to ~/.ssh/known_hosts.
[INFO] Progress - 14%
[INFO] Install as HADOOP_USER biadmin
[INFO] Progress - Checking directories permission
[INFO] Progress - 17%
[INFO] HadoopMgmtCmdline - Running configDirs
[INFO] @oc3537047140.ibm.com - root@oc3537047140.ibm.com s password:
[INFO] Progress - 18%
[INFO] HadoopMgmtCmdline - Running check32or64
[INFO] Progress - Deploying IBM Hadoop Cluster
[INFO] Progress - 18%
[INFO] HadoopMgmtCmdline - Running deployForceAll
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
[INFO] HadoopConf - Reset includes (dfs.hosts)
[INFO] HadoopConf - Reset includes (mapred.hosts)
[INFO] HadoopConf - Auto set mapred.fairscheduler.allocation.file=/opt/ibm/biginsights/hadoopconf/fair-scheduler.xml
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... hdm, force
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com, force
[INFO] @oc3537047140.ibm.com - Deploy ... ihc, force
[INFO] @oc3537047140.ibm.com - Deploy ... jdk, force
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [hdm, ihc-conf, ihc, jdk]
[INFO] Progress - 25%
[INFO] Progress - Clean up possible leftover process
[INFO] Progress - 25%
[INFO] HadoopMgmtCmdline - Running cleanupForInstall
[INFO] @oc3537047140.ibm.com - /opt/ibm/biginsights/.hdm-stub/bin/managedcleanupForInstall.sh
root@oc3537047140.ibm.com s password:
[INFO] Progress - Updating nofile and nproc
[INFO] Progress - 26%
[INFO] HadoopMgmtCmdline - Running syncnofile 65536 65536
[INFO] @oc3537047140.ibm.com - /opt/ibm/biginsights/.hdm-stub/bin/managed-rootfilehandler.sh biadmin 65536 65536
root@oc3537047140.ibm.com s password:
Set hard nofile to 65536
Set soft nofile to 65536
Set hard nproc to 65536
Set soft nproc to 65536
[INFO] Progress - 27%
[INFO] Progress - Synchronizing system time
[INFO] Progress - 27%
[INFO] HadoopMgmtCmdline - Running synctime
[INFO] @oc3537047140.ibm.com - /opt/ibm/biginsights/.hdm-stub/bin/managed-rootsynctime.sh 08/07/2013 17:01:39 Europe/Rome
root@oc3537047140.ibm.com s password:
Wed Aug 7 00:00:00 CEST 2013
Wed Aug 7 17:01:39 CEST 2013
Time updated 08/07/2013 17:01:39 Europe/Rome
[INFO] Progress - 29%
[INFO] Progress - Installing BigInsights applications
[INFO] Progress - 35%
[INFO] DeployCmdline - [ IBM InfoSphere BigInsights Enterprise Edition Enterprise Version ]
[INFO] Progress - Install hdm
[INFO] Deployer - Copy HDM essentials to other nodes
[INFO] Deployer - Setting up runtime environment, export user environment variables
[INFO] Deployer - Deploying shared lib to all nodes
[INFO] Deployer - Create default mount directory on local file system
[INFO] Deployer - Installing jaql-db2 integration
[INFO] Deployer - [INFO] Install JAQL-DB2 integration on the management node
[INFO] Untar jaql_db2...
[INFO] deploy jaql_db2 plugin succeed
[INFO] Progress - 37%
[INFO] Progress - Install gpfs-connector
[INFO] Deployer - Inject GPFS Connector in IHC
[INFO] Deployer - /usr/lpp/mmfs/lib/libgpfs.so not found
[INFO] Deployer - Re-deploy IHC to deploy GPFS Connector
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... hdm
[INFO] @oc3537047140.ibm.com - Deploy ... ihc
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [hdm, ihc]
[INFO] Deployer - Setup gpfs-connector log directory.
[INFO] Deployer - @oc3537047140.ibm.com - installing protobuf rpm
[INFO] Progress - 39%
[INFO] Progress - Install guardiumproxy
[INFO] Progress - 41%
[INFO] Progress - Install zookeeper
[INFO] @oc3537047140.ibm.com - zookeeper configuration synced
[INFO] @oc3537047140.ibm.com - zookeeper installed
[INFO] @oc3537047140.ibm.com - zookeeper started, pid 26394
[INFO] Deployer - zookeeper service started
[INFO] Progress - 43%
[INFO] Progress - Install data-compression
[INFO] Deployer - Inject data compression jar and natives in IHC/lib
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Deployer - Re-deploy IHC to deploy data compressor
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... ihc
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [ihc, ihc-conf]
[INFO] @oc3537047140.ibm.com - data-compression installed
[INFO] Progress - 45%
[INFO] Progress - Install scheduler
[INFO] Deployer - Inject flex jar in IHC/lib
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Deployer - Re-deploy IHC to deploy flex scheduler
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... ihc
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [ihc, ihc-conf]
[INFO] @oc3537047140.ibm.com - scheduler installed
[INFO] Progress - 47%
[INFO] Progress - Install adaptivemapred
[INFO] Deployer - Inject adaptivemapred jar in IHC/lib
[INFO] Deployer - link zookeeper to IHC lib
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Deployer - Re-deploy IHC to deploy adaptivemapred
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... ihc
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [ihc, ihc-conf]
[INFO] @oc3537047140.ibm.com - adaptivemapred installed
[INFO] Progress - 49%
[INFO] Progress - Install sftp
[INFO] Deployer - Inject sftp jars into IHC/lib
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Deployer - Re-deploy IHC to deploy sftp
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... ihc
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [ihc, ihc-conf]
[INFO] @oc3537047140.ibm.com - sftp installed
[INFO] Progress - 51%
[INFO] Progress - Install text-analytics
[INFO] Deployer - Found LanguageWare dir : /opt/ibm/biginsights/textanalytics/lib/LanguageWare-7.2.0.2
[INFO] @oc3537047140.ibm.com - text-analytics installed
[INFO] Progress - 53%
[INFO] Progress - Install hadoop
[INFO] Deployer - deploy hadoop to cluster
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... hdm, force
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com, force
[INFO] @oc3537047140.ibm.com - Deploy ... ihc, force
[INFO] @oc3537047140.ibm.com - Deploy ... jdk, force
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [hdm, ihc-conf, ihc, jdk]
[INFO] @oc3537047140.ibm.com - /opt/ibm/biginsights/.hdm-stub/bin/managed-format.sh
13/08/07 17:09:24 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = oc3537047140.ibm.com/127.0.0.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 1.1.1
STARTUP_MSG: build = git://dasani.svl.ibm.com/ on branch (no branch) -r
70b5aad8822a30795c1acdb966c97316387e1fc0; compiled by jenkins on Thu May 30 17:51:51 PDT
2013
************************************************************/
Re-format filesystem in /hadoop/hdfs/name ? (Y or N) 13/08/07 17:09:24 INFO util.GSet: VM
type = 64-bit
13/08/07 17:09:24 INFO util.GSet: 2% max memory = 20.0 MB
13/08/07 17:09:24 INFO util.GSet: capacity = 2^21 = 2097152 entries
13/08/07 17:09:24 INFO util.GSet: recommended=2097152, actual=2097152
13/08/07 17:09:24 INFO namenode.FSNamesystem: fsOwner=biadmin
13/08/07 17:09:24 INFO namenode.FSNamesystem: supergroup=supergroup
13/08/07 17:09:24 INFO namenode.FSNamesystem: isPermissionEnabled=true
13/08/07 17:09:24 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
13/08/07 17:09:24 WARN namenode.FSNamesystem: The dfs.support.append option is in your
configuration, however append is not supported. This configuration option is no longer required to
enable sync.
13/08/07 17:09:24 INFO namenode.FSNamesystem: isAccessTokenEnabled=false
accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
13/08/07 17:09:24 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = -1
13/08/07 17:09:24 INFO namenode.NameNode: Caching file names occuring more than 10 times
13/08/07 17:09:24 INFO common.Storage: Image file of size 113 saved in 0 seconds.
13/08/07 17:09:25 INFO namenode.FSEditLog: closing edit log: position=4,
editlog=/hadoop/hdfs/name/current/edits
13/08/07 17:09:25 INFO namenode.FSEditLog: close success: truncate to 4,
editlog=/hadoop/hdfs/name/current/edits
13/08/07 17:09:26 INFO common.Storage: Storage directory /hadoop/hdfs/name has been
successfully formatted.
13/08/07 17:09:26 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at oc3537047140.ibm.com/127.0.0.1
************************************************************/
[INFO] Deployer - Change the permission of hadoop.tmp.dir /var/ibm/biginsights/hadoop/tmp
[INFO] Deployer - Update hadoop-env.sh
[INFO] Deployer - ConfigureJmx:true
[INFO] Deployer - NameNode JMX Port:8004;JobTracker JMX Port:8006;DataNode JMX Port:8007
[INFO] Cluster - Number of slave nodes : 1.0
[INFO] Cluster - init mapred.submit.replication : 1
[INFO] Cluster - dfs.replication : 1
[INFO] Cluster - final mapred.submit.replication : 1
[INFO] Cluster - Update hadoop daemon heap size
[INFO] Deployer - Setup linux task controller
[INFO] LinuxTaskControllerSetup - Update core-site.xml
[INFO] LinuxTaskControllerSetup - Reset
hadoop.tmp.dir=/var/ibm/biginsights/hadoop/tmp/${user.name}
[INFO] LinuxTaskControllerSetup - Update mapred-site.xml
[INFO] LinuxTaskControllerSetup - Reset mapred.task.tracker.taskcontroller=org.apache.hadoop.mapred.LinuxTaskController
[INFO] LinuxTaskControllerSetup - Reset mapreduce.tasktracker.group=501
[INFO] LinuxTaskControllerSetup - Update task-controller.cfg
[INFO] LinuxTaskControllerSetup - Reset hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
[INFO] LinuxTaskControllerSetup - Reset mapred.local.dir=/hadoop/mapred/local
[INFO] LinuxTaskControllerSetup - Reset mapreduce.tasktracker.group=501
[INFO] LinuxTaskControllerSetup - Reset min.user.id=100
[INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihcconf.tar.gz@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - Deploy ... ihc-conf@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [ihc-conf]
[INFO] @oc3537047140.ibm.com - Create task-controller configuration directory
[INFO] @oc3537047140.ibm.com - Update task-controller executable owner and permission
[INFO] @oc3537047140.ibm.com - Update ownership and permissions for task-controller.cfg
[INFO] @oc3537047140.ibm.com - Copy task-controller.cfg to hadoop security conf directory
[INFO] @oc3537047140.ibm.com - namenode started, pid 16970
[INFO] @oc3537047140.ibm.com - secondarynamenode started, pid 17342
[INFO] @oc3537047140.ibm.com - datanode started, pid 17562
[INFO] Progress - 54%
[INFO] Deployer - Waiting for Namenode to exit safe mode...
[INFO] Deployer - HDFS cluster started successfully
[INFO] @oc3537047140.ibm.com - jobtracker started, pid 18012
[INFO] @oc3537047140.ibm.com - tasktracker started, pid 18304
[INFO] Progress - 55%
[INFO] Deployer - MapReduce cluster started successfully
[INFO] @oc3537047140.ibm.com - hadoop installed
[INFO] Progress - Install derby
[INFO] @oc3537047140.ibm.com - derby installed
[INFO] @oc3537047140.ibm.com - derby started, pid 19679
[INFO] Progress - 57%
[INFO] Progress - Install jaql
[INFO] @oc3537047140.ibm.com - jaql installed
[INFO] Progress - 59%
[INFO] Progress - Install hive
[INFO] @oc3537047140.ibm.com - hive library deployed
[INFO] @oc3537047140.ibm.com - hive installed
[INFO] Progress - 61%
[INFO] Progress - Install pig
[INFO] @oc3537047140.ibm.com - pig installed
[INFO] Progress - 63%
[INFO] Progress - Install lucene
[INFO] @oc3537047140.ibm.com - lucene installed
[INFO] Progress - 65%
[INFO] Progress - Install hbase
[INFO] Deployer - deploying library hbase
[INFO] Deployer - Found hase jar file : /opt/ibm/biginsights/hbase/hbase-0.94.3-security.jar
[INFO] Deployer - Found zookeeper jar file : /opt/ibm/biginsights/hbase/lib/zookeeper-3.4.5.jar
[INFO] Deployer - Symlink hbase.jar to overlay or BI jar files @oc3537047140.ibm.com
[INFO] Deployer - Create symlink for lib/zookeeper.jar file to reference overlay or BI jar files
@oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - hbase installed
[INFO] Deployer - check zookeeper services, make sure zookeeper service is started before start
hbase service
[INFO] @oc3537047140.ibm.com - hbase-master(active) started
[INFO] @oc3537047140.ibm.com - hbase-regionserver started
[INFO] Deployer - hbase service started
[INFO] Progress - 67%
[INFO] Progress - Install flume
[INFO] @oc3537047140.ibm.com - Deploy ... flume
[INFO] @oc3537047140.ibm.com - flume installed
[INFO] Progress - 68%
[INFO] Progress - Install ei
[INFO] @oc3537047140.ibm.com - Deploy ... ei
[INFO] @oc3537047140.ibm.com - ei installed
[INFO] Progress - 70%
[INFO] Progress - Install machine-learning
[INFO] @oc3537047140.ibm.com - Deploy ... machine-learning
[INFO] @oc3537047140.ibm.com - machine-learning installed
[INFO] Progress - 72%
[INFO] Progress - Install hcatalog
[INFO] @oc3537047140.ibm.com - Deploy ... hcatalog
[INFO] @oc3537047140.ibm.com - hcatalog installed
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/components/hcatalog/todeploy/hcatalogconf.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... hcatalog-conf, force
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [hcatalog-conf]
[INFO] Progress - 74%
[INFO] Progress - Install sqoop
[INFO] @oc3537047140.ibm.com - Deploy ... sqoop
[INFO] @oc3537047140.ibm.com - sqoop installed
[INFO] Progress - 76%
[INFO] Progress - Install bigsql
[INFO] @oc3537047140.ibm.com - Deploy ... bigsql
[INFO] @oc3537047140.ibm.com - bigsql installed
[INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/components/bigsql/todeploy/bigsqlconf.tar.gz
[INFO] @oc3537047140.ibm.com - Deploy ... bigsql-conf, force
[INFO] @oc3537047140.ibm.com - Packages all up-to-date, after deploy [bigsql-conf]
[INFO] Progress - 78%
[INFO] Progress - Install bigindex
[INFO] @oc3537047140.ibm.com - bigindex installed
[INFO] Progress - 80%
[INFO] Progress - Install oozie
[INFO] Deployer - deploying library oozie
[INFO] @oc3537047140.ibm.com - oozie installed
[INFO] Deployer - Update biginsights-oozie.properties succeed.
[INFO] Progress - 82%
[INFO] Progress - Install orchestrator
[INFO] @oc3537047140.ibm.com - orchestrator installed
[INFO] Progress - 84%
[INFO] Progress - Install jaqlserver
[INFO] Progress - 86%
[INFO] Progress - Install console
[INFO] Deployer - repack enterpsie console war file before deploy ...
[INFO] Deployer - found : /tmp/biginsightsConsole.properties6262178694317033460.tmp, updating
...
[INFO] Deployer - reset appPath=/opt/ibm/biginsights/console/wlp/usr/servers/waslpserver/apps/BigInsightsWebApp.ear/EnterpriseConsole.war
[INFO] Deployer - reset
com.ibm.biginsights.catalog.dbUrl=jdbc:derby://oc3537047140.ibm.com:1528//var/ibm/biginsights
/derby/toolsCatalog;create=true
[INFO] Deployer - Copy /tmp/bouncer.jar6525045569696993899.tmp to
/opt/ibm/biginsights/hdm/components/console/binary/bouncer.jar
[INFO] Deployer - node oc3537047140.ibm.com Directory /etc/pam.d exists
[INFO] Deployer - create server on waslp ...
[INFO] Deployer - Server waslp-server created.
[INFO] Deployer - pack hadoop conf jar ...
[INFO] Deployer - added manifest
[INFO] Deployer - adding: .deploy.cksum(in = 54) (out= 56)(deflated -3%)
[INFO] Deployer - adding: taskcontroller.cfg(in = 287) (out= 176)(deflated 38%)
[INFO] Deployer - adding: configuration.xsl(in = 535) (out= 270)(deflated 49%)
[INFO] Deployer - adding: hdfs-site.xml(in = 2491) (out= 726)(deflated 70%)
[INFO] Deployer - adding: fair-scheduler.xml(in = 327) (out= 206)(deflated 37%)
[INFO] Deployer - adding: core-site.xml(in = 1808) (out= 668)(deflated 63%)
[INFO] Deployer - adding: capacity-scheduler.xml(in = 7457) (out= 1863)(deflated 75%)
[INFO] Deployer - adding: mapred-queue-acls.xml(in = 229) (out= 153)(deflated 33%)
[INFO] Deployer - adding: mapred-site.xml(in = 5592) (out= 1541)(deflated 72%)
[INFO] Deployer - adding: hadoop-metrics2.properties(in = 1488) (out= 425)(deflated 71%)
[INFO] Deployer - adding: hadoop-policy.xml(in = 4644) (out= 781)(deflated 83%)
[INFO] Deployer - adding: ssl-client.xml.example(in = 1243) (out= 263)(deflated 78%)
[INFO] Deployer - adding: slaves(in = 21) (out= 23)(deflated -9%)
[INFO] Deployer - adding: hadoop-env.sh(in = 3648) (out= 1432)(deflated 60%)
[INFO] Deployer - adding: ibm-hadoop.properties(in = 2302) (out= 913)(deflated 60%)
[INFO] Deployer - adding: log4j.properties(in = 8179) (out= 1814)(deflated 77%)
[INFO] Deployer - adding: flex-scheduler.xml(in = 167) (out= 96)(deflated 42%)
[INFO] Deployer - adding: includes(in = 21) (out= 23)(deflated -9%)
[INFO] Deployer - adding: masters(in = 42) (out= 44)(deflated -4%)
[INFO] Deployer - adding: ssl-server.xml.example(in = 1195) (out= 259)(deflated 78%)
[INFO] Deployer - pack oozie conf jar ...
[INFO] Deployer - added manifest
[INFO] Deployer - adding: hadoop-config.xml(in = 1409) (out= 662)(deflated 53%)
[INFO] Deployer - adding: oozie-env.sh(in = 1679) (out= 823)(deflated 50%)
[INFO] Deployer - adding: action-conf/(in = 0) (out= 0)(stored 0%)
[INFO] Deployer - adding: action-conf/hive.xml(in = 1305) (out= 691)(deflated 47%)
[INFO] Deployer - adding: oozie-default.xml(in = 63999) (out= 9526)(deflated 85%)
[INFO] Deployer - adding: oozie-log4j.properties(in = 3550) (out= 976)(deflated 72%)
[INFO] Deployer - adding: hadoop-conf/(in = 0) (out= 0)(stored 0%)
[INFO] Deployer - adding: hadoop-conf/core-site.xml(in = 1409) (out= 662)(deflated 53%)
[INFO] Deployer - adding: oozie-site.xml(in = 13617) (out= 3388)(deflated 75%)
[INFO] Deployer - adding: adminusers.txt(in = 824) (out= 474)(deflated 42%)
[INFO] Deployer - pack hbase conf jar ...
[INFO] Deployer - added manifest
[INFO] Deployer - adding: hbase-env.sh(in = 5044) (out= 2005)(deflated 60%)
[INFO] Deployer - adding: hadoop-metrics.properties(in = 3015) (out= 857)(deflated 71%)
[INFO] Deployer - adding: hbase-policy.xml(in = 2250) (out= 821)(deflated 63%)
[INFO] Deployer - adding: log4j.properties(in = 3593) (out= 1087)(deflated 69%)
[INFO] Deployer - adding: regionservers(in = 21) (out= 23)(deflated -9%)
[INFO] Deployer - adding: hbase-site.xml(in = 1591) (out= 344)(deflated 78%)
[INFO] Deployer - update /opt/ibm/biginsights/console/wlp/usr/servers/waslpserver/bootstrap.properties
[INFO] Deployer - copy sampApp to console dir
[INFO] Deployer - install EnterpriseConsole.war
[INFO] Deployer - /opt/ibm/biginsights
[INFO] Deployer - install Bouncer.war
[INFO] Deployer - inject bouncer.jar
[INFO] Deployer - inflated: PAMRegistry-1.0.mf
[INFO] Deployer - install cre.war
[INFO] Deployer - install productserver.war
[INFO] Deployer - install bigsqlweb.war
[INFO] Deployer - /opt/ibm/biginsights/console/wlp
[INFO] Deployer - rm: cannot remove `/opt/ibm/biginsights/console/wlp/usr/servers/waslpserver/logs/console.log : No such file or directory
[INFO] Deployer - Server waslp-server started with process ID 8210.
[INFO] Deployer - [AUDIT ] CWWKF0011I: The server waslp-server is ready to run a smarter
planet.
[INFO] Deployer - BigInsights Management Console started, pid 8210
[INFO] @oc3537047140.ibm.com - console installed
[INFO] Progress - 88%
[INFO] Progress - Install eclipsetooling
[INFO] Deployer - Stopping BigInsights Management Console...
[INFO] Deployer - Server waslp-server stopped.
[INFO] Deployer - Use built-in Hadoop OSGI bundle
[INFO] Deployer - Use built-in Hbase OSGI bundle
[INFO] Deployer - install ... updatesite.war
[INFO] Deployer - /opt/ibm/biginsights/console/wlp
[INFO] Deployer - Server waslp-server started with process ID 8964.
[INFO] Deployer - [AUDIT ] CWWKF0011I: The server waslp-server is ready to run a smarter
planet.
[INFO] Deployer - BigInsights Management Console started, pid 8964
[INFO] Progress - 90%
[INFO] Progress - Install sheets
[INFO] Deployer - Set sheets frontend port to 8080
[INFO] Deployer - Initialize sheets properties, saved in
/opt/ibm/biginsights/hdm/components/sheets/biginsights-bs.properties
[INFO] BigSheetsDeployer - bdm.sh using BIGINSIGHTS_HOME: /opt/ibm/biginsights
bdm.sh using BIGINSIGHTS_VAR: /var/ibm/biginsights
[INFO] @oc3537047140.ibm.com - set dataDir with a default value /biginsights/sheets
Archive: /opt/ibm/biginsights/bdm_tmp/bigsheets-wasce.zip
inflating: classpath.xml
inflating: frontend.war
inflating: Jobserver.war
inflating: geronimo-web-auth.xml
inflating: geronimo-web-noauth.xml
inflating: web-noauth.xml
inflating: web.xml
[INFO] @oc3537047140.ibm.com - after unzip wasce tar
[INFO] @oc3537047140.ibm.com - after WEB_INF
adding: WEB-INF/web.xml(in = 6522) (out= 1263)(deflated 80%)
cp -f frontend-deploy.war /opt/ibm/biginsights/console/wlp/usr/servers/waslpserver/apps/BigInsightsWebApp.ear/
cp -f Jobserver.war /opt/ibm/biginsights/console/wlp/usr/servers/waslpserver/apps/BigInsightsWebApp.ear/
[INFO] Deployer - Stopping BigInsights Management Console...
[INFO] Deployer - Server waslp-server stopped.
[INFO] Deployer - Found commons-httpclient jar file : /opt/ibm/biginsights/IHC/lib/commonshttpclient-3.0.1.jar
[INFO] Deployer - Start Sheets
[INFO] Deployer - starting pigserver, logging to /var/ibm/biginsights/sheets/logs/bigsheetsbiadmin-pigserver.out
[INFO] Deployer - /opt/ibm/biginsights/console/wlp
[INFO] Deployer - Server waslp-server started with process ID 11267.
[INFO] Deployer - [AUDIT ] CWWKF0011I: The server waslp-server is ready to run a smarter
planet.
[INFO] Deployer - BigInsights Management Console started, pid 11267
[INFO] @oc3537047140.ibm.com - sheets installed
[INFO] Progress - 92%
[INFO] Progress - Install HAManager
[INFO] DeployManager - HAManager is skipped from deployment...
[INFO] Progress - 94%
[INFO] Progress - Install import-export
[INFO] @oc3537047140.ibm.com - Deploy ... import-export
[INFO] @oc3537047140.ibm.com - import-export installed
[INFO] Progress - 96%
[INFO] Progress - Install httpfs
[INFO] Deployer - httpfs is embeded in IHC
[INFO] Deployer - start installConfig of httpfs
[INFO] Deployer - successfully generated all configuration files for httpfs
[INFO] Deployer - successfully save installed nodes info for httpfs
[INFO] Deployer - successfully creat httpfs verify scripts for httpfs
[INFO] Deployer - check
jettyProps:/opt/ibm/biginsights/hdm/components/httpfs/conf/jetty.properties
[INFO] Deployer - /opt/ibm/biginsights/hdm/components/httpfs/conf/jetty.properties does not
exist...
[INFO] @oc3537047140.ibm.com - httpfs-server configuration synced
[INFO] Deployer - successfully synchronized httpfs configuration files to all installing nodes
[INFO] @oc3537047140.ibm.com - httpfs-server installed
[INFO] Progress - 98%
[INFO] Progress - Install monitoring
[INFO] Deployer - Prepare Monitoring component binary file...
[INFO] Deployer - Install the monitoring component...
[INFO] Deployer - Deploy monitoring binary files...
[INFO] Deployer - Update permissions...
[INFO] Deployer - Install monitoring binary files...
[INFO] Deployer - post-install clean up...
[INFO] Deployer - Configure monitoring component...
[INFO] Deployer - Generate monitoring configuration files...
[INFO] Deployer - Update monitoring configurations
[INFO] Deployer - Update monitoring agent env...
[INFO] Deployer - Populate agents file...
[INFO] Deployer - update chukwa-agent-conf.xml
[INFO] Deployer - update Oozie shared library in the Hadoop file system
[INFO] Deployer - Waiting for Namenode to exit safe mode...
[INFO] @oc3537047140.ibm.com - monitoring configuration synced
[INFO] Deployer - Create HBase table, hbaseHome:/opt/ibm/biginsights/hbase;
schema:/opt/ibm/biginsights/monitoring/etc/chukwa/hbase.schema;
monitoringHome:/opt/ibm/biginsights/monitoring
[WARN] Deployer - Cannot tell which is the active master because: [Failed to get active master
from api using reflection, because Timeout while trying to get active master from hbase api using
reflection]
[INFO] @oc3537047140.ibm.com - hbase-master started, pid 27318
[INFO] @oc3537047140.ibm.com - hbase-regionserver started, pid 27515
[INFO] Deployer - hbase service started
[INFO] Deployer - Create HBase table via oc3537047140.ibm.com
[INFO] Deployer - Deploy monitoring system metrics...
[INFO] Deployer - stopping pigserver
[INFO] Deployer - Stopping BigInsights Management Console...
[INFO] Deployer - Server waslp-server stopped.
[INFO] Deployer - /opt/ibm/biginsights/console/wlp
[INFO] Deployer - starting pigserver, logging to /var/ibm/biginsights/sheets/logs/bigsheetsbiadmin-pigserver.out
[INFO] Deployer - Server waslp-server started with process ID 15687.
[INFO] Deployer - [AUDIT ] CWWKF0011I: The server waslp-server is ready to run a smarter
planet.
[INFO] Deployer - BigInsights Management Console started, pid 15687
[INFO] Deployer - Start monitoring...
[INFO] Deployer - Starting monitoring agent on oc3537047140.ibm.com
[INFO] @oc3537047140.ibm.com - monitoring started, pid 17065
[INFO] Deployer - monitoring service started
[INFO] Deployer - add adaptors for the monitoring agents...
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=NamenodeProcessor, AdaptorParams=localhost 8004 60 Hadoop:*}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : NamenodeProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 8004 60
Hadoop:* , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=JTProcessor, AdaptorParams=localhost 8006 60 Hadoop:*}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : JTProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 8006 60
Hadoop:* , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=DatanodeProcessor, AdaptorParams=localhost 8007 60 Hadoop:*}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : DatanodeProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 8007 60
Hadoop:* , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=HbaseMasterProcessor, AdaptorParams=localhost 10101 60
hadoop:service=Master,name=Master}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : HbaseMasterProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 10101
60 hadoop:service=Master,name=Master , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=HbaseMasterProcessor, AdaptorParams=localhost 10101 60
hadoop:service=Master,name=MasterStatistics}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : HbaseMasterProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 10101
60 hadoop:service=Master,name=MasterStatistics , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=HbaseRegionServerProcessor, AdaptorParams=localhost 10102 60
hadoop:service=RegionServer,name=RegionServerStatistics}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : HbaseRegionServerProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 10102
60 hadoop:service=RegionServer,name=RegionServerStatistics , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor,
DataType=ZookeeperProcessor, AdaptorParams=localhost 2182 60 org.apache.ZooKeeperService:*}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : ZookeeperProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.JMXAdaptor , AdaptorParams : localhost 2182 60
org.apache.ZooKeeperService:* , Offset : 0 }
[INFO] Deployer - add adaptors for oc3537047140.ibm.com; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.RestAdaptor,
DataType=OozieProcessor, AdaptorParams=localhost 8280 oozie/v1/admin/instrumentation 60}
[INFO] Deployer - URL:http://oc3537047140.ibm.com:9099/rest/v1/adaptor add >> Adaptor
content:{ DataType : OozieProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.RestAdaptor , AdaptorParams : localhost 8280
oozie/v1/admin/instrumentation 60 , Offset : 0 }
[INFO] Deployer - add adaptors for ; adaptor data:{Offset=0,
AdaptorClass=org.apache.hadoop.chukwa.datacollection.adaptor.FlumeRestAdaptor,
DataType=FlumeProcessor, AdaptorParams=localhost 35862 node/reports 60}
[INFO] Deployer - URL:http://:9099/rest/v1/adaptor add >> Adaptor content:{ DataType :
FlumeProcessor , AdaptorClass :
org.apache.hadoop.chukwa.datacollection.adaptor.FlumeRestAdaptor , AdaptorParams : localhost
35862 node/reports 60 , Offset : 0 }
[INFO] Deployer - Resolve host name used by agent for oc3537047140.ibm.com
[INFO] Deployer - Stop monitoring agents...
[INFO] Deployer - Stopping monitoring...
[INFO] Deployer - Stop monitoring agents on [oc3537047140.ibm.com]
[INFO] Deployer - Stop agent with pid 17065
[INFO] Deployer - monitoring service stopped
[INFO] Deployer - Disable monitoring...
[INFO] DeployManager - biginsights.properties updated, disable monitoring
[INFO] @oc3537047140.ibm.com - monitoring installed
[INFO] Progress - 100%
[WARN] DeployManager - HAManager deployer returned an empty deploy result which is treated
as installation skipped
[INFO] DeployManager - Install; SUCCEEDED components: [hdm, gpfs-connector, guardiumproxy,
zookeeper, data-compression, scheduler, adaptivemapred, sftp, text-analytics, hadoop, derby, jaql,
hive, pig, lucene, hbase, flume, ei, machine-learning, hcatalog, sqoop, bigsql, bigindex, oozie,
orchestrator, jaqlserver, console, eclipsetooling, sheets, HAManager, import-export, httpfs,
monitoring]; Consumes : 1090324ms
[INFO] Progress - Start services
[INFO] Progress - 0%
[INFO] DeployCmdline - [ IBM InfoSphere BigInsights Enterprise Edition Enterprise Version ]
[INFO] Progress - Start zookeeper
[INFO] @oc3537047140.ibm.com - zookeeper already running, pid 26394
[INFO] Deployer - zookeeper service started
[INFO] Progress - 5%
[INFO] Progress - Start hadoop
[INFO] @oc3537047140.ibm.com - namenode already running, pid 16970
[INFO] @oc3537047140.ibm.com - secondarynamenode already running, pid 17342
[INFO] @oc3537047140.ibm.com - datanode already running, pid 17562
[INFO] Progress - 8%
[INFO] Deployer - Waiting for Namenode to exit safe mode...
[INFO] Deployer - HDFS cluster started successfully
[INFO] @oc3537047140.ibm.com - jobtracker already running, pid 18012
[INFO] @oc3537047140.ibm.com - tasktracker already running, pid 18304
[INFO] Progress - 10%
[INFO] Deployer - MapReduce cluster started successfully
[INFO] Progress - Start derby
[INFO] @oc3537047140.ibm.com - derby already running, pid 19679
[INFO] Progress - 15%
[INFO] Progress - Start hive
[INFO] @oc3537047140.ibm.com - derby already running, pid 19679
[INFO] Progress - 16%
[INFO] @oc3537047140.ibm.com - hive-web-interface started, pid 19903
[INFO] @oc3537047140.ibm.com - hive-server started, pid 20320
[INFO] Progress - 20%
[INFO] Progress - Start hbase
[INFO] Deployer - check zookeeper services, make sure zookeeper service is started before start
hbase service
[INFO] @oc3537047140.ibm.com - hbase-master(active) already running, pid 27318
[INFO] @oc3537047140.ibm.com - hbase-regionserver already running, pid 27515
[INFO] Deployer - hbase service started
[INFO] Progress - 25%
[INFO] Progress - Start bigsql
[INFO] @oc3537047140.ibm.com - bigsql-server started, pid 21719
[INFO] Progress - 30%
[INFO] Progress - Start oozie
[INFO] @oc3537047140.ibm.com - oozie started, pid 21899
[INFO] Progress - 35%
[INFO] Progress - Start orchestrator
[INFO] @oc3537047140.ibm.com - orchestrator started, pid 22459
[INFO] Progress - 40%
[INFO] Progress - Start console
[INFO] Deployer - /opt/ibm/biginsights/console/wlp
[INFO] Deployer - BigInsights Management Console is already started, pid 15687
[INFO] Progress - 45%
[INFO] Progress - Start httpfs
[INFO] @oc3537047140.ibm.com - httpfs-server started, pid 23156
[INFO] Deployer - httpfs service started
[INFO] Progress - 50%
[INFO] DeployManager - Start; SUCCEEDED components: [zookeeper, hadoop, derby, hive,
hbase, bigsql, oozie, orchestrator, console, httpfs]; Consumes : 62313ms
[INFO] Progress - Verifying BigInsights installation
[INFO] Progress - 50%
[INFO] DeployCmdline - [ IBM InfoSphere BigInsights Enterprise Edition Enterprise Version ]
[INFO] Progress - Validate hadoop
[INFO] Deployer - Running Hadoop terasort example
[INFO] Progress - 75%
[INFO] Progress - Validate hbase
[INFO] Deployer - Try to start hbase if hbase service is stopped...
[INFO] Deployer - Double check whether hbase is started successfully...
[INFO] @oc3537047140.ibm.com - hbase-master(active) started, pid 27318
[INFO] @oc3537047140.ibm.com - hbase-regionserver started, pid 27515
[INFO] Deployer - hbase service started
[INFO] Deployer - hbase service is healthy
[INFO] Progress - 100%
[ERROR] DeployManager [ERROR] DeployManager - OPERATION ERROR -- Validate [hadoop, hbase]:
[ERROR] DeployManager - ------------------------------------------------------[ERROR] DeployManager - hadoop failed
java.io.IOException: exit code: 2 -- /opt/ibm/biginsights/hdm/bin/hdm checkdeploy
[INFO] Progress - Checking Hadoop cluster started
[INFO] DeployCmdline - [ IBM InfoSphere BigInsights Enterprise Edition Enterprise Version ]
[INFO] Progress - Start hadoop
[INFO] @oc3537047140.ibm.com - namenode already running, pid 16970
[INFO] @oc3537047140.ibm.com - secondarynamenode already running, pid 17342
[INFO] @oc3537047140.ibm.com - datanode already running, pid 17562
[INFO] Progress - 50%
[INFO] Deployer - Waiting for Namenode to exit safe mode...
[INFO] Deployer - HDFS cluster started successfully
[INFO] @oc3537047140.ibm.com - jobtracker already running, pid 18012
[INFO] @oc3537047140.ibm.com - tasktracker already running, pid 18304
[INFO] Progress - 100%
[INFO] Deployer - MapReduce cluster started successfully
[INFO] DeployManager - Start; SUCCEEDED components: [hadoop]; Consumes : 4722ms
[INFO] Progress - Waiting for exit of safe mode
[INFO] HadoopMgmtCmdline - Running safemode wait
[INFO] Progress - Running terasort example
>> /opt/ibm/biginsights/IHC/bin/hadoop dfs -rmr /hdm-tera-input /hdm-tera-output /hdm-terareport
rmr: cannot remove /hdm-tera-input: No such file or directory.
rmr: cannot remove /hdm-tera-output: No such file or directory.
rmr: cannot remove /hdm-tera-report: No such file or directory.
>> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.1.1.jar
teragen -Dmapred.map.tasks=1 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m 10000 /hdm-tera-input
Generating 10000 using 1 maps with step of 10000
13/08/07 17:22:19 INFO mapred.JobClient: Running job: job_201308071709_0001
13/08/07 17:22:20 INFO mapred.JobClient: map 0% reduce 0%
13/08/07 17:22:20 INFO mapred.JobClient: Task Id : attempt_201308071709_0001_m_000002_0,
Status : FAILED
Error initializing attempt_201308071709_0001_m_000002_0:
java.io.IOException: Job initialization failed (24) with output: Reading task controller config from
/var/bi-task-controller-conf/taskcontroller.cfg
Can t get group information for 501 - Success.
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
at java.lang.Thread.run(Thread.java:738)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
... 8 more
13/08/07 17:22:20 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000002_0&filter=stdout
13/08/07 17:22:20 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000002_0&filter=stderr
13/08/07 17:22:21 INFO mapred.JobClient: Task Id : attempt_201308071709_0001_m_000002_1,
Status : FAILED
Error initializing attempt_201308071709_0001_m_000002_1:
java.io.IOException: Job initialization failed (24) with output: Reading task controller config from
/var/bi-task-controller-conf/taskcontroller.cfg
Can t get group information for 501 - Success.
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
at java.lang.Thread.run(Thread.java:738)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
... 8 more
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000002_1&filter=stdout
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000002_1&filter=stderr
13/08/07 17:22:21 INFO mapred.JobClient: Task Id : attempt_201308071709_0001_m_000002_2,
Status : FAILED
Error initializing attempt_201308071709_0001_m_000002_2:
java.io.IOException: Job initialization failed (24) with output: Reading task controller config from
/var/bi-task-controller-conf/taskcontroller.cfg
Can t get group information for 501 - Success.
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
at java.lang.Thread.run(Thread.java:738)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
... 8 more
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000002_2&filter=stdout
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000002_2&filter=stderr
13/08/07 17:22:21 INFO mapred.JobClient: Task Id : attempt_201308071709_0001_m_000001_0,
Status : FAILED
Error initializing attempt_201308071709_0001_m_000001_0:
java.io.IOException: Job initialization failed (24) with output: Reading task controller config from
/var/bi-task-controller-conf/taskcontroller.cfg
Can t get group information for 501 - Success.
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
at java.lang.Thread.run(Thread.java:738)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
... 8 more
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000001_0&filter=stdout
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000001_0&filter=stderr
13/08/07 17:22:21 INFO mapred.JobClient: Task Id : attempt_201308071709_0001_m_000001_1,
Status : FAILED
Error initializing attempt_201308071709_0001_m_000001_1:
java.io.IOException: Job initialization failed (24) with output: Reading task controller config from
/var/bi-task-controller-conf/taskcontroller.cfg
Can t get group information for 501 - Success.
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
at java.lang.Thread.run(Thread.java:738)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
... 8 more
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000001_1&filter=stdout
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000001_1&filter=stderr
13/08/07 17:22:21 INFO mapred.JobClient: Task Id : attempt_201308071709_0001_m_000001_2,
Status : FAILED
Error initializing attempt_201308071709_0001_m_000001_2:
java.io.IOException: Job initialization failed (24) with output: Reading task controller config from
/var/bi-task-controller-conf/taskcontroller.cfg
Can t get group information for 501 - Success.
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
at java.lang.Thread.run(Thread.java:738)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
... 8 more
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000001_2&filter=stdout
13/08/07 17:22:21 WARN mapred.JobClient: Error reading task
outputhttp://oc3537047140.ibm.com:50060/tasklog?plaintext=true&attemptid=attempt_2013080717
09_0001_m_000001_2&filter=stderr
13/08/07 17:22:21 INFO mapred.JobClient: Job complete: job_201308071709_0001
13/08/07 17:22:21 INFO mapred.JobClient: Counters: 4
13/08/07 17:22:21 INFO mapred.JobClient: Job Counters
13/08/07 17:22:21 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=0
13/08/07 17:22:21 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving
slots (ms)=0
13/08/07 17:22:21 INFO mapred.JobClient: Total time spent by all maps waiting after reserving
slots (ms)=0
13/08/07 17:22:21 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
13/08/07 17:22:21 INFO mapred.JobClient: Job Failed: JobCleanup Task Failure, Task:
task_201308071709_0001_m_000001
java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1327)
at org.apache.hadoop.examples.terasort.TeraGen.run(TeraGen.java:352)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.examples.terasort.TeraGen.main(TeraGen.java:357)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.1.1.jar
terasort -Dmapred.reduce.tasks=4 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m -Dio.sort.record.percent=0.17 /hdmtera-input /hdm-tera-output
13/08/07 17:22:23 INFO terasort.TeraSort: starting
13/08/07 17:22:23 INFO mapred.FileInputFormat: Total input paths to process : 0
java.lang.ArithmeticException: divide by zero
at
org.apache.hadoop.examples.terasort.TeraInputFormat.writePartitionFile(TeraInputFormat.java:118)
at org.apache.hadoop.examples.terasort.TeraSort.run(TeraSort.java:243)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.examples.terasort.TeraSort.main(TeraSort.java:257)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.1.1.jar
teravalidate -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=Xmx2048m /hdm-tera-output /hdm-tera-report
13/08/07 17:22:26 INFO mapred.JobClient: Cleaning up the staging area
hdfs://oc3537047140.ibm.com:9000/user/biadmin/.staging/job_201308071709_0002
13/08/07 17:22:26 ERROR security.UserGroupInformation: PriviledgedActionException as:biadmin
cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist:
hdfs://oc3537047140.ibm.com:9000/hdm-tera-output
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist:
hdfs://oc3537047140.ibm.com:9000/hdm-tera-output
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
at org.apache.hadoop.examples.terasort.TeraInputFormat.getSplits(TeraInputFormat.java:209)
at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043)
at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323)
at org.apache.hadoop.examples.terasort.TeraValidate.run(TeraValidate.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.examples.terasort.TeraValidate.main(TeraValidate.java:153)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> /opt/ibm/biginsights/IHC/bin/hadoop dfs -ls /hdm-tera-report
ls: Cannot access /hdm-tera-report: No such file or directory.
>> /opt/ibm/biginsights/IHC/bin/hadoop dfs -rmr /hdm-tera-input /hdm-tera-output /hdm-terareport
Deleted hdfs://oc3537047140.ibm.com:9000/hdm-tera-input
rmr: cannot remove /hdm-tera-output: No such file or directory.
rmr: cannot remove /hdm-tera-report: No such file or directory.
[INFO] =============== Summary of Hadoop Installation ===============
[INFO] TeraSort ..................................Failed
at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:101)
at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:41)
at com.ibm.xap.mgmt.hdm.HadoopDeployer.healthCheck(HadoopDeployer.java:816)
at com.ibm.xap.mgmt.DeployManager$HealthCheckThread.work(DeployManager.java:2655)
at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2385)
[INFO] DeployManager - hbase succeeded -- fake-node=0
[ERROR] DeployManager - Validate; SUCCEEDED components: [hbase]; FAILED components:
[hadoop]; Consumes : 52881ms
Error exit.
[INFO] HadoopMgmtCmdline - Running removeBiTemp
[FATAL] Failed to verify BigInsights component(s): hadoop and/or hbase
Download