Saturday 15 September 2012

java - hadoop 2.5.0 failed to start datanode -



java - hadoop 2.5.0 failed to start datanode -

i'm trying deploy standalone version of hadoop 2.5.0. datanode fails start. log prints:

2014-10-20 13:42:13,288 fatal org.apache.hadoop.hdfs.server.datanode.datanode: exception in securemain java.lang.unsatisfiedlinkerror: org.apache.hadoop.io.nativeio.sharedfiledescriptorfactory.createdescriptor0(ljava/lang/string;ljava/lang/string;i)ljava/io/filedescriptor; @ org.apache.hadoop.io.nativeio.sharedfiledescriptorfactory.createdescriptor0(native method) @ org.apache.hadoop.io.nativeio.sharedfiledescriptorfactory.create(sharedfiledescriptorfactory.java:87) @ org.apache.hadoop.hdfs.server.datanode.shortcircuitregistry.<init>(shortcircuitregistry.java:165) @ org.apache.hadoop.hdfs.server.datanode.datanode.initdataxceiver(datanode.java:586) @ org.apache.hadoop.hdfs.server.datanode.datanode.startdatanode(datanode.java:773) @ org.apache.hadoop.hdfs.server.datanode.datanode.<init>(datanode.java:292) @ org.apache.hadoop.hdfs.server.datanode.datanode.makeinstance(datanode.java:1895) @ org.apache.hadoop.hdfs.server.datanode.datanode.instantiatedatanode(datanode.java:1782) @ org.apache.hadoop.hdfs.server.datanode.datanode.createdatanode(datanode.java:1829) @ org.apache.hadoop.hdfs.server.datanode.datanode.securemain(datanode.java:2005) @ org.apache.hadoop.hdfs.server.datanode.datanode.main(datanode.java:2029)

i googled while , couldn't find useful help. tried compile hadoop-2.5.0 in computer(x86-64 centos 6.5) since error somehow related native lib, got same error. tried cdh version, still no good.

my hdfs-site.xml:

<property> <name>fs.checkpoint.dir</name> <value>/home/seg3/namesecondary</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>/home/seg2/datanodedir</value> </property> <property> <name>dfs.datanode.hdfs-blocks-metadata.enabled</name> <value>true</value> </property> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.support.append</name> <value>true</value> </property> <property> <name>dfs.block.local-path-access.user</name> <value>root</value> </property> <property> <name>dfs.client.read.shortcircuit</name> <value>true</value> </property> <property> <name>dfs.domain.socket.path</name> <value>/var/run/hadoop-hdfs/dn._port</value> </property> <property> <name>dfs.client.file-block-storage-locations.timeout</name> <value>10000</value> </property>

and core-site.xml:

<property> <name>fs.defaultfs</name> <value>hdfs://localhost:8020</value> </property> <property> <name>fs.trash.interval</name> <value>10080</value> </property> <property> <name>fs.trash.checkpoint.interval</name> <value>10080</value> </property> <property> <name>io.native.lib.available</name> <value>false</value> </property>

any ideas? btw, hadoop 2.3.0 works on machine.

after trying deploy same bundle on bunch of servers, found problem. somehow hadoop 2.3.0's native lib got way jdk's native path, in turn poluted java runtime. when datanode tries load native lib, finds old one. after deleting .so files, got datanode , running. cheers.

java hadoop jni hdfs

No comments:

Post a Comment