社区首页 > Hadoop > hive无法启动 报BigIn...

hive无法启动 报BigInteger转换Long错误

来自: 你的温柔,我明白

回复量:1

创建时间: 2017-03-28 10:12

环境介绍:

  操作系统:redhat linux 6.4 64位

  JDK:1.7.0_79

hadoop:2.6.0

spark:1.6.0

hive:1.2.1

mysql: 8.0

jdbc驱动:mysql-connector-java-5.1.26-bin.jar

当前状态:

hadoop 已启动正常运行


[hadoop@master hive]$ jps

79481 Jps

77444 NameNode

38573 Master

77593 SecondaryNameNode

53527 SparkSubmit

77756 ResourceManager

mysql正常运行,且已创建hive数据库,hive用户:

```

 [root@master hive]# mysql -uhive -p

Enter password: 

Welcome to the MySQL monitor.  Commands end with ; or \g.

Your MySQL connection id is 94

Server version: 8.0.0-dmr MySQL Community Server (GPL)


Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.


Oracle is a registered trademark of Oracle Corporation and/or its

affiliates. Other names may be trademarks of their respective

owners.


Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.


mysql> use hive;

Reading table information for completion of table and column names

You can turn off this feature to get a quicker startup with -A


Database changed

mysql> show tables;

+----------------+

| Tables_in_hive |

+----------------+

| test           |

+----------------+

1 row in set (0.00 sec)


mysql> 


```

启动的时候报如下错误:



Exception in thread "main" javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true, username = hive. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------

java.sql.SQLException: java.lang.ClassCastException: java.math.BigInteger cannot be cast to java.lang.Long

        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1062)

        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:973)

        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:959)

        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:904)

        at com.mysql.jdbc.ConnectionImpl.buildCollationMapping(ConnectionImpl.java:1058)

        at com.mysql.jdbc.ConnectionImpl.initializePropsFromServer(ConnectionImpl.java:3577)

        at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2518)

        at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2288)

        at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:818)

        at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:31)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

        at com.mysql.jdbc.Util.handleNewInstance(Util.java:395)

        at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400)

        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:346)

        at java.sql.DriverManager.getConnection(DriverManager.java:571)

        at java.sql.DriverManager.getConnection(DriverManager.java:187)

        at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)

        at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)

        at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)

        at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)

        at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)

        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)

        at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)

        at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:606)

        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)

        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)

        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)

        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)

        at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)

        at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)

        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)

        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)

        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)

        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)

        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)

        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)

        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)

        at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5756)

        at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5751)

        at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5984)

        at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:606)

        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by: java.lang.ClassCastException: java.math.BigInteger cannot be cast to java.lang.Long

        at com.mysql.jdbc.ConnectionImpl.buildCollationMapping(ConnectionImpl.java:1003)

        ... 64 more

------


请问上面的错误是msyql的驱动与mysql版本不匹配,还是hadoop与hive不匹配,如何处理。请高手指教。


0

1 回复

你的温柔,我明白 1F 2017-03-29 10:14:46

已解决:

   应该是$HIVE_HOME/lib中的jline jar包和hadoop的jline jar包,版本不一致,将$HADOOP_HOME/share/hadoop/lib中的jline备份,将$HIVE_HOME/lib中的jline jar包拷贝到$HADOOP_HOME/share/hadoop/lib中,再启动即可。


回复 赞(
发表回复