Hive集成Mysql作为元数据时,提示错误:Specified key was too long; max key length is 767 bytes

在进行Hive集成Mysql作为元数据过程中,做完所有安装配置工作后,进入到hive模式,执行show databases;执行正常,接着执行show tables;时却报错。

关键错误信息如下:

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes)

具体操作信息如下:

hive> show databases;

OK

default

Time taken: 8.638 seconds

hive> show tables;

FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.hive.metastore.api.MetaException javax.jdo.JDODataStoreException: An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767
bytes

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)

at com.mysql.jdbc.Util.getInstance(Util.java:386)

at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)

at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4098)

at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4030)

at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2490)

at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2651)

at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2671)

at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2621)

at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:842)

at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:681)

at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)

at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)

at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)

at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681)

at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:402)

at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:458)

at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2689)

at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)

at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)

at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)

at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)

at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)

at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)

at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)

at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)

at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)

at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)

at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)

at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)

at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)

at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)

at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)

at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)

at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)

at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:243)

at org.apache.hadoop.hive.metastore.ObjectStore.getTables(ObjectStore.java:781)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)

at com.sun.proxy.$Proxy4.getTables(Unknown Source)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_tables(HiveMetaStore.java:2327)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)

at com.sun.proxy.$Proxy5.get_tables(Unknown Source)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTables(HiveMetaStoreClient.java:817)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)

at com.sun.proxy.$Proxy6.getTables(Unknown Source)

at org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:1009)

at org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:983)

at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2215)

at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1336)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1122)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:935)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

NestedThrowables:

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes)

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

处理方案:

修改我们创建的hive元数据库的编码字符集,如:

alter database hive_test character set latin1;

版权声明:本文为博主原创文章,未经博主允许不得转载。

时间: 2024-08-07 04:32:08

Hive集成Mysql作为元数据时,提示错误:Specified key was too long; max key length is 767 bytes的相关文章

编译pure-ftpd时提示错误Your MySQL client libraries aren't properly installed

如果出现类似configure: error: Your MySQL client libraries aren’t properly installed 的错误,请将mysql目录下的 include/mysql下的mysql.h文件以及lib/mysql下的全部文件,连接(直接复制过去或许也可)到 /usr/lib 目录下(参考) cp /www/wdlinux/mysql-5.5.x/include/mysql/mysql.h /usr/lib/ cp /www/wdlinux/mysql

hive执行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:

hive> select product_id, track_time from trackinfo limit 5; Total MapReduce jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOEx

hive运行query语句时提示错误:org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOException:

hive> select product_id, track_time from trackinfo limit 5; Total MapReduce jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.io.IOEx

使用tp访问数据库时提示错误: 'PDO' not found, 和not defined constant mysql_attr_init_command

第一个问题: PDO not found 是因为 php没有安装pdo扩展, 无法提供给 php 以 数据库访问功能, 所以 报错是在文件: Think/Db.class.php的里面. 解决方法是: 给 php安装 PDO 扩展: dnf install php-pdo 这时 再看 就可以 在 默认 自动 加载的modules目录 中 看到: /usr/lib/php/modules pdo.so这个共享库 安装好 pdo后, 又出现了 没有定义 mysql_attr_init_command

Oracle登录时提示错误,导致用户无法登录

Oracle登录时提示错误,导致用户无法登录,错误如下 ------------------------------------------------------------------------- ORA-00604:递归SQL级别1出现错误 ORA-01653表SYS.AUD$无法通过1024(在表空间SYSTEM中扩展) ORA-02002:写入审计线索时出错 ORA-01653表SYS.AUD$无法通过1024(在表空间SYSTEM中扩展) ---------------------

MYSQL导入数据时,出现错误:Incorrect string value: '\xF0\x9F...' for column 'XXX' at row 1

Incorrect string value: '\xF0\x9F...' for column 'XXX' at row 1 这个问题,原因是UTF-8编码有可能是两个.三个.四个字节.Emoji表情或者某些特殊字符是4个字节,而MySQL的utf8编码最多3个字节,所以数据插不进去. 我的解决方案是这样的 1.在mysql的安装目录下找到my.ini,作如下修改: [mysqld] character-set-server=utf8mb4 [mysql] default-character-

提交时提示错误This Bundle is invalid.New apps and app updates submitted to the App Store must be built wit

今天传appstore的时候发生了一个问题. this bundle is invalid . new apps and app updates submitted to the app store must be built with public 在网上查了好久,始终告诉我,要使用xcode5,和IOS7但是奇怪的是我本来使用的就是xcode5.1和ios7打的包,最后试了很多次,发现大概从5月12日开始,必须使用最新的xcode5.1.1才能上传应用,最后终于解决问题. 提交时提示错误Th

转载:Linux下执行SVN命令时提示错误:Valid UTF-8 data

在Linux下执行svn add *时出现如下错误: svn:  Valid UTF-8  data(hex: 4b)followed by invalid UTF-8 sequence(hex:  fc 63 68  65) 出现这个错误是因为svn库里有文件的名字不是utf-8编码的,这种情况对于中文来说很常见.比如在自己的windows上建了一个中文名字的文件,就会使这种情况. 几经周折,才找到解决办法: 首先,执行命令: ls * | file -/dev/stdin:  ISO-885

visual studio 2015 IOS开发连接mac时提示错误couldn't connect to xxxx, please try again的一个方法

本人使用虚拟机MAC.原本使用虚拟机中的VS2015连接正常没有问题. 但是当把MAC的虚拟机文件COPY到另一个机器上,提示“couldn't connect to xxxx,  please try again”. 经过查找和升级MAC中的Xamarin.ios都不行.后面尝试添加新的MAC(在VS的连接页面左下角有一个“add mac..."),直接输入MAC的IP,竟然连接上了. 分析原因可能是自动找到的使用MAC机器名的有些问题,但不确定.仅供各位参考. visual studio 2