[ERROR] Error:error while loading <root>, error in opening zip file error:scala.reflect.internal.MissingRequiremen Terror:object scala.runtime in compiler mirror not found.

Source: Internet
Author: User

Compile an Apache open source project at home and encounter errors at compile time as follows:

Error:error while loading <root>, error in opening zip File[error] error:error while loading <root>, error I N Opening zip fileerror:scala.reflect.internal.MissingRequirementError:object scala.runtime in compiler mirror not Foun d.at scala.reflect.internal.missingrequirementerror$.signal (missingrequirementerror.scala:16) at Scala.reflect.internal.missingrequirementerror$.notfound (missingrequirementerror.scala:17) at Scala.reflect.internal.mirrors$rootsbase.getmoduleorclass (mirrors.scala:48) at scala.reflect.internal.mirrors$ Rootsbase.getmoduleorclass (mirrors.scala:40) at Scala.reflect.internal.mirrors$rootsbase.getmoduleorclass ( mirrors.scala:61) at Scala.reflect.internal.mirrors$rootsbase.getpackage (mirrors.scala:172) at Scala.reflect.internal.mirrors$rootsbase.getrequiredpackage (mirrors.scala:175) at Scala.reflect.internal.definitions$definitionsclass.runtimepackage$lzycompute (Definitions.scala:183) at Scala.reflect.internal.definitions$definitionsclass.runtimepackage (Definitions.scala:183) at Scala.reflect.internal.definitions$definitionsclass.runtimepackageclass$lzycompute ( definitions.scala:184) at Scala.reflect.internal.definitions$definitionsclass.runtimepackageclass ( definitions.scala:184) at Scala.reflect.internal.definitions$definitionsclass.annotationdefaultattr$lzycompute ( definitions.scala:1024) at Scala.reflect.internal.definitions$definitionsclass.annotationdefaultattr ( definitions.scala:1023) at Scala.reflect.internal.definitions$definitionsclass.syntheticcoreclasses$lzycompute ( definitions.scala:1153) at Scala.reflect.internal.definitions$definitionsclass.syntheticcoreclasses ( definitions.scala:1152) at scala.reflect.internal.definitions$definitionsclass.symbolsnotpresentinbytecode$ Lzycompute (definitions.scala:1196) at scala.reflect.internal.definitions$ Definitionsclass.symbolsnotpresentinbytecode (definitions.scala:1196) at scala.reflect.internal.definitions$ Definitionsclass.init (definitions.scala:1261) at scala.tools.nsc.global$run.<init> (Global.scala:1290) at Scala.tools.nsc.Driver.doCompile (driver.scala:32) at Scala.tools.nsc.main$.docompile (main.scala:79) at Scala.tools.nsc.Driver.process (driver.scala:54) at Scala.tools.nsc.Driver.main (driver.scala:67) at Scala.tools.nsc.Main.main (Main.scala) at Sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at Sun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:62) at Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43) at Java.lang.reflect.Method.invoke (method.java:498) at org_scala_tools_maven_executions. Mainhelper.runmain (mainhelper.java:161) at org_scala_tools_maven_executions. Mainwithargsinfile.main (mainwithargsinfile.java:26)

It was compiled with the open source Carbondata project, and the use of MAVEN3.3.9,JDK was 1.8. So to the Internet various search and query see various answers, but none can solve the problem. There is a hint that a jar package corruption may have caused an error in opening. So all of the Maven repository jar is completely erased and recompiled, OK compile successfully. (You can also write code to check which jar package is broken.)

But why is the jar package damaged? I reviewed the compilation process, the original compile at the beginning of a frequent download jar package is very slow, in the process of downloading the direct interruption of the RE, it may be caused by this reason.

Compile and start running code and report the following error.

Starting carbonexample using Spark version 1.5.2Exception in thread "main" java.lang.RuntimeException: Java.lang.RuntimeException:The root Scratch dir:/tmp/hive on HDFS should be writable. Current Permissions Are:rw-rw-rw-at Org.apache.hadoop.hive.ql.session.SessionState.start (sessionstate.java:522) at Org.apache.spark.sql.hive.client.clientwrapper.<init> (clientwrapper.scala:171) at Org.apache.spark.sql.hive.hivecontext.executionhive$lzycompute (hivecontext.scala:162) at Org.apache.spark.sql.hive.HiveContext.executionHive (hivecontext.scala:160) at Org.apache.spark.sql.hive.hivecontext.<init> (hivecontext.scala:167) at Org.apache.spark.sql.CarbonContext. <init> (carboncontext.scala:41) at Org.apache.carbondata.examples.util.exampleutils$.createcarboncontext ( exampleutils.scala:44) at Org.apache.carbondata.examples.carbonexample$.main (carbonexample.scala:27) at Org.apache.carbondata.examples.CarbonExample.main (Carbonexample.scala) at SUN.REFLECT.NATIVEMETHODACCESSORIMPL.INVOKE0 (Native Method) at Sun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:62) at Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43) at Java.lang.reflect.Method.invoke (method.java:498) at Com.intellij.rt.execution.application.AppMain.main ( appmain.java:147) caused by:java.lang.RuntimeException:The root scratch dir:/tmp/hive on HDFS should be writable. Current Permissions Are:rw-rw-rw-at Org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir ( sessionstate.java:612) at Org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs (sessionstate.java:554 ) at Org.apache.hadoop.hive.ql.session.SessionState.start (sessionstate.java:508) ... More

is also a variety of search, some say to download the correct winutils.exe, some say to use the command Winutils.exe to the directory to assign permissions, and some say installed XXX software conflict .... All sorts of things have been done or the same mistake. Finally reboot (restart the machine) all right.

Tears ran ...

[ERROR] Error:error while loading <root>, error in opening zip file error:scala.reflect.internal.MissingRequiremen Terror:object scala.runtime in compiler mirror not found.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.