Comparison between Java and C in integer summation
come on, it's a whim tonight. Test the performance of C and Java on the same platform ubuntu14.04, with the same computational load. However, the final results of the experiment surprised me, and in my heart slowly again to the James Gosling (author of the Java language) pay tribute. The experiment itself is very simple, you crossing can also do the same experiment. After the experience, it may be in the spirit of learning, common communication and common growth. The test code is posted below:
C Language:
<span style= "FONT-SIZE:18PX;" ><strong>int Main () {int number=0,i=0;for (i=0;i<=1000000000;i++) Number++;return 0;} </strong></span>
The C language compiler is the GCC (Ubuntu 4.8.2-19ubuntu1) 4.8.2. Compiled program a.out run time measured, for real 0m3.153s, user 0m3.152s, sys 0m0.000s
Java language:
<span style= "FONT-SIZE:18PX;" ><strong>public class Tst{public static void Main (string[] args) {int i;int number=0;long startTime = System.curr Enttimemillis ();//Run Start and end time for (i=0;i<=1000000000;i++) Number++;long endTime = System.currenttimemillis ();// Run stop time float seconds = (endtime-starttime)/1000f; System.out.println (float.tostring (seconds) + "seconds.");} </strong></span>
Java compiler is Javac 1.7.0_79,java virtual machine is Java version "1.7.0_79", OpenJDK Runtime Environment (IcedTea 2.5.5) ( 7u79-2.5.5-0ubuntu0.14.04.2), OpenJDK Server VM (build 24.79-b02, Mixed mode). The result of time measurement is: Real 0m0.093s,user 0m0.088s,sys 0m0.008s (Note: If you test with Java functions (regardless of the JVM's time overhead) The runtime is 0.006 seconds.)
Arithmetic comparison between Java and C in integer summation