Julia programming language with the rise of machine learning

Source: Internet
Author: User

Julia This programming language is the development efficiency of Python, also has the execution efficiency of C, is the programming language that designs for numerical operation. Julia can call C directly, many open source C and FORTRAN libraries are integrated into the Julia Base library. In addition, it also has notebook.

Julia tries to replace R, MATLAB, octave and other numerical computing tools. Its syntax is similar to that of other scientific computing languages. In many cases, there is performance that can match the compiled language. Julia's design complies with three principles, fast, rich expression, dynamic language. Julia's core is written in C, and other parts are written using Julia itself.

At present, this programming language in the domestic popularity is not high, if you search for Julia in Baidu, the first page does not have a and Julia language-related items, contrary to appear is a Japanese AV star, this ...

Currently the most popular programming language in machine learning is Python, look at a picture:

The rise and fall of a programming language is directly related to the community behind it. If a programming language community is strong, then more resources, a variety of libraries are more, then the use of more people. Julia's community seems to be engaged in numerical operations, its application is currently limited to this, if the language to do the Web (there is a library), it is not dead tired.

This post uses Julia to demonstrate a handwritten digit recognition to see if its syntax can be used with your eyes.  Julia's several machine learning libraries SCIKITLEARN.JL: Python-like Scikit-learn mocha.jl textanalysis.jl mxnet.jl tensorflow.jl: Encapsulation TensorFlow Install Julia Julia Source code: Https://github.com/JuliaLang/julia [Plain] View plain copy http://julialang.org/downloads/# Ubun Tu $ sudo apt install gfortran $ sudo apt install Julia # MacOS $ brew Install Caskroom/cask/julia

Document: http://docs.julialang.org/en/stable/manual/[plain] View plain copy julia> pkg.test ("Mocha")
Handwritten digit recognition

Install MOCHA.JL:

[Plain] View plain copy julia> pkg.add ("Mocha") # or install the latest version pkg.clone ("Https://github.com/pluskid/Mocha.jl.git")

Test installation:
[Plain] View plain copy julia> pkg.test ("Mocha")

Preparing handwritten digital data sets: Https://github.com/pluskid/Mocha.jl/tree/master/examples/mnist

Code: [plain]  View Plain  copy # https://github.com/pluskid/mocha.jl/blob/master/examples/mnist/ mnist.jl       using mocha   srand (12345678)         Data_layer  = asynchdf5datalayer (name= "Train-data",  source= "Data/train.txt",  Batch_size=64, shuffle=true)    Conv_layer  = convolutionlayer (name= "CONV1",  N_filter=20, kernel= (5,5),  bottoms=[:d Ata], tops=[:conv])    pool_layer  =  poolinglayer (name= "Pool1",  kernel= (2,2),  stride= (2,2),  bottoms=[:conv], tops=[:p Ool] )    Conv2_layer = convolutionlayer (name= "Conv2",  n_filter=50, kernel= (5,5),  bottoms=[:p Ool], tops=[:conv2])    pool2_layer = poolinglayer (name= "Pool2",  kernel = (2,2),  stride= (2,2),  bottoms=[:conv2], tops=[:p ool2])    fc1_layer   =  inneRproductlayer (name= "Ip1",  output_dim=500, neuron=neurons.relu (),  bottoms=[:p ool2], tops=[: IP1])    fc2_layer   = innerproductlayer (name= "IP2", output_dim=10,  BOTTOMS=[:IP1], TOPS=[:IP2])    Loss_layer  = softmaxlosslayer (name= "loss",  Bottoms=[:ip2,:label])        backend = defaultbackend ()    init ( Backend)        Common_layers = [conv_layer, pool_layer, conv2_layer , pool2_layer, fc1_layer, fc2_layer]   net = net ("MNIST-train",  backend ,  [data_layer, common_layers..., loss_layer])        exp_dir =  "Snapshots-$ (Mocha.default_backend_type)"        method = sgd ()    Params = make_solver_parameters (method, max_iter=10000, regu_coef=0.0005,                                    mom_policy=mompolicy.fixed (0.9),                                     lr_policy= LRPOLICY.INV (0.01, 0.0001, 0.75),                                     load_from=exp_dir)    Solver = solver (method, params)        Setup_coffee_lounge (solver, save_into= "$exp _dir/statistics.jld",  every _n_iter=1000)        # report training progress every 100  iterations   add_coffee_break (Solver, trainingsummary (),  every_n_iter=100)        # save  snapshots every 5000 iterations   Add_coffee_break (Solver, snapshot (exp_dir ),  every_n_iter=5000        # show performance on test  data every 1000 iterations   Data_layer_test = hdf5datalayer (name= "Test-data ",  source=" Data/test.txt ",  batch_size=100)    Acc_layer = accuracylayer (name=" Test-accuracy ",  bottoms=[:ip2, :label])    test_net = net (" Mnist-test ",  Backend, [data_layer_test, common_layers..., acc_layer])    add_coffee_break (Solver,  validationperformance (test_net),  every_n_iter=1000)        solve (solver,  net)        #Profile. init (int (1e8),  0.001)    # @profile  solve ( Solver, net)    #open ("Profile.txt",  "W")  do out   #  profile.print (out)     #end        Destroy (net)    destroy (Test_net)    shutdown (backend)   

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.