Numerical Learning Library

Source: Internet
Author: User
Tags benchmark

Link: https://code.google.com/p/nll/

NLL is a multi-platform the open source project entirely written in C + +. Its goal are to propose generic and efficient algorithms for machine learning and more specifically computer. It is intended to being very easy to integrate and it are mainly composed of header files with no dependency on any library bu t the STL. Architecture

NLL implements generic algorithms using template metaprogramming and a minimalist interface. Several layers are used:core:the very basic structures and operations algorithm_impl:generic algorithms with very limit Ed dependencies and interface Algorithm:algorithms taking advantage of the NLL framework imaging:algorithms related to I Maging techniques such as volumes, slices, blending, lut tables, multi-planar reconstruction or maximum intensity projecti On.

Details

This is a overview of some algorithms implemented in Nll:classifiers (k-nearest neighbour, multi-layered neural networks , support vector machines, boosting, Gaussian mixture model, quadratic discriminant, radial basis function, naive Bayes) F Eature selection (Best-first, wrapper using genetic algorithm, RELIEF-F, Pearson) feature transformation (PCA, kernel PCA, ICA) optimizers (grid search, harmony Search, genetic Algorithms, Powell) Math library (matrix, vector, linear algebra, D istributions) Image Library (resampling, morphology, transformations, convolutions, region growing, labeling, SURF) Visua Lization of high-dimensional data (locally linear embedding, Sammon ' s Mapping) volume library (resampling, maximum Intensi Ty projection, Multi-planar reconstruction) Clustering (K-means, LSDBC) kd-trees, Gabor filters, Haar features Markov-Chai N, Hidden Markov model RANSAC estimator ... and much more soon!

Example

Here are a typical use of the framework:

/**  in This test a neural network would be optimized using a Harmony search algorithm.  */void Test () {   typedef nll::benchmark::benchmarkdatabases::D atabase::sample::input  Input; & nbsp  typedef nll::algorithm::classifier<input>                   &NBS P

       Classifier;    //Find the CANCER1.DT benchmark    const nll::benchmark::benchmarkdatabases::benchmark*
Benchmark = nll::benchmark::benchmarkdatabases::instance (). Find ("Cancer1.dt");
   ensure (Benchmark, "can ' t find benchmark");

   classifier::D atabase dat = benchmark->database;    //use a multi layered perceptron as a classifier    typedef algorithm::classifiermlp<input>
Classifierimpl;

   classifierimpl classifier;    //Optimize the parameters of the classifier on the original dataset    //we'll use a harmony SE ArcH algorithm.    //For each point, the classifier is evaluated:a 10-fold cross validation are    //run on the Lear Ning database    classifier::optimizerclientclassifier Classifieroptimizer = Classifier.createoptimizer (

DAT);
   //Configure the optimizer options    nll::algorithm::stopconditioniteration stop (10);    nll::algorithm::metriceuclidian<nll::algorithm::optimizerharmonysearchmemory::tmetric::value_
Type> metric;    nll::algorithm::optimizerharmonysearchmemory Parametersoptimizer (5, 0.8, 0.1, 1, &stop, 0.01, &

Metric);    //run the optimizer on the default constrained classifier parameters    //if the default values D On ' t fit, the other constraint parameters should is given    std::vector<double> params = Parametersoptimizer.
Optimize (Classifieroptimizer, classifierimpl::buildparameters ());        //Learn the LEARNING and VALIDATIONDatabase with the optimized parameters, and test the classifier    //on the testing database    class
Ifier.learntrainingdatabase (DAT, nll::core::make_buffer1d (params));

   classifier::result rr = classifier.test (DAT);
   tester_assert (Rr.testingerror < 0.025); }
Tutorials

Here for a list of tutorials and samples of code:tutorials

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.