-RBF optimization of SVM Gaussian kernel function
important to understand the mathematics part:
Covariance matrix, Gauss kernel function formula.
Personal advice to the specific method or look at the core code below it, better understanding, anyway, for me personally, irritable formula, still less than a piece of code to the actual. Originally wanted to use a Java called Jblas matrix package, but think about it, or write it yourself. Deepen your understanding. The language of implementation is the Java twin-brother Scala. I think it is not difficult to understand. The matrix transformation is made with a two-bit array.
The following code suggests debugging with scala command line
Core Code
def Transposedmatrix (A:array[array[double]]): array[array[double]]={//returns the transpose matrix Val length=a.length val width=a (0). Len gth var transposedm =array.ofdim[double] (width,length) for (i <-0-length-1) for (J-<-0 to Width-1)
{transposedm (j) (i) = A (i) (j)} TRANSPOSEDM} def R1 (i:int,j:int,m:array[array[double]):D ouble={//r (i,j) = (column i-column j) *[(column i-column j) transpose] Var sum:double =0 for (x <-0 to M.length-1) Sum+=math.pow ((M (x) (i)-M (x) (j), 2) sum} def rowaverage (A:array[arra Y[double]]: array[double]={//Returns the mean value of the column, returning a column matrix Val length=a.length val width=a (0). length var b=new array[double] ( width) for (i <-0-width-1) for (J <-0 to Length-1) b (i) + A (j) (i) for (i <-0 to WI Dth-1) b (i) =b (i)/length B} def sumofrow (A:array[array[double]): array[double]={//returns the and of the matrix column, returns a column matrix Val len Gth=a.length Val width=a (0). length var b=new array[double] (width) for (i <-0 to WIdth-1) for (J <-0 to Length-1) b (i) + + A (j) (i) for (i <-0 to width-1) b (i) =b (i) b}
def sum (i:int,j:int,a:array[array[double]]):D ouble={//i column by J Column transpose Var result:double=0 for (x<-0 to A.length-1) Result + = (A (x) (i) *a (x) (j)) result} def CoV (a:array[array[double]): array[array[double]]={//takes the feature matrix as an argument and returns the covariance
The difference matrix Val M1=transposedmatrix (a) Val M2=rowaverage (M1) Val M3=datasort (m1,m2)//Will matrix center Val width=m3 (0). length var b =array.ofdim[double] (width,width) for (i <-0-width-1) for (J-<-0 to Width-1) b (i) (j)
=sum (i,j,m3) b} def datasort (A:array[array[double]],b:array[double]): array[array[double]]={//matrix centered, subtracting column values from each column
for (i <-0-a (0). Length-1) for (J <-0 to A.length-1) A (j) (i)-= B (i) A} def Gaussmatrix (a:array[array[double]],delta:array[double]): array[array[double]]={//a is a feature matrix, Delta is the sum of covariance matrices, Returns the Gaussian kernel function matrix Val B=transposedmatrix (A) Val length=b (0). length var R =array.ofdim[double] (length,length) for (i <-0 to Length-1) for (j ;-0 to Length-1) r (i) (j) =math.exp (-r1 (I,j,b)/delta (j)) R} Val Test=array (Array (2.0, 8.0), Array (3.0, 6.0)
, Array (9.0, 2.0) Val Test2=cov (Test) Val Rowofsum=sumofrow (RES65) Gaussmatrix (test,rowofsum)
Welcome all Reader Ye criticized the advice.
Thanks to the following Baidu know reply friends, the implementation of a lot of code snippets get his inspiration.
Http://zhidao.baidu.com/link?url=-u5LznclWQ0LbvEx3DB8sofohyP7nJCWws78TsWBNaDR15rDn-7ENoRealHRIM8W8ycioegl_ Ngafzqj33pbz90acqq7elf8hgr7daqujjs