The Softmax regression function is used to normalized the result of a classification. But it is different from the normal method of proportional normalization, which is normalized by logarithmic transformation, so that the larger value is more profitable in the normalization process.
Softmax formula
Softmax Implementation Method 1
Import NumPy as Npdef Softmax (x): "" "Compute Softmax values for each sets of scores in X." "" Pass # Todo:compute and Retu RN Softmax (x) x = Np.array (x) x = np.exp (x) x.astype (' float32 ') if X.ndim = = 1: sumcol = SUM (x) for I in range (x.s ize): x[i] = x[i]/float (sumcol) if X.ndim > 1: sumcol = x.sum (Axis = 0) for row with x: For i in range (ro w.size): row[i] = row[i]/float (Sumcol[i]) return x# test Result scores = [3.0,1.0, 0.2]print Softmax (scores)
It calculates the following results:
[0.8360188 0.11314284 0.05083836]
Softmax Implementation Method 2
Import NumPy as Npdef Softmax (x): Return Np.exp (x)/np.sum (Np.exp (x), axis=0) #测试结果scores = [3.0,1.0, 0.2]print Softmax (SCO Res
Above this Python Softmax regression function implementation method (recommended) is the small part to share all the content of everyone, hope to give you a reference, but also hope that we support a lot of topic.alibabacloud.com.