The following small series will introduce you to the implementation of Softmax regression functions in Python (recommended ). I think this is quite good. now I will share it with you and give you a reference. Let's take a look at the Softmax regression function to normalize the classification results. However, it is different from the general proportional normalization method. it is normalized through logarithm transformation, so that larger values are more profitable in the normalization process.
Softmax formula
Softmax implementation method 1
Import numpy as npdef softmax (x): "Compute softmax values for each sets of scores in x. "pass # TODO: Compute and return softmax (x) x = np. array (x) x = np. exp (x) x. astype ('float32') if x. ndim = 1: sumcol = sum (x) for I in range (x. size): x [I] = x [I]/float (sumcol) if x. ndim> 1: sumcol = x. sum (axis = 0) for row in x: for I in range (row. size): row [I] = row [I]/float (sumcol [I]) return x # Test result scores = [3.0, 1.0, 0.2] print softmax (scores)
The calculation result is as follows:
[ 0.8360188 0.11314284 0.05083836]
Softmax implementation method 2
Import numpy as npdef softmax (x): return np. exp (x)/np. sum (np. exp (x), axis = 0) # Test result scores = [3.0, 1.0, 0.2] print softmax (scores)
The implementation method (recommended) of the Softmax regression function in Python is all the content that I have shared with you. I hope to give you a reference and support for PHP.
For more information about how to implement Softmax regression functions in Python, see PHP!