I have been looking for precision and recall how to calculate, because always call the function of the relationship, I thought that TensorFlow has been encapsulated to write such a call method, has not been found, seems to be too lazy, lost the desire to do, in the following address to see the code written by others, Although it can not be used, it is also very important to inspire.
Https://gist.github.com/Mistobaan/337222ac3acbfc00bdac
def tf_confusion_metrics (model, actual_classes, Session, feed_dict): predictions = Tf.argmax (model, 1) actuals = tf.ar Gmax (actual_classes, 1) ones_like_actuals = tf.ones_like (actuals) zeros_like_actuals = tf.zeros_like (actuals) Ones_ Like_predictions = tf.ones_like (predictions) zeros_like_predictions = tf.zeros_like (predictions) Tp_op = Tf.reduce_su M (Tf.cast (Tf.logical_and tf.equal (actuals, ones_like_actuals), tf.equal (predictions, Ones_li ke_predictions)), "float") Tn_op = Tf.reduce_sum (Tf.cast tf.logical_and (TF.
Equal (actuals, zeros_like_actuals), tf.equal (predictions, zeros_like_predictions)), "float") ) Fp_op = Tf.reduce_sum (Tf.cast tf.logical_and (tf.equal, actuals), TF
. Equal (predictions, ones_like_predictions)), "float") Fn_op = Tf.reduce_sum (Tf.cast (
Tf.logical_and (Tf.equal (actuals, ones_like_actuals), tf.equal (predictions, zeros_like_predictions)), "float") TP, TN, FP, fn = \ Session.run ([Tp_op, Tn_op, Fp_op, Fn_op], feed_dict) TPR = FLOAT (TP)/ (FLOAT (TP) + float (fn)) FPR = float (FP)/(FLOAT (TP) + float (fn)) accuracy = (FLOAT (TP) + float (TN))/(FLOAT (TP) + float (FP) + float (FN) + float (tn)) recall = TPR Precision = FLOAT (TP)/(FLOAT (TP) + float (fp)) F1_score = (2 * (Precisio n * recall))/(precision + recall) print ' precision = ', precision print ' recall = ', recall print ' F1 Score = ', F1_score print ' accuracy = ', accuracy