定义summary
writer = tf.summary.FileWriter(logdir=self.han_config.log_path, graph=session.graph)
1.scalar存储结果
a.先在训练的循环外定义:
test_accuracy_summary = tf.summary.scalar(‘test_accuracy‘, self.han_model.accuracy) test_loss_summary = tf.summary.scalar(‘test_loss‘, self.han_model.loss) test_scalar = tf.summary.merge([test_accuracy_summary, test_loss_summary])
b.在session run的时候run test_scalar,获得值,然后再添加。
writer.add_summary(summary=train_scalar_, global_step=steps)
2.histogram存储权重,偏执。
a.先在训练的循环外定义:
W_w_attention_word_histogram = tf.summary.histogram(‘W_w_attention_word‘, self.han_model.W_w_attention_word) W_b_attention_word_histogram = tf.summary.histogram(‘W_w_attention_word‘, self.han_model.W_b_attention_word) context_vecotor_word_histogram = tf.summary.histogram(‘context_vecotor_word‘, self.han_model.context_vecotor_word) W_w_attention_sentence_histogram = tf.summary.histogram(‘W_w_attention_sentence‘, self.han_model.W_w_attention_sentence) W_b_attention_sentence_histogram = tf.summary.histogram(‘W_b_attention_sentence‘, self.han_model.W_b_attention_sentence) context_vecotor_sentence_histogram = tf.summary.histogram(‘context_vecotor_sentence‘, self.han_model.context_vecotor_sentence) train_variable_histogram = tf.summary.merge([W_w_attention_word_histogram, W_b_attention_word_histogram, context_vecotor_word_histogram, W_w_attention_sentence_histogram, W_b_attention_sentence_histogram, context_vecotor_sentence_histogram])
b.在session run的时候run test_scalar,获得值,然后再添加。
writer.add_summary(summary=train_variable_histogram_, global_step=steps)
原文地址:https://www.cnblogs.com/callyblog/p/9549993.html
时间: 2024-11-03 13:11:36