今天我们来介绍一款机器学习中很实用的工具:TensorBoard。它之所以实用,主要是因为它将机器学习这一个类似黑盒的东西可视化出来,让用户对训练过程有一个更好的理解。下面就基于MNIST数据集来实战一下TensorBoard的使用方法。
TensorBoard可视化MNIST分类代码
import numpy as np
from keras.layers import Input, Dense, Dropout, Activation, Conv2D, MaxPool2D, Flatten
from keras.datasets import mnist
from keras.models import Model
from keras.utils import to_categorical
from keras.callbacks import TensorBoard
if __name__ == '__main__':
#数据集处理
# (X_train, y_train), (X_test, y_test) = mnist.load_data() # 使用keras导入数据集
data = np.load('mnist.npz')
x_train, y_train = data['x_train'], data['y_train']
x_test, y_test = data['x_test'], data['y_test']
x_train = np.expand_dims(x_train, axis=-1)
x_test = np.expand_dims(x_test, axis=-1)
y_train=to_categorical(y_train, num_classes=10)
y_test=to_categorical(y_test, num_classes=10)
#参数配置
batch_size =128
epoch = 10
# 模型搭建
inputs = Input([28, 28, 1])
x = Conv2D(32, (5, 5), activation='relu' )(inputs)
x = Conv2D(64, (5, 5), activation='relu')(x)
x = MaxPool2D(pool_size=(2, 2))(x)
x = Flatten()(x)
x = Dense(128, activation='relu')(x)
x = Dropout(0.5)(x)
x = Dense(10, activation='softmax')(x)
model = Model(inputs, x)
#模型编译
model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=['acc'])
#面板配置
Tensorboard = TensorBoard(log_dir='.\model', histogram_freq=1, write_grads=True) #log_dir保存目录
#模型训练
result = model.fit(x_train, y_train, batch_size=batch_size, epochs=epoch, shuffle=True, validation_split=0.2,callbacks=[Tensorboard]) #在回调函数中加入TensorBoard!
import numpy as np
from keras.layers import Input, Dense, Dropout, Activation, Conv2D, MaxPool2D, Flatten
from keras.datasets import mnist
from keras.models import Model
from keras.utils import to_categorical
from keras.callbacks import TensorBoard
if __name__ == '__main__':
#数据集处理
# (X_train, y_train), (X_test, y_test) = mnist.load_data() # 使用keras导入数据集
data = np.load('mnist.npz')
x_train, y_train = data['x_train'], data['y_train']
x_test, y_test = data['x_test'], data['y_test']
x_train = np.expand_dims(x_train, axis=-1)
x_test = np.expand_dims(x_test, axis=-1)
y_train=to_categorical(y_train, num_classes=10)
y_test=to_categorical(y_test, num_classes=10)
#参数配置
batch_size =128
epoch = 10
# 模型搭建
inputs = Input([28, 28, 1])
x = Conv2D(32, (5, 5), activation='relu' )(inputs)
x = Conv2D(64, (5, 5), activation='relu')(x)
x = MaxPool2D(pool_size=(2, 2))(x)
x = Flatten()(x)
x = Dense(128, activation='relu')(x)
x = Dropout(0.5)(x)
x = Dense(10, activation='softmax')(x)
model = Model(inputs, x)
#模型编译
model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=['acc'])
#面板配置
Tensorboard = TensorBoard(log_dir='.\model', histogram_freq=1, write_grads=True) #log_dir保存目录
#模型训练
result = model.fit(x_train, y_train, batch_size=batch_size, epochs=epoch, shuffle=True, validation_split=0.2,callbacks=[Tensorboard]) #在回调函数中加入TensorBoard!
TensorBoard启动
实际上,训练信息已经保存在TensorBoard配置的目录中了。下面进入当前*.py目录下的PowerShell,敲入
tensorboard --logdir= XXX(前文中的model)
PowerShell会反馈一个本地6006端口开放的提示信息。此时打开浏览器输入
http://localhost:6006
这样就成功进入TensorBoard界面啦!
TensorBoard配置
我们上述实战例子中,我们只能够看到Loss和Acc的训练曲线,而其他信息是没有保存下来的。但世界上TensorBoard的可视化功能非常强大,目前可以支持一下7种可视化类别:
- SCALARS:展示训练过程中的准确率、损失值、权重/偏置的变化情况
- IMAGES:展示训练过程中及记录的图像
- AUDIO:展示训练过程中记录的音频
- GRAPHS:展示模型的数据流图,以及各个设备上消耗的内存和时间
- DISTRIBUTIONS:展示训练过程中记录的数据的分布图
- HISTOGRAMS:展示训练过程中记录的数据的柱状图
- EMBEDDINGS:展示词向量后的投影分布
具体的参数配置后续再加~
参考文献:
- https://blog.csdn.net/fendouaini/article/details/80368770?utm_medium=distribute.pc_relevant.none-task-blog-BlogCommendFromMachineLearnPai2-2.nonecase&depth_1-utm_source=distribute.pc_relevant.none-task-blog-BlogCommendFromMachineLearnPai2-2.nonecase
- https://blog.csdn.net/u010099080/article/details/77426577
- https://blog.csdn.net/hnwolfs/article/details/81122380?utm_medium=distribute.pc_relevant.none-task-blog-baidujs-1