banner
RustyNail

RustyNail

coder. 【blog】https://rustynail.me 【nostr】wss://ts.relays.world/ wss://relays.world/nostr

【调包侠的机器学习】 线性回归

import tensorflow as tf
import numpy as np
X = tf.zeros([1000, 1])
#     print("X: ", X)
X += tf.random.normal(shape=X.shape)
print("X: ",X[:2])
W = tf.zeros([1,1]) + 3.
b = tf.constant(2.)



Y = tf.matmul(X, W) + b
bias = tf.random.normal(shape=Y.shape)
Y = Y + bias






X:  tf.Tensor(
[[-1.2712942]
 [-0.177366 ]], shape=(2, 1), dtype=float32)



import matplotlib.pyplot as plt

plt.subplot(1, 1, 1)
plt.title("plot 1")
plt.scatter(X, Y)

plt.show()

png

net = tf.keras.Sequential()
# 添加一个连接层(Dense),输出标量数量为 1 ( w1x1 + w2x2 + b = y)
net.add(tf.keras.layers.Dense(units=1, input_dim=1))
# 正态分布随机
initializer = tf.initializers.RandomNormal(stddev=0.1)
# 获取网络
# net = tf.keras.Sequential()
# 相当初始化一层,用来提供初始化数据
net.add(tf.keras.layers.Dense(1, kernel_initializer=initializer))
sgd = tf.keras.optimizers.Adam(learning_rate=0.01)

net.compile(optimizer=sgd, loss=tf.keras.losses.MeanSquaredError(), metrics=[tf.keras.metrics.SparseCategoricalAccuracy()])
# def normal(data):
#     m = np.mean(data)
#     mx = max(data)
#     mn = min(data)
#     return [(float(i) - m) / (mx - mn) for i in data]

# xx = normal(X)
# yy = normal(Y)
# print(xx[:4], yy[:4])
# xx, yy =tX.astype(np.float32).reshape(-1, 1), tY.astype(np.float32).reshape(-1, 1)
# print(xx[:4], yy[:4])
remote = tf.keras.callbacks.RemoteMonitor(root='http://localhost:9000')
history = net.fit(X, Y, batch_size=100, epochs=100, validation_split=0.2, callbacks=[remote])
net.summary()
Epoch 1/100
8/8 [==============================] - 0s 20ms/step - loss: 13.1567 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 13.6374 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 2/100
8/8 [==============================] - 0s 12ms/step - loss: 12.7501 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 13.0824 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 3/100
8/8 [==============================] - 0s 8ms/step - loss: 12.2133 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 12.3870 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 4/100
8/8 [==============================] - 0s 7ms/step - loss: 11.5556 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 11.5262 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 5/100
8/8 [==============================] - 0s 11ms/step - loss: 10.7523 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 10.5153 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 6/100
8/8 [==============================] - 0s 8ms/step - loss: 9.8151 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 9.3757 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 7/100
8/8 [==============================] - 0s 10ms/step - loss: 8.8276 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 8.1307 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 8/100
8/8 [==============================] - 0s 10ms/step - loss: 7.7102 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 6.8838 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 9/100
8/8 [==============================] - 0s 10ms/step - loss: 6.5997 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 5.6855 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 10/100
8/8 [==============================] - 0s 10ms/step - loss: 5.5322 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 4.5915 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 11/100
8/8 [==============================] - 0s 10ms/step - loss: 4.5828 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 3.6276 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 12/100
8/8 [==============================] - 0s 7ms/step - loss: 3.7428 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 2.8263 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 13/100
8/8 [==============================] - 0s 11ms/step - loss: 3.0192 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 2.1999 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 14/100
8/8 [==============================] - 0s 9ms/step - loss: 2.4301 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 1.7255 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 15/100
8/8 [==============================] - 0s 10ms/step - loss: 1.9782 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 1.3785 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 16/100
8/8 [==============================] - 0s 8ms/step - loss: 1.6411 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 1.1450 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 17/100
8/8 [==============================] - 0s 11ms/step - loss: 1.3934 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 1.0115 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 18/100
8/8 [==============================] - 0s 10ms/step - loss: 1.2405 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9485 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 19/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1697 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9267 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 20/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1255 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9291 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 21/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1092 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9376 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 22/100
8/8 [==============================] - 0s 8ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9456 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 23/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1048 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9542 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 24/100
8/8 [==============================] - 0s 12ms/step - loss: 1.1051 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9556 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 25/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9571 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 26/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9557 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 27/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9546 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 28/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9535 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 29/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1045 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9515 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 30/100
8/8 [==============================] - 0s 6ms/step - loss: 1.1046 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9525 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 31/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9520 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 32/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1038 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9521 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 33/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9527 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 34/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9535 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 35/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1037 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9527 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 36/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1037 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9521 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 37/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1043 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9487 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 38/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9510 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 39/100
8/8 [==============================] - 0s 12ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9529 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 40/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1037 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9517 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 41/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9509 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 42/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9514 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 43/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1038 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9521 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 44/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1038 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9511 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 45/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9518 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 46/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1050 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9536 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 47/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9501 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 48/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9501 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 49/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9530 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 50/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9529 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 51/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9527 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 52/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9519 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 53/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9517 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 54/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9521 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 55/100
8/8 [==============================] - 0s 8ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9526 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 56/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1046 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9537 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 57/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9518 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 58/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9511 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 59/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9502 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 60/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9504 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 61/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1043 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9548 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 62/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9555 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 63/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9548 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 64/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9529 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 65/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1045 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9493 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 66/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1046 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9478 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 67/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1038 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9507 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 68/100
8/8 [==============================] - 0s 6ms/step - loss: 1.1037 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9525 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 69/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9531 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 70/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9540 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 71/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9533 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 72/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1043 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9526 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 73/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9511 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 74/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9540 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 75/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9535 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 76/100
8/8 [==============================] - 0s 8ms/step - loss: 1.1046 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9524 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 77/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1043 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9504 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 78/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1046 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9523 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 79/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1045 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9541 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 80/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1043 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9498 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 81/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9484 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 82/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1044 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9537 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 83/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1041 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9519 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 84/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9527 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 85/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1050 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9494 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 86/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1037 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9520 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 87/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1038 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9546 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 88/100
8/8 [==============================] - 0s 9ms/step - loss: 1.1050 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9545 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 89/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1053 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9565 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 90/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9512 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 91/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1038 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9493 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 92/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1045 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9522 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 93/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9507 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 94/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1046 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9538 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 95/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1043 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9511 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 96/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9497 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 97/100
8/8 [==============================] - 0s 11ms/step - loss: 1.1042 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9503 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 98/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1040 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9511 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 99/100
8/8 [==============================] - 0s 7ms/step - loss: 1.1039 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9512 - val_sparse_categorical_accuracy: 0.0000e+00
Epoch 100/100
8/8 [==============================] - 0s 10ms/step - loss: 1.1060 - sparse_categorical_accuracy: 0.0000e+00 - val_loss: 0.9508 - val_sparse_categorical_accuracy: 0.0000e+00
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 1)                 2         
                                                                 
 dense_1 (Dense)             (None, 1)                 2         
                                                                 
=================================================================
Total params: 4
Trainable params: 4
Non-trainable params: 0
_________________________________________________________________
plt.plot(history.history["loss"], label="Training Loss")
plt.plot(history.history["val_loss"], label="Validation Loss")
plt.legend()
plt.show()
net.get_weights()
Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.