Get the value of some weights in a model trained by TensorFlow

I have trained a ConvNet model with TensorFlow, and I want to get a particular weight in layer. For example in torch7 I would simply access model.modules[2].weights. to get the weights of layer 2. How would I do the same thing in TensorFlow?

102554 次浏览

In TensorFlow, trained weights are represented by tf.Variable objects. If you created a tf.Variable—e.g. called v—yourself, you can get its value as a NumPy array by calling sess.run(v) (where sess is a tf.Session).

If you do not currently have a pointer to the tf.Variable, you can get a list of the trainable variables in the current graph by calling tf.trainable_variables(). This function returns a list of all trainable tf.Variable objects in the current graph, and you can select the one that you want by matching the v.name property. For example:

# Desired variable is called "tower_2/filter:0".
var = [v for v in tf.trainable_variables() if v.name == "tower_2/filter:0"][0]

So if you proceed this code step by step, you will first get a list of used/trainable variables. Then you could sort them in a list where you sort weight matrices/lists to variable names, just for example how you were possible to deal with that information.

vars = tf.trainable_variables()
print(vars) #some infos about variables...
vars_vals = sess.run(vars)
for var, val in zip(vars, vars_vals):
print("var: {}, value: {}".format(var.name, val)) #...or sort it in a list....

2.0 Compatible Answer: If we build a Model using Keras Sequential API, we can get the Weights of the Model using the code mentioned below:

!pip install tensorflow==2.1


from tf.keras import Sequential


model = Sequential()


model.add(Conv2D(filters=conv1_fmaps, kernel_size=conv1_ksize,
strides=conv1_stride, padding=conv1_pad,
activation=tf.nn.relu, input_shape=(height, width, channels),
data_format='channels_last'))


model.add(MaxPool2D(pool_size = (2,2), strides= (2,2), padding="VALID"))


model.add(Dropout(0.25))


model.add(Flatten())


model.add(Dense(units = 32, activation = 'relu'))


model.add(Dense(units = 10, activation = 'softmax'))


model.summary()


print(model.trainable_variables)

The Last Statement, print(model.trainable_variables), will return the Weights of the Model as shown below:

    [<tf.Variable 'conv2d/kernel:0' shape=(3, 3, 1, 32) dtype=float32>,
<tf.Variable 'conv2d/bias:0' shape=(32,) dtype=float32>, <tf.Variable
'dense/kernel:0' shape=(6272, 32) dtype=float32>, <tf.Variable 'dense/bias:0'
shape=(32,) dtype=float32>, <tf.Variable 'dense_1/kernel:0' shape=(32, 10)
dtype=float32>, <tf.Variable 'dense_1/bias:0' shape=(10,) dtype=float32>]

To get the weights in the form of arrays. This gives all the parameters(trainable and non trainable). Examples of non trainable parameters are moving mean and moving variance of Batch Norm Layer

model.get_weights()

And then you can access the weights of any layer using

model.get_weights()[0]
model.get_weights()[1]

model.weights

will also do the trick; the same as

model.trainable_variables