I am creating a sparse neural network as described in the image below. Keras only provides a dense layer and we can’t choose how many neurons we want to be connected to the previous layer. For implementing this using Keras, I am trying to implement the following approach:

1- Get the Gradient Matrix of each layer in each epoch

2- Multiply Gradient Matrix with a mask matrix to make it sparse

3- Update new weights according to that updated gradient matrix

I could not be able to find a gradient matrix in Keras. How can I get it and update it during epochs?

`tf.GradientTape()`

is only for see gradients.

Thanks in Advance. Please see attach picture below

### >Solution :

Instead of working with the gradients matrix manually, It’s better to apply TensorFlow pruning techniques. We can apply pruning to each layer of the model by passing a parameter.

```
pruning_params = {
'pruning_schedule':
PolynomialDecay(
initial_sparsity=0.1,
final_sparsity=0.75,
begin_step=1000,
end_step=5000,
frequency=100)}
```

For every layer chosen to be pruned, it adds a binary mask variable which is of the same size and shape as the layer’s weight tensor and determines which of the weights participate in the forward execution of the graph. Tensorflow introduce a new automated gradual pruning algorithm in which the sparsity is increased from an initial sparsity value(usually 0) to a final sparsity value over a span of n pruning steps, starting at training step t0 and with pruning frequency ∆t.

You can add a pruned layer by following code:

```
prune.prune_low_magnitude(
l.Conv2D(32, 5, padding='same', activation='relu'),
input_shape=input_shape,
**pruning_params)
```

Complete Tutorial is available here