Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Programing a Pytorch neural network with a branch in the flow of information

I am trying to program a custom layer in PyTorch. I would like this layer to be fully connected to the previous layer but at the same time I want to feed some information from the input layer, let’s say I want it to be fully connected to the first layer as well. For example the 4th layer would be fed the 3rd and 1st layer.
This would make the information flow split at the first layer and one branch would be inserted later into the network.

I have to define the forward in this layer having two inputs

class MyLayer(nn.Module):

    def __init__(self, size_in, size_out):
        super().__init__()

        self.size_in, self.size_out = size_in, size_out

        weights = torch.Tensor(size_out, size_in)

        (... ...)

    def forward(self, first_layer, previous_layer):

            (... ...)

        return output

How can I make this work if I put this layer after, let’s say, a normal feed-farward which takes only the previous layer’s output as input?
Can I use nn.Sequential with this layer?

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Thanks!

>Solution :

just concatenate the input info with the output of previous layers and feed it to next layers, like:

class Net(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(100, 120) #supose your input shape is 100
        self.fc2 = nn.Linear(120, 80)
        self.fc3 = nn.Linear(180, 10)

    def forward(self, input_layer):

        x = F.relu(self.fc1(input_layer))
        x = F.relu(self.fc2(x))
        x = torch.cat((input_layer, x), 0)
        x = self.fc3(x) #this layer is fed by the input info and the previous layer
        return x
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading