-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Locally connected layer #201
Locally connected layer #201
Conversation
Hello, a tiny update: now, locally_connected_1d is a conv1d layer. Since layers that take a 2d array as input are not very supported in this repository, in the next days I'll provide an implementation which will make this work. Then, I'll add the passages which iwll transform the convolution in a locally_connected layer. |
Great. Tomorrow I'm going to try your implementation |
f4c4412
to
9ae50ee
Compare
ad592da
to
b5e7f74
Compare
Hello, after the recent updates I think that my reshape_generalized ceased to work. For these reasons, I preferred to switch to a reshape2d implementation. Of course, after making this work the goal would be to create a unique reshape file. The cnn mnist 1d file, with the locally_connected_1d layer, does still not work but it is normal since I have not worked on its connection to layer and network yet. Tomorrow I will. Sorry for the delay but I had some difficulties. I'd be glad if someone wants to have a look and apply some changes. It shouldn't be too complicated |
3a8cecc
to
eb4079d
Compare
Hello, sorry for not having given updates but I had been busy. I almost finished. I only have to fix one bug in cnn_network and then this draft will be ready for review, i think already tomorrow. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM. Here are some suggestions.
real, intent(in) :: gradient(:,:) | ||
! The `input` dummy argument is not used but nevertheless declared | ||
! because the abstract type requires it. | ||
self % gradient = pack(gradient, .true.) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the purpose of this?
Would the following not more eefficient?
self % gradient = pack(gradient, .true.) | |
real :: g_(:) => null() | |
g_(1:size(gradient)) => gradient | |
self % gradient = g_ | |
nullify(g_) |
pure module subroutine forward(self, input) | ||
class(reshape2d_layer), intent(in out) :: self | ||
real, intent(in) :: input(:) | ||
self % output = reshape(input, self % output_shape) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A pointer could be used here, instead of reshape
to avoid a tmp copy of input
@ricor07 Great job! I really like it! |
@OneAdder thank you for your review, I did the adjustments you suggested. |
I tried to solve conflicts with main but i think i have to merge more accurately. Is that needed, since you only have to merge the branch? |
If there were conflicts of this branch with main, then they had to be reconciled by merging main into here. There don't seem to be merge conflict now. |
But, it does seem like the merge conflicts may not have been properly resolved because now there are compilation errors. You can still resolve them by hand, or if you want you can revert to the previous commit and I can help merge main here. |
78d26e4
to
b69ba9a
Compare
Ok, solved. But I still get reminded that this branch has conflicts. Do i have to update main? |
No, |
Choose the layout you prefer. Thank you |
Co-authored-by: Jeremie Vandenplas <[email protected]>
Thank you for this PR, it's a great addition! I'll take a day or two more to review in more detail as it's so big. I'd also like to look whether it's feasible to hide conv1d/2d, maxpool1d/2d, and reshape2d/3d behind generic layer constructors |
Yes, I do. In a few days I'll start implementing other stuff like averagepooling and i will also group those under generic names. If you have time, I think i made a mistake with naming locally_connected_layer. I think that 1d does not need an underline before it. Could you fix this? Thanks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
Continuing milancurcic#2 (comment) on the right repository. Reshape_generalized works, locally_connected_1d still does not. This is, of course, still a draft. My final goal will be making the layer work and generalizing ALL the 1d-3d function which are horrible to see.
Edit: sorry, I mistakenly closed the other PR.