Physics-Informed Machine Learning with Residual Networks (ResNet)

Posted by

Residual Networks (ResNet) [Physics Informed Machine Learning]

Residual Networks (ResNet) [Physics Informed Machine Learning]

Residual Networks, or ResNets, are a type of deep neural network architecture that have been widely used in the field of machine learning, particularly in tasks like image recognition and natural language processing. ResNets are known for their ability to train very deep neural networks effectively, overcoming the problem of vanishing gradients that can occur in traditional deep neural networks.

One key innovation of ResNets is the use of skip connections, also known as shortcut connections, that allow information to bypass certain layers in the neural network. By doing so, ResNets are able to learn residual functions, which are the differences between the desired output and the actual output of a certain layer. This residual learning approach helps to overcome the degradation problem, where adding more layers to a neural network can actually lead to worse performance.

ResNets have been applied to a wide range of machine learning tasks, including image classification, object detection, and speech recognition. In recent years, researchers have also begun exploring the use of ResNets in physics-informed machine learning, where physical principles and constraints are incorporated into the training of neural networks. This approach can help to improve the accuracy and generalization of machine learning models in physics-related tasks, such as solving partial differential equations or predicting material properties.

Overall, Residual Networks (ResNets) have proven to be a powerful tool in the field of machine learning, enabling the training of very deep neural networks and achieving state-of-the-art performance in a variety of tasks. As the field of physics-informed machine learning continues to grow, ResNets are likely to play a key role in advancing our understanding of complex physical systems and phenomena.

0 0 votes
Article Rating
17 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
@suhailea963
4 months ago

I am a video editor. If you need any help related to video editing you can contact me. I will share my portfolio

@HansPeter-gx9ew
4 months ago

tbh understanding his videos is very difficult, IMO he explains badly. Like 14:14 is the first more complicated part and I don't really get what it is about. I wouldn't understand ResNet from his explanation either if I had no prior knowledge about it. He just assumes that I am some expert in math and DLs

@Daniboy370
4 months ago

You have an impressive ability to simplify complex subjects

@culturemanoftheages
4 months ago

Excellent explanation! For those interested in LLMs residual connections are also featured in the vanilla transformer block. The idea is similar to CNN ResNets, but instead of gradually adding pixel resolution each block adds semantic "resolution" to the original embedded text input.

@maksymriabov1356
4 months ago

IMHO you should speak a little faster and make less jests; for scientists watching this it wastes a time and attention.

@davidmccabe1623
4 months ago

Does anyone know if transformers have superseded resnets for image classification?

@physicsanimated1623
4 months ago

Hi Steve – this is Vivek Karmarkar! Thanks for the video – great content as usual and keeps me motivated to create my own PINN content as well. Looking forward to the next video in the series and would love to talk PINN content creation with you!
I have been thinking about presenting PINNs with ODEs as examples and its nice to contrast it with Neural ODEs – nomenclature aside, it looks like the power of the NN as universal approximators allows us to model either the flow field (Neural ODEs) or the physical field of interest (PINNs) for analysis which is pretty cool!

@lorisdemuth374
4 months ago

Many thanks for the extremely good videos. Really well explained and easy to understand.
A video on "Augmented neural ODEs" would go well with "neural ODEs" 😊

@ultrasound1459
4 months ago

ResNet is literally the best thing happened in Deep Learning.

@mostafasayahkarajy508
4 months ago

Thank you very much for your videos. I am glad that besides the classical sources to promote science (such as books and papers), your lectures can also be found on youtube. In my opinion, Prof. Bruton is the best provider of youtube lectures and I don't want to miss any of the lectures.

@ramimohammed3132
4 months ago

thank u sire!

@cieciurka1
4 months ago

SHAMEEEEEE🎉 Bound, border, infinity, noninfinity, natural, where is the end?! calc machine how it works, integer, costs money costs profits cons in mathematics, NOMIA! ECO? algorithm accuracy, fuzzy logic, integer 0-I. ONE BOOK NO INDIVIDUAL HERE 🎉WHEN YOU SMOOTHING GRADIENT YOU LOSING

@sainissunil
4 months ago

Thank you for making this. I watched your video on Neural ODEs before I watched this. It is much easier to understand the Neural ODE video now that I have watched this.

I would love to watch a video about the ResNet classifier idea you discuss here. If you have already done that please add a link here.

Thanks, and this is awesome!

@saraiva407
4 months ago

Thank you SO MUCH prof. Steve!! I intend to study neural networks in my graduate courses thanks to your lectures!! 😀

@cieciurka1
4 months ago

STEVE MAKE TWO. SMALLER HIGHER LIKE ARRAY ONE DIRECTION OR SYMMETRY LIKE MIRROR. FEEDBACK AND THIS 150.000ageSCIENCE.

@goodlack9093
4 months ago

Thank you for this content!
Love your approach. Please never stop educating people. We all need teachers like you!:) ps Enjoying reading your book

@Ishaheennabi
4 months ago

Love from kashmir india❤❤❤