Understanding Convolution: From Discrete to Continuous Applications
An exploration of convolution concepts across various fields, including probability, image processing, and Fourier transforms, with a video on the continuous case for deeper understanding.
🔥 Related Trending Topics
LIVE TRENDSThis video may be related to current global trending topics. Click any trend to explore more videos about what's hot right now!
THIS VIDEO IS TRENDING!
This video is currently trending in Pakistan under the topic 'f'.
About this video
Discrete convolutions, from probability to image processing and FFTs.
Video on the continuous case: https://youtu.be/IaSGqQa5O-M
Help fund future projects: https://www.patreon.com/3blue1brown
Special thanks to these supporters: https://3b1b.co/lessons/convolutions#thanks
An equally valuable form of support is to simply share the videos.
Other videos I referenced
Live lecture on image convolutions for the MIT Julia lab
https://youtu.be/8rrHTtUzyZA
Lecture on Discrete Fourier Transforms
https://youtu.be/g8RkArhtCc4
Reducible video on FFTs
https://youtu.be/h7apO7q16V0
Veritasium video on FFTs
https://youtu.be/nmgFG7PUHfo
A small correction for the integer multiplication algorithm mentioned at the end. A “straightforward” application of FFT results in a runtime of O(N * log(n) log(log(n)) ). That log(log(n)) term is tiny, but it is only recently in 2019, Harvey and van der Hoeven found an algorithm that removed that log(log(n)) term.
Another small correction at 17:00. I describe O(N^2) as meaning "the number of operations needed scales with N^2". However, this is technically what Theta(N^2) would mean. O(N^2) would mean that the number of operations needed is at most constant times N^2, in particular, it includes algorithms whose runtimes don't actually have any N^2 term, but which are bounded by it. The distinction doesn't matter in this case, since there is an explicit N^2 term.
Thanks to these viewers for their contributions to translations
Hebrew: Omer Tuchfeld
Italian: Emanuele Vezzoli
Vietnamese: lkhphuc
--------
These animations are largely made using a custom python library, manim. See the FAQ comments here:
https://www.3blue1brown.com/faq#manim
https://github.com/3b1b/manim
https://github.com/ManimCommunity/manim/
You can find code for specific videos and projects here:
https://github.com/3b1b/videos/
Music by Vincent Rubinetti.
https://www.vincentrubinetti.com/
Download the music on Bandcamp:
https://vincerubinetti.bandcamp.com/album/the-music-of-3blue1brown
Stream the music on Spotify:
https://open.spotify.com/album/1dVyjwS8FBqXhRunaG5W5u
Timestamps
0:00 - Where do convolutions show up?
2:07 - Add two random variables
6:28 - A simple example
7:25 - Moving averages
8:32 - Image processing
13:42 - Measuring runtime
14:40 - Polynomial multiplication
18:10 - Speeding up with FFTs
21:22 - Concluding thoughts
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe: http://3b1b.co/subscribe
Various social media stuffs:
Website: https://www.3blue1brown.com
Twitter: https://twitter.com/3blue1brown
Reddit: https://www.reddit.com/r/3blue1brown
Instagram: https://www.instagram.com/3blue1brown
Patreon: https://patreon.com/3blue1brown
Facebook: https://www.facebook.com/3blue1brown
Video Information
Views
3.3M
Total views since publication
Likes
114.4K
User likes and reactions
Duration
23:01
Video length
Published
Nov 18, 2022
Release date
Quality
hd
Video definition
Captions
Available
Subtitles enabled