Differentiation is an Infinite Matrix

This is the English translation of a Japanese video posted in March 2024.

Таймкоды

00:00:00 Введение в проблему

  • Обсуждение понятия дифференцирования.
  • Представление проблемы: попытка связать дифференцирование с матрицами.

00:01:03 Основы умножения матриц

  • Объяснение работы умножения матриц на двумерный вектор.
  • Идея использования матрицы для преобразования двумерного вектора.

00:02:00 Преобразование многочленов в векторы

  • Матрицы могут преобразовывать только векторы, а не многочлены.
  • Сопоставление коэффициентов многочлена с вектором.
  • Дифференцирование многочлена и попытка представить его с помощью матрицы.

00:02:56 Создание матрицы дифференцирования

  • Определение матрицы D для дифференцирования многочленов.
  • Проверка соответствия матричного расчёта результату дифференцирования.

00:03:52 Расширение матрицы дифференцирования

  • Ограничения текущей матрицы: возможность дифференцирования только многочленов не выше второй степени.
  • Расширение матрицы D для дифференцирования многочленов любой степени.

00:04:46 Применение расширенной матрицы

  • Необходимость бесконечных векторов для умножения на расширенную матрицу.
  • Пример дифференцирования многочлена с помощью расширенной матрицы.

00:05:40 Проверка результата

  • Детальный расчёт компонентов вектора и проверка соответствия результату дифференцирования.
  • Подтверждение совпадения матричного вычисления с результатом дифференцирования.

00:06:38 Применение к другим функциям

  • Обсуждение возможности дифференцирования тригонометрических и экспоненциальных функций с помощью расширенной матрицы.
  • Представление синуса и косинуса как бесконечных многочленов и их дифференцирование с помощью матрицы.

00:07:37 Дифференцирование и матрицы

  • Дифференцирование можно представить как матрицу, независимо от классического определения через предел.
  • Матрица дифференцирования d не зависит от исходного определения дифференцирования.

00:08:36 Векторы и функции

  • Вектор f представляет функцию f от x, а матрица d — дифференцирование.
  • Существуют разные способы связать функции с векторами, что влияет на форму матрицы дифференцирования.
  • Коэффициенты бесконечных полиномов, таких как ряды Тейлора и Фурье, используются для представления функций.

00:09:32 Возведение в квадрат и вторая производная

  • Возведение матрицы d в квадрат соответствует взятию второй производной.
  • Результат возведения d в квадрат можно использовать для вычисления второй производной без повторного расчёта.

00:10:28 Обратная матрица и интегрирование

  • Обратная матрица d не существует, так как дифференцирование устраняет постоянный член.
  • Интегрирование означает антидифференцирование, но постоянная интегрирования не учитывается.

00:11:27 Производные от производных

  • Производная от производной не является обратной матрицей из-за нулевого верхнего левого элемента.
  • Первый столбец матрицы дифференцирования d состоит из нулей, что делает невозможным получение единичной матрицы при умножении слева.

00:12:20 Заключение

  • Обсуждение представления дифференцирования с помощью матрицы изменило представление о дифференцировании.
  • Подчёркивается важность начальных условий в дифференциальных уравнениях для восстановления постоянного члена.

Расшифровка видео

0:00
what what is this this is what is called
0:02
differentiation who is it just who are
0:08
you oh my I feel like I had a bad dream
0:12
good morning zindan M good morning let’s
0:15
get started today’s problem is here I
0:17
just woke up
0:19
though well um represent G by GX as a
0:23
matrix uh what does it mean this is
0:26
quite a tricky problem I have no idea
0:28
what we are supposed to do
0:30
differentiation in a matrix are
0:31
completely different things right that’s
0:33
right they’re completely different
0:35
things but the fact that it came up as a
0:37
problem like this means there might be a
0:39
way to mathematically interpret and
0:42
represent differentiation with a matrix
0:44
let’s give it a try understood try to
0:47
recall how matrix multiplication works
0:49
as an example let’s consider a 2×2
0:52
square Matrix like this then multiply
0:54
this Square matrix by a two-dimensional
0:56
Vector a matrix with a single column can
0:59
be identified as a vector you know
1:02
that’s how it works this matrix
1:04
multiplication is first focus on this
1:07
part and the first component becomes
1:09
like this next focus on this part and
1:13
the second component becomes like this
1:16
now that you mention it I think it was
1:18
something like that note that the result
1:21
is also a two-dimensional Vector see in
1:24
other words this Matrix can be thought
1:26
of as transforming a two-dimensional
1:29
vector indeed that’s correct based on
1:32
this idea let’s think about whether
1:34
differentiation can be represented with
1:36
a matrix however handling a general
1:39
function right away is difficult so
1:41
let’s start by considering a polinomial
1:43
in X for instance something like this
1:46
this is a quadratic polinomial hold on a
1:49
second I’m starting to lose track of
1:51
what we’re doing but what we’re trying
1:53
to do now is create a matrix that
1:56
differentiates this polinomial right oh
1:58
so you do understand menad unfortunately
2:01
matrices can only transform vectors and
2:03
they cannot transform polinomial that’s
2:05
a sharp observation what it’s true that
2:09
as it is it can’t be transformed using a
2:11
matrix that’s why we need to identify
2:14
polinomial with vectors so we’ll extract
2:17
the coefficients of the polinomial and
2:19
create a vector let’s identify this
2:21
Vector with the original polinomial
2:23
that’s quite an unusual thing to do
2:25
you’ll understand what we’re doing soon
2:27
enough now then let’s differentiate the
2:29
polinomial let me do it for this
2:33
polinomial when you differentiate it
2:35
with respect to X it becomes like this
2:38
that’s exactly right now let’s assume we
2:41
can represent this operation with a
2:43
matrix so there exists some Matrix and
2:47
if you multiply this matrix by the
2:48
vector of polinomial
2:50
coefficients the result should be a
2:52
vector corresponding to the
2:54
differentiated polinomial H in other
2:57
words the calculation in this part
2:59
becomes a a sub 1 so the first row
3:02
should look like this next this part
3:05
becomes 2 a sub 2 so the second row
3:07
should be like
3:08
this finally since the result becomes
3:11
zero the elements of this row can also
3:14
be zero looks good with this we have
3:17
represented the differentiation of
3:19
polinomial with a matrix let’s denote
3:21
this Matrix as d h is this the Matrix
3:25
that represents differentiation I don’t
3:27
quite get it all right let’s try using
3:29
this to differentiate we’ll consider a
3:32
polinomial as a function of X this can
3:35
be expressed like
3:37
this and the corresponding Vector looks
3:39
like
3:40
this now let’s differentiate this which
3:43
means multiply the differentiation
3:45
Matrix d by the vector F I’ll skip the
3:48
calculation steps but the result is here
3:52
this corresponds to the result of
3:54
differentiating f ofx if we actually
3:56
differentiate F ofx the result looks
3:59
like like this so you can see that the
4:02
Matrix calculation matches the result
4:05
well it’s true that multiplying by The
4:07
Matrix D gives the same result as
4:09
differentiation but you can only
4:12
differentiate polinomial of degree 2 or
4:14
less with this that’s not good you’re
4:17
right then to handle any polinomial
4:19
let’s extend the differentiation Matrix
4:21
D when we do that D takes this form ah
4:25
the blank parts are considered to be
4:27
zero I feel like this showed up in my my
4:29
dream maybe it’s just my imagination by
4:32
the way how far does this Matrix go it
4:35
continues infinitely what does such a
4:38
matrix even exist that’s a short
4:40
question from here on we’ll be dealing
4:43
with infinite dimensions and infinite
4:45
sums in reality we need to rigorously
4:48
Define their meaning in calculation
4:50
rules but for now we’ll prioritize
4:52
intuitive understanding so please keep
4:55
that in mind I see now let’s
4:57
differentiate a polinomial using this
4:59
Matrix this time we’ll consider a
5:02
polinomial like this extracting the
5:04
coefficients just like before the
5:06
corresponding Vector takes this form uh
5:09
that makes sense so we also write the
5:11
vector as infinite dimensional yes the
5:14
vector also has to be infinite
5:16
dimensional otherwise you can’t multiply
5:19
the matrix by the vector I see all right
5:22
zon try differentiating f using the
5:24
Matrix D okay if I think about it the
5:27
same way as before when you multiply G
5:29
by F you should get the result of
5:31
differentiating f if we write G and F
5:34
explicitly it looks like this H the
5:37
first component is we should focus here
5:41
in the first row the only nonzero value
5:43
is one and this one corresponds to that
5:46
one so the first component can be
5:48
considered one that’s true next let’s
5:52
calculate the second component in the
5:54
second row the only nonzero value is
5:56
this two and the two corresponds to zero
6:00
so the second component becomes zero it
6:02
ended up being zero now what about the
6:05
next one uh the third component is this
6:08
three corresponds to the two here which
6:11
means the result is six that’s right
6:14
since the four in the fourth row
6:16
corresponds to zero the fourth component
6:19
is zero from here on the vector
6:21
components remain zero so the
6:23
calculation results remain zero as well
6:26
that’s how it is all right let’s confirm
6:28
that this calcul corresponds to
6:31
differentiation the derivative of f ofx
6:33
is like this and the result turns into
6:37
this The Matrix calculation result
6:40
matches the result of differentiation
6:43
that’s true with this Matrix you can
6:45
differentiate polinomial of any degree
6:48
but somehow I feel a bit tricked what do
6:50
you mean you can only differentiate
6:52
polinomial with this there are many
6:54
other functions like trigonometric
6:56
functions or exponential functions it’s
6:59
true that not every differentiable
7:00
function can be differentiated using
7:02
this Matrix but if it’s a trigonometric
7:05
or an exponential function you can
7:08
differentiate it with this what really
7:11
for example s of X and cosine of x can
7:13
be expressed like this simply put they
7:16
can be represented as infinite
7:18
polinomial wow I see and if you extract
7:21
their coefficients you can represent
7:23
them as vectors h i get it let’s take
7:27
the vector corresponding to sign and
7:30
differentiate it using this Matrix I’ll
7:32
skip the calculation steps but the
7:35
result will be a vector corresponding to
7:37
cosine wow that’s amazing wait but this
7:40
kind of feels obvious doesn’t it why do
7:42
you think so Express s of X as an
7:45
infinite sum and if you differentiate
7:47
each term it matches the form of cosine
7:50
of x expressed as an infinite sum so the
7:53
earlier calculation was just expressing
7:56
that in terms of a matrix you’ve
7:58
realized it’s in the
8:00
but there’s something I’d like you to
8:01
know here what is it it’s that
8:03
differentiation can be defined
8:05
independently as a
8:07
matrix try to recall it differentiation
8:10
was originally defined using a limit
8:12
like this oh yeah that’s right and the
8:15
Matrix the defined this time took this
8:17
form they look completely different
8:19
that’s the important point this Matrix
8:21
is defined to represent differentiation
8:24
but it doesn’t depend on the definition
8:26
of differentiation using a limit in
8:29
other words
8:30
you could Define this Matrix without
8:32
relying on the definition of
8:33
differentiation and it would still work
8:36
well defining such a matrix out of
8:38
nowhere would make you look eccentric uh
8:40
anyway we can think of a vector F that
8:43
represents a function f ofx and consider
8:46
a matrix D that represents
8:48
differentiation but a matrix is simply a
8:51
collection of numbers so it can be
8:53
defined independently a structure
8:55
similar to that of differentiation
8:57
actually exists in the world of matrices
9:00
which seems unrelated this is truly
9:02
fascinating now that you mention it it
9:04
does feel that way ah for those who’ve
9:07
studied linear algebra this
9:09
correspondence might feel natural as a
9:12
side note there are other ways to
9:14
associate functions with vectors and
9:16
depending on that the Matrix
9:18
representing differentiation might
9:20
change wow really in this discussion
9:23
when thinking about vectors
9:24
corresponding to functions we’ve
9:26
generally extracted coefficients of
9:29
inite polinomial so-called power series
9:32
well when we actually calculate the
9:34
coefficients we often consider it a
9:36
tailor series but let’s set it aside for
9:38
now in practice fora series are often
9:41
used instead of power series we won’t go
9:43
into detail about 4A series but remember
9:46
that there are various ways to map
9:48
functions to vectors H understood now as
9:52
a result of what we’ve done so far we’ve
9:54
roughly established that this Matrix key
9:57
represents differentiation here let’s
9:59
try doing something Matrix like what do
10:02
you mean for example how about squaring
10:04
D of course squaring d means multiplying
10:07
G by itself as a matrix then it becomes
10:10
like this what could this be actually
10:12
this corresponds to taking the second
10:14
derivative or differentiating twice what
10:17
this is pretty interesting D feels like
10:20
both a matrix and
10:21
differentiation here if you do this
10:24
calculation once you can reuse it every
10:27
time you calculate a second derivative
10:29
if you you memorize d^2 you don’t have
10:31
to calculate it every time you could
10:33
probably think about General n
10:35
derivatives in the same way that’s true
10:38
let’s look at another example this time
10:41
let’s consider the inverse Matrix of d
10:43
by the way an inverse Matrix is a matrix
10:46
that when multiplied results in the
10:48
identity Matrix I what do you think this
10:51
will be uh if it’s the inverse of
10:54
differentiation is it integration that’s
10:56
almost correct what do you mean by
10:58
almost actually the inverse Matrix of D
11:01
does not exist you tricked me if we try
11:04
defining a matrix corresponding to
11:06
integration as D Prime D Prime looks
11:09
something like this here the integration
11:11
refers to anti- differentiation the
11:13
process of finding a function that
11:15
returns to the original when
11:17
differentiated note that the constant of
11:19
integration is ignored here it has a
11:22
form that is symmetric to the
11:23
differentiation Matrix this is
11:25
interesting in its own way now let’s
11:28
calculate DD Prime will skip the
11:30
calculation process but the result takes
11:33
this form in other words this is the
11:36
infinite dimensional identity Matrix wow
11:39
so D Prime is the inverse Matrix of D
11:41
after all not so fast next let’s
11:44
calculate D Prime D the result comes out
11:47
like this what the top left is zero
11:50
that’s right which means this is not the
11:52
identity Matrix so D Prime is not the
11:56
inverse Matrix of D how strange why does
11:59
this happen look at the First Column of
12:01
the differentiation Matrix D the First
12:04
Column is all zeros this ensures that
12:06
the top left of the result will always
12:08
be zero so no matter what Matrix you
12:11
multiply from the left it cannot become
12:13
the identity Matrix that is D does not
12:16
have an inverse Matrix to begin with so
12:18
that’s how it is if we interpret this
12:21
from a calculus perspective
12:23
differentiating eliminates the constant
12:25
term and integrating can’t restore it if
12:28
you want to store it additional
12:30
information is needed like initial
12:32
conditions and differential equations I
12:34
don’t really get differential equations
12:36
but at least I understand that D Prime
12:38
is an almost inverse Matrix it was just
12:41
one step away it seems in a way this is
12:43
very calculous light and interesting
12:46
that’s true so that was the discussion
12:49
on representing differentiation with the
12:51
Matrix today what did you think we
12:53
talked about infinite dimensions and
12:55
infinite sums in a somewhat intuitive
12:57
way but I feel like my perspective on
13:00
differentiation has changed a bit well
13:03
I’m not entirely sure if we can just
13:05
connect these with an equal sign maybe
13:07
someone felt like writing it this way
13:10
okay let’s rack it up M well then take
13:12
care everyone see you again
13:15
[Music]

Поделиться: