.mpipks-transcript | 03. Non-equilibrium Field Theory

MountAye

Feb 4, 2021


3. Non-equilibrium Field Theory

《非平衡态系统中的集体过程 (Collective processes in non-equilibrium systems)》是位于德累斯顿的马克思普朗克复杂物理研究所 (Max Planck Institute for the Physics of Complex Systems) Steffen Rulands 研究员的一门课程。

课程主页链接在此,网页上有课程的课件,录像发布于 YouTube。

YouTube 把视频中讲者说的话从语音转化成了文字,我把这些转录复制了下来,进行了简单的断句,并且推测了各段文字对应的课件的内容。

The slide 8 of examples was talked about at the end of the lecture rather than between neighboring slides.

random


(00:00) so i i have to set this sound to you so
(00:02) you hear them as well so you’re using
(00:04) headphones okay right that goes
(00:08) okay so then let’s start uh so welcome
(00:11) uh to our third lecture in collective
(00:14) processes and this will be
(00:17) the final lecture that is more
(00:18) methodology methodological
(00:21) and technical so today we’ll talk about
(00:24) a field theory representations of
(00:28) the kind of processes that we’ve been
(00:29) looking at last time
(00:32) let me just share the screen
(00:42) okay

slide 1


(00:49) here we go so if you remember the
(00:52) lecture so you can see that
(00:54) great uh so you remember if you remember
(00:56) lecture that we had last time
(00:59) uh i gave a little introduction to the
(01:02) mathematics that is
(01:04) behind the description of the stochastic
(01:07) processes
(01:09) and we discussed two kinds
(01:12) of stochastic processes alternatives not
(01:15) two kinds of stochastic processes but
(01:17) two ways
(01:18) of describing sarcastic processes the
(01:21) first way
(01:22) that was associated with the name
(01:24) nonzero
(01:25) that we already featured in the very
(01:26) first lecture
(01:28) relied on deriving
(01:32) a stochastic differential equation that
(01:34) describes
(01:36) the time evolution of a single
(01:38) realization
(01:39) of such a stochastic process and the
(01:41) alternative approach that einstein
(01:44) applied for the description of broad
(01:46) emotion
(01:47) was to derive an equation
(01:52) for the time evolution of the
(01:54) probability density
(01:56) itself yeah and that was the master
(01:58) equation this master equation is not a
(02:00) stochastic
(02:01) differential equation is a deterministic
(02:04) equation
(02:05) but it’s typically high dimensional and
(02:09) relies as you saw last time on some
(02:13) integrations over the possible states
(02:16) that you can jump into
(02:18) and then for those of you who remained
(02:21) for the example
(02:23) section last time you would have seen
(02:26) that these master equations although
(02:29) they look
(02:30) pretty complicated can be derived
(02:32) phenologically
(02:34) phenologically actually
(02:37) in most cases in a very simple way yeah
(02:41) so these are the two kinds of
(02:42) description and why are we not
(02:44) completely happy with that
(02:48) so there exists approximately
(02:52) suppose both in general conditions both
(02:54) of these different kinds of equations
(02:55) cannot be solved
(02:57) there exists approximative methods that
(03:00) we discussed last time for example the
(03:01) focal plunk equation
(03:03) that was an approximation for the master
(03:06) equation
(03:07) and these appropriate approximative
(03:09) methods that work in certain special
(03:11) cases
(03:13) now the focal planck equation as you
(03:15) remember the
(03:16) derivation relied on making strong
(03:19) assumptions
(03:20) on how these jumps in states state space
(03:24) look like they’re very nicely behave
(03:26) these jumps are very small
(03:28) and that they effectively rely on very
(03:32) large
(03:32) system size and
(03:36) in other cases there are no
(03:39) approximative
(03:40) methods at all so what field theory
(03:44) does for us is it gives us a flexible
(03:47) framework a general
(03:48) and flexible framework that allows us to
(03:51) describe a large class of stochastic
(03:54) systems
(03:55) and it is even
(03:59) maybe if you remember from statistical
(04:01) physics or quantum mechanics
(04:03) it’s uh so general that you can apply it
(04:06) to
(04:06) a very diverse set of systems that’s
(04:10) one thing is for example spatially
(04:12) extended systems
(04:13) yes you can see on the bottom uh right
(04:16) yeah so this is a system where
(04:20) the evolution over the spatial
(04:22) information
(04:23) itself is very important not because you
(04:26) see here there’s a chemical
(04:28) system you see that here structures form
(04:31) it’s an example of what’s called binodal
(04:33) decomposition
(04:35) now so here the spatial component is
(04:36) very important
(04:38) and this has to be taken into account
(04:41) when you want to understand of course
(04:43) collective processes
(04:45) another example are non-linearities in
(04:48) stochastic differential equations
(04:51) now these become especially if they
(04:53) affect an oyster and become pretty hard
(04:55) very quickly
(04:56) and another example is that i would have
(04:58) showed you last time
(05:00) that field theory that typical
(05:02) approximative methods like the focal
(05:04) planck
(05:06) method or other kinds of expansions
(05:10) uh are not very suitable for rare events
(05:14) for the tales of probability
(05:16) distributions
(05:17) so in fields here we have a general
(05:19) framework
(05:20) of understanding these variables
(05:24) and so there has been a lot of work on
(05:26) how to make approximations to these
(05:28) field theories
(05:30) that allow us to understand the tales of
(05:32) probability distributions and very often
(05:34) these tales
(05:36) are pretty important so they are not
(05:39) they’re rare
(05:40) but uh if you think about a nuclear
(05:44) reactor or something like this yes these
(05:46) rare events are rare
(05:47) but you want to know how often they hear
(05:49) and there are a few theoretic methods
(05:51) that allow you
(05:52) to calculate the tails of probability
(05:54) distributions
(05:56) very nicely so
(06:00) this is what we’ll do and in this
(06:02) lecture

slide 2


(06:03) i want to show you how we can derive
(06:07) a field theory description um
(06:11) for the longitudinal equation for
(06:13) stochastic differential equations
(06:15) we’ll be looking at very simple a
(06:20) very simple automatic equation that
(06:22) doesn’t have any
(06:23) explicit time derivative that doesn’t
(06:25) have
(06:26) multiplicative noise and we can
(06:30) nevertheless in the framework of field
(06:31) theory we can
(06:33) straightforwardly uh extend
(06:36) this methodology to more complicated
(06:38) systems so we restrict ourselves
(06:40) to this very simple larger equation uh
(06:43) with gaussian white noise
(06:45) that is as last time uncorrelated
(06:50) so if you
(06:54) so the first step here is to discretize
(06:58) time yeah and uh so we take
(07:02) the logical equation and we write it
(07:03) down in discrete time
(07:06) and uh as you saw uh last time
(07:09) uh or in the first of what was the first
(07:12) lecture already i saw
(07:13) two lectures ago oh wait whatever yeah
(07:16) so so no last time last time was the
(07:18) catholic processes
(07:19) yeah so as you saw last time the way we
(07:22) discretize
(07:24) stochastic differential equations and i
(07:27) showed you that for uh stochastic
(07:28) integrals is very important
(07:32) yeah and in this case we also have to
(07:34) discretize
(07:35) our stochastic differential equation in
(07:37) a certain way
(07:39) which is called the ito discretization
(07:43) and but if we do that if we discretize
(07:47) our stochastic differential equation in
(07:49) this way
(07:50) the idea is that in principle we can
(07:53) write down
(07:55) averages over any
(07:58) observable you know that’s here on the
(08:01) left left hand side by some complicated
(08:04) thing that is on the right hand side
(08:07) so this what is on the right hand side
(08:09) and this equation looks pretty
(08:10) complicated but it’s not that
(08:12) complicated so let’s have a look
(08:14) at these different terms in the first
(08:18) term here
(08:19) on the left right hand side the red term
(08:22) we just integrate over all possible
(08:25) realizations
(08:27) that a stochastic trajectory can take
(08:32) on the right hand side
(08:35) now if you look at the right hand side
(08:37) oh there’s a delta function missing here
(08:40) right at here delta here
(08:43) on the right hand side you have this
(08:45) delta function
(08:49) here you have this delta function and
(08:52) this delta function
(08:54) just makes sure that whatever we
(08:56) integrate here whatever trajectories we
(08:59) integrate over
(09:01) that they fulfill the discretized
(09:04) version
(09:05) of the larger equation
(09:08) yeah so the delta function is just the
(09:11) left-hand side
(09:12) minus the right-hand side of this launch
(09:15) of our equation
(09:16) and if the left-hand side is equal to
(09:18) the right-hand side
(09:20) then we take that trajectory into
(09:22) account
(09:23) you know and then sandwiched between
(09:26) that we have
(09:27) our observable o that is some functional
(09:31) of our trajectory x
(09:34) now as i say functional so i don’t know
(09:37) how much uh
(09:37) functional analysis all of you had so
(09:40) most
(09:41) so i would expect that most of you would
(09:44) have that
(09:44) until the first fourth term
(09:48) or so also but just just to make sure a
(09:51) functional
(09:52) is basically a function that maps um
(09:56) that maps this that maps
(09:59) this function x uh yes that map
(10:03) a function to a real number yeah
(10:06) and this is our how we define these
(10:08) observables
(10:09) now you take a trajectory and you map it
(10:12) to them to a number
(10:16) okay so this looks very complicated it
(10:18) doesn’t help us
(10:19) anything at all and uh
(10:22) one the other step that we need to make
(10:24) on the slide is we that we
(10:26) introduce some notations here
(10:29) and uh of course we don’t always want to
(10:32) write
(10:33) all of these integrals here on the left
(10:36) hand side
(10:37) we just say we write that in this
(10:39) functional form here
(10:41) and of course many of you will know that
(10:43) this is just
(10:44) a functional integral or a path into
(10:47) yeah that’s how we define it here

slide 3


(10:49) and now for uh notational convenience
(10:54) now we just go to a continuum locate
(10:56) notation now we forget that we
(10:58) discretize time in the previous step
(11:01) and we write the equation back in
(11:03) continuous form
(11:05) just for the sake of simplicity and we
(11:07) define
(11:08) some delta functional
(11:11) that is just equal to the product over
(11:14) all data functions that we had on the
(11:16) previous slide
(11:17) now if you look here are the product of
(11:20) our data functions
(11:21) just make sure that you fulfill the
(11:23) larger equation really at each time
(11:25) point
(11:27) and we just define like a super delta
(11:29) function or delta functional
(11:31) that makes sure that we really satisfy
(11:33) the larger equation for each time point
(11:41) so now we make a little trick
(11:45) now we say we have this delta function
(11:47) or this product of delta functions
(11:50) and what we say is that in fourier space
(11:54) the delta function is represented by a
(11:58) plane
(11:58) wave yeah so we fully transform the
(12:02) delta function
(12:15) or a functional is now and
(12:18) the fourier transform of the delta
(12:20) function
(12:21) just becomes delta x dot
(12:25) minus f of x minus
(12:29) sine because our delta our function
(12:31) equation
(12:33) is equal to a fury space
(12:38) d x tilde
(12:43) e to the minus i x to the
(12:48) x dot minus
(12:52) f of x minus c
(12:56) now we now get this variable x tilde
(13:00) so this is nothing there’s nothing
(13:01) happening it’s just the definition of
(13:03) the forage in the form
(13:04) of a delta function and because we have
(13:08) this
(13:08) product for many delta functions here
(13:12) we have the integral here that the path
(13:15) integral here will also get a path
(13:17) integral
(13:18) delta x total you know that’s that just
(13:22) the definition of the fury transform
(13:24) and now we plug this
(13:28) in again so
(13:32) we obtain from that
(13:35) that the expectation value of our
(13:39) observable yeah
(13:43) taken to the average now this average is
(13:45) over the distance
(13:47) over over a different voice realizations
(13:52) yeah it is equal now we plug that in
(13:55) an integral now over
(13:58) x and x tilde so it’s a path integral
(14:03) over x and x tilde so we there’s an
(14:06) integral
(14:07) over all realization of x and all
(14:10) realization
(14:11) of x2 now so now this x still that pops
(14:14) up yeah so we don’t really know what it
(14:15) is
(14:16) but it will hang around and it will show
(14:18) you later what it actually mean
(14:21) means so we have this fourth integral
(14:25) so our observable of x and then
(14:33) plugging in e to the minus
(14:37) i x tilde
(14:40) um sorry
(14:44) we’ve got an integral so we have here a
(14:46) little
(14:47) integral dt
(14:52) and this integral
(14:56) we get because we had this
(15:00) product over here now so here the
(15:03) product
(15:04) we had here gives us an integral so our
(15:07) sum
(15:08) in the exponential so this was
(15:10) originally like a product of many
(15:12) exponential
(15:14) and uh so this is this and then we have
(15:18) our
(15:18) x tilde x dot minus
(15:23) f of x minus x psi
(15:28) and then we close the average
(15:32) and then we just plug this in yeah we
(15:34) can move out
(15:36) everything that does not depend on psi
(15:39) or the noise
(15:41) out of the average because average is
(15:43) over the noise
(15:46) b x x to the
(15:51) power of x right now comes the stuff
(15:54) that does not depend
(15:56) on x under x i integral
(16:00) e t x to the
(16:04) x dot minus f of x
(16:08) and now we have some
(16:12) average over minus i
(16:15) dt x tilde
(16:19) times psi

slide 4


(16:23) you know so
(16:26) now the question is what is this here
(16:32) can we calculate this at the moment we
(16:34) cannot do anything with this equation
(16:36) with this expression can you can we
(16:38) calculate
(16:40) this last term that involves a noise
(16:44) average
(16:45) over e to the minus dt
(16:49) and psi now and there’s hope that we can
(16:52) calculate this because we know what
(16:54) psi is now we said that x i
(16:58) is a gaussian random variable
(17:01) yeah it has follows a normal
(17:03) distribution
(17:04) it’s uncorrelated so we know a lot of
(17:07) things about this
(17:08) sign and what we do right now
(17:12) right because psi is gaussian there’s
(17:14) also hope that we can actually
(17:17) solve or integrate this integral that is
(17:20) this average here
(17:21) now i’ll show you now how this works in
(17:24) detail
(17:25) yeah so we make use of the definition of
(17:28) phi
(17:30) so let’s see what this second average
(17:32) looks like
(17:34) now that’s o of x
(17:38) uh sorry it’s not o of x it is
(17:42) e to the minus i dt
(17:47) integral x of
(17:50) sine
(17:54) now and this is
(17:58) by definition by the definition of the
(18:00) average
(18:03) is equal to the integral over dxi
(18:08) times the probability over the
(18:11) probability distribution of
(18:13) psi which is a gaussian or a normal
(18:15) distribution
(18:18) 2 pi a a is the strength of our noise
(18:23) e to the minus psi
(18:26) squared over 2a
(18:29) yeah so this is the probability
(18:31) distribution
(18:33) and we know that psi is normally
(18:36) distributed
(18:37) yeah and that’s why we have this normal
(18:40) distribution here
(18:42) at now we multiply this by the thing
(18:45) we’re averaging over
(18:48) yeah e to the minus i
(18:51) d t x to the sine
(18:57) now so this excuse me yes isn’t it
(19:00) supposed to be exponential
(19:02) plus psi integral delta t whatever
(19:08) integral over let me see
(19:12) you mean the second integral this one
(19:13) here
(19:16) yeah okay so this one is that minus or
(19:18) plus
(19:20) let me just check
(19:23) let me check my notes
(19:27) okay
(19:31) so here we go
(19:35) so no there’s a minus
(19:39) i i wouldn’t be able to find him i
(19:42) wouldn’t be able to find it
(19:43) [Music]
(19:45) on the previous slide you can go and
(19:47) just
(19:48) okay so let me see maybe there’s a here
(19:51) we have the minus
(19:54) here we have the minus u of the minus
(19:57) here we have the minus and another minus
(20:00) yeah you’re right
(20:01) let me see what’s wrong here uh
(20:04) okay oh yes okay so here
(20:12) i think this minus here is not right
(20:17) now that’s just the definition here wait
(20:20) minus my i don’t know
(20:22) okay minus minus is plus
(20:28) minus minus with plus i think i think
(20:29) you’re right
(20:31) but then here should be a minus so where
(20:33) does the
(20:34) is there an i squared somewhere
(20:44) excuse me i think there should be a
(20:46) minus sign

slide 4


(20:47) um in the first exponential as well in
(20:50) the last step on this particular slide
(20:52) there are two exponential studies yes
(20:54) yes yes yes yes
(20:56) that’s what i’m wondering about um let
(20:58) me see if it’s
(20:59) actually the final result
(21:02) um i x minus minus
(21:08) [Music]
(21:14) okay so so mike
(21:17) let me let me just see
(21:24) let me just see
(21:28) so let’s let’s put this minus here in
(21:30) brackets
(21:31) now so this one minus this should have
(21:34) the opposite
(21:35) i don’t wait with me okay this is the
(21:37) minus then
(21:38) this should be plus you say
(21:42) you know i think i think it i think it
(21:45) will come out correctly later
(21:46) now let’s see how it goes um let’s see
(21:50) how it goes
(21:51) now it makes sense
(21:56) now we have to have a gaussian integral
(22:00) and that will determine whether what
(22:04) we’re doing is right
(22:05) okay so let’s let’s go on so this here
(22:08) is the integral and now we just
(22:12) put everything together and say
(22:15) that the sign 1 over
(22:20) 2 a e
(22:24) minus dt x to the
(22:28) psi
(22:32) now that’s the first one so that would
(22:34) be a plus then
(22:36) minus i dt x
(22:40) to the sine
(22:43) now i just i just uh i just combined the
(22:46) exponentials
(22:47) now thanks for thanks for paying so much
(22:49) attention
(22:51) um so yeah and this is here
(22:54) a gaussian integral and if you don’t
(22:57) know remember
(22:58) the gaussian integrals and you can look
(23:00) at the bottom here
(23:01) these gaussian intervals are pretty easy
(23:04) to solve
(23:05) probably most of you have heard of that
(23:08) and we can also solve
(23:10) this gaussian integral here and what we
(23:13) get
(23:13) is that this is just e to the a over 2
(23:18) dt x to the squared
(23:23) yeah because we know yeah and here
(23:27) one thing you have to make uh you need
(23:29) to remember
(23:30) is that this here has eyes in it
(23:33) now if you look at this formula at the
(23:35) bottom you need to take into account
(23:36) that these are complex
(23:38) integrals so we can calculate
(23:42) this quantity here
(23:45) we can calculate this quant this
(23:46) quantity here because we know
(23:49) how psi looks like and that we integrate
(23:52) over the distribution of
(23:53) sine of the noise organization and we
(23:56) get
(23:56) the term that we have here yeah
(23:59) and now we already have arrived at the
(24:02) famous
(24:03) no at least famous for a very small
(24:05) number of people

slide 5


(24:06) and this is the so-called martin citra
(24:09) rose johnson they dominicus
(24:12) uh functional integral yeah that’s what
(24:14) you see here
(24:16) in the red box now we just now put
(24:18) everything together
(24:19) here we have our noise here we have
(24:23) uh what we had before and here
(24:28) is so to say that it’s a deterministic
(24:30) part that came from the larger equation
(24:33) yeah and uh so what this tells us now
(24:37) here
(24:38) and maybe it looks familiar to some of
(24:40) you yet quantum field theory
(24:42) this looks very familiar so what we do
(24:45) now is
(24:46) we want to calculate the average over
(24:49) some observable
(24:52) we integrate over all possible
(24:56) realizations and over all possible
(24:58) realizations of some weird quantity
(25:00) x tilde so we got a second field here
(25:05) and weight the contributions of
(25:08) different trajectories
(25:11) by this exponential factor here
(25:14) now and this exponential factor looks
(25:16) very much like what you know from
(25:19) other field theories like quantum field
(25:21) theory and this is why this is very
(25:23) often called
(25:24) an action
(25:28) now that’s the martin sutra rose or ms
(25:31) rjd functional
(25:34) integral that allows us to really write
(25:37) a field theory
(25:39) for stochastic processes
(25:42) now so here our axes are not fields yet
(25:45) now they’re trajectories they don’t have
(25:47) a space component now they don’t have
(25:49) spatial dimensions
(25:51) but as you can see later the structure
(25:53) of a real spatial
(25:55) the the integrals they will look very
(25:58) similar
(26:00) now we can also rewrite this a little
(26:04) bit here and look at specific
(26:05) trajectories and then we can look for
(26:09) example at a special case
(26:11) where it is observable is just
(26:14) the propagator here or this will
(26:16) propagate as the probability
(26:18) that we end up at some state x at a time
(26:22) t
(26:23) if we start it add some x naught
(26:27) at a time t naught you know and we
(26:30) obtain this
(26:31) not by just requiring that this
(26:33) observable is a delta function
(26:36) where uh we say that the x the specific
(26:39) time
(26:40) t at a time t needs to be equal
(26:43) to the x that we give to the probability
(26:46) distribution here
(26:50) and then we plug that in and
(26:53) what we now need to say is that we only
(26:56) integrate
(26:57) over trajectories that actually started
(27:00) it’s not an end
(27:01) and x at a given times now that’s that’s
(27:04) what we have to take to account for in
(27:06) the boundaries and the bounds of the
(27:07) integrals
(27:08) yeah and then we get this form here
(27:13) now that looks very similar and that’s a
(27:16) different representation if you remember
(27:18) last time we had the
(27:19) koi maguro of uh chap and komogorov
(27:21) equation
(27:22) it was an equation for the same quantity
(27:25) and the idea was a little bit similar
(27:27) in this technical mcgovern
(27:30) equation we had the same spirit
(27:33) and we looked at different sums of
(27:35) different paths
(27:37) um a process could take to go from x
(27:40) naught to x
(27:41) and here we do the same thing in a more
(27:43) fancy way
(27:49) just to reflect on this a little bit
(27:51) more
(27:52) so what happened now so we started
(27:55) here with
(27:59) a longer equation we discretized it
(28:03) and if you remember uh quantum fields
(28:06) theory that’s also the step that you do
(28:08) there
(28:08) you discretize time into small intervals
(28:12) as the first step if you derive
(28:13) a quantum fluid theory and then
(28:17) we wrote expectation values of some
(28:19) observables
(28:20) formally in a way that involved
(28:23) integrals of
(28:23) all possible trajectories and the
(28:25) resultant trajectories
(28:27) filtered four trajectories that solve
(28:30) the launch of a creation
(28:32) now that was to say self uh
(28:35) circular starting point and in the next
(28:39) uh step we then got this field
(28:42) side tilde now that we still don’t know
(28:44) what it is exactly about
(28:46) now we got that from the fourier
(28:47) transform of the delta function
(28:52) and now now we have two fields we
(28:54) integrate over two fields
(28:56) x and x x tilde and
(28:59) in the end we managed to integrate out
(29:03) the noise so what we have here
(29:07) now is something that does not depend on
(29:09) the sign anymore
(29:11) it’s a deterministic
(29:14) equation and deterministic integral so
(29:17) somehow
(29:18) our noise was observed observed absorbed
(29:23) into a new field a new fluctuating field
(29:28) x tilde yeah and that’s how
(29:31) it very often goes now that you
(29:34) make a field theory and what you gain is
(29:37) you get a nice nice integral but you
(29:39) have to pay for
(29:40) it by having additional fields conjugate
(29:43) fields
(29:44) that you have to integrate over and the
(29:47) same is true here
(29:48) and i’ll show you later what these x
(29:50) tilde actually mean before i do that let

slide 6


(29:55) me just mention so so now we have a few
(29:56) theory i have a path integral and these
(29:59) path integrals are very useful
(30:01) because we can make use of a lot of
(30:03) tools
(30:04) from other field fields from quantum
(30:06) field theory
(30:08) renormalization perturbation theory and
(30:10) so we can
(30:11) make use of these tools very powerful
(30:14) frameworks developed in the last 70
(30:18) years or so
(30:20) we can make use of these frameworks and
(30:21) apply them to these
(30:23) few theories for stochastic processes
(30:27) and one of the things that we can do is
(30:29) we can
(30:30) define a so-called generating functional
(30:35) that’s maybe something that you already
(30:36) know from other field theories
(30:38) so what you do is you add some auxiliary
(30:42) fields
(30:43) external fields h and h
(30:46) tilde and these fields cuddle
(30:51) to x and x tilde the accelerations
(30:55) respectively
(30:57) and what is this is then the generating
(31:00) function
(31:01) and of course we know that these fields
(31:04) don’t really exist
(31:05) now we just added them and the reason
(31:08) why we added them
(31:10) is that if we take derivatives
(31:13) with respect to these fields h or this
(31:16) external
(31:17) forces or external fields h here
(31:21) and here what will happen is that each
(31:24) time
(31:25) because this is an in exponential
(31:28) the x
(31:31) let or just draw that now the x
(31:36) or the x tilde
(31:40) will go down here
(31:44) yeah and if we do that if we take these
(31:46) derivatives
(31:48) you know so we can therefore get the
(31:52) expectation values of combinations of
(31:56) the x
(31:56) and x tildes just by differentiating
(31:59) this
(32:01) generating functional with respect to
(32:04) these
(32:05) weird virtual fields
(32:08) yeah and when we’ve done that we have to
(32:12) remove these fields again so we have to
(32:14) set them back to zero
(32:15) now for example if you want to have the
(32:17) correlation the autocorrelation function
(32:19) so how much
(32:21) is the process at a time t correlated to
(32:24) the state at a time t
(32:25) prime yeah then we
(32:28) differentiate and we
(32:32) take the derivative first with respect
(32:34) to
(32:35) with respect to h at a certain time t
(32:39) yeah and then we get one of these axes
(32:42) here and then we take the derivative
(32:44) with uh to h at
(32:48) a time a different time t prime and then
(32:50) we get the field again
(32:52) at a different time here
(32:55) yeah and these are of course functional
(32:57) derivatives now that was
(32:59) uh functional calculus if you haven’t
(33:01) done that
(33:02) it’s uh for these what you what you do
(33:04) is you
(33:05) look at the change of a functional
(33:08) now for example this here is a
(33:10) functional
(33:12) we look at the change of a functional
(33:15) with respect to small
(33:16) changes of its argument also you have a
(33:20) look at
(33:20) perturbations in age and h tilde
(33:24) around some value yeah and then you
(33:28) see how your function changes that’s
(33:30) called a functional derivative
(33:32) and if you do that you get very
(33:34) conveniently these pre-factors here
(33:39) right here in front of the action and if
(33:42) you look at
(33:43) the definition here this is just what
(33:46) gives us our observable
(33:48) as for example if our observable is x
(33:52) that we just take the derivative once
(33:55) we get an x here yeah
(33:58) and if we have the x here we have the
(34:01) first moment so the mean
(34:03) of x the average of x if we take the
(34:06) derivative twice
(34:07) at different times then we get a
(34:09) correlation here on the left-hand side
(34:13) so this is a very convenient tool and as
(34:15) i said
(34:16) if you want to have a correlation
(34:18) function for example we take the
(34:19) derivative
(34:20) twice and you must always remember to
(34:23) set
(34:23) these fields to zero again it’s actually
(34:26) the same approach as you do in the
(34:28) quantity
(34:29) and classical field theory the
(34:31) equilibrium equilibrium field t
(34:37) okay so now what is this x
(34:40) of t that’s just a remark so i’m not
(34:42) doing the calculations like

slide 7


(34:44) what is this x tilde of t that we get
(34:48) got in this process
(34:52) there are different uh
(34:55) c theories for stochastic processes and
(34:57) for master equations
(34:58) and you always get some kind of
(35:01) auxiliary
(35:02) field some some conjugate field that you
(35:05) have to pay for
(35:07) and uh this x tilde from here from here
(35:10) you can
(35:12) get an intuition about that i’m not
(35:14) super rigorous but you can get
(35:16) your intuition about this if you write
(35:18) down the master equation
(35:20) sorry the larger equation with respect
(35:24) and add some external source
(35:27) capital h of t
(35:32) you know and if you add this external
(35:34) force capital h
(35:36) of t now some temporally fluctuating
(35:39) force
(35:40) that doesn’t depend on x itself
(35:43) and you plug that in into this martin
(35:46) sergio rose
(35:47) functional integral or martin citra ruse
(35:50) johnson did you limit it to minikit
(35:54) then you see a formal analogy that if
(35:57) you
(35:57) that this you get a term that looks like
(36:00) h tilde
(36:04) if you define this to be minus i this
(36:07) external field
(36:09) yeah and now we can see what happens
(36:14) to x what is the effect of this external
(36:17) field
(36:19) h of t on x so there’s a little bit in
(36:22) an
(36:23) analogy already here now so here on the
(36:25) left hand side it has something to do
(36:27) with this uh external field from the
(36:31) generating functional that couples to x
(36:33) tilde
(36:34) now let’s see what this does to x to the
(36:37) actual stochastic process
(36:39) now to this end we calculate a response
(36:43) function
(36:43) and this response function is just the
(36:47) change
(36:48) in the average of f x with respect
(36:52) to changes in this external field
(36:55) that’s called the response so how does
(36:56) the system respond
(36:58) to changes in this external field
(37:01) yeah and so as you remember the average
(37:05) here we just get by
(37:08) integrating uh by by taking this for
(37:12) this this generating functional
(37:15) and taking the derivative with respect
(37:17) to h
(37:18) once so that’s the first moment
(37:22) and because of this equality here
(37:27) now we see that this year
(37:30) this response function is we also get
(37:33) that
(37:33) if we take the derivative with respect
(37:36) to h
(37:37) and then h tilde at a certain time
(37:41) t let me see let me just say
(37:45) here that’s the tilde
(37:51) yeah and this here what this is
(37:59) as on the last slide is the correlation
(38:02) between
(38:02) x of t and x tilde of t
(38:06) now somehow the response of the system
(38:10) with respect to an external force
(38:13) or an infinitely miserable external
(38:16) force
(38:17) is given by how the x total
(38:21) couples to x
(38:24) so it describes the activity or is
(38:27) related
(38:28) to an infinitesimal response
(38:32) of x of the field x with respect
(38:35) to a small perturbation and that’s why
(38:39) this field x tilde is also called the
(38:41) response field
(38:44) now there’s just a little bit of
(38:46) intuition and very often this these
(38:48) conjugate variables somehow in some way
(38:51) encode the noise
(38:56) now i have an example uh let me just
(38:59) check the time whether we do it now or
(39:01) at the end of the lecture
(39:05) let’s do it at the end of the lecture
(39:06) again not this example for those of you
(39:09) or who
(39:10) heard a few theory course last year uh
(39:13) there were no
(39:14) examples so i’ll leave that to the to
(39:16) the end of the lecture that
(39:17) people can dial out uh if they’re tired

slide 9


(39:24) now i want to just give you a second
(39:26) remark
(39:28) so we are now in the framework of
(39:29) physicians
(39:31) and in field theory now if you remember
(39:34) we can have different formulations of
(39:36) field theories otherwise the lagrangian
(39:39) field theory and the hamiltonian field
(39:42) and we can translate these two into each
(39:45) other
(39:45) and we can do the same things here for
(39:49) the non-equilibrium for the stochastic
(39:51) process
(39:53) yeah and it’s also just a remark no it’s
(39:56) not
(39:57) so important for the rest of the lecture
(40:00) but you can formally define
(40:03) new variables q and
(40:07) p by these relations here
(40:10) and then this probability will take the
(40:14) form
(40:14) that i wrote down here now so this
(40:18) has formally the form of a hamiltonian
(40:22) action oh or having a hamiltonian theory
(40:25) so we have
(40:26) p times q dot minus
(40:29) some hamiltonian and this hamiltonian
(40:32) is given by this term here this looks a
(40:35) little bit like a kinetic energy so it’s
(40:38) just
(40:38) just just saying that we can write by
(40:41) variable transformation we can write
(40:42) these
(40:43) field theories uh then quite analogously
(40:47) to feed theories that we already know
(40:51) and we can even go one step further
(40:55) now we can take this here and integrate
(40:58) out the piece
(40:59) now because it’s just gradually
(41:01) quadratic in p
(41:03) so these are just essentially gaussian
(41:05) integrals that you can integrate over
(41:06) them
(41:07) and just to tell you the results that we
(41:09) then get
(41:10) a field theory that looks like a
(41:13) lagrangian field theory
(41:16) now we have a lagrangian that depends on
(41:17) q and the
(41:19) derivative of q with respect to time
(41:22) and if you then look at the analogy
(41:26) at this hammer at this land range here
(41:28) then
(41:29) this looks like it describes some kind
(41:31) of particle
(41:32) with that has some mass one over a uh
(41:35) that lays
(41:36) like in a potential that happens that’s
(41:38) coupled to some
(41:39) q dot here and uh we have now here
(41:43) the quadratic potential of laughter now
(41:46) so now these analogies they’re not very
(41:48) helpful yeah they don’t tell you
(41:49) anything
(41:50) uh these hamiltonians that you get here
(41:53) uh
(41:53) they’re not comparable to hamiltonians
(41:55) that you get important systems
(41:57) for example these hamiltonians they’re
(42:01) not emission quantities
(42:05) for quantum physicists
(42:10) and just to say that they’re different
(42:11) ways of formulating these
(42:13) theories that you get by variable
(42:16) transformations
(42:17) uh this second equation here has a name
(42:19) that’s the own sagar
(42:21) mac look functional you know and like
(42:24) you will
(42:25) pop up these these these kind of
(42:27) functional
(42:28) um pop up in papers if you read papers
(42:31) that
(42:32) they they pop up in different ways but
(42:34) in the end
(42:35) the same uh um
(42:40) implementations as a few implementations
(42:43) of the same
(42:44) stochastic differential equation but
(42:46) just reformulations of the same thing

slide 10


(42:52) okay now
(42:55) i told you in the beginning of the
(42:57) lecture that um
(43:01) i actually what most of the cases
(43:03) interested in
(43:04) now and what field theories are also
(43:07) good for
(43:08) are spatially extended systems
(43:13) now how do you write down a larger
(43:16) equation or a spatially extended system
(43:19) now it gets of course a little bit more
(43:20) complicated but there’s actually a
(43:22) classification
(43:23) theme for a long general equation that
(43:27) describes systems
(43:28) that have spatial degrees of freedom
(43:32) and this classification scheme you can
(43:34) see on the slide
(43:36) and so we’re looking here at some time
(43:40) evolution of some field phi
(43:43) at a position x at a time t
(43:47) and this time evolution is given by
(43:49) different components
(43:51) yeah that’s r on the right hand sides on
(43:54) the very right we have the noise
(43:56) as always yeah and this noise
(44:01) now has the usual properties the average
(44:04) of the noise is zero
(44:06) and it also has correlations so how is
(44:09) a fluctuation a fluctuating force of
(44:12) that xi represents how is that
(44:15) correlated between different time points
(44:18) and how they correlated between
(44:20) different positions
(44:21) in space so while the assumption that
(44:24) different
(44:25) correlations that this noise is
(44:28) uncorrelated in time so that
(44:30) you don’t have memory is something that
(44:32) is
(44:33) very reasonable and that we that we
(44:35) typically make
(44:37) it’s not so clear that the noise is also
(44:40) independent at each position in space
(44:44) you know so generally there will be some
(44:47) some
(44:48) length you perturb the system and then
(44:49) you don’t perturb a single atom
(44:51) yeah but you deter like a small region
(44:54) for example
(44:55) and then these fluctuating forces are
(44:58) correlated
(45:00) over small regions and these
(45:02) correlations in the spatial
(45:06) uh in these spatial systems and the
(45:09) fluctuating force that act on these
(45:11) systems
(45:13) are given by so-called spatial
(45:15) correlation
(45:16) now that just says how are these
(45:19) fluctuating forces
(45:21) how are they correlated between two
(45:24) given positions
(45:25) in space yeah and
(45:28) now we have the rest here
(45:32) the black parts here
(45:35) yeah it’s just a fancy way of writing
(45:38) down
(45:39) the deterministic part of what’s
(45:42) actually happening
(45:44) yeah and it’s just a fancy way of
(45:47) writing down uh the actual
(45:50) uh for example chemical reactions
(45:53) that change the value of the field at a
(45:56) given position
(45:58) yeah and we can formally write down this
(46:01) uh this term by saying okay so we have
(46:04) some kind of potential
(46:06) and the dynamics will go so to some
(46:09) minimum
(46:10) of this potential not just some some
(46:13) equilibrium state
(46:15) yeah and then uh we can formally write
(46:18) this
(46:18) this way here that we have some
(46:21) functions and free energy functional
(46:23) f that depends on the fields and this is
(46:26) given by some input
(46:28) space integral uh where we have this
(46:31) term here that will flatten
(46:33) the field that something like can become
(46:36) a diffusion term
(46:37) and on the right hand side we have
(46:40) something
(46:41) that is a potential that describes
(46:43) what’s actually which where we’re going
(46:44) to with the field
(46:46) now if you have to have heard
(46:48) statistical physics
(46:50) yeah then uh these will look familiar to
(46:53) them
(46:53) to you like fight to the force theory
(46:55) ginsburg londo and so on
(46:58) now there’s this red stuff here yeah and
(47:01) this red stuff is just a way
(47:03) of classifying different kinds of
(47:06) systems
(47:07) and people distinguish between system
(47:10) where the order parameter so our phi
(47:13) is not conserved for example in chemical
(47:16) systems where you can
(47:17) convert one chemical species to another
(47:20) chemical species
(47:21) yeah and then another chemical species
(47:23) and so on then the con
(47:25) concentration of one chemical species is
(47:28) not
(47:28) conserved in the entire system
(47:32) yeah and this is what you get if you set
(47:34) this exponent to zero
(47:36) that means you don’t have this laplacian
(47:38) the second derivative here
(47:42) the other case is when you have
(47:45) set this to one here set this n to one
(47:49) then this will here be part of
(47:52) something like a diffusion term yeah
(47:55) this will go into a diffusion term
(47:57) and what you uh will then get is uh what
(48:00) is
(48:00) what what these kind of systems that
(48:02) describe are
(48:04) situations where the field supply is
(48:06) conserved
(48:07) now so you remove stuff at one point of
(48:09) the system if you remove stuff at one
(48:11) point of the system
(48:12) it has to pop up in another point
(48:16) an example is for example uh if you have
(48:18) something like hydrodynamics also where
(48:20) you just
(48:20) move mars around and uh but it doesn’t
(48:24) really disappear
(48:25) and you just move things around but
(48:27) things don’t disappear
(48:28) that’s an example for these kind of
(48:30) model b
(48:32) systems yeah and
(48:35) uh so if you plug these things in so
(48:37) typical examples very famous example
(48:39) systems also called reaction diffusion
(48:43) systems
(48:44) now so and the typical reaction
(48:45) diffusion equation
(48:47) is seen here on the right hand side that
(48:50) describes a situation
(48:52) where uh the phi if you look here at the
(48:55) phi
(48:56) now then what happens what this term
(48:59) here does
(49:00) is a 5 is just a little bit larger than
(49:03) 0
(49:04) yeah then this term will be positive
(49:07) yeah and this this
(49:10) this will give to rise an increase in
(49:12) the field
(49:14) now this term is a diffusion term uh
(49:17) that just
(49:17) transports information to the set
(49:19) between different positions
(49:21) in the system and then we have our noise
(49:24) and this noise typically has
(49:26) prefactors it’s multiplicative uh if you
(49:29) look for example at biological systems

slide 11


(49:34) so at the take home message for these
(49:36) spatial systems
(49:38) now we you can think about okay we just
(49:40) add an x variable well what i just said
(49:42) is just a complicated way of
(49:44) saying okay so we have some x variable
(49:47) and some diffusion yeah but everything
(49:50) else will look very similar as before
(49:53) now and indeed we can follow exactly the
(49:56) same steps now
(49:57) now we take the long-term equation
(49:59) general launch of my equation here
(50:02) you know which is this one i do exactly
(50:05) the same steps as before
(50:07) and now this here would be the first
(50:10) step
(50:13) now where we have this delta functions
(50:16) now now we don’t only have a product
(50:18) over i
(50:19) that is already in the over time that is
(50:21) already in this delta function here
(50:23) but also over x yeah and then we just
(50:26) plug in this
(50:27) generalized launch of our equation and
(50:29) do the same step
(50:31) and then we get the field theory the
(50:33) martin citra rose functional
(50:35) for the spatially extended system so
(50:38) this looks pretty complicated
(50:40) but it is actually the same as before
(50:43) you know so we have we integrate again
(50:46) over the field fine and the response
(50:50) field
(50:50) file tilde and
(50:54) then we sum up different contributions
(50:59) to our variable to our observable like
(51:02) the second term
(51:03) and then as before we wait that
(51:06) with some exponential that essentially
(51:10) quantifies
(51:10) how far we are away from a realistic
(51:14) realization of the launching equation
(51:17) yeah and here we have
(51:19) exactly the same structure as before
(51:22) here you have the launch of the equation
(51:25) now you see you say that you need to
(51:27) solve the deterministic part of the
(51:29) moisture equation
(51:31) and this was previously just
(51:34) x tilde squared i said previously
(51:40) this was this term here
(51:43) was something like a
(51:46) over 2 x to the squared
(51:50) well now that looks more complicated
(51:52) yeah but it has the same form as if you
(51:54) look at this
(51:56) you have here your x tilde or phi tilde
(52:00) here you have another one that just
(52:03) coupled by the correlations in the noise
(52:07) uh that is described by this voice
(52:10) kernel
(52:11) now but the structure is the same as
(52:14) before
(52:15) yeah and uh so so uh so this is the
(52:18) martial citra rose
(52:20) functional for the noise for the
(52:22) spatially extended system
(52:25) yeah and with this uh i’m done with the
(52:29) definitions
(52:29) and with the actual
(52:33) derivation of the field theory now

slide 8


(52:36) let’s see if we can apply it and now i
(52:39) go back
(52:40) to the example and just see how we’re
(52:43) doing in time
(52:47) okay great i think almost exactly an
(52:49) hour
(52:50) so if you don’t already know all of this
(52:52) maybe from last year’s lecture
(52:53) then uh feel free to feel free for her
(52:56) to drop out
(52:56) and for the rest i’ll go through one
(52:58) example
(53:00) and i’ll double check so if i upload
(53:02) when i upload the uh
(53:04) the lecture notes i’ll try to make sure
(53:06) that actually these minus signs
(53:09) are correct yeah so so if there was a
(53:11) mistake i’ll correct it
(53:12) in the uploaded version that you find on
(53:15) the website
(53:17) excuse me yes so
(53:20) when you wrote the response function
(53:23) there was this i guess heavyside
(53:28) function multiplied with
(53:29) your autocorrelation so
(53:33) does that come up because uh you have
(53:36) a kind of the etho formalism
(53:39) into your system the point
(53:44) the response function if you have a
(53:46) perturbation here
(53:48) if you perturb at t prime and you look
(53:50) at the response at t
(53:52) now that’s how this response function is
(53:55) defined
(53:56) answer you should have here that’s a d
(53:58) prime
(53:59) here as you perturb at t prime and you
(54:02) look
(54:03) at the time t yeah then
(54:06) oh the other way around actually the
(54:08) answer so then then this
(54:12) herbicide function just ensures
(54:15) causality
(54:16) that you cannot observe the response
(54:19) that hap that happens before you could
(54:20) do the perturbation before you apply the
(54:22) force
(54:25) now so that’s that’s that’s the reason
(54:27) why you get these tata functions
(54:30) okay okay thanks for the question
(54:35) let’s go then to the
(54:42) first example
(54:46) here we go yeah and
(54:49) uh so let’s let’s have a look at the
(54:52) fields here for a very
(54:54) simple example and all like in the
(54:56) stochastic processes
(54:58) also simple examples quickly become
(55:04) complicated
(55:06) so let me find the notes
(55:11) here we go
(55:18) okay we start with a simple
(55:22) with probably one of the simplest larger
(55:24) way equation uh you can
(55:26) imagine and uh because it’s so simple
(55:29) uh it has a name because it has been
(55:32) extensively
(55:33) studied and it’s called the einstein
(55:35) ruling
(55:36) process and this process is just given
(55:40) by the time
(55:41) by the time derivative of x
(55:45) dot and this is equal to
(55:48) some restoring force all right so plus
(55:51) sometimes
(55:52) uh some external fluctuating force
(55:56) side and as always we have the usual
(55:59) conditions that are outside
(56:00) very nicely behaved doesn’t have a mean
(56:04) and it’s uncorrelated in time
(56:08) so now writing down the margin citra
(56:12) rose
(56:13) function integral easy because we just
(56:16) have to plug in this laundromate
(56:18) equation
(56:19) so martin
(56:23) martin cedar rose johnson the dominicus
(56:28) some people say martin said rose i did
(56:30) that once and i happened to be in munich
(56:32) and then i they were very angry that i
(56:34) forgot johnson
(56:36) my parents apparently had some
(56:37) connections to munich
(56:40) and uh now so that’s the full
(56:43) that that that contains i think
(56:45) everybody who contributed
(56:47) uh
(56:52) reads now so what is the action now just
(56:56) i don’t write the full integral just
(56:57) write the action
(56:59) so the action is
(57:03) dt i x tilde
(57:12) del t x plus alpha x
(57:19) now let’s make sure that we fulfill the
(57:21) login equation
(57:23) plus a over 2
(57:27) dt
(57:31) x tilde now that would be squared
(57:37) now we can write down the generating
(57:39) function
(57:41) generating
(57:47) functional yeah and this is just as
(57:50) before
(57:52) that of h is tilde
(57:57) is equal to the integral
(58:00) over our two fields function interval of
(58:02) the word two fields
(58:04) field response field uh
(58:07) e to the minus s
(58:11) plus dt
(58:15) h x so that the external auxiliary field
(58:19) that comes to x
(58:21) plus an external field that couples
(58:24) to x tilde
(58:28) yeah and uh just
(58:31) just to make sure everybody understands
(58:33) now so why do i say that this field
(58:35) couples to x
(58:36) and x tilde here so these and why are
(58:38) these external fields
(58:40) provided like this this just follows if
(58:42) you write down
(58:43) um something like for example the
(58:45) ginsburg landlord theory also
(58:48) or you look at the icing model now then
(58:50) terms like this here
(58:55) will tilt the potential in one direction
(58:59) yeah and here this tilts the potential
(59:04) in the x-direction and this tills the
(59:06) potential in the axillary
(59:07) direction this tilts it in the x
(59:10) direction
(59:12) and this tilts it in the x tilde
(59:14) direction
(59:16) now so this is why we call these
(59:17) external fields
(59:19) and that they couple to uh
(59:22) these fields x and x that’s the analogy
(59:26) to uh the ginsberg london theory and
(59:29) other fields
(59:32) okay so let’s go this is the
(59:35) um functional generating functional
(59:40) and now we use that
(59:44) integral over dx e to the iq x
(59:48) is just delta of q so we use the
(59:52) definition of the fourier transform of
(59:54) the delta function
(59:56) and by doing that we get that
(60:00) this functional generative function
(60:04) is equal to d of
(60:07) x to the e
(60:11) to the a over two
(60:16) integral dt x to the squared
(60:22) plus dt
(60:26) our two fields h x
(60:29) plus h tilde x
(60:33) yeah and now we’ve got our delta
(60:37) function
(60:39) i del t minus alpha
(60:44) x to the plus h
(60:53) now when is this delta function here the
(60:56) delta function
(60:58) actually non-zero now so the delta
(61:02) delta functional
(61:08) is non zero
(61:11) if our x tilde
(61:15) solves this ordinary differential
(61:18) equation
(61:20) you know that we have here in the delta
(61:22) function now we can just write it down
(61:26) i times t to infinity
(61:30) dt prime e to the minus
(61:35) alpha t minus p prime
(61:39) h of t prime yeah
(61:45) now we substitute that so we saw that we
(61:48) already got rid of
(61:49) one field here by doing this reverse
(61:53) fury transform now we substitute that we
(61:56) get
(61:57) rid of our second field now we
(61:59) substitute
(62:05) into that
(62:11) and what we get
(62:14) is age h tilde
(62:17) is equal to and now comes a large
(62:20) exponential
(62:21) to see how to write that on the screen
(62:26) minus integral dt
(62:30) integral dt prime the second it comes
(62:34) from this x tilde
(62:35) and then we have
(62:39) a over 2 e to the minus
(62:42) alpha t minus t prime
(62:48) times h of t
(62:53) h of t prime plus
(62:57) i theta
(63:00) t minus t prime
(63:03) e to the minus alpha t
(63:07) minus t prime
(63:10) times i’m sorry i said it like
(63:14) even for the simplest process it gets
(63:16) lengthy
(63:17) h of t h of t prime
(63:21) yeah but what’s happening here is
(63:24) nothing magical it’s just a calculation
(63:26) such
(63:26) integrals well we substituted that x
(63:29) total
(63:31) into our generating functional and then
(63:34) we collected the terms
(63:35) and made sure
(63:40) that the right time order
(63:43) is given and now we have this generating
(63:46) function it looks a little bit
(63:47) complicated
(63:48) but we can deal with that now so we can
(63:51) take
(63:52) functional derivatives with this now for
(63:54) example to get a correlation function
(64:02) and so we have x
(64:06) of the x of t prime
(64:10) now we take the second
(64:14) derivative
(64:19) once at time t
(64:24) and once at time t prime
(64:30) and then we must not forget
(64:33) to set the fields equal to zero again
(64:40) now and if you look at this equation if
(64:43) we do that
(64:44) then and do a little bit of calculations
(64:48) don’t
(64:48) get that actually this correlation
(64:50) function is
(64:51) a over alpha e to the minus alpha
(64:56) t minus t prime
(65:02) so that’s just uh an example yeah so
(65:05) these actual calculations
(65:06) are a little bit messy but that’s not
(65:09) just how you use these field theories
(65:12) that you have the field theory you write
(65:14) down the general functional
(65:16) yeah and then you look that you are able
(65:19) to
(65:19) take these functional derivatives with
(65:22) respect to these external fields
(65:24) and the rest is then say
(65:28) mathematics yeah
(65:32) okay great so this was an example and
(65:35) from next week
(65:35) we’ll be a little bit more intuitive
(65:37) again now we’ve covered the technical
(65:39) stuff
(65:40) uh and we can look into some real uh
(65:43) physics
(65:44) and physics problems okay see you all
(65:47) next week
(65:48) bye