Maximum entropy at equilibrium for closed system: Local maximum or global maximum?











up vote
9
down vote

favorite
1












For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?



I am an undergraduate physics student and it seems that the possibility of entropy having local maximums was not discussed. It was always assumed it was a global maximum. Is this true in all cases?










share|cite|improve this question




























    up vote
    9
    down vote

    favorite
    1












    For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?



    I am an undergraduate physics student and it seems that the possibility of entropy having local maximums was not discussed. It was always assumed it was a global maximum. Is this true in all cases?










    share|cite|improve this question


























      up vote
      9
      down vote

      favorite
      1









      up vote
      9
      down vote

      favorite
      1






      1





      For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?



      I am an undergraduate physics student and it seems that the possibility of entropy having local maximums was not discussed. It was always assumed it was a global maximum. Is this true in all cases?










      share|cite|improve this question















      For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?



      I am an undergraduate physics student and it seems that the possibility of entropy having local maximums was not discussed. It was always assumed it was a global maximum. Is this true in all cases?







      thermodynamics statistical-mechanics entropy equilibrium






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 12 at 18:45









      Qmechanic

      101k121821137




      101k121821137










      asked Dec 12 at 15:25









      TaeNyFan

      1314




      1314






















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          4
          down vote



          accepted










          To quote H.B. Callen Thermodynamics book, his second postulate about the formal development of Thermodynamics is:




          Postulate II - There exists a function (called the entropy $S$) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.




          As an example: assume $S$ is a function of $U,V,N$ and suppose your system can only exchange heat, so $V$ and $N$ are constants. Out of all the possible values the unconstrained parameter $U$ may take, the system at equilibrium will assume the value of $U$ such that $S$ is a maximum. So $S$ will be a global maximum in respect to $U$, but not necessarily in respect to $V$ and $N$ in a particular problem. However, if you also allow your system to expand and to exchange matter, by this postulate the values assumed by $U,V,N$ (which are now all unconstrained) will be such that $S$ is a global maximum.



          Edit. I was forgetting about phase transitions. When near a phase transition there will be states such that the Gibbs Potential $G$ is a local minimum (so $S$ is a local maximum). These states are, however, metastable and your thermodynamic system will typically prefer more stable states that correspond to a global minimum of $G$ (and consequently, to a global maximum of $S$).






          share|cite|improve this answer






























            up vote
            7
            down vote














            For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?




            To allow a meaningful answer, it would be necessary to qualify the maximum. Maximum with respect to which variable? In thermodynamics the correct (and meaningful) statement is "maximum with respect to the variables (different from the thermodynamic state variables describing the isolated system) which represent the entropy dependence on all possible internal constraints" (i.e. constraints in the isolated system).



            From this principle, i.e. from this sentence which condensate a long series of experiences, it is possible to obtain many consequences, like the equilibrium conditions or even the condition of concavity of the entropy as function of the variables which describe the macroscopic state of the isolated system, which, I stress, are not the same which describe the constraints.



            So, from the maximum principle, one can get the concavity of entropy with respect to the state variables, by carefully choosing the kind of constraint.



            However, such concavity as function of the state variables, does not imply strict concavity, or even concavity of entropy with respect to any possible internal constraint. For example, one could think of a constraint forcing an atomic system to stay only in two ordered crystalline structures (maybe not easy in a lab but not complicate in a computer simulation). For such constrained system one could have local maxima, with the highest being the true stable state and the remaining one, being a metastable system.



            Probably the most interesting question could be: if we remove all the internal constraints, how can we know if there is a unique final equilibrium state?
            And maybe this was the intended original question. Well, at the best of my knowledge, there is no definite answer. And there is a good reason for that. It is possible to imagine systems which do not reach equilibrium at all (non ergodic systems). Thus, I would consider the request of a unique maximum value of the entropy as an additional request for thermodynamic systems.






            share|cite|improve this answer




























              up vote
              5
              down vote













              Good question! At a fundamental level, entropy depends on the probability distribution $p(x)$ of the microscopic states $xin Omega$ of a system, which is given by the following equation:



              $$H(p) = -sum_{xin Omega}p(x)log p(x)$$



              Just specifying a few macroscopic variables of a system (e.g. $U$, $V$, and $N$) isn't enough to determine a unique probability distribution over the microscopic states, but the principle of maximum entropy says that the equilibrium distribution over microscopic states satisfying these macroscopic constraints is the one that has the greatest entropy.



              Mathematically, $H(p)$ is a (strictly) concave function of the probability distributions, which means that it can only increase when we average over probability distributions:



              $$H(lambda p_1 + (1-lambda) p_2) geq lambda H(p_1) + (1-lambda)H(p_2)$$



              An incredibly useful property of (strictly) concave functions is that they can only have one local maximum point, which is then guaranteed to be the global maximum. This is the reason why people ignore the possibility of multiple local maxima of the entropy, because the concave nature of the entropy guarantees that you'll only ever have one (see these notes, for example).



              This of course isn't the whole story, because in practice you can get things like metastable states, where a system gets stuck in a non-equilibrium state for a long time. But at least on paper, that's why we only ever talk about "the" maximum entropy state.






              share|cite|improve this answer





















              • Does metastable state correspond to entropy's local maximum?
                – Gec
                Dec 12 at 16:59








              • 4




                Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                – GiorgioP
                Dec 12 at 18:05











              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "151"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: false,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f446850%2fmaximum-entropy-at-equilibrium-for-closed-system-local-maximum-or-global-maximu%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              4
              down vote



              accepted










              To quote H.B. Callen Thermodynamics book, his second postulate about the formal development of Thermodynamics is:




              Postulate II - There exists a function (called the entropy $S$) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.




              As an example: assume $S$ is a function of $U,V,N$ and suppose your system can only exchange heat, so $V$ and $N$ are constants. Out of all the possible values the unconstrained parameter $U$ may take, the system at equilibrium will assume the value of $U$ such that $S$ is a maximum. So $S$ will be a global maximum in respect to $U$, but not necessarily in respect to $V$ and $N$ in a particular problem. However, if you also allow your system to expand and to exchange matter, by this postulate the values assumed by $U,V,N$ (which are now all unconstrained) will be such that $S$ is a global maximum.



              Edit. I was forgetting about phase transitions. When near a phase transition there will be states such that the Gibbs Potential $G$ is a local minimum (so $S$ is a local maximum). These states are, however, metastable and your thermodynamic system will typically prefer more stable states that correspond to a global minimum of $G$ (and consequently, to a global maximum of $S$).






              share|cite|improve this answer



























                up vote
                4
                down vote



                accepted










                To quote H.B. Callen Thermodynamics book, his second postulate about the formal development of Thermodynamics is:




                Postulate II - There exists a function (called the entropy $S$) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.




                As an example: assume $S$ is a function of $U,V,N$ and suppose your system can only exchange heat, so $V$ and $N$ are constants. Out of all the possible values the unconstrained parameter $U$ may take, the system at equilibrium will assume the value of $U$ such that $S$ is a maximum. So $S$ will be a global maximum in respect to $U$, but not necessarily in respect to $V$ and $N$ in a particular problem. However, if you also allow your system to expand and to exchange matter, by this postulate the values assumed by $U,V,N$ (which are now all unconstrained) will be such that $S$ is a global maximum.



                Edit. I was forgetting about phase transitions. When near a phase transition there will be states such that the Gibbs Potential $G$ is a local minimum (so $S$ is a local maximum). These states are, however, metastable and your thermodynamic system will typically prefer more stable states that correspond to a global minimum of $G$ (and consequently, to a global maximum of $S$).






                share|cite|improve this answer

























                  up vote
                  4
                  down vote



                  accepted







                  up vote
                  4
                  down vote



                  accepted






                  To quote H.B. Callen Thermodynamics book, his second postulate about the formal development of Thermodynamics is:




                  Postulate II - There exists a function (called the entropy $S$) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.




                  As an example: assume $S$ is a function of $U,V,N$ and suppose your system can only exchange heat, so $V$ and $N$ are constants. Out of all the possible values the unconstrained parameter $U$ may take, the system at equilibrium will assume the value of $U$ such that $S$ is a maximum. So $S$ will be a global maximum in respect to $U$, but not necessarily in respect to $V$ and $N$ in a particular problem. However, if you also allow your system to expand and to exchange matter, by this postulate the values assumed by $U,V,N$ (which are now all unconstrained) will be such that $S$ is a global maximum.



                  Edit. I was forgetting about phase transitions. When near a phase transition there will be states such that the Gibbs Potential $G$ is a local minimum (so $S$ is a local maximum). These states are, however, metastable and your thermodynamic system will typically prefer more stable states that correspond to a global minimum of $G$ (and consequently, to a global maximum of $S$).






                  share|cite|improve this answer














                  To quote H.B. Callen Thermodynamics book, his second postulate about the formal development of Thermodynamics is:




                  Postulate II - There exists a function (called the entropy $S$) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.




                  As an example: assume $S$ is a function of $U,V,N$ and suppose your system can only exchange heat, so $V$ and $N$ are constants. Out of all the possible values the unconstrained parameter $U$ may take, the system at equilibrium will assume the value of $U$ such that $S$ is a maximum. So $S$ will be a global maximum in respect to $U$, but not necessarily in respect to $V$ and $N$ in a particular problem. However, if you also allow your system to expand and to exchange matter, by this postulate the values assumed by $U,V,N$ (which are now all unconstrained) will be such that $S$ is a global maximum.



                  Edit. I was forgetting about phase transitions. When near a phase transition there will be states such that the Gibbs Potential $G$ is a local minimum (so $S$ is a local maximum). These states are, however, metastable and your thermodynamic system will typically prefer more stable states that correspond to a global minimum of $G$ (and consequently, to a global maximum of $S$).







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Dec 12 at 16:19

























                  answered Dec 12 at 16:00









                  ErickShock

                  1365




                  1365






















                      up vote
                      7
                      down vote














                      For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?




                      To allow a meaningful answer, it would be necessary to qualify the maximum. Maximum with respect to which variable? In thermodynamics the correct (and meaningful) statement is "maximum with respect to the variables (different from the thermodynamic state variables describing the isolated system) which represent the entropy dependence on all possible internal constraints" (i.e. constraints in the isolated system).



                      From this principle, i.e. from this sentence which condensate a long series of experiences, it is possible to obtain many consequences, like the equilibrium conditions or even the condition of concavity of the entropy as function of the variables which describe the macroscopic state of the isolated system, which, I stress, are not the same which describe the constraints.



                      So, from the maximum principle, one can get the concavity of entropy with respect to the state variables, by carefully choosing the kind of constraint.



                      However, such concavity as function of the state variables, does not imply strict concavity, or even concavity of entropy with respect to any possible internal constraint. For example, one could think of a constraint forcing an atomic system to stay only in two ordered crystalline structures (maybe not easy in a lab but not complicate in a computer simulation). For such constrained system one could have local maxima, with the highest being the true stable state and the remaining one, being a metastable system.



                      Probably the most interesting question could be: if we remove all the internal constraints, how can we know if there is a unique final equilibrium state?
                      And maybe this was the intended original question. Well, at the best of my knowledge, there is no definite answer. And there is a good reason for that. It is possible to imagine systems which do not reach equilibrium at all (non ergodic systems). Thus, I would consider the request of a unique maximum value of the entropy as an additional request for thermodynamic systems.






                      share|cite|improve this answer

























                        up vote
                        7
                        down vote














                        For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?




                        To allow a meaningful answer, it would be necessary to qualify the maximum. Maximum with respect to which variable? In thermodynamics the correct (and meaningful) statement is "maximum with respect to the variables (different from the thermodynamic state variables describing the isolated system) which represent the entropy dependence on all possible internal constraints" (i.e. constraints in the isolated system).



                        From this principle, i.e. from this sentence which condensate a long series of experiences, it is possible to obtain many consequences, like the equilibrium conditions or even the condition of concavity of the entropy as function of the variables which describe the macroscopic state of the isolated system, which, I stress, are not the same which describe the constraints.



                        So, from the maximum principle, one can get the concavity of entropy with respect to the state variables, by carefully choosing the kind of constraint.



                        However, such concavity as function of the state variables, does not imply strict concavity, or even concavity of entropy with respect to any possible internal constraint. For example, one could think of a constraint forcing an atomic system to stay only in two ordered crystalline structures (maybe not easy in a lab but not complicate in a computer simulation). For such constrained system one could have local maxima, with the highest being the true stable state and the remaining one, being a metastable system.



                        Probably the most interesting question could be: if we remove all the internal constraints, how can we know if there is a unique final equilibrium state?
                        And maybe this was the intended original question. Well, at the best of my knowledge, there is no definite answer. And there is a good reason for that. It is possible to imagine systems which do not reach equilibrium at all (non ergodic systems). Thus, I would consider the request of a unique maximum value of the entropy as an additional request for thermodynamic systems.






                        share|cite|improve this answer























                          up vote
                          7
                          down vote










                          up vote
                          7
                          down vote










                          For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?




                          To allow a meaningful answer, it would be necessary to qualify the maximum. Maximum with respect to which variable? In thermodynamics the correct (and meaningful) statement is "maximum with respect to the variables (different from the thermodynamic state variables describing the isolated system) which represent the entropy dependence on all possible internal constraints" (i.e. constraints in the isolated system).



                          From this principle, i.e. from this sentence which condensate a long series of experiences, it is possible to obtain many consequences, like the equilibrium conditions or even the condition of concavity of the entropy as function of the variables which describe the macroscopic state of the isolated system, which, I stress, are not the same which describe the constraints.



                          So, from the maximum principle, one can get the concavity of entropy with respect to the state variables, by carefully choosing the kind of constraint.



                          However, such concavity as function of the state variables, does not imply strict concavity, or even concavity of entropy with respect to any possible internal constraint. For example, one could think of a constraint forcing an atomic system to stay only in two ordered crystalline structures (maybe not easy in a lab but not complicate in a computer simulation). For such constrained system one could have local maxima, with the highest being the true stable state and the remaining one, being a metastable system.



                          Probably the most interesting question could be: if we remove all the internal constraints, how can we know if there is a unique final equilibrium state?
                          And maybe this was the intended original question. Well, at the best of my knowledge, there is no definite answer. And there is a good reason for that. It is possible to imagine systems which do not reach equilibrium at all (non ergodic systems). Thus, I would consider the request of a unique maximum value of the entropy as an additional request for thermodynamic systems.






                          share|cite|improve this answer













                          For a closed system at equilibrium the entropy is maximum. Is this a local maximum or is it a global maximum?




                          To allow a meaningful answer, it would be necessary to qualify the maximum. Maximum with respect to which variable? In thermodynamics the correct (and meaningful) statement is "maximum with respect to the variables (different from the thermodynamic state variables describing the isolated system) which represent the entropy dependence on all possible internal constraints" (i.e. constraints in the isolated system).



                          From this principle, i.e. from this sentence which condensate a long series of experiences, it is possible to obtain many consequences, like the equilibrium conditions or even the condition of concavity of the entropy as function of the variables which describe the macroscopic state of the isolated system, which, I stress, are not the same which describe the constraints.



                          So, from the maximum principle, one can get the concavity of entropy with respect to the state variables, by carefully choosing the kind of constraint.



                          However, such concavity as function of the state variables, does not imply strict concavity, or even concavity of entropy with respect to any possible internal constraint. For example, one could think of a constraint forcing an atomic system to stay only in two ordered crystalline structures (maybe not easy in a lab but not complicate in a computer simulation). For such constrained system one could have local maxima, with the highest being the true stable state and the remaining one, being a metastable system.



                          Probably the most interesting question could be: if we remove all the internal constraints, how can we know if there is a unique final equilibrium state?
                          And maybe this was the intended original question. Well, at the best of my knowledge, there is no definite answer. And there is a good reason for that. It is possible to imagine systems which do not reach equilibrium at all (non ergodic systems). Thus, I would consider the request of a unique maximum value of the entropy as an additional request for thermodynamic systems.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Dec 12 at 19:01









                          GiorgioP

                          1,299212




                          1,299212






















                              up vote
                              5
                              down vote













                              Good question! At a fundamental level, entropy depends on the probability distribution $p(x)$ of the microscopic states $xin Omega$ of a system, which is given by the following equation:



                              $$H(p) = -sum_{xin Omega}p(x)log p(x)$$



                              Just specifying a few macroscopic variables of a system (e.g. $U$, $V$, and $N$) isn't enough to determine a unique probability distribution over the microscopic states, but the principle of maximum entropy says that the equilibrium distribution over microscopic states satisfying these macroscopic constraints is the one that has the greatest entropy.



                              Mathematically, $H(p)$ is a (strictly) concave function of the probability distributions, which means that it can only increase when we average over probability distributions:



                              $$H(lambda p_1 + (1-lambda) p_2) geq lambda H(p_1) + (1-lambda)H(p_2)$$



                              An incredibly useful property of (strictly) concave functions is that they can only have one local maximum point, which is then guaranteed to be the global maximum. This is the reason why people ignore the possibility of multiple local maxima of the entropy, because the concave nature of the entropy guarantees that you'll only ever have one (see these notes, for example).



                              This of course isn't the whole story, because in practice you can get things like metastable states, where a system gets stuck in a non-equilibrium state for a long time. But at least on paper, that's why we only ever talk about "the" maximum entropy state.






                              share|cite|improve this answer





















                              • Does metastable state correspond to entropy's local maximum?
                                – Gec
                                Dec 12 at 16:59








                              • 4




                                Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                                – GiorgioP
                                Dec 12 at 18:05















                              up vote
                              5
                              down vote













                              Good question! At a fundamental level, entropy depends on the probability distribution $p(x)$ of the microscopic states $xin Omega$ of a system, which is given by the following equation:



                              $$H(p) = -sum_{xin Omega}p(x)log p(x)$$



                              Just specifying a few macroscopic variables of a system (e.g. $U$, $V$, and $N$) isn't enough to determine a unique probability distribution over the microscopic states, but the principle of maximum entropy says that the equilibrium distribution over microscopic states satisfying these macroscopic constraints is the one that has the greatest entropy.



                              Mathematically, $H(p)$ is a (strictly) concave function of the probability distributions, which means that it can only increase when we average over probability distributions:



                              $$H(lambda p_1 + (1-lambda) p_2) geq lambda H(p_1) + (1-lambda)H(p_2)$$



                              An incredibly useful property of (strictly) concave functions is that they can only have one local maximum point, which is then guaranteed to be the global maximum. This is the reason why people ignore the possibility of multiple local maxima of the entropy, because the concave nature of the entropy guarantees that you'll only ever have one (see these notes, for example).



                              This of course isn't the whole story, because in practice you can get things like metastable states, where a system gets stuck in a non-equilibrium state for a long time. But at least on paper, that's why we only ever talk about "the" maximum entropy state.






                              share|cite|improve this answer





















                              • Does metastable state correspond to entropy's local maximum?
                                – Gec
                                Dec 12 at 16:59








                              • 4




                                Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                                – GiorgioP
                                Dec 12 at 18:05













                              up vote
                              5
                              down vote










                              up vote
                              5
                              down vote









                              Good question! At a fundamental level, entropy depends on the probability distribution $p(x)$ of the microscopic states $xin Omega$ of a system, which is given by the following equation:



                              $$H(p) = -sum_{xin Omega}p(x)log p(x)$$



                              Just specifying a few macroscopic variables of a system (e.g. $U$, $V$, and $N$) isn't enough to determine a unique probability distribution over the microscopic states, but the principle of maximum entropy says that the equilibrium distribution over microscopic states satisfying these macroscopic constraints is the one that has the greatest entropy.



                              Mathematically, $H(p)$ is a (strictly) concave function of the probability distributions, which means that it can only increase when we average over probability distributions:



                              $$H(lambda p_1 + (1-lambda) p_2) geq lambda H(p_1) + (1-lambda)H(p_2)$$



                              An incredibly useful property of (strictly) concave functions is that they can only have one local maximum point, which is then guaranteed to be the global maximum. This is the reason why people ignore the possibility of multiple local maxima of the entropy, because the concave nature of the entropy guarantees that you'll only ever have one (see these notes, for example).



                              This of course isn't the whole story, because in practice you can get things like metastable states, where a system gets stuck in a non-equilibrium state for a long time. But at least on paper, that's why we only ever talk about "the" maximum entropy state.






                              share|cite|improve this answer












                              Good question! At a fundamental level, entropy depends on the probability distribution $p(x)$ of the microscopic states $xin Omega$ of a system, which is given by the following equation:



                              $$H(p) = -sum_{xin Omega}p(x)log p(x)$$



                              Just specifying a few macroscopic variables of a system (e.g. $U$, $V$, and $N$) isn't enough to determine a unique probability distribution over the microscopic states, but the principle of maximum entropy says that the equilibrium distribution over microscopic states satisfying these macroscopic constraints is the one that has the greatest entropy.



                              Mathematically, $H(p)$ is a (strictly) concave function of the probability distributions, which means that it can only increase when we average over probability distributions:



                              $$H(lambda p_1 + (1-lambda) p_2) geq lambda H(p_1) + (1-lambda)H(p_2)$$



                              An incredibly useful property of (strictly) concave functions is that they can only have one local maximum point, which is then guaranteed to be the global maximum. This is the reason why people ignore the possibility of multiple local maxima of the entropy, because the concave nature of the entropy guarantees that you'll only ever have one (see these notes, for example).



                              This of course isn't the whole story, because in practice you can get things like metastable states, where a system gets stuck in a non-equilibrium state for a long time. But at least on paper, that's why we only ever talk about "the" maximum entropy state.







                              share|cite|improve this answer












                              share|cite|improve this answer



                              share|cite|improve this answer










                              answered Dec 12 at 16:45









                              jemisjoky

                              912




                              912












                              • Does metastable state correspond to entropy's local maximum?
                                – Gec
                                Dec 12 at 16:59








                              • 4




                                Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                                – GiorgioP
                                Dec 12 at 18:05


















                              • Does metastable state correspond to entropy's local maximum?
                                – Gec
                                Dec 12 at 16:59








                              • 4




                                Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                                – GiorgioP
                                Dec 12 at 18:05
















                              Does metastable state correspond to entropy's local maximum?
                              – Gec
                              Dec 12 at 16:59






                              Does metastable state correspond to entropy's local maximum?
                              – Gec
                              Dec 12 at 16:59






                              4




                              4




                              Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                              – GiorgioP
                              Dec 12 at 18:05




                              Uniqueness of the maximum follows from the strict (without parentheses) concavity. Unfortunately, thermodynamic entropy is just concave, not strict concave. It is strictly concave almost everywhere, but in the presence of a first order phase transition, the physical coexistence of phases implies a non strict concavity. Therefore it is not possible to claim uniqueness of the maximum, in general.
                              – GiorgioP
                              Dec 12 at 18:05


















                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Physics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.





                              Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                              Please pay close attention to the following guidance:


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f446850%2fmaximum-entropy-at-equilibrium-for-closed-system-local-maximum-or-global-maximu%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Список кардиналов, возведённых папой римским Каликстом III

                              Deduzione

                              Mysql.sock missing - “Can't connect to local MySQL server through socket”