Sie sind auf Seite 1von 9

Laws of thermodynamics The laws of thermodynamics describe the transport of heat and work in thermodynamic processes.

These laws have become some of the most important in all of physics and other types of scienceassociated with thermodynamics.
[citation needed]

Classical thermodynamics, which is focused on systems in thermodynamic equilibrium, can be considered separately from nonequilibrium thermodynamics. This article focuses on classical or thermodynamic equilibrium thermodynamics. The four principles (referred to as "laws"): The zeroth law of thermodynamics, which underlies the definition of temperature.
[1]

The first law of thermodynamics, which mandates conservation of energy, and states in particular that the flow of heat is a form of energy transfer. The second law of thermodynamics, which states that the entropy of an isolated macroscopic system never decreases, or (equivalently) that perpetual motion machines are impossible. The third law of thermodynamics, which concerns the entropy of a perfect crystal at absolute zerotemperature, and implies that it is impossible to cool a system all the way to exactly absolute zero.

Zeroth law If two thermodynamic systems are each in thermal equilibrium with a third, then they are in thermal equilibrium with each other. When two systems are put in contact with each other, there will be a net exchange of energy between them unless or until they are in thermal equilibrium. That is the state of having equal temperature. Although this concept of thermodynamics is fundamental, the need to state it explicitly was not widely perceived until the first third of the 20th century, long after the first three principles were already widely in use. Hence it was numbered zero -- before the subsequent three. The Zeroth Law asserts that thermal equilibrium, viewed as a binary relation, is a transitive relation (and since any system is always in equilibrium with itself and if a system is in equilibrium with another, the latter is in equilibrium with the former, it is furthermore an equivalence relation)

he zeroth law of thermodynamics is a concoction about the thermal equilibrium among bodies, orthermodynamic systems, in contact. It results from the definition and properties of temperature. Zeroth law as equivalence relation A system is said to be in thermal equilibrium when its temperature does not change over time. Let A, B, and C be distinct thermodynamic systems or bodies. The zeroth law of thermodynamics can then be expressed as: "If A and C are each in thermal equilibrium with B, A is also in thermal equilibrium with C." The preceding sentence asserts that thermal equilibrium is a Euclidean relation between thermodynamic systems. If we also grant that all thermodynamic systems are (trivially) in thermal equilibrium with themselves, then thermal equilibrium is also a reflexive relation. Relations that are both reflexive and Euclidean are equivalence relations. One consequence of this reasoning is that thermal equilibrium is a transitive relationbetween the temperature T of A, B, and C: if T(A) = T(B) and T(B) = T(C) then T(A) = T(C).

]Thermal equilibrium between many systems Many systems are said to be in equilibrium if the small exchanges (due to Brownian motion, for example) between them do not lead to a net change in the total energy summed over all systems. A simple example illustrates why the zeroth law is necessary to complete the equilibrium description. Consider N systems in adiabatic isolation from the rest of the universe (i.e no heat exchange is possible outside of these N systems), all of which have a constant volume and composition, and which can only exchange heat with one another. The combined First and Second Laws relate the fluctuations in total energy, U, to the temperature of the i th system, the entropyfluctuation in the ith system, as follows: and

. The adiabatic isolation of the system from the remaining universe requires that the total sum of the entropy fluctuations vanishes, or:

That is, entropy can only be exchanged between the N systems. This constraint can be used to rearrange the expression for the total energy fluctuation and obtain:

where

is the temperature of any system j we may choose to single out among the N systems. Finally, equilibrium requires the

total fluctuation in energy to vanish, in which case:

which can be thought of as the vanishing of the product of an antisymmetric matrix fluctuations . In order for a non-trivial solution to exist,

and a vector of entropy

That is, the determinant of the matrix formed by

must vanish for all choices of N. However, according to Jacobi's

theorem, the determinant of a NxN antisymmetric matrix is always zero if N is odd, although for N even we find that all of the entries must vanish, , in order to obtain a vanishing determinant. Hence at equilibrium. This non-intuitive

result means that an odd number of systems are always in equilibrium regardless of their temperatures and entropy fluctuations, while equality of temperatures is only required between an even number of systems to achieve equilibrium in the presence of entropy fluctuations. The zeroth law solves this odd vs. even paradox, because it can readily be used to reduce an odd-numbered system to an even number by considering any three of the N systems and eliminating one by application of its principle, and hence reduce the problem to even N which subsequently leads to the same equilibrium condition that we expect in every case, i.e., . The same

result applies to fluctuations in any extensive quantity, such as volume (yielding the equal pressure condition), or fluctuations in mass (leading to equality of chemical potentials). Hence the zeroth law has implications for a great deal more than temperature alone. In general, we see that the zeroth law breaks a certain kind of asymmetry present in the First and Second Laws.

Temperature and the zeroth law It is often claimed, for instance by Max Planck in his influential textbook on thermodynamics, that the Zeroth law implies that we can define a "temperature function" or more informally, that we can "construct a thermometer." In the space of thermodynamic parameters, zones of constant temperature will form a surface, which provides a natural order of nearby surfaces. It is then simple to construct a global temperature function that provides a continuous ordering of states. Note that the dimensionality of a surface of constant temperature is one less than the number of thermodynamic parameters (thus, for an ideal gas described with 3 thermodynamic parameters P, V and n, it is a 2 dimensional surface). The temperature so defined may indeed not look like the Celsius temperature scale, but it is a temperature function nonetheless. For example, if two systems of ideal gas are in equilibrium, then P1V1/N1 = P2V2/N2 where Pi is the pressure in the ith system, Vi is the volume, and Ni is the "amount" (in moles, or simply the number of atoms) of gas. The surface PV/N = const defines surfaces of equal temperature, and the obvious (but not only) way to label them is to define T so that PV/N =RT, where R is some constant. These systems can now be used as a thermometer to calibrate other systems. First law Energy can neither be created nor destroyed. It can only change forms. In any process in an isolated system, the total energy remains the same. For a thermodynamic cycle the net heat supplied to the system equals the net work done by the system. The First Law states that energy cannot be created or destroyed; rather, the amount of energy lost in a steady state process cannot be greater than the amount of energy gained. This is the statement of conservation of energy for a thermodynamic system. It refers to the two ways that aclosed system transfers energy to and from its surroundings by the process of heating (or cooling) and the process of mechanical work. The rate of gain or loss in the stored energy of a system is determined by the rates of these two processes. In open systems, the flow of matter is another energy transfer mechanism, and extra terms must be included in the expression of the first law. The First Law clarifies the nature of energy. It is a stored quantity which is independent of any particular process path, i.e., it is independent of the system history. If a system undergoes a thermodynamic cycle, whether it becomes warmer, cooler, larger, or smaller, then it will have the same amount of energy each time it returns to a particular state. Mathematically speaking, energy is a state function and infinitesimal changes in the energy are exact differentials. All laws of thermodynamics but the First are statistical and simply describe the tendencies of macroscopic systems. For microscopic systems with few particles, the variations in the parameters become larger than the parameters themselves, and the assumptions of thermodynamics become meaningless. [edit]Fundamental thermodynamic relation The first law can be expressed as the fundamental thermodynamic relation: Heat supplied to a system = increase in internal energy of the system + work done by the system Increase in internal energy of a system = heat supplied to the system - work done by the system

Here, U is internal energy, T is temperature, S is entropy, p is pressure, and V is volume. This is a statement of conservation of energy: The net change in internal energy (dU) equals the heat energy that flows in (TdS), minus the energy that flows out via the system performing work (pdV).

The first law of thermodynamics, an expression of the principle of conservation of energy, states that energy can be transformed (changed from one form to another), but cannot be created or destroyed. The increase in the internal energy of a system is equal to the amount of energy added by heating the system minus the amount lost as a result of the work done by the system on its surroundings.

Description

The first law of thermodynamics says that energy is conserved in any process involving a thermodynamic system and its surroundings. Frequently it is convenient to focus on changes in the assumed internal energy (U) and to regard them as due to a combination of heat (Q) added to the system and work done by the system (W). Taking dU as an incremental (differential) change in internal energy, one writes

where Q and W are incremental changes in heat and work, respectively. Note that the minus sign in front of W indicates that a positive amount of work done by the system leads to energy being lost from the system. Note, also, that some books formulate the first law as:

where

is the work done on the system by the surroundings.

[1]

When a system expands in a quasistatic process, the work done on the system is PdV whereas the work done by the system while expanding is PdV. In any case, both give the same result when written explicitly as: Work and heat are due to processes which add or subtract energy, while U is a particular form of energy associated with the system. Thus the term "heat energy" for Q means "that amount of energy added as the result of heating" rather than referring to a particular form of energy. Likewise, "work energy" for w means "that amount of energy lost as the result of work". Internal energy is a property of the system whereas work done and heat supplied are not. A significant result of this distinction is that a given internal energy change (dU) can be achieved by, in principle, many combinations of heat and work. Informally, the law was first formulated by Germain Hess via Hess's Law , and later by Julius Robert von Mayer
[2] [3]

The first explicit

statement of the first law of thermodynamics was given by Rudolf Clausius in 1850: "There is a state function E, called energy, whose differential equals the work exchanged with the surroundings during an adiabatic process." [edit]Mathematical formulation The infinitesimal heat and work in the equations above are denoted by rather than d because, in mathematical terms, they are not exact differentials. In other words, they do not describe the state of any system. The integral of an inexact differential depends upon the particular "path" taken through the space of thermodynamic parameters while the integral of an exact differential depends only upon the initial and final states. If the initial and final states are the same, then the integral of an inexact differential may or may not be zero, but the integral of an exact differential will always be zero. The path taken by a thermodynamic system through a chemical or physical change is known as athermodynamic process.

An expression of the first law can be written in terms of exact differentials by realizing that the work that a system does is, in case of a reversible process, equal to its pressure times the infinitesimal change in its volume. In other words w = PdV where P is pressure and V isvolume. Also, for a reversible process, the total amount of heat added to a system can be expressed as Q = TdS where T is temperature andS is entropy. Therefore, for a reversible process: Since U, S and V are thermodynamic functions of state, the above relation holds also for non-reversible changes. The above equation is known as the fundamental thermodynamic relation. In the case where the number of particles in the system is not necessarily constant and may be of different types, the first law is written:

where dNi is the (small) number of type-i particles added to the system, and i is the amount of energy added to the system when one type-i particle is added, where the energy of that particle is such that the volume and entropy of the system remains unchanged. i is known as thechemical potential of the type-i particles in the system. The statement of the first law, using exact differentials is now:

If the system has more external variables than just the volume that can change, the fundamental thermodynamic relation generalizes to:

Here the Xi are the generalized forces corresponding to the external variables xi. A useful idea from mechanics is that the energy gained by a particle is equal to the force applied to the particle multiplied by the displacement of the particle while that force is applied. Now consider the first law without the heating term: dU = PdV. The pressure P can be viewed as a force (and in fact has units of force per unit area) while dV is the displacement (with units of distance times area). We may say, with respect to this work term, that a pressure difference forces a transfer of volume, and that the product of the two (work) is the amount of energy transferred as a result of the process. It is useful to view the TdS term in the same light: With respect to this heat term, a temperature difference forces a transfer of entropy, and the product of the two (heat) is the amount of energy transferred as a result of the process. Here, the temperature is known as a "generalized" force (rather than an actual mechanical force) and the entropy is a generalized displacement. Similarly, a difference in chemical potential between groups of particles in the system forces a transfer of particles, and the corresponding product is the amount of energy transferred as a result of the process. For example, consider a system consisting of two phases: liquid water and water vapor. There is a generalized "force" of evaporation which drives water molecules out of the liquid. There is a generalized "force" of condensation which drives vapor molecules out of the vapor. Only when these two "forces" (or chemical potentials) are equal will there be equilibrium, and the net transfer will be zero. The two thermodynamic parameters which form a generalized force-displacement pair are termed "conjugate variables". The two most familiar pairs are, of course, pressure-volume, and temperature-entropy.

[edit]Types of thermodynamic processes Paths through the space of thermodynamic variables are often specified by holding certain thermodynamic variables constant. It is useful to group these processes into pairs, in which each variable held constant is one member of a conjugate pair. The pressure-volume conjugate pair is concerned with the transfer of mechanical or dynamic energy as the result of work.

An isobaric (or quasi equilibrium process) occurs at constant pressure. An example would be to have a movable piston in a cylinder, so that the pressure inside the cylinder is always at atmospheric pressure, although it is isolated from the atmosphere. In other words, the system is dynamically connected, by a movable boundary, to a constant-pressure reservoir. Like a balloon contracting when the gas inside cools. As pressure is constant the work for a isobaric process is W = PdV

An isochoric (or isovolumetric) process is one in which the volume is held constant, meaning that the work done by the system will be zero. It follows that, for the simple system of two dimensions, any heat energy transferred to the system externally will be absorbed as internal energy. An isochoric process is also known as an isometric process. An example would be to place a closed tin can containing only air into a fire. To a first approximation, the can will not expand, and the only change will be that the gas gains internal energy, as evidenced by its increase in temperature and pressure. Mathematically, Q = dU. We may say that the system is dynamically insulated, by a rigid boundary, from the environment The temperature-entropy conjugate pair is concerned with the transfer of thermal energy as the result of heating.

An isothermal process occurs at a constant temperature. An example would be to have a system immersed in a large constanttemperature bath. Any work energy performed by the system will be lost to the bath, but its temperature will remain constant. In other words, the system is thermally connected, by a thermally conductive boundary to a constant-temperature reservoir.

An isentropic process occurs at a constant entropy. For a reversible process this is identical to an adiabatic process (see below). If a system has an entropy which has not yet reached its maximum equilibrium value, a process of cooling may be required to maintain that value of entropy.

An adiabatic process is a process in which there is no energy added or subtracted from the system by heating or cooling. For a reversible process, this is identical to an isentropic process. We may say that the system is thermally insulated from its environment and that its boundary is a thermal insulator. If a system has an entropy which has not yet reached its maximum equilibrium value, the entropy will increase even though the system is thermally insulated. The above have all implicitly assumed that the boundaries are also impermeable to particles. We may assume boundaries that are both rigid and thermally insulating, but are permeable to one or more types of particle. Similar considerations then hold for the (chemical potential)-(particle number) conjugate pairs. Fundamental thermodynamic relation From Wikipedia, the free encyclopedia In thermodynamics, the fundamental thermodynamic relation expresses an infinitesimal change ininternal energy in terms of infinitesimal changes in entropy, and volume for a closed system in thermal equilibrium in the following way.

Here, E is internal energy, T is absolute temperature, S is entropy, P is pressure, and V is volume. As all physics equations, this equation can be used in any unit system. In a consistent unit system like the SIsystem the corresponding equation for the numerical values of the physical quantities relative to the unit system is of the same form. Derivation from the first and second laws of thermodynamics The first law of thermodynamics states that:

According to the second law of thermodynamics we have for a reversible process:

Hence: By substituting this into the first law, we have:

Letting dW be reversible pressure-volume work, we have:

This equation has been derived in the case of reversible changes. However, since E, S, and V are thermodynamic functions of state, the above relation holds also for non-reversible changes. If the system has more external variables than just the volume that can change and if the numbers of particles in the system can also change, the fundamental thermodynamic relation generalizes to:

Here the Xi are the generalized forces corresponding to the external variables xi. The j are the chemical potentials corresponding to particles of type j. Derivation from first principles The above derivation uses the first and second laws of thermodynamics. The first law of thermodynamics is essentially a definition of heat, i.e. heat is the change in the internal energy of a system that is not caused by a change of the external parameters of the system. However, the second law of thermodynamics is not a defining relation for the entropy. The fundamental definition of entropy of an isolated system containing an amount of energy of E is:

where

is the number of quantum states in a small interval between E and E + E. Here E is a macroscopically small

energy interval that is kept fixed. Strictly speaking this means that the entropy depends on the choice of E. However, in the thermodynamic limit (i.e. in the limit of infinitely large system size), the specific entropy (entropy per unit volume or per unit mass) does not depend on E. The entropy is thus a measure of the uncertainty about exactly which quantum state the system is in, given that we know its energy to be in some interval of size E. Deriving the fundamental thermodynamic relation from first principles thus amounts to proving that the above definition of entropy implies that for reversible processes we have:

The fundamental assumption of statistical mechanics is that all the

states are equally likely. This allows us to extract all

the thermodynamical quantities of interest. The temperature is defined as See here for the justification for this definition. Suppose that the system has some external parameter, x, that can be changed. In general, the energy eigenstates of the system will depend on x. According to the adiabatic theorem of quantum mechanics, in the limit of an infinitely slow change of the system's Hamiltonian, the system will stay in the same energy eigenstate and thus change its energy according to the change in energy of the energy eigenstate it is in. The generalized force, X, corresponding to the external variable x is defined such that Xdx is the work performed by the system if x is increased by an amount dx. E.g., if x is the volume, then X is the pressure. The generalized force for a system known to be in

energy eigenstate Er is given by: Since the system can be in any energy eigenstate within an interval of E, we define the generalized force for the system as the expectation value of the above expression:

To evaluate the average, we partition the

energy eigenstates by counting how many of them have a value for , we have:

within a range between Y and Y + Y. Calling this number

The average defining the generalized force can now be written:

We can relate this to the derivative of the entropy w.r.t. x at constant energy E as follows. Suppose we change x to x + dx. Then will change because the energy eigenstates depend on x, causing energy eigenstates to move into or out of the

range between E and E + E. Let's focus again on the energy eigenstates for which

lies within the range between Y and Y +

Y. Since these energy eigenstates increase in energy by Y dx, all such energy eigenstates that are in the interval ranging from E Y dx to E move from below E to above E. There are such energy eigenstates. If , all these energy eigenstates will move into the range between E and E +

E and contribute to an increase in . The number of energy eigenstates that move from below E + E to above E + E is, of course, given by . The difference is thus the net contribution to the increase in . Note that if Y dx is

larger than E there will be the energy eigenstates that move from below E to above E + E. They are counted in Both and , therefore the above expression is also valid in that case.Expressing the above expression as a derivative w.r.t. E and summing over Y yields the expression:

The logarithmic derivative of w.r.t. x is thus given by:

The first term is intensive, i.e. it does not scale with system size. In contrast, the last term scales as the inverse system size and will thus vanishes in the thermodynamic limit. We have thus found that:

Combining this with

Gives:

which we can write as:

Das könnte Ihnen auch gefallen