alexa Entropy and Telecommunications Systems
ISSN: 2167-0919
Journal of Telecommunications System & Management
Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on Medical, Pharma, Engineering, Science, Technology and Business
  • Editorial   
  • J Telecommun Syst Manage, Vol 4(1)
  • DOI: 10.4172/2167-0919.1000e114

Entropy and Telecommunications Systems

Kalyan Raman* and Vijay Viswanathan
Northwestern University, Evanston, IL 60208, USA
*Corresponding Author: Kalyan Raman, Northwestern University, Evanston, IL 60208, USA, Tel: 847-467-0847, Email: [email protected]

Received Date: Mar 19, 2015 / Accepted Date: Mar 26, 2015 / Published Date: Apr 06, 2015

Background

The concept of entropy which first arose in the determination of the maximum efficiency attainable by heat engines in the early 19th century has turned out to have ramifications that extend far beyond its original domain of application. It is no exaggeration to claim that nearly every branch of science, engineering, and even the social sciences has been touched by entropy. Entropy has also enriched the field of Statistics where it has augmented traditional estimation methods such as least squares and maximum likelihood by a new methodology called the Maximum Entropy approach. While energy conservation is a fundamental law in Physics, it is not sufficient on its own to predict whether a physical process can occur spontaneously in nature, for there are many processes that would be permitted because they do not violate conservation of energy, but which cannot spontaneously occur in nature. It is entropy that distinguishes between processes that can occur spontaneously in nature and those that cannot. Such processes can indeed occur, but not spontaneously - they require a flow of energy into the system to make them happen. The astrophysicist Sir Arthur Eddington considered entropy to be such a basic physical idea that he made the following frequently cited statement [1]: “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations-then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well those experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics, I can offer you no hope; there is nothing for it but to collapse in deepest humiliation.”

Historical Perspective

The roots of entropy lie in the field of thermodynamics where it arises in the analysis of the efficiency of heat engines, originating early in the 19th century [2]. However, it was not till the 20th century that entropy began to appear as a fundamental tool of analysis in branch after branch of the sciences and engineering. As early as 1803, Lazare Carnot initiated the analysis of the efficiency of fundamental engines such as pulleys and inclined planes. This work eventually led to the idea of transformation-energy or energy lost to dissipation and friction. It is the transformation-energy idea that is now called entropy. The term entropy is attributed to the German physicist Rudolf Clausius in 1865 [3]. Entropy clarifies the distinction between reversible and irreversible processes, a distinction that has important ramifications not just in thermodynamics but also in many branches of engineering. As Quantum Mechanics was developed in the early years of the 20th century, von Neumann extended the concept of entropy to quantum systems using the concept of the density matrix, an entity that is roughly analogous to that of a probability distribution over phase space [4].

Implications of Entropy for Telecommunications

Entropy is a surprisingly deep and subtle concept and for that reason, perhaps there are more misconceptions and incorrect interpretations of entropy than any other physical concept. It is intimately connected to the second law of thermodynamics which can be stated in many forms, one of which is that heat always flows spontaneously from a hot object to a cold one [5]. An equivalent version of the second law, applicable to those physical phenomena in which no heat flow takes place, is that the entropy of an isolated system never decreases [5]. Since the universe in its entirety is an isolated system,this implies that the entropy of the universe is increasing. Returning to the heat flow example, note that the flow happens only when there is a temperature gradient-as the heat flows from the hot body to the cold, the temperature of the former drops and that of the latter increases until the two are at the same temperature. At that point, the two bodies are in thermal equilibrium and no more flow takes place. As another example, the energy content of a lake cannot be harnessed to produce useful work. But if the lake flows into a waterfall, useful work can then be extracted from the energy content of the water, say by having the water turn a turbine, as the water falls down the gradient defining the distance between the top and bottom of the waterfall. It is only in disequilibrium - a difference between energy levels-that useful work can be obtained [5]. Thus, we may regard entropy as the absence of disequilibrium. As disequilibrium decreases, entropy increases and reaches its maximum at equilibrium. Lambert [6] has given one of the best and clearest discussions of entropy and its implications. He notes that the second law of thermodynamics says that “energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so.” Entropy quantitatively measures the extent to which energy has flowed from being localized to becoming more widely spread. Lambert observes that all spontaneous phenomena in the universe are examples of the second law because they involve energy dispersing. Cream mixing in coffee, the scent of perfume diffusing across the room and gas expansion are all phenomena in which the energy is initially confined to a small volume of phase space but eventually the molecules occupy a large volume as the dispersal takes place. Life presents a paradox because it requires an ordered arrangement of molecules but the second law predicts that things will become more disordered over time. Nelson [7] explains this by noting that the second law applies to an isolated system, but the Earth is not isolated-it is part of a larger system including the sun and stars—and thus the entropy of a subsystem can decrease provided that the entropy of the rest of the system increases enough to compensate so that the entropy of the entire system indeed increases. Another perspective on entropy is that it describes the conversion of high-quality energy capable of producing useful work into low-quality energy from which no additional work can be obtained. Thus, plants use the high-quality solar energy to produce sugar, fat and plant tissue through photosynthesis and radiate some of it as waste heat [7]. Only a small part of the radiant energy is converted into useful work, the rest is radiated back by the Earth into space as low-quality energy—heat. The distinction between high-quality and low-quality energy can be made quantitatively precise as follows. Scientists realized in the 19th century that thermal energy is that part of the total energy attributable to random molecular motion [7]. Thus, for example, this random molecular motion is distinct from the organized kinetic energy of a falling rock. And therein lies the distinction between high-quality and low-quality energy. The useful part of the total energy is that part which is not associated with the random motion of the molecules. Thus, it is the part that is not associated with entropy. This idea leads to the Helmholtz free energy, defined by F=E–TS, where F is the free energy, E the total energy, T the temperature and S the entropy at that temperature [7]. Spontaneous change at a fixed temperature will occur if the effect of the change is to reduce the system’s free energy. No spontaneous change will occur if the free energy is already minimum.

Applications of Entropy in Other Sciences

In the psychology literature, entropy is widely used to measure uncertainty-related anxiety. The construct of psychology entropy is defined as the experience of conflicting perceptual and behavioral affordances [8]. This concept is an extension of the evolutionary perspective which suggests that living organisms survive by lowering their internal entropy and simultaneously increasing the entropy in the external environment [9]. Another related theory is that of selforganization which suggest that organisms that cannot effectively dissipate entropy are destroyed [10]. Entropy has also been used as a measure of event-discriminability in forced choice tasks. For example, some studies have used Shannon’s entropy as a measure of discriminability [11] while others have used the principle of maximum entropy to explain psychological models of categorization [12]. Other studies have utilized entropy as a pattern variable to measure the extent to which individuals discriminate one stimulus from another in signal detection experiments [13]. However, it is also important to note that some studies have found the use of entropy in psychological studies to be inappropriate primarily because the stimuli of psychological experiments are often not interchangeable [14].

Neuroscience has also used the concept of entropy to analyze patterns of neural firing. For example, the entropy for a variable with a discrete probability distribution is calculated using the formula

image

If we assume that a neuron responds each time by firing at the same rate r, the probabilities for all but one of the firing rates are zero. There is only one firing rate that has a probability of one. Consequently, every term in the above equation is zero because either pi=0 or log 1=0. Now let us make the assumption that the neuron fires in only two possible ways, either with rate r+ or r−. Since pr-=1 - pr+, the entropy equation above can be rewritten as

image

If we solve the first order condition, we find that entropy attains a maximum value of 1 bit when pr+ = pr- It can be shown that in the absence of data, the probability distribution that has the maximum entropy is the uniform distribution. A widely used concept in neuroscience to measure information transfer in neuroscience is transfer entropy [15]. The theoretic functional used to define transfer entropy is mutual conditional information. Specifically, the transition probabilities are conditioned to exclude influences that arise from sharing information consisting of common history and input signals. The resulting transfer entropy is thus able to extricate driving and responding elements in a system. Recent developments in this area include approaches to detect causalities in multivariate time series using multivariate transfer entropy [16].

Conclusions and Projections

Entropy is not only a powerful theoretical concept but also an eminently practical one, as evidenced by its application to so many diverse problems in multiple branches of the sciences and engineering. We are confident that, in the future, the entropy concept will continue to illuminate complex issues in many other disciplines in which it has not yet appeared.

References

Citation: Raman K, Viswanathan V (2015) Entropy and Telecommunications Systems. J Telecommun Syst Manage 4: e114. Doi: 10.4172/2167-0919.1000e114

Copyright: ©2015 Raman K, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Select your language of interest to view the total content in your interested language

Post Your Comment Citation
Share This Article
Article Usage
  • Total views: 12943
  • [From(publication date): 9-2015 - Jan 19, 2020]
  • Breakdown by view type
  • HTML page views: 9024
  • PDF downloads: 3919
Share This Article
Top