Renaissance in-Depth
5.2.1 Clausius
Ledger Book from the Medici Bank circa 14xx
Rudolf Clausius was a German physicist and mathematician who made seminal contributions to the field of thermodynamics. His work laid the groundwork for the formal and conceptual understanding of energy conservation, heat, and entropy.
First and Second Laws of Thermodynamics
Perhaps Clausius’s most significant achievement was his clear formulation of the first two laws of thermodynamics.
First Law (Conservation of Energy): Clausius was one of the key figures to articulate that energy, in a closed system, is conserved and cannot be created or destroyed. This is often phrased as “Energy can neither be created nor destroyed, only converted from one form to another.”
Second Law: He also played a central role in defining the second law, emphasizing that heat cannot spontaneously flow from a colder body to a hotter one.
Entropy
Clausius introduced the term “entropy” for a state function that describes the degree of disorder or randomness in a system. He postulated that the entropy of the universe tends to a maximum, laying the groundwork for future developments in the field. The idea of a measure of the irreversible progression of natural processes via the concept of entropy, implied a preferred direction of time — often called the “arrow of time”.
Clausius’s Inequality:
In mathematical terms, Clausius’s statement of the second law can be represented by the inequality:
\[ \oint \frac{δQ}{T} \leq 0 \]
Where \( δQ \) is the infinitesimal amount of heat received by a system and \( T \) is the absolute temperature of the body delivering \( δQ \). For a cycle where the system returns to its initial state, the integral of this quantity over the cycle will always be less than or equal to zero.
Clausius-Clapeyron Equation:
Clausius, in collaboration with Benoît Paul Émile Clapeyron, derived an equation that describes the phase transition between two phases of matter, such as the liquid-gas transition. The equation is given by:
\[ \frac{dP}{dT} = \frac{L}{T\Delta v} \]
Where:
– \( dP/dT \) is the rate of change of pressure with respect to temperature at the phase transition.
– \( L \) is the latent heat of the phase transition.
– \( Δv \) is the change in volume per unit mass during the phase transition.
Beyond his mathematical contributions, Clausius’s work in thermodynamics had profound philosophical implications. His formalization of the laws brought clarity and precision to previously vague notions of heat, work, and energy, and his entropy concept brought a new understanding of time’s arrow and the inevitable increase of disorder in the universe.
5.2.1 Boltzman
Ledger Book from the Medici Bank circa 14xx
Ludwig Boltzmann was instrumental development of statistical mechanics and made profound contributions to the understanding of the second law of thermodynamics. Boltzmann’s approach to the second law using probability and statistics met with considerable resistance from some contemporaries. He faced criticisms from scientists who believed in a deterministic universe, as implied by classical Newtonian mechanics. His work, combining detailed kinetic models with statistical arguments, laid the foundation for the modern field of statistical mechanics, and his perspective on the probabilistic nature of the second law anticipated many elements of quantum mechanics.
Statistical Mechanics
Boltzmann’s principal achievement lies in his development of statistical mechanics. This is the branch of physics that relates the macroscopic properties of systems (like temperature and pressure) to the probabilistic behaviors of their microscopic constituents.
Boltzmann’s Entropy Formula:
One of his most famous contributions is the Boltzmann entropy formula:
\[ S = k \cdot \ln(W) \]
Where:
– \( S \) is the entropy of the system.
– \( k \) is Boltzmann’s constant, which is roughly \(1.38 \times 10^{-23} \, \text{J/K}\).
– \( W \) is the number of microstates corresponding to a given macrostate. Essentially, it represents the number of ways the particles of the system can be arranged to produce the macroscopic state the system is in.
This formula is engraved on his tombstone, signifying its importance.
Boltzmann Equation:
Boltzmann also formulated the Boltzmann equation for the temporal development of distribution functions in phase space:
\[ \frac{\partial f}{\partial t} = Q(f,f) \]
Where \( f \) is the distribution function of particles and \( Q \) represents the collisional term. This equation describes the time evolution of a probability distribution function over phase space and is foundational in kinetic theory.
Second Law of Thermodynamics:
Boltzmann’s interpretation of the second law of thermodynamics, in terms of statistical mechanics, became a point of contention. He proposed that the second law is only statistically true; that is, while it’s highly probable for a system to evolve to a state of higher entropy, it’s not an absolute certainty.
H-Theorem:
The H-theorem, derived from the Boltzmann equation, provides a mathematical argument for the increase of entropy over time in certain conditions. The “H” function, defined in terms of the distribution function of the system, typically increases – a trend consistent with increasing entropy.
In summary, Ludwig Boltzmann’s pioneering efforts in statistical mechanics provided a deep, probabilistic understanding of thermodynamics, challenging prevailing deterministic views of the time and paving the way for future developments in quantum mechanics and statistical physics.
5.2.1 James Clerk Maxwell
Ledger Book from the Medici Bank circa 14xx
James Clerk Maxwell is one of the titans of physics, primarily remembered for his pioneering work in electromagnetism and thermodynamics. Here’s a comprehensive look at some of Maxwell’s significant contributions:
Maxwell’s Equations (Electromagnetism):
Maxwell’s most renowned contribution is the set of four differential equations, now known as Maxwell’s equations, that describe the behavior of electric and magnetic fields. These equations are:
Gauss’s Law for Electricity: \(\nabla \cdot \mathbf{E} = \frac{\rho}{\varepsilon_0}\)
– Relates electric charges to the electric field they produce.
Gauss’s Law for Magnetism: \(\nabla \cdot \mathbf{B} = 0\)
– Implies there are no magnetic monopoles in nature; magnetic fields are created by electric currents and change in electric fields.
Faraday’s Law of Induction: \(\nabla \times \mathbf{E} = -\frac{\partial \mathbf{B}}{\partial t}\)
– Describes how a changing magnetic field induces an electric field.
Ampère’s Law with Maxwell’s Addition: \(\nabla \times \mathbf{B} = \mu_0 \mathbf{J} + \mu_0 \varepsilon_0 \frac{\partial \mathbf{E}}{\partial t}\)
– Links currents and the changing electric field to the magnetic field.
From these equations, Maxwell derived that electromagnetic waves would travel at the speed of light, thereby proposing that light itself is an electromagnetic wave.
Maxwell’s Demon:
In the realm of thermodynamics, Maxwell introduced a thought experiment known as “Maxwell’s demon.” It challenges the second law of thermodynamics by imagining a tiny demon who controls a door between two gas-filled chambers. By selectively allowing fast-moving molecules to pass one way and slow-moving molecules the other, the demon seems to reduce entropy without doing work, seemingly violating the second law. While the demon is fictional, it sparked crucial discussions about the relationship between information, entropy, and the fundamental laws of physics.
One of the primary debates spurred by Maxwell’s Demon is about the relationship between information and entropy. To make its selective decisions, the demon must observe and gather information about each molecule. Later understandings proposed that the act of acquiring, processing, and erasing this information has an associated thermodynamic cost. Rolf Landauer formalized this idea in the 1960s, with Landauer’s principle, which states that erasing one bit of information is associated with a minimum entropy increase, thus reconciling the demon’s actions with the second law.
Kinetic Theory of Gases:
Maxwell, along with Ludwig Boltzmann, developed the kinetic theory of gases. He derived the Maxwell-Boltzmann distribution, which describes the distribution of speeds for gas particles in a container at a particular temperature. The equation for the Maxwell-Boltzmann distribution is:
\[f(v) = \left( \frac{m}{2\pi k T} \right)^{3/2} \exp \left( -\frac{mv^2}{2kT} \right) 4\pi v^2\]
Where:
– \(f(v)\) is the fraction of molecules with speed \(v\).
– \(m\) is the molecular mass.
– \(k\) is Boltzmann’s constant.
– \(T\) is the absolute temperature.
Control Theory
Control theory, at its heart, is about understanding the dynamics of systems and finding ways to manipulate them to achieve desired outputs.
In the 1860s, Maxwell studied the newly invented centrifugal governor, a device used to regulate the speed of steam engines. In his analysis of the centrifugal governor, Maxwell formulated a criterion for stability. He realized that merely having feedback (like the mechanical feedback in a governor) wasn’t enough to ensure stability. Instead, the design and parameters of the system determined whether the feedback would stabilize or destabilize it. His insights on this topic were published in the paper “On Governors” in the Proceedings of the Royal Society in 1868. This paper is considered one of the foundational texts in control theory.
Color Perception
In the field of optics, Maxwell proposed the trichromatic theory of color vision, demonstrating that any color in the visible spectrum could be created by combining light of three different colors.
5.2.1 Helmholtz
Ledger Book from the Medici Bank circa 14xx
Certainly! Hermann von Helmholtz (1821-1894) was a German physician and physicist who made significant contributions in various scientific domains. Here’s a detailed exploration of some of his major works:
Conservation of Energy:
Helmholtz’s most celebrated contribution to physics is his formulation of the law of conservation of energy. In his 1847 treatise “On the Conservation of Force,” he posited that the total amount of energy in a closed system remains constant over time, though it can transform from one form to another. This principle unified various observations across thermodynamics, mechanics, and electromagnetism, asserting that energy, whether mechanical, chemical, thermal, or electrical, can neither be created nor destroyed.
Equation: ΔE = 0 for a closed system, where ΔE represents the change in total energy.
Free Energy
The concept of “free energy” relates to the energy available in a system to do useful work
Helmholtz Free Energy (A or F): This is the energy associated with a system at constant temperature and volume. It’s defined as:
is the entropy. When systems evolve naturally, they tend to minimize their Helmholtz free energy at constant temperature and volume.
Perception
Helmholtz believed that sensory perception involved unconscious inferences. When we perceive the world around us, we’re not just passively receiving information; our brains actively interpret and make predictions about this information based on prior experiences. This idea has been foundational in cognitive science, influencing our understanding of how the brain perceives and interprets sensory information. This idea suggests that our brains are constantly making predictions about the world and then updating those predictions based on sensory feedback, a concept deeply embedded in predictive coding theories of brain function.
5.2.1 Charles Darwin
Ledger Book from the Medici Bank circa 14xx
Charles Darwin stands as one of the most influential figures in the history of science. His groundbreaking work on the theory of evolution by natural selection radically transformed our understanding of the natural world and humanity’s place within it. Here’s a deeper dive into his ideas, collaborators, influences, and the influence he exerted:
Evolution by Natural Selection
Darwin’s main contribution was the idea that species change over time due to the process of natural selection. In this process, individuals with advantageous traits are more likely to survive and reproduce. Over generations, these traits become more common within the population. Darwin proposed that all life on Earth descends from common ancestors. This means that closely related organisms share a more recent common ancestor than organisms that are less closely related.
Robert Malthus’ essay on population helped shape Darwin’s understanding of competition and “struggle for existence” in nature. The idea that populations can grow faster than their resources led to Darwin’s understanding of natural selection. He was also influenced by Jean Baptiste Lamarck’s idea of evolution—where traits acquired during an individual’s lifetime get passed down to offspring—was different from Darwin’s, Lamarck was among the first to propose a coherent mechanism for evolutionary change.
While working independently, Alfred Russel Wallace came to conclusions about natural selection similar to Darwin’s. Their simultaneous discoveries prompted the presentation of both their findings in 1858, a year before Darwin published “On the Origin of Species.” Wallace, in many ways, spurred Darwin to publish his findings, which he had been refining for years.
Darwin’s theory of evolution by natural selection is foundational to modern biology. It provides a unifying explanation for the diversity of life. In the early 20th Century the integration of Mendelian genetics with Darwinian natural selection resulted in the modern synthesis, which forms the foundation of evolutionary biology, and our concept of heridatible characteristics.
Social Darwinism
Although Darwin himself did not advocate for this, some individuals co-opted and misinterpreted his theories to support societal ideologies, suggesting that human races and classes were subject to the same competitive pressures as species in the wild. This misapplication has been criticized for justifying social inequalities and imperialism.
In sum, Charles Darwin’s contributions to science and thought are immeasurable. His careful observations, critical thinking, and revolutionary theories have had enduring impacts, shaping not just the scientific domain but also our broader cultural and philosophical landscapes.
5.2.1 Communism vs. Anarchism
Of course. Karl Marx, a philosopher, economist, and revolutionary, stands as one of the most influential thinkers of the 19th century. His work has had a profound impact on multiple disciplines, especially sociology, political science, and economics. Influenced by Hegel’s dialectic, ideas progressing through thesis-antithesis-synthesis, Marx turned this on its head. He saw the dialectical process as material and rooted in real, tangible historical developments.
Materialist Conception of History
Marx believed that the course of history is primarily determined by material conditions, particularly the mode of production. Societies evolve based on how they produce material goods and how these goods are distributed. The engine of this historical evolution is class struggle. At each stage of history, the dominant class (which controls the means of production) oppresses the subordinate class. This oppression and resulting conflict drive societal change.
Marx’s ideas were deeply influenced by the socio-economic landscape of the 19th century, particularly the First Industrial Revolution and prevailing theories of value. The Industrial Revolution brought about significant socio-economic changes, especially in the urban landscape. The shift from agrarian, craft-based economies to urban industrial production fundamentally changed the worker’s relationship with the product of their labor. In pre-industrial societies, artisans and craftsmen had a direct relationship with their creations. However, with the advent of factory-based production, workers became mere cogs in a vast machine, leading to Marx’s theory of alienation. That is, under industrial capitalism, workers are alienated from their work because they have no control over what they produce or how they produce it.
Economics
Although Marx analyzed capitalism critically he was heavily influenced by the classical economists, especially Adam Smith and David Ricardo. These economists developed the labor theory of value, which posited that the value of a commodity was determined by the labor invested in its production. Building on the labor theory of value, Marx developed the concept of surplus value. He argued that capitalists paid workers less than the value of what they produced. This difference, which the capitalists kept as profit, was the “surplus value”. For Marx, this became a cornerstone of his critique of capitalism, evidencing the inherent exploitation of workers. Furthermore, under capitalism, social relations are mediated through commodities, he termed this dynamic commodity fetishism. People relate to each other in terms of the goods they produce and exchange, obscuring the underlying social relations and exploitation.
Revolution and Communism
Marx posited that the economic base (mode of production) shapes the superstructure (societal institutions, culture, and ideologies). The dominant ideology in any society reflects the interests of the ruling class and works to perpetuate its dominance. He believed that the internal contradictions of capitalism would lead to its downfall. The proletariat, growing in numbers and becoming increasingly impoverished and alienated, would eventually overthrow the bourgeoisie. Post-revolution, a stateless, classless society, termed communism, would emerge. Production and distribution would be organized based on need, abolishing the prior exploitative class structures.
Anarchism is a political philosophy that opposes the existence of involuntary, coercive hierarchies, especially in the form of the state, and advocates for a society based on voluntary cooperation among free individuals. Here are explanations of the ideas of three main thinkers in the anarchist tradition:
Mikhail Bakunin
Collectivist Anarchism: Bakunin proposed a system in which workers would be organized into associations that manage the means of production and divide the product according to the labor contributed.
Anti-Authoritarianism: He emphasized a direct, revolutionary approach and was famously critical of Marx’s notion of the “dictatorship of the proletariat,” which he saw as a new form of tyranny.
State Critique: He believed that the state, regardless of its political form, would always oppress the individual. To Bakunin, liberation could only come from the abolition of the state.
Peter Kropotkin
Mutual Aid: Kropotkin saw cooperation as a primary force of evolution. In his book “Mutual Aid: A Factor of Evolution”, he argued that species survive not just due to competitive struggle, but more importantly, through cooperation.
Communist Anarchism: Kropotkin envisioned a society where goods are distributed based on needs, not on labor or merit. His idea was a society where the means of production are held in common and there’s free access to resources.
Decentralization: He believed that local communities should federate freely and operate based on communal decision-making, rather than being under a centralized authority.
Pierre-Joseph Proudhon
Property Critique: Proudhon is famously known for the statement, “Property is theft!” He believed that those who do not use or occupy property, but merely own it, steal from those who do the labor. However, he also differentiated between “private property” (large estates and sources of passive income) and “personal property” (one’s home, personal belongings).
Mutualism: This economic theory proposed that individuals and cooperative groups should trade their products in a market without profit, using “labor notes” reflecting hours of work as currency.
Federalism: Unlike some anarchists, Proudhon did not advocate for the complete abolition of all forms of government. Instead, he believed in a confederation of self-managing communities.
Each of these thinkers brought unique perspectives to the overarching philosophy of anarchism. While all rejected the state and coercive authority, they varied in their visions for how society should be organized in its absence.
5.2.1 Pareto
Ledger Book from the Medici Bank circa 14xx
Vilfredo Pareto, an influential Italian economist and sociologist, made significant contributions that were underpinned by mathematical formulations.
Pareto Principle (80/20 Rule)
This principle emerged from Pareto’s observations on wealth distribution, where he noted a consistent imbalance between causes and effects, inputs and outputs. Although not an equation in the traditional sense, the principle represents a type of power law, with a mathematical underpinning represented generally as:
\[
y = ax^k
\]
Here, “y” and “x” are variables, and “a” and “k” are constants. In the context of the Pareto Principle, it suggests that a small number of causes (20% represented by “x”) can lead to the majority of the effects (80% represented by “y”).
Pareto Efficiency/Optimality
Pareto efficiency, a concept from welfare economics, doesn’t have a specific equation, but it has a critical condition. A situation is Pareto optimal if no individual’s situation can be improved without making someone else’s situation worse. In mathematical terms, for a given resource allocation among individuals, the allocation is Pareto efficient if:
\[
\nabla f(x) = 0
\]
This means there are no alternative allocations that could make at least one individual better off without making any other individual worse off. The gradient (or the set of partial derivatives) of the utility function “f” with respect to the allocation “x” equals zero, indicating a state of optimal balance or equilibrium.
General Equilibrium Theory
Pareto expanded on Léon Walras’s work, contributing to the formulation of systems of equations that represent the equilibrium state of a whole economy. The fundamental equation representing market equilibrium for “n” markets can be conceptually represented as:
\[
\sum_{i=1}^{n} Q_i(D_i – S_i) = 0
\]
Here, \(D_i\) and \(S_i\) are the demand and supply in market “i,” respectively, and \(Q_i\) is the quantity of the good in market “i.” The system is in equilibrium when the total quantity of goods demanded equals the total quantity supplied.
Pareto’s mathematical approaches in economics laid the groundwork for various modern economic theories and practices, integrating algebraic formulations with economic principles. However, it’s essential to understand that the actual implementation of these mathematical concepts in his work was often more qualitative and broad, aiming to provide a structured framework for analyzing economic systems.
5.2.1 Marginal Revolution
Ledger Book from the Medici Bank circa 14xx
The Marginal Revolution was pivotal in transitioning from the classical economics perspective, which focused on the value derived from production costs, to a more modern perspective that emphasizes subjective valuation and marginal utility. These ideas came about as a means of solving The Diamond-Water Paradox.
The diamond-water paradox
First articulated by the Scottish philosopher and economist Adam Smith in his seminal work, “An Inquiry into the Nature and Causes of the Wealth of Nations” (1776). In discussing the apparent contradiction between the high market value of diamonds and the low market value of water despite the latter’s essential importance to life, Smith wrote:
“The things which have the greatest value in use have frequently little or no value in exchange; and, on the contrary, those which have the greatest value in exchange have frequently little or no value in use. Nothing is more useful than water: but it will purchase scarce anything; scarce anything can be had in exchange for it. A diamond, on the contrary, has scarce any value in use; but a very great quantity of other goods may frequently be had in exchange for it.”
Thus, while Smith identified and described the paradox, it was the later economists of the Marginal Revolution—particularly William Stanley Jevons, Carl Menger, and Léon Walras—who provided a solution to the paradox through the concept of marginal utility.
The resolution of this paradox is found in the concept of marginal utility, a cornerstone of the Marginal Revolution in economic thought.
Total Utility vs. Marginal Utility:
Total Utility is the total satisfaction or value derived from all units of a good consumed. Given its necessity, the total utility of water is incredibly high. While Marginal Utility refers to the additional satisfaction or value derived from consuming one additional unit of a good.
Diminishing Marginal Utility:
If you have a large quantity of water, the utility or satisfaction of having one more unit (like an additional glass) is minimal since you already have enough to satisfy your most pressing needs. Conversely, if you have no diamonds and then get one, the increase in satisfaction (due to its rarity, beauty, or social status it confers) can be significant.
Market Price and Abundance:
The price of a good in the market doesn’t just reflect its total utility but its marginal utility. Since water is abundant in many places, the satisfaction derived from an extra unit is low, leading to a low price. Diamonds, being rare, have a much higher marginal utility and thus command a higher price.
To calculate the marginal utility of a good, you use the following formula:
Where:
is the difference in total utility between two levels of consumption.
is the difference in the quantity of the good consumed between the two levels.
For a practical example, consider the utility gained from eating chocolate bars:
- If eating the first chocolate bar gives you a total utility of 10 units and eating the second one increases your total utility to 18 units, the change in total utility from eating the second bar is
units.
- The change in quantity consumed (
) is
chocolate bar.
- Using the formula, the marginal utility of the second chocolate bar is:
5.2.1 Special Relativity
Ledger Book from the Medici Bank circa 14xx
The development of special relativity was an intellectual journey that spanned several decades, with contributions from numerous physicists. However, the roles played by Poincaré, Lorentz, and Einstein are especially significant. Let’s trace the development of the theory through the lens of these contributors.
Background – Michelson-Morley Experiment
A foundational experiment in this context is the Michelson-Morley experiment in 1887. It sought to detect the “ether wind” as Earth moved through the luminiferous ether, the hypothetical medium through which light was thought to propagate. Unexpectedly, the experiment failed to detect any relative motion between Earth and the ether, suggesting that either the ether didn’t exist or that its effects were somehow being nullified.
Lorentz and the Ether
– Lorentz, influenced by the negative results of the Michelson-Morley experiment, proposed that moving bodies contract in the direction of motion — a phenomenon now known as “Lorentz contraction.”
– He developed the **Lorentz transformations** to describe how space and time coordinates of an event would appear to observers in different inertial frames. The transformations are given by:
\[
t’ = \gamma \left( t – \frac{vx}{c^2} \right)
\]
\[
x’ = \gamma (x – vt)
\]
where \( t \) and \( x \) are time and space coordinates in one frame, \( t’ \) and \( x’ \) are coordinates in a relatively moving frame, \( v \) is the relative speed between the two frames, \( c \) is the speed of light, and \( \gamma \) is the Lorentz factor given by:
\[
\gamma = \frac{1}{\sqrt{1 – \frac{v^2}{c^2}}}
\]
Lorentz believed these transformations reconciled Maxwell’s equations (which describe electromagnetism) with the idea of an ether and the null result of the Michelson-Morley experiment.
Poincaré’s Contributions
Henri Poincaré, a mathematician and physicist, recognized and discussed the implications of the Lorentz transformations. He emphasized the idea of “local time” and introduced the principle of relativity, stating that the laws of physics should be the same for all observers, regardless of their relative motion. Poincaré also noted the connection between mass and energy, hinting at the famous relation \( E=mc^2 \).
Einstein’s Special Relativity
In 1905, Albert Einstein published his paper on special relativity, “On the Electrodynamics of Moving Bodies.” Unlike Lorentz and Poincaré, Einstein didn’t base his theory on the existence of the ether. Instead, he started with two postulates: (i) the laws of physics are invariant (identical) in all inertial systems, and (ii) the speed of light in a vacuum is the same for all observers, regardless of their relative motion. Using just these postulates, Einstein derived the Lorentz transformations and several consequences of them. He also derived the relation between energy and mass, encapsulated in the famous equation:
\[
E = mc^2
\]
Einstein’s approach was different because it was based on these simple postulates rather than specific mechanical models or the existence of the ether. His theoretical framework fully incorporated time as a relative entity intertwined with space, leading to the concept of spacetime.
5.2.1 Bachelier
Ledger Book from the Medici Bank circa 14xx
Louis Bachelier, a pioneering French mathematician, is best known for his early work in the theory of financial markets and the process of price formation in such markets. His most groundbreaking contribution was his doctoral thesis, titled “Théorie de la Spéculation” (Theory of Speculation), which he presented in 1900.
Random Walk Hypothesis
Bachelier is credited with introducing the idea that stock market prices follow a random walk. This means that the future price movement of a stock is independent of its past price movements. In mathematical terms, if \(P(t)\) is the stock price at time \(t\), the change in price over a small time interval \(\delta t\) can be represented as:
\[
\Delta P(t) = P(t + \delta t) – P(t)
\]
Bachelier assumed \(\Delta P(t)\) is a random variable with a normal distribution.
Brownian Motion
Bachelier was among the first to apply the concept of Brownian motion to stock price movements, predating even Albert Einstein’s famous 1905 paper on the topic for particle motion. Brownian motion is a continuous-time stochastic process in which a particle (or in this case, a stock price) moves in random directions over time. Mathematically, it can be represented by the Wiener process, denoted by \(W(t)\), where:
\[
dW(t) \sim N(0, dt)
\]
This denotes that the infinitesimal change \(dW(t)\) follows a normal distribution with mean 0 and variance \(dt\).
Option Pricing
In his thesis, Bachelier also provided an early model for option pricing. While his model was not as refined or popular as the later Black-Scholes model, it laid essential groundwork for the field. He derived an equation for the value of a “call option” by analyzing the probable movement of stock prices.
While not immediately recognized in his time, Bachelier’s work gained significant attention and appreciation in the mid-20th century, particularly with the rise of the field of mathematical finance. His insights into the probabilistic nature of financial markets have become fundamental concepts in modern finance theory.
5.2.1 Reimann
Ledger Book from the Medici Bank circa 14xx
Bernhard Riemann was a German mathematician known for his profound and wide-ranging contributions to mathematics.
Riemannian Geometry
This is perhaps what he’s best known for. Riemann proposed the idea of extending Euclidean geometry to spaces of any dimension, and the foundation of this idea lies in the Riemann curvature tensor. The key equation here is the metric tensor, which provides a way to measure distances in these generalized spaces:
[ ds^2 = g_{ij} dx^i dx^j ]
where ( g_{ij} ) are the components of the metric tensor.
Riemann Hypothesis
This is one of the unsolved problems in mathematics and concerns the zeros of the Riemann zeta function:
[ zeta(s) = 1^s + 2^{-s} + 3^{-s} + cdots ]
The hypothesis asserts that all non-trivial zeros of the zeta function have their real parts equal to 1/2.
5. **Cauchy-Riemann Equations**: Though more credited to Cauchy, Riemann also worked on these equations which characterize holomorphic functions (complex differentiable functions). The equations are:
[ frac{partial u}{partial x} = frac{partial v}{partial y} ]
[ frac{partial u}{partial y} = -frac{partial v}{partial x} ]
where ( u(x,y) ) and ( v(x,y) ) are the real and imaginary parts of a complex function ( f(z) = u + iv ).
Riemann Surfaces
A Riemann surface is a one-dimensional complex manifold. This essentially means that, locally (in the vicinity of any point), a Riemann surface looks like the complex plane, but globally, its structure can be much more complicated.
One motivation for introducing Riemann surfaces was to understand multi-valued functions. For instance, the square root function is multi-valued: (sqrt{4}) can be 2 or -2. To handle this, we can create a Riemann surface called a “double cover” of the complex plane, where each point has two values of the square root.
Complex Plane: This is the simplest Riemann surface. Every point has a unique complex number associated with it.
Riemann Sphere: Imagine taking the complex plane and adding a point at infinity, turning it into a sphere. This surface provides a compact way of representing the entire complex plane.
Torus: A torus can be viewed as a Riemann surface, generated by identifying opposite edges of a rectangle in the complex plane.
As one encircles a branch point, the function value might switch from one branch to another. This phenomenon, where the function’s value changes as you go around a loop, is known as monodromy. Riemann surfaces play a crucial role in various areas: They allow for the extension of the theory of holomorphic functions to deal with multi-valued functions. Complex algebraic curves can be viewed as Riemann surfaces. The study of elliptic curves, which are a type of Riemann surface, has deep implications in number theory, most famously in the proof of Fermat’s Last Theorem by Andrew Wiles. String theory, a framework attempting to unify all forces of nature, is deeply tied to the mathematics of Riemann surfaces.
Riemann’s ideas, especially in geometry, were way ahead of his time and provided the mathematical underpinning for General Relativity, among other things. His work has continued to be foundational in multiple areas of mathematics.
5.2.1 Pascal and Fermat
Ledger Book from the Medici Bank circa 14xx
The Problem of Points and development of Probality Theory
Two players, A and B, are playing a game where the first to win a certain number of rounds will win the entire pot. They are forced to stop the game before either has won, and the question is how to fairly divide the stakes.
The “problem of points” that Pascal tackled in his correspondence with Fermat did not involve the formulation of a single specific equation as we might expect today. Instead, they approached the problem with a logical and combinatorial method to determine the fairest way to divide stakes in an unfinished game of chance. Using this logical method, Pascal and Fermat provided a foundation for the modern concept of probability. It’s worth noting that this combinatorial approach, which focused on counting favorable outcomes, was revolutionary for its time and paved the way for the systematic study of probability.
To illustrate their method, consider a simplified version of the problem:
Suppose A needs 2 more wins to clinch the game and B needs 3 more wins. They want to split the pot based on their chances of winning from this point.
Pascal and Fermat’s solution
1. Enumerate all possible ways the game could end: This involves all the combinations of wins and losses that lead to one of the players winning. In the above example, this could be WW (A wins the next two), WLW (A wins two out of the next three with one loss in between), LWLW, and so on.
2. Count favorable outcomes for each player: In the above scenario, if you list all possible combinations of games (with 2 wins for A and 3 wins for B), you’ll find more combinations where A wins than where B wins.
3. Divide the stakes proportionally based on these counts: If, for example, the counts are 3 combinations where A wins and 2 where B wins, then A should receive 3/5 of the pot, and B should receive 2/5.
Blaise Pascal
Pascal was born in Clermont-Ferrand, France. His exceptional mathematical abilities were evident from a young age. Homeschooled by his father, a mathematician, Pascal began making significant contributions to mathematics while still a teenager.
Mathematics
Pascal’s Triangle: One of Pascal’s early works was his construction of the eponymous triangle. It can be defined as follows:
– Every number is the sum of the two numbers directly above it.
– The outer edges of the triangle are always 1.
![Pascal’s Triangle](https://wikimedia.org/api/rest_v1/media/math/render/svg/25b25f443121f3a2a7c6c36a52e70f8c835c63d4)
Physics and Engineering
Pascal’s Law: In fluid mechanics, Pascal articulated that in a confined fluid at rest, any change in pressure applied at any given point is transmitted undiminished throughout the fluid. Mathematically, this can be expressed as: ΔP = ρgΔh, where ΔP is the change in pressure, ρ is the fluid density, g is gravitational acceleration, and Δh is the change in height.
The Pascaline: Pascal’s mechanical calculator was designed to perform addition and subtraction. The operation of carrying was simulated using gears and wheels.
Philosophy and Theology
Pascal is best known for his theological work, “Pensées.” In it, he reflects on the human condition, faith, reason, and the nature of belief. Pascal’s philosophy grapples with the paradox of an infinite God in a finite world. Central to his thought is “Pascal’s Wager,” a pragmatic argument for belief in God. Instead of offering proofs for God’s existence, the Wager presents the choice to believe as a rational bet: if God exists and one believes, the eternal reward is infinite; if one doesn’t believe and God exists, the loss is profound. Conversely, if God doesn’t exist, the gains or losses in either scenario are negligible. Thus, for Pascal, belief was the most rational gamble.
Blaise Pascal’s foundational work in mathematics and physics, notably in probability theory and fluid mechanics, continues to influence these fields today. His philosophical and theological musings in the “Pensées” have secured his place among the prominent thinkers in Christian apologetics. The unit of pressure in the International System of Units (SI), the pascal (Pa), commemorates his contributions to science.
Pierre de Fermat
Pierre de Fermat was a 17th-century French lawyer who, despite not being a professional mathematician, made significant contributions to various areas of mathematics. Here are some of his notable achievements, along with relevant specifics and equations:
Number Theory
Fermat’s Little Theorem: This theorem is used in number theory to determine the primality of numbers. It states:
[ a^{p-1} equiv 1 mod p ]
where ( p ) is a prime number and ( a ) is an integer not divisible by ( p ).
Fermat’s Last Theorem: This is perhaps the most famous result attributed to Fermat, mainly because of the 358 years it took to prove it. Fermat stated without proof:
[ x^n + y^n neq z^n ]
for any positive integers ( x, y, ) and ( z ) when ( n ) is an integer greater than 2. The theorem remained unproven until 1994 when it was finally proven by Andrew Wiles.
Analytic Geometry
– Fermat, along with René Descartes, is considered a co-founder of analytic geometry. This branch of mathematics uses algebra to study geometric properties and define geometric figures. He introduced the method of finding the greatest and the smallest ordinates of curved lines, which resembles the methods of calculus.
Calculus
Fermat is often credited with early developments that led to infinitesimal calculus. He used what would become differential calculus to derive equations of tangents to curves. He applied maxima and minima concepts, showing, for instance, that any positive number has two square roots.
Optics
Fermat developed the principle, now called Fermat’s principle, that the path taken by a ray of light between two points is the one that can be traversed in the least time.
Pierre de Fermat’s contributions have had long-lasting impacts, particularly in number theory. The mathematical community spent centuries proving many of his theorems and conjectures, most famously Fermat’s Last Theorem. His work in analytical geometry, calculus, and optics has been foundational to the development of modern mathematics and science.