Workshop: REDUCTION IN PHYSICS AND BIOLOGY

REDUCTION IN PHYSICS AND BIOLOGY

International Workshop in Philosophy of Physics and Philosophy of Biology

Instituto de Filosofía y Ciencias de la Complejidad (IFICC), Santiago, Chile
January 4‐6, 2016

IDEA AND MOTIVATION

This workshop will bring together philosophers of physics and philosophers of biology with the goal of analyzing, from an inter-disciplinary perspective, issues concerning emergence and reduction. Can the behavior of complex biological or physical systems be explained in terms of the behavior of their parts? What explains the universality of power-law behavior? Does the fitness of a specific type reduce to the fitness of the individuals of that type? Do phase transitions represent a challenge for reductionism? Is mass an emergent property? Is the Nagelian an adequate model of inter-theoretic reduction? What is the role that highly idealized models play in scientific explanations? What are the arguments for an ontological pluralist view? Can all sciences be reduced to physics? These are some of the questions that will be addressed in this meeting.

SPEAKERS

Stephan Hartman is the Chair of Philosophy of Science, Alexander von Humboldt Professor, and Head of the Munich Center for Mathematical Philosophy at the LMU Munich. Before that he taught at Tilburg University, the London School of Economics, and the University of Konstanz. He had visiting appointments at the University of California at Irvine and Lund University and was a Visiting Fellow at the Center for Philosophy of Science at the University of Pittsburgh. He is President of the European Philosophy of Science Association (EPSA) and of the European Society for Analytical Philosophy (ESAP). For more information, visit his webpage: http://stephanhartmann.org
Aidan Lyon completed a PhD degree (2009) in philosophy at the Australian National University. He is currently Assistant Professor of Philosophy at the University of Maryland College Park, and will be visiting the MCMP for 6 months each year (Oct–Feb) until 2017 on a Humboldt Fellowship. Aidan is also a Research Fellow at the Centre of Excellence for Biosecurity Risk Analysis (CEBRA) at the University of Melbourne. For more information visit his website:   http://aidanlyon.com
Ximena González-Grandón is a Postdoctoral researcher at IFICC in Philosophy of Biology. He earned his PhD in Philosophy of Cognitive sciences at the National University of Mexico (UNAM) and Basque Country University (UPN-EHU). She is currently Assistant Professor of Philosophy of Medicine and Bioethics in the National University of Mexico (UNAM). Her main research interests are in philosophy of biology, philosophy of cognitive sciences and philosophy of medicine. Besides, she is a physician.
Olimpia Lombardi is an associate professor at the University of Buenos Aires and principal researcher at the Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET). She earned her PhD in philosophy from the University of Buenos Aires.  Her main research interests are in philosophy of physics, philosophy of chemistry and philsosophy of biology with several publications in all those areas. For more information visit her website:  http://www.filoexactas.exactas.uba.ar/olimpia/olimpia.htm
Patricia Palacios is a doctoral fellow at the Munich Center for Mathematical Philosophy at the LMU Munich and she is also a member of IFICC.  Her current research interests are in general philosophy of science and philosophy of physics, with particular interest in the philosophy of statistical mechanics. For more information visit her website: http://www.mcmp.philosophie.uni-muenchen.de/people/doct_fellows/palacios_patricia/index.html
Samuel Fletcher is an assistant professor at the University of Minnesota and he is also affiliated with the MCMP. He obtained his PhD in philosophy from the University of California, Irvine, where he held a National Science Foundation Graduate Research Fellowship (from 2010-2014). His current research interests are in the foundations of physics and statistics and also in broader issues in the philosophy of science. For more information visit his website: http://samuelcfletcher.com
Federico Benitez is a Postdoctoral researcher at the Physics Institute of the University of Bern in Switzerland, having previously spent some years at the Max Planck Institute for Solid State Research in Stuttgart, Germany. He earned his PhD in Statistical Mechanics (with an emphasis in Renormalization Group Methods) at the University of Paris 6. He comes from Montevideo, Uruguay.
Davide Vecchi is FCT Research Fellow at the Centre for Philosophy of Sciences (CFCUL) of the Faculty of Sciences of the University of Lisbon. He obtained the Ph.D. at the Department of Philosophy, Logic and Scientific Method of the London School of Economics and Political Science (United Kingdom). His current research interests are in theoretical and philosophical issues in developmental and evolutionary biology. For more information visit http://cfcul.fc.ul.pt/equipa/dvecchi.php
Wilfredo Quezada is associate professor at the Universidad de Santiago where he is the current Director of the Philosophy Department. He obtained his Ph.D. from King’s College London. His current research interests are in philosophical logic and in the metaphysics of causality.  For more information https://www.causalidadusach.cl
Maurizio Esposito is associate professor at the Universidad de Santiago. He obtained his Ph.D. at the University of Leeds. His areas of research are the history and philosophy of science, and especially of the life sciences. For more information visit https://usach.academia.edu/MaurizioEsposito
Diego Romero Maltrana is an associate Professor of Physics at the Pontificia Universidad Católica de Valparaíso and he is a member of the Instituto de Filosofía y Ciencias de la Complejidad (IFICC). He earned his PhD in Physics from the Pontificia Universidad Católica de Chile. He also obtained an MSc in Philosophy of Physics from the University of Oxford. His recent research in philosophy of physics includes the analysis of the equivalence of mass and energy proposed by Albert Einstein, the Higgs boson and the relation between symmetries and conserved quantities. For more information visit http://ificc.cl/content/diego-romero-matrana
Pablo Razeto is the director of the Instituto de Filosofía y Ciencias de la Complejidad (IFICC). He obtained his Ph.D.  in Ecology and Evolutionary Biology from the University of Chile. His main areas of research are theoretical biology, mathematical models in evolutionary biology, general philosophy of sciences and philosophy of biology. For more information visit more information visit http://www.ificc.cl/content/pablo-razeto-barry
José Tomás Alvarado is associate professor at the Institute of Philosophy of the Pontifical Catholic University in Santiago. He obtained the Ph.D. at the University of Navarra. His areas of research are philosophy of language and analytic metaphysics. For more information visit: http://filosofia.uc.cl/Academicos/alvarado-marambio-jose-tomas
Ramiro Frick is a doctoral fellow at the Alberto Hurtado University and he is also a member of IFICC. His main areas of research are general philosophy of sciences, philosophy of biology and philosophy of cognitive sciences. For more information visit: http://www.ificc.cl/content/ramiro-frick-andrade
Brad Weslake is an associate professor in philosophy at the New York University, Shanghai. He earned his PhD in philosophy from the University of Sydney.His research interests are in philosophy of science, especially causation and explanation, and on related topics in philosophy of physics, philosophy of biology and philosophy of mind. For more information visit his website: httsp://bweslake.org
Michael Strevens is an associate professor in philosophy at the New York University. He obtained his PhD in philosophy from the Rutgers University. His main research interests are in Philosophy of science, philosophical applications of cognitive science as well as in Philosophy of physics, philosophy of biology, formal epistemology. He has several publications in those areas and he is also the author of three books: Tychomancy, Depth and Bigger than Chaos. For more information visit his website: https://www.strevens.org

ABSTRACTS

Stephan Hartmann (Munich Center for Mathematical Philosophy/LMU Munich)
Understanding Toy Models

Toy models are highly idealized and extremely simple models. Although they are omnipresent across virtually all scientific disciplines, toy models are a surprisingly under-appreciated subject in philosophy of science. The main philosophical puzzle regarding toy models consists in that it is simply an unsettled question what the epistemic goal of toy modeling is. One interesting and promising proposal for answering this question is the claim that the primary function of toy models is to provide individual scientists with understanding. The aim of this talk is to precisely articulate and to defend this claim. The talk is based on joint work with Dominik Hangleitner and Alexander Reutlinger.

 

Maurizio Esposito (Universidad de Santiago de Chile)
Reductionism and action: composing and decomposing the life phenomena

One of the traditional critiques moved against reductionist approaches in biology is that the analysis of the basic components do not provide a satisfying description of the living phenomena. This presentation does not have the ambition to challenge this critique. Rather, the talk aims to assess a generally held supposition shared by advocates and critics of the reductionist approaches: i.e. the idea that these approaches refer to adequate or inadequate representations of the living processes. Instead, with the help of historical and contemporary examples, it will be argued that the conflict between these approaches lies in a contradiction between two different epistemic values: that of objectively representing a phenomenon and that of producing and controlling new phenomena.

 

Ximena González-Grandón (IFICC-UNAM)
Emergent novelties: a naturalistic proposal for music learning

Psychological Novelty has been a puzzle in cognitive science. According to a reductionist tradition (Koestler, 1964;Finke et al. 1992; Fauconnier and Turner 2002), new ideas, beliefs, or knowledge (practical and propositional) are thought to originate from a logical relationship, while rational novelty can be logically deduced from already existing ideas, beliefs, and knowledge. Music cognition follows the same trend: the novelty comes from a re-combination of existing elements (Weinberg et al. 2008; Bäckman and Dahlstedt, 2008). In this workshop contributionan epistemological viewpoint is proposed in order to integrate emergent and self-organized thought with studies on learning new musical habilities. Particularly, it is considered that, from the perspective of a relational and dynamical epistemology, the emergent and self-organized activity patterns arenecessary conditions for novelty arising and becoming part of the human agent’s repertoire of skills. Elaborating on recent embodied music cognition literature, it is suggested thatontogenetic development plays a crucial role as a causal variable for the acquisition of practical and psychological novelty in musicians and non-musicians. Finally, an operational notion of emergent novelty will be introduced that avoids a reductionist explanation.

 

Michael Strevens (New York University)
Reduction, Dependence, and the Sciences of Complexity

One aim of reductionist science is to explain high-level regularities, and in particular, to explain why some complex systems display enough stability or simplicity to be amenable to the usual methods of empirical inquiry — model building, statistical analysis, causal explanation, and so on. This paper describes two different forms that a reductionist explanation of high-level simplicity in a complex system might take. One — suitable for some high-level economic phenomena — hinges on the independence (in certain respects) of the behavior of the system’s parts. The other — suitable for various physical and biological systems — hinges on the parts’ extreme degree of interdependence. A question naturally arises: what about systems in which the parts’ behavior exhibits intermediate levels of dependence? I suggest that in such cases, we may not find the simplicity and stability necessary for high-level science to succeed. This may explain the many difficulties encountered by scientific modelers in the social sciences.

 

Olimpia Lombardi (Universidad de Buenos Aires)
Ontological plurality, reduction, and two forms of emergence.

I will begin by considering the problem of the relationship between physics and chemistry from the anti-reductionist perspective given by a Kantian rooted ontological pluralism. From this perspective I will analyze a neo-reductionist approach and its treatment of bridge laws, in particular those that do not express identity relations. This task will lead me to consider emergence and its essential features. On this basis, I will distinguish between intratheory emergence, in which the emergent and the base properties correspond to states defined in the context of the same theory, and intertheory emergence, which is a relationship between properties theoretically and empirically characterized in the context of different theories. I will accept the first form of emergence but will argue against the second one from an ontological pluralist perspective.

 

Patricia Palacios (Munich Center for Mathematical Philosophy/IFICC)
The role of approximation in a reductive model for phase transitions

Phase transitions, roughly understood as sudden changes in the phenomenological properties of a system, have recently motivated a debate about reduction and emergence in the physical sciences. In this debate there are two main positions: i) Phase transitions are paradigmatic cases of emergent or irreducible behavior (Lebowitz 1999, Batterman 2000, 2002, Bangu 2011); ii) phase transitions represent a successful case of Nagelian reduction (Butterfield 2011, Menon and Callender 2011, Norton 2012). This leads one to conceive of the discussion in the following terms: Phase transitions are either non-reductive phenomena or reductive phenomena satisfying the Nagelian model of reduction. In this paper I will suggest that this dichotomy is misleading. In fact, there are good reasons for considering phase transitions as a case of reduction that does not satisfy the Nagelian model of reduction, either in its strict or more liberal versions.

 

Samuel Fletcher (University of Minnesota/Munich Center for Mathematical Philosophy)
Limits of Nagelian Reduction

This presentation concerns the question of whether limiting-type reductions—what Nickles (1973) calls reduction_2—can be accommodated in the Generalized Nagel-Schaffner (GNS) framework (Dizadji-Bahmani et al., 2010) for intertheoretic reduction, which has been argued to avoid most of the problems leveled at the frameworks originally considered by Nagel (1961) and Schaffner (1967).  It consists of two parts.  The first considers a problem for this accommodation arising from the essentially deductive nature of the relationship that the GNS framework posits between theories: exhibiting a limiting relationship between theories simply does not fit this mold.  The second considers a possible response to this problem using a powerful theorem from the theory of uniform spaces.  However, I conclude that this offers only a grossly attenuated sense in which limiting-type reductions can be described in the GNS framework.

 

Davide Vecchi (CFCUL University of Lisbon/Universidad de Santiago de Chile/IFICC) and Wilfredo Quezada (Universidad de Santiago de Chile)
Of molecules, fields and cells

In this contribution we shall focus on reductive and non-reductive explanations in developmental biology. One instructive example of reductionism is computational embryology. Its basic idea is that embryological output is merely the function of the macromolecular input represented by the components of the embryo (i.e., nucleic acids and proteins). We shall then focus on criticisms of molecular reductionism positing field causation. Field theorists argue that the same molecular basis can produce different developmental outcomes in different developmental contexts; thus, the difference maker in causal terms is the “morphogenetic field”. We shall then propose that there might be a deep commonality between the two approaches: while in the first cells become redundant causal intermediaries between gene expression and morphogenesis, in the second cells become redundant causal intermediaries between field instruction and morphogenesis. Thus, the fundamental question becomes: do cells play an autonomous causal role in development?

 

Aidan Lyon (University of Maryland/Munich Center for Mathematical Philosophy)
Maximum Entropy Explanations in Biology

There are many robust and simple patterns in biology that arise out of the aggregation of a myriad of chaotic, and complex processes. A phenotypic trait such as height is the outcome of a huge complex array of interactions between genes and the environment. However, as Galton (1889) and many others have noticed, such traits often exhibit a very simple pattern: they are often normally distributed. Similarly simple patterns occur all throughout nature: the log-normal, power law, and exponential distributions––just to name a few.

Recent work, particularly by Frank (2009), has shown that these distributions can, in some sense, be explained by their maximum entropy properties. Frank argues that by maximising the entropy of a probability subject to a few informational constraints, one obtains a framework that neatly unifies and explains many of the robust and simple patterns that we observe in biology. However, it’s not at all clear how entropy and its maximisation can explain anything in nature. Entropy, in this context, is usually understood in terms of information, or the lack of information, that some ideal epistemic agent has (e.g., Frank 2009, Jaynes 2003). But how can the information that some agent has explain why, for example, heights are normally distributed? Intuitively, it can’t. In this paper, I survey a number of interpretations of the entropy of a probability distribution and examine how they can be embedded in a theory of explanation so that we can make sense of such maximum entropy explanations in biology

References

Frank, S.A. (2009). The common patterns of nature. Journal of Evolutionary Biology, 22, 1563–1585.
Galton, F. (1889). Natural Inheritance. MacMillan, London/New York.
Jaynes, E.T. (2003). Probability Theory: The Logic of Science. Cambridge University Press, New-York.

 

Diego Maltrana (Universidad Catolica de Valparaiso/IFICC) and Pablo Razeto (IFICC)
Mass as an ontological emergent property

Einstein’s discovery of the equivalence of mass and energy has generated a long term debate concerning the relation between these properties, whether they are indeed the same property, and how to interpret the alleged “conversion” between mass and energy. Recently Lange (2001, 2002) claimed that energy is not real and that only rest mass is objectively real, because only the latter is Lorentz invariant and thus the apparent “conversion” between them is an illusion that arises when shifting the level of analysis while examining physical systems. However, the rest mass of a composed system is Lorentz invariant (therefore real, in Lange’s terms) and does not reduce to the sum of the components’ rest masses. Following Lange’s reasoning, a real property would be composed of both real and “unreal” properties. We argue that this is an unacceptable conclusion, that the relation between parts has to be considered real as well, and that the “new mass” should be considered as ontologically emergent. We conclude that emergent mass arises in any complex (i.e., non-trivially composed) system and that this mass: 1) is real (i.e., Lorentz invariant), 2) is not composed of the component’s rest masses, 3) emerges from the relations among parts, 4) has causal power. According to us, this would be the first clear scientific example of ontologically emergent property.

 

José Tomás Alvarado (Universidad Catolica de Santiago de Chile)
Reduction, emergence, grounding, and other mysteries

In the last twenty years a lot of attention has been given to the concepts of ‘grounding’ and ‘dependence’. Our vague pre-theoretic intuitions have been refined and sophisticated (cf. Correia and Schnieder, 2012). Full strict grounding between entities is conceived as a primitive, irreflexive, asymmetric, transitive and non-monotonic relation –that should be distinguished from partial grounding, and weak grounding. Rigid dependence between entities is conceived as a primitive, irreflexive, asymmetric and transitive relation –that should be distinguished from generic dependence and weak dependence. Neither grounding, nor dependence can be analyzed in terms of modal quantified logic, but if x is (fully strictly) grounded on y, then it is necessary that if y exists, then x exists; and if x is (rigidly strictly) dependent on y, then it is necessary that if x exists, then y exists. It has been proposed that: [x is emergent on y =df x is dependent on y and it is not the case x is grounded on y.] What happens with ‘reduction’ here? Sometimes ‘reduction’ is conceived simply as a synonym for ‘grounding’. Sometimes, nevertheless, it is conceived as something that is neither grounding, nor dependence. If x grounds y, then x is numerically different from y. If x depends on y, then x is numerically different from y. Reduction differs both from grounding and from dependence, because reduction is identity. That is: [x is reduced to y =df (x = y and y is explanatorily prior to x)]. Reduction claims, then, seem to be more ontologically austere that grounding or dependence claims.

 

Ramiro Frick (Universidad Alberto Hurtado/IFICC)
Naturalism, normativity and the problem of biological functions

1. The concept of function –so common and pervasive in biological sciences– has to be naturalized; otherwise it must be eliminated.
2. The naturalization task is filling in the blank in, “T has the function F in virtue of _______”, without making ineliminable or irreducible use of non-natural, normative or intentional terms.
3. A naturalistic theory of biological functions is acceptable only if it satisfy two essential desiderata: a) normativity (it must be possible for a trait to have a certain function and to fail in realizing it) and b) causal efficacy (a trait’s possession of functional status should make a difference regarding its causal powers).
4. However, both desiderata are in tension with each other, they can’t be satisfied simultaneously, and this explains why all existing theories fail in one way or another.

 

Brad Weslake (New York University, Shanghai)
Fitness and Variance

I argue that a consequence of natural selection in populations with variance in reproductive success is that the fitness of a type does not reduce to the fitnesses of individuals of that type. One premise in my argument is that fitness is a theoretically unified concept, in the sense that it should predict the evolutionary success of individuals and types respectively. I thereby show that the demand for theoretical unity may cut against reduction rather than for it. In conclusion, I draw some lessons concerning natural selection and selectionist explanation.

 

Laura Franklin-Hall (New York University)
The possibility of biology

Though by no means ubiquitous, biologists have offered up a number of relatively simple and predictively successful models of exceedingly complex living systems. It is the existence of such ‘high-level’ modeling that has lead philosophers to judge biological systems explanatorily autonomous, and best explained and understood in ‘high-level’ terms. But, even granting the claims of explanatory anti-reductionism, a puzzle remains: biological systems are ultimately just physical contrivances governed by physical laws. If this is the case, why is it possible to describe these systems in such a detail-spare way? This paper aims to answer this question in the context of organismal biology.

 

Federico Benitez (University of Bern)
Contextual emergence and the problem of non-fundamental theories

We discuss one relatively recent proposal by H. Primas of a framework to describe emergence in science. These ideas are based around the introduction of so-called “Contextual Topologies”, which regularize some coarse-grained limit of the behaviour of fundamental theories. Such a procedure finds its natural application when used to recover Chemistry or Thermodynamics out of the formalism of Algebraic Quantum Mechanics, but the extrapolation to other emergent phenomena – and in particular how is it possible to find a description in which the notion of a Contextual Topology appears naturally – is not fully understood. The lack of such a well defined procedure for finding a contextual topology at non-fundamental levels, and the consequent impossibility to treat towers of hierarchical theories by means of these concepts is commented.
In passing, we briefly discuss a contribution by Atmanspacher and beim Graben trying to extend these methods to the problem of the emergence of the Mind, which we think ends up being artificial and ultimately resonates as an excessive use of mathematical formalism for its own sake.

It is in this scenario that we think a distinction between “principle” and “constructive” theories, in the sense defined by e.g. Einstein, should be addressed, and an argument is put forward as to the role Quantum Mechanics plays as a principle theory, which let us better evaluate the strengths of this topological approach

Compartir