Workshop 1
Statistical Physics Methods in Social and Economic Systems
January 26th to 30th
Participants who intend to present a poster can submit their request to the Poster Session Chair (Prof. Cecilia Vernia, cecilia.vernia@unimore.it)
Invited speakers:
A. Barra
“Insights in Economical Complexity: the hidden role of migrants small worlds”,
[abstract]
Through a series of sequential steps, driven by statistical-mechanics
and graph-theory perspectives and supported by extensive data, we
analyze the product diversification for trades in Spain and we link
its behavior with the existence of an underlying social network of
migrants interacting with natives.
We prove that a boost for diversification in the international trade market
is (partially) achieved through the underlying interactions among locals and
migrants: the latter, providing key information on policies and needs
in their native countries, allow firm holders to leverage
transactional costs of exports and duties. As a consequence,
international trading is allowed to a larger basin of firms and
results in an increased number of transactions (extensive margin),
which, in turn, implies a larger diversification of international
traded products.
Further, as a sideline, our theory allows easily to estimate the
critical amount of migrants inside the host country before their
presence starts to
affect international trading and naturally implies that the social organization
of Spanish decision makers exhibits small world features.
D. Delli Gatti
“Macroeconomic Debates: The State of the Art and the Computational Way Ahead”,
[abstract]
In this talk I will briefly review the most recent controversies and
developments on the state of macroeconomics. There are a number of
different ways ahead in macro. I will focus on computational methods,
concerning expecially multi-agent models.
J. Donier
“How people's decisions impact prices: Empirical evidence and theory of a square root”,
[abstract]
The non linear impact of agent's decisions on market prices is one of
the main puzzles that arise when it comes to unraveling the price
formation mechanism on (non-)financial markets. Supported by strong
evidence from the Bitcoin market, we present a physical model that
consistently accounts for most empirical facts known so far on price
impact, thus resolving the apparent square root law paradox and laying
new foundations for addressing the most common impact-related
problems.
M. Fedele
“Interacting Models in Social Sciences and Health Screening Campaigns”,
[abstract]
Imitation and social pressure are usually observed in the aggregate
behavior of populations, and they are responsible for the appearance of
trends, herd effects, discontinuities and crashes. To account for these
phenomena, interactions networks must be included in the modeling of
social systems, and measured from data. We present a recent analysis on
an extensive dataset from adhesion to cancer screening campaigns, where a
modeling, based on statistical mechanics and multi-species mean field spin
models, allows for a quantitive estimate of average interaction effects
through an inverse problem and leads to a forecast of effective social
policies to enhance participation.
M. Gallegati
“The Economics in Crisis”,
[abstract]
This lecture discusses the crisis in the economy and the
macroeconomics. Theory appears to be inadequate in its explanation of
the origins and the
nature of the crisis, because of the classical physic assumptions
which in economics are translated into the representative agent
assumption, equilibrium and no interaction. DSGE macroeconomic
models, however
sophisticated have continued to be based on the same foundations shown
to be wanting in the
1970s and have the stability and uniqueness problem emphasized by the
SMD theorem, which is a powerful
warnings by numerous mathematicians and economists as to its unsound
foundations. We need to construct models which view the economy as a
complex adaptive system, may use some
of the tools of statistical physics and do not necessarily use the
standard equilibrium approach. The ABM is a most promising way to do
it.
S. Gualdi
“Tipping points and monetary policy in a stylized macroeconomic agent-based model”,
[abstract]
Traditional approaches in economics rely on the assumption
that economic agents are identical, non-interacting and rational.
Within this framework, economic instabilities would require large
exogenous shocks, when in fact small local shocks can trigger large
systemic effects when heterogeneities and interactions are taken into
account. The need to include these effects motivate the development of
agent-based models (ABMs), which are extremely versatile and allow to
take into account more realistic behavioural rules. In this talk we
introduce a simple ABM, explore the possible types of phenomena that
it can reproduce and propose a methodology that characterizes a model
through its phase diagram. We then generalize the model with the aim
of investigating the role and efficacy of the monetary policy of a
central bank. We show that the existence of different equilibrium
states of the economy can cause the monetary policy itself to trigger
instabilities and be counter-productive.
P. Jensen
“Are models drawn from physics relevant for social systems?”,
[abstract]
These last years have witnessed a significant rise of
papers from physicists using relatively simple models to understand
social systems. The basic idea is to use the physicists’ expertise on
the emergence of collective phenomena in condensed-matter : as the
properties of materials emerge from the interactions between atoms,
the characteristics of societies would emerge from interactions
between individuals, taken as ‘social atoms’. In this presentation, I
claim that these micro-macro models are unfit to unfold the complexity
of collective existence and that the priority should instead be the
development of new formal tools to exploit the richness of digital
data.
Specifically, I will argue that micro-macro models have serious
methodological and political problems. From a methodological
viewpoint, most simulations work only at the price of simplifying the
properties of micro-agents, the rules of interaction and the nature of
macro-structures so that they conveniently fit each other. A bit like
Descartes’ followers who explained the acidity of lemons by
postulating the existence of ‘lemon atoms’ with tiny pricking needles.
In the absence of empirical confirmation, social models tend to rely
exclusively on internal coherence rather than validation or relevance
for real social systems. From a political viewpoint, micro-macro
models assume by construction that agents at the local level are
incapable to understand and control the phenomena at the global level,
as in the so-called ‘tragedy of the commons’. Only the modelers can
observe collective phenomena. Ironically, a supposedly “bottom-up”
approach leads to “top-down” social politics!
In a recent collaboration with sociologists, we have argued that
collective action does not originate at the micro level of individual
atoms and does not end up in a macro level of stable structures.
Instead, actions distribute in intricate and heterogeneous networks
than fold and deploy creating differences but not discontinuities.
Therefore, the time seems ripe to develop the formal techniques
necessary to unfold the origami of collective existence and this
should be the aim of the renewed alliance between the social and
natural sciences. For the next few years, efforts should be shifted
from simulating to mapping, from simple explanations to complex
observations.
M. Marsili
“Lost in diversification”,
[abstract]
In an effort to understand the drivers of the decoupling of
finance from the real economy,
I will discuss simple models that provide a quantitative measure of
unpriced information losses that
occur in widespread financial practices.
J-P. Nadal
“Entanglement between Demand and Supply in Markets with Bandwagon Goods”,
[abstract]
Whenever customers'choices (e.g. to buy or not a given good) depend on
others choices ('bandwagon effect' in the economic literature), the
demand may be multiply valued: for a same posted price, there is
either a small number of buyers, or a large one -- in which case one
says that the customers coordinate. This leads to a dilemma for the
seller: should he sell at a high price, targeting a small number of
buyers, or at low price targeting a large number of buyers? We show
that the interaction between demand and supply is even more complex
than expected, leading to what we call the curse of coordination: the
pricing strategy for the seller which aimed at maximizing his profit
corresponds to posting a price which, not only assumes that the
customers will coordinate, but also lies very near the critical price
value at which such high demand no more exists. This is obtained by
the detailed mathematical analysis of a particular model formally
related to the Random Field Ising Model and to a model introduced in
social sciences by T. C. Schelling in the 70's.
Work with Mirta B. Gordon (LIG, Grenoble), Denis Phan (GEMASS, Paris) and Viktoriya Semeshenko (Instituto Interdisciplinario de Economía Política, Buenos-Aires)
L. Pareschi
“Mean field and Boltzmann control of socio-economic systems”,
[abstract]
Work with Mirta B. Gordon (LIG, Grenoble), Denis Phan (GEMASS, Paris) and Viktoriya Semeshenko (Instituto Interdisciplinario de Economía Política, Buenos-Aires)
In this talk we survey some recent results on the control of complex
socio-economic systems composed by a large number of agents. We focus in
particular on constrained opinion models and investigate model predictive
control techniques in the mean-field and Boltzmann limits. Connections with
continuous control based on Riccati equations are also presented. Finally
the presence of random inputs in the system is considered and the need to
control instabilities is discussed. Several numerical results illustrate
the different approaches.
References
[1] G.Albi, M.Herty, and L.Pareschi. Kinetic description of optimal control problems in consensus modeling. Comm. Math. Sci., to appear
[2] G.Albi, L.Pareschi, and M.Zanella. Boltzmann type control of opinion consensus through leaders. Phil. Trans. A Math. Phys. Eng. Sci., 13:372(2028), 2014
[3] M.Herty, L.Pareschi, S.Steffensen. Mean-field control and Riccati equations. Network and Heterogeneous Media, to appear
[4] G.Albi, L.Pareschi, and M.Zanella. Uncertainty quantification in control problems for flocking models, preprint 2014
M. Pisati
“The Unbearable Lightness of the Social Sciences: Current Practices and Possible Futures”,
[abstract]
[1] G.Albi, M.Herty, and L.Pareschi. Kinetic description of optimal control problems in consensus modeling. Comm. Math. Sci., to appear
[2] G.Albi, L.Pareschi, and M.Zanella. Boltzmann type control of opinion consensus through leaders. Phil. Trans. A Math. Phys. Eng. Sci., 13:372(2028), 2014
[3] M.Herty, L.Pareschi, S.Steffensen. Mean-field control and Riccati equations. Network and Heterogeneous Media, to appear
[4] G.Albi, L.Pareschi, and M.Zanella. Uncertainty quantification in control problems for flocking models, preprint 2014
The purpose of this talk is to discuss some issues related
to the analysis of social phenomena from the perspective of sociology.
First, I claim that, overall, the sociological analysis of social
phenomena still has an uncertain scientific status, due to its being a
melting pot of sometimes divergent epistemological and methodological
stances. Then, I focus on the quantitative analysis of social
phenomena and discuss its current practices, pointing to strengths and
weaknesses. Finally, I suggest that a new alliance between sociology
and other scientific disciplines, like biology and physics, might
contribute to bring about a truly scientific analysis of social
phenomena.
M. Rasetti
“The Topological Field Theory of Data: a program towards a new strategy for data mining”,
[abstract]
The piece of work described in this talk aims to challenging current
thinking in IT about the 'Big Data' question, proposing – almost
verbatim, with almost no formulas – a program whose goal is to
construct an innovative methodology to perform data analytics. We
suggest to build a theoretical framework which – directly probing the
data space – could enable us to extract the manifold of hidden
relations (patterns) that exist among data as correlations. The latter
depend on and at the same time generate the semantics underlying the
mining context itself. The approach, that exploits recent innovative
ways of incorporating data in a topological setting, proposes a Field
Theory of Data, transferring and generalizing to the space of data
notions inspired by physical (topological) field theories. It
harnesses as well the theory of formal languages to define a potential
semantics to understand the emerging patterns.
S. Redner
“Statistics of Basketball Scoring and Lead Changes”,
[abstract]
Exploiting recent availability of comprehensive data on all scoring events in
recent NBA basketball games, the statistics of scoring and lead changes are
investigated. Except for anomalies at the start and the end of the game,
basketball scoring is well described by a continuous-time anti-persistent
random walk, with essentially no temporal correlations between successive
scoring events. We also determine the criterion for when a lead of a
specified size is "safe" as a function of the time remaining in the game.
Finally, we show that the distribution of times when the last lead change
occurs and the distribution of times when the score difference is maximal are
both given by the celebrated arcsine law, a prediction that is in excellent
agreement with basketball game data.
R. Sandell
“Why Sociologists should (and increasingly want to) “make out” with the hard sciences.”,
[abstract]
The full impact of the era of internet and the informatics revolution
is probably still not fully apprehensible. The informatics revolution
no doubt means far reaching changes for just about any science. The
increased computer capacity, the speed of calculations, the storage
capacity, implies that we can do things that were impossible just a
few years ago. For the social sciences the changes are extremely
important. The revolution has brought “big data” do the desktop of the
researcher. This drastically changes the scope of social research.
From being a science foremost engaged with philosophical reasoning of
about human and collective behaviour, or from being survey driven,
sociology and other social sciences have recently been exposed to
large or very large data sets recorded in continues time. The
complexity and wealth of this “new” data invites to a different
analytical approach than the traditional. In this talk I’ll dissect
the mind of the sociologists, show what they are looking out for, and
the tools, primitive and advanced, used to produce frontier
sociological science. I’ll show the sociologist limitations in dealing
with systemic data, and argue for a more close collaboration between
social scientists and in particular physicists and mathematicians.
Making out with the hard sciences may bring social sciences closer to
the reality that it is trying so hard to understand.
A. Sirbu
“A new dimension for democracy: egalitarianism in the rank aggregation problem”,
[abstract]
Winner selection by majority, in an election between two
candidates, is the only rule compatible with democratic principles.
Instead, when the candidates are three or more and the voters rank
candidates in order of preference, there are no univocal criteria for
the selection of the winning (consensus) ranking and the outcome is
known to depend sensibly on the adopted rule. Building upon XVIII
century Condorcet theory, whose idea was to maximize total voter
satisfaction, we propose here the addition of a new basic principle
(dimension) to guide the selection: satisfaction should be distributed
among voters as equally as possible. With this new criterion we
identify an optimal set of rankings. They range from the Condorcet
solution to the one which is the most egalitarian with respect to the
voters. We show that highly egalitarian rankings have the important
property to be more stable with respect to fluctuations and that
classical consensus rankings (Copeland, Tideman, Schulze) often turn
out to be non optimal. The new dimension we have introduced provides,
when used together with that of Condorcet, a clear classification of
all the possible rankings. By increasing awareness in selecting a
consensus ranking our method may lead to social choices which are more
egalitarian compared to those achieved by presently available voting
systems.
M. Smerlak
“Thermodynamics of economic inequalities: precariousness, volatility and stratification”,
[abstract]
Growing economic inequalities are observed in several countries
throughout the world. Following Pareto, the power-law structure of
these inequalities has been the subject of much theoretical and
empirical work. But their nonequilibrium dynamics, e.g. after a policy
change, remains incompletely understood. I will introduce a
thermodynamical theory of inequalities based on the analogy between
economic stratification and statistical entropy. Within this framework
the combination of upward mobility with precariousness is identified
as a fundamental driver of inequality. I will formalize this statement
by a "second-law" inequality displaying upward mobility and
precariousness as thermodynamic conjugate variables. I will also
estimate the time scale for the "relaxation" of the wealth
distribution after a sudden change of the after-tax return on capital.
My method can be generalized to gain insight into the dynamics of
inequalities in any Markovian model of socioeconomic interactions.
G. Toscani
“Wealth and knowledge in multi-agent systems. A Kinetic approach”,
[abstract]
In recent years, the distribution of wealth in a multi-agent society
has been investigated by resorting to classical methods of kinetic
theory of rarefied gases. In analogy with the Boltzmann equation, the
change of wealth in these models is due to microscopic binary trades
among agents. Surprisingly, other important aspects linked to
different types of human wealth, like the role of personal knowledge
(information), have not been taken into consideration. In this
lecture, we introduce and discuss a nonlinear kinetic equation of
Boltzmann type which describes the influence of knowledge in the
evolution of wealth in a system of agents which interact through
binary trades. The trades, which include both saving propensity and
the risks of the
market, are here modified in the risk and saving parameters, which now
are assumed to depend on the personal degree of knowledge. The
numerical simulations show that the presence of knowledge has the
potential to produce a class of wealthy agents and to account for a
larger proportion of wealth inequality.
C. Vernia
“Trust social network from collective data: interaction vs independence, connectedness vs fragmentation.”,
[abstract]
In this talk we deal with a classical problem in sociology which is
the phenomena of isolation and individual alienation in large urban
areas.
More specifically, we study the structure of a social network of trust
investigating its property of connectedness versus fragmentation.
To this purpose we analyse two extensive sets of census data on
immigration, concerning social and economic integration quantifiers
collected in Spain and in Italy. The study is based on a novel
approach that
uses data analysis methods and mathematical models
inspired by statistical physics. We show that integration quantifiers
may exhibit linear or non-linear growth on immigration density. We
explain these differences by means of a properly defined social
interaction component, and we illustrate how this leads to
quantitative estimates of integration across different socioeconomic
contexts.