21 resultados para Idealism and Epistemology

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Setting practical priorities for sexually transmitted infection (STI) control is a balance between idealism and pragmatism. Infections transmitted through unsafe sex (chlamydia, gonorrhoea, syphilis, HIV, hepatitis B and human papillomavirus (HPV) infections) rank in the top five causes of the global burden of disease.1 Their distribution in populations is driven by a complex mixture of individual behaviours, social and community norms and societal and historical context. Ideally, we would be able to reduce exposure to unsafe sex to its theoretical minimum level of zero and thus eliminate a significant proportion of the current global burden of disease, particularly in resource-poor settings.2 Ideally, we would have ‘magic bullets’ for diagnosing and preventing STI in addition to specific antimicrobial agents for specific infections.3 Arguably, we have ‘bullets’ that work at the individual level; highly accurate diagnostic tests and highly efficacious vaccines, antimicrobial agents and preventive interventions.4 Introducing them into populations to achieve similarly high levels of effectiveness has been more challenging.4 In practice, the ‘magic’ in the magic bullet can be seen as overcoming the barriers to sustainable implementation in partnerships, larger sexual networks and populations (figure 1).4 We have chosen three (pragmatic) priorities for interventions that we believe could be implemented and scaled up to control STI other than HIV/AIDS. We present these starting with the partnership and moving up to the population level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulations arrive at their results by introducing randomness, sometimes derived from a physical randomizing device. Nonetheless, we argue, they open no new epistemic channels beyond that already employed by traditional simulations: the inference by ordinary argumentation of conclusions from assumptions built into the simulations. We show that Monte Carlo simulations cannot produce knowledge other than by inference, and that they resemble other computer simulations in the manner in which they derive their conclusions. Simple examples of Monte Carlo simulations are analysed to identify the underlying inferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The talk starts out with a short introduction to the philosophy of probability. I highlight the need to interpret probabilities in the sciences and motivate objectivist accounts of probabilities. Very roughly, according to such accounts, ascriptions of probabilities have truth-conditions that are independent of personal interests and needs. But objectivist accounts are pointless if they do not provide an objectivist epistemology, i.e., if they do not determine well-defined methods to support or falsify claims about probabilities. In the rest of the talk I examine recent philosophical proposals for an objectivist methodology. Most of them take up ideas well-known from statistics. I nevertheless find some proposals incompatible with objectivist aspirations.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

N. Bostrom’s simulation argument and two additional assumptions imply that we are likely to live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and material models. Even though the computer hardware does provide a material model of the target, this does not suffice to underwrite the simulation argument because the ways in which parts of the computer hardware interact during simulations do not resemble the ways in which neurons interact in the brain. Further, there are computer simulations of all kinds of systems, and it would be unreasonable to infer that some computers display consciousness just because they simulate brains rather than, say, galaxies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two Boltzmannian accounts of the Second Law, viz.\ a globalist and a localist one. In both cases, the probabilities fail to be chances because they have rivals that are roughly equally good. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At first sight, experimenting and modeling form two distinct modes of scientific inquiry. This spurs philosophical debates about how the distinction should be drawn (e.g. Morgan 2005, Winsberg 2009, Parker 2009). But much scientific practice casts serious doubts on the idea that the distinction makes much sense. There are two worries. First, the practices of modeling and experimenting are often intertwined in intricate ways because much modeling involves experimenting, and the interpretation of many experiments relies upon models. Second, there are borderline cases that seem to blur the distinction between experiment and model (if there is any). My talk tries to defend the philosophical project of distinguishing models from experiment and to advance the related philosophical debate. I begin with providing a minimalist framework of conceptualizing experimenting and modeling and their mutual relationships. The methods are conceptualized as different types of activities that are characterized by a primary goal, respectively. The minimalist framwork, which should be uncontroversial, suffices to accommodate the first worry. I address the second worry by suggesting several ways how to conceptualize the distinction in a more flexible way. I make a concrete suggestion of how the distinction may be drawn. I use examples from the history of science to argue my case. The talk concentrates and models and experiments, but I will comment on simulations too.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current wisdom in cosmology has it that the Universe is about 13.8 billions year old. Statements about the age of the Universe are not just difficult to confirm, but also carry a lot of presuppositions. The aim of this talk is to make explicit these presuppositions, to discuss their significance and to trace the implications for an emipirical investigation of the age of the Universe.