920 resultados para causal inference


Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To examine the degree to which use of β blockers, statins, and diuretics in patients with impaired glucose tolerance and other cardiovascular risk factors is associated with new onset diabetes. DESIGN Reanalysis of data from the Nateglinide and Valsartan in Impaired Glucose Tolerance Outcomes Research (NAVIGATOR) trial. SETTING NAVIGATOR trial. PARTICIPANTS Patients who at baseline (enrolment) were treatment naïve to β blockers (n=5640), diuretics (n=6346), statins (n=6146), and calcium channel blockers (n=6294). Use of calcium channel blocker was used as a metabolically neutral control. MAIN OUTCOME MEASURES Development of new onset diabetes diagnosed by standard plasma glucose level in all participants and confirmed with glucose tolerance testing within 12 weeks after the increased glucose value was recorded. The relation between each treatment and new onset diabetes was evaluated using marginal structural models for causal inference, to account for time dependent confounding in treatment assignment. RESULTS During the median five years of follow-up, β blockers were started in 915 (16.2%) patients, diuretics in 1316 (20.7%), statins in 1353 (22.0%), and calcium channel blockers in 1171 (18.6%). After adjusting for baseline characteristics and time varying confounders, diuretics and statins were both associated with an increased risk of new onset diabetes (hazard ratio 1.23, 95% confidence interval 1.06 to 1.44, and 1.32, 1.14 to 1.48, respectively), whereas β blockers and calcium channel blockers were not associated with new onset diabetes (1.10, 0.92 to 1.31, and 0.95, 0.79 to 1.13, respectively). CONCLUSIONS Among people with impaired glucose tolerance and other cardiovascular risk factors and with serial glucose measurements, diuretics and statins were associated with an increased risk of new onset diabetes, whereas the effect of β blockers was non-significant.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thousands of books and articles on Charles de Gaulle's policy toward European integration, whether written by historians, social scientists, or commentators, universally accord primary explanatory importance to the General's distinctive geopolitical ideology. In explaining his motivations, only secondary significance, if any at all, is attached to commercial considerations. This paper seeks to reverse this historiographical consensus by examining the four major decisions toward European integration during de Gaulle's presidency: the decisions to remain in the Common Market in 1958, to propose the Foucher Plan in the early 1960s, to veto British accession to the EC, and to provoke the "empty chair" crisis in 1965-1966, resulting in the "Luxembourg Compromise." In each case, the overwhelming bulk of the primary evidence-speeches, memoirs, or government documents-suggests that de Gaulle's primary motivation was economic, not geopolitical or ideological. Like his predecessors and successors, de Gaulle sought to promote French industry and agriculture by establishing protected markets for their export products. This empirical finding has three broader implications: (1) For those interesred in the European Union, it suggests that regional integration has been driven primarily by economic, not geopolitical considerations--even in the "least likely" case. (2) For those interested in the role of ideas in foreign policy, it suggests that strong interest groups in a democracy limit the impact of a leader's geopolitical ideology--even where the executive has very broad institutional autonomy. De Gaulle was a democratic statesman first and an ideological visionary second. (3) For those who employ qualitative case-study methods, it suggests that even a broad, representative sample of secondary sources does not create a firm basis for causal inference. For political scientists, as for historians, there is in many cases no reliable alternative to primary-source research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thousands of books and articles on Charles de Gaulle's policy toward European integration, whether written by historians, political scientists, or commentators, universally accord primary explanatory importance to the General's distinctive geopolitical ideology. In explaining his motivations, only secondary significance, if any at all, is attached to commercial considerations. This paper seeks to reverse this historiographical consensus by the four major decisions toward European integration taken under de Gaulle's Presidency: the decisions to remain in the Common Market in 1958, to propose the Fouchet Plan in the early 1960s, to veto British accession to the EC, and to provoke the "empty chair" crisis in 1965-1966, resulting in Luxembourg Compromise. In each case, the overwhelming bulk of the primary evidence speeches, memoirs, or government documents suggests that de Gaulle's primary motivation was economic, not geopolitical or ideological. Like his predecessors and successors, de Gaulle sought to promote French industry and agriculture by establishing protected markets for their export products. This empirical finding has three broader implications: (1) For those interested in the European Union, it suggests that regional integration has been driven primarily by economic, not geopolitical considerations even in the least likely case. (2) For those interested in the role of ideas in foreign policy, it suggests that strong interest groups in a democracy limit the impact of a leaders geopolitical ideology even where the executive has very broad institutional autonomy. De Gaulle was a democratic statesman first and an ideological visionary second. (3) For those who employ qualitative case-study methods, it suggests that even a broad, representative sample of secondary sources does not create a firm basis for causal inference. For political scientists, as for historians, there is in many cases no reliable alternative to primary source research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this dissertation, I explore the impact of several public policies on civic participation. Using a unique combination of school administrative and public–use voter files and methods for causal inference, I evaluate the impact of three new, as of yet unexplored, policies: one informational, one institutional, and one skill–based. Chapter 2 examines the causal effect of No Child Left Behind’s performance-based accountability school failure signals on turnout in school board elections and on individuals’ use of exit. I find that failure signals mobilize citizens both at the ballot box and by encouraging them to vote with their feet. However, these increases in voice and exit come primarily from citizens who already active—thus exacerbating inequalities in both forms of participation. Chapter 3 examines the causal effect of preregistration—an electoral reform that allows young citizens to enroll in the electoral system before turning 18, while also providing them with various in-school supports. Using data from the Current Population Survey and Florida Voter Files and multiple methods for causal inference, I (with my coauthor listed below) show that preregistration mobilizes and does so for a diverse set of citizens. Finally, Chapter 4 examines the impact of psychosocial or so called non-cognitive skills on voter turnout. Using information from the Fast Track intervention, I show that early– childhood investments in psychosocial skills have large, long-run spillovers on civic participation. These gains are widely distributed, being especially large for those least likely to participate. These chapters provide clear insights that reach across disciplinary boundaries and speak to current policy debates. In placing specific attention not only on whether these programs mobilize, but also on who they mobilize, I provide scholars and practitioners with new ways of thinking about how to address stubbornly low and unequal rates of citizen engagement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While scientific realism generally assumes that successful scientific explanations yield information about reality, realists also have to admit that not all information acquired in this way is equally well warranted. Some versions of scientific realism do this by saying that explanatory posits with which we have established some kind of causal contact are better warranted than those that merely appear in theoretical hypotheses. I first explicate this distinction by considering some general criteria that permit us to distinguish causal warrant from theoretical warrant. I then apply these criteria to a specific case from particle physics, claiming that scientific realism has to incorporate the distinction between causal and theoretical warrant if it is to be an adequate stance in the philosophy of particle physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we introduce B2DI model that extends BDI model to perform Bayesian inference under uncertainty. For scalability and flexibility purposes, Multiply Sectioned Bayesian Network (MSBN) technology has been selected and adapted to BDI agent reasoning. A belief update mechanism has been defined for agents, whose belief models are connected by public shared beliefs, and the certainty of these beliefs is updated based on MSBN. The classical BDI agent architecture has been extended in order to manage uncertainty using Bayesian reasoning. The resulting extended model, so-called B2DI, proposes a new control loop. The proposed B2DI model has been evaluated in a network fault diagnosis scenario. The evaluation has compared this model with two previously developed agent models. The evaluation has been carried out with a real testbed diagnosis scenario using JADEX. As a result, the proposed model exhibits significant improvements in the cost and time required to carry out a reliable diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente escrito se ocupa de estudiar el nexo de causalidad como elemento estructural de la responsabilidad cuando éste es difuso. Para ello, se pretende abordar la pérdida de la oportunidad como una teoría especial de causalidad que tiene lugar cuando el nexo causal no resulta claro, lo cual contradice la tesis preponderante de la doctrina y la jurisprudencia tradicional según la cual, la pérdida de la oportunidad es un criterio autónomo del daño. En su contenido se realiza una explicación del por qué se entiende la pérdida de la oportunidad como una teoría especial de causalidad y no como un criterio autónomo de daño, haciendo énfasis en el elemento de certeza que caracteriza al daño. Posteriormente, se advierte del tratamiento que la jurisprudencia le ha dado a la pérdida de la oportunidad. A su turno, el presente documento, indica la naturaleza jurídica de la pérdida de la oportunidad, afirmando que es una inferencia lógica que realiza el juez y no un hecho que altere el estado de las cosas como si sucede con el daño. Finalmente, se aborda la prueba de la teoría de la pérdida de la oportunidad mediante un cálculo de probabilidades y se identifican los pasos para realizar una adecuada reparación integral.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New DNA-based predictive tests for physical characteristics and inference of ancestry are highly informative tools that are being increasingly used in forensic genetic analysis. Two eye colour prediction models: a Bayesian classifier - Snipper and a multinomial logistic regression (MLR) system for the Irisplex assay, have been described for the analysis of unadmixed European populations. Since multiple SNPs in combination contribute in varying degrees to eye colour predictability in Europeans, it is likely that these predictive tests will perform in different ways amongst admixed populations that have European co-ancestry, compared to unadmixed Europeans. In this study we examined 99 individuals from two admixed South American populations comparing eye colour versus ancestry in order to reveal a direct correlation of light eye colour phenotypes with European co-ancestry in admixed individuals. Additionally, eye colour prediction following six prediction models, using varying numbers of SNPs and based on Snipper and MLR, were applied to the study populations. Furthermore, patterns of eye colour prediction have been inferred for a set of publicly available admixed and globally distributed populations from the HGDP-CEPH panel and 1000 Genomes databases with a special emphasis on admixed American populations similar to those of the study samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: Descrever e comparar estudos longitudinais que permitam inferir sobre a influência da creche no estado nutricional de crianças pré-escolares. FONTES DE DADOS: Revisão sistemática de trabalhos científicos publicados entre janeiro de 1990 e dezembro de 2008. Buscaram-se os estudos nas seguintes bases de dados: Lilacs, Scielo e PubMed. Realizou-se também pesquisa manual dos artigos referenciados. A busca ocorreu no período de março de 2008 a junho de 2009, e os descritores utilizados foram: "creche", "estado nutricional", "antropometria", "consumo alimentar", "anemia" e "alimentação escolar". SÍNTESE DOS DADOS: Na primeira etapa do estudo, obtiveram-se 78 artigos, mas somente sete puderam ser incluídos. Os outros 71 não apresentaram dados para contribuir com o objetivo específico deste estudo. Entre os artigos pesquisados na literatura, existem poucos que permitem inferir sobre a influência que a creche pode ter em relação ao estado nutricional de pré-escolares. Contudo, estudos longitudinais têm mostrado a relação causal entre a presença frequente da criança na creche e a melhoria do estado nutricional. CONCLUSÕES: Existe uma relação positiva entre a frequência da criança na creche e a melhoria do estado nutricional.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper makes two points. First, we show that the line-of-sight solution to cosmic microwave anisotropies in Fourier space, even though formally defined for arbitrarily large wavelengths, leads to position-space solutions which only depend on the sources of anisotropies inside the past light cone of the observer. This foretold manifestation of causality in position (real) space happens order by order in a series expansion in powers of the visibility gamma = e(-mu), where mu is the optical depth to Thomson scattering. We show that the contributions of order gamma(N) to the cosmic microwave background (CMB) anisotropies are regulated by spacetime window functions which have support only inside the past light cone of the point of observation. Second, we show that the Fourier-Bessel expansion of the physical fields (including the temperature and polarization momenta) is an alternative to the usual Fourier basis as a framework to compute the anisotropies. The viability of the Fourier-Bessel series for treating the CMB is a consequence of the fact that the visibility function becomes exponentially small at redshifts z >> 10(3), effectively cutting off the past light cone and introducing a finite radius inside which initial conditions can affect physical observables measured at our position (x) over right arrow = 0 and time t(0). Hence, for each multipole l there is a discrete tower of momenta k(il) (not a continuum) which can affect physical observables, with the smallest momenta being k(1l) similar to l. The Fourier-Bessel modes take into account precisely the information from the sources of anisotropies that propagates from the initial value surface to the point of observation-no more, no less. We also show that the physical observables (the temperature and polarization maps), and hence the angular power spectra, are unaffected by that choice of basis. This implies that the Fourier-Bessel expansion is the optimal scheme with which one can compute CMB anisotropies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.