53 resultados para local-to-zero analysis
Resumo:
BACKGROUND Sepsis continues to be a major cause of death, disability, and health-care expenditure worldwide. Despite evidence suggesting that host genetics can influence sepsis outcomes, no specific loci have yet been convincingly replicated. The aim of this study was to identify genetic variants that influence sepsis survival. METHODS We did a genome-wide association study in three independent cohorts of white adult patients admitted to intensive care units with sepsis, severe sepsis, or septic shock (as defined by the International Consensus Criteria) due to pneumonia or intra-abdominal infection (cohorts 1-3, n=2534 patients). The primary outcome was 28 day survival. Results for the cohort of patients with sepsis due to pneumonia were combined in a meta-analysis of 1553 patients from all three cohorts, of whom 359 died within 28 days of admission to the intensive-care unit. The most significantly associated single nucleotide polymorphisms (SNPs) were genotyped in a further 538 white patients with sepsis due to pneumonia (cohort 4), of whom 106 died. FINDINGS In the genome-wide meta-analysis of three independent pneumonia cohorts (cohorts 1-3), common variants in the FER gene were strongly associated with survival (p=9·7 × 10(-8)). Further genotyping of the top associated SNP (rs4957796) in the additional cohort (cohort 4) resulted in a combined p value of 5·6 × 10(-8) (odds ratio 0·56, 95% CI 0·45-0·69). In a time-to-event analysis, each allele reduced the mortality over 28 days by 44% (hazard ratio for death 0·56, 95% CI 0·45-0·69; likelihood ratio test p=3·4 × 10(-9), after adjustment for age and stratification by cohort). Mortality was 9·5% in patients carrying the CC genotype, 15·2% in those carrying the TC genotype, and 25·3% in those carrying the TT genotype. No significant genetic associations were identified when patients with sepsis due to pneumonia and intra-abdominal infection were combined. INTERPRETATION We have identified common variants in the FER gene that associate with a reduced risk of death from sepsis due to pneumonia. The FER gene and associated molecular pathways are potential novel targets for therapy or prevention and candidates for the development of biomarkers for risk stratification. FUNDING European Commission and the Wellcome Trust.
Resumo:
Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20–30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow–Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals.
Resumo:
Children living near highways are exposed to higher concentrations of traffic-related carcinogenic pollutants. Several studies reported an increased risk of childhood cancer associated with traffic exposure, but the published evidence is inconclusive. We investigated whether cancer risk is associated with proximity of residence to highways in a nation-wide cohort study including all children aged <16 years from Swiss national censuses in 1990 and 2000. Cancer incidence was investigated in time to event analyses (1990-2008) using Cox proportional hazards models and incidence density analyses (1985-2008) using Poisson regression. Adjustments were made for socio-economic factors, ionising background radiation and electromagnetic fields. In time to event analysis based on 532 cases the adjusted hazard ratio for leukaemia comparing children living <100 m from a highway with unexposed children (≥500 m) was 1.43 (95 % CI 0.79, 2.61). Results were similar in incidence density analysis including 1367 leukaemia cases (incidence rate ratio (IRR) 1.57; 95 % CI 1.09, 2.25). Associations were similar for acute lymphoblastic leukaemia (IRR 1.64; 95 % CI 1.10, 2.43) and stronger for leukaemia in children aged <5 years (IRR 1.92; 95 % CI 1.22, 3.04). Little evidence of association was found for other tumours. Our study suggests that young children living close to highways are at increased risk of developing leukaemia.
Resumo:
We consider the problem of nonparametric estimation of a concave regression function F. We show that the supremum distance between the least square s estimatorand F on a compact interval is typically of order(log(n)/n)2/5. This entails rates of convergence for the estimator’s derivative. Moreover, we discuss the impact of additional constraints on F such as monotonicity and pointwise bounds. Then we apply these results to the analysis of current status data, where the distribution function of the event times is assumed to be concave.
Resumo:
Governance of food systems is a poorly understood determinant of food security. Much scholarship on food systems governance is non-empirical, while existing research is often case study-based and theoretically and methodologically incommensurable. This frustrates aggregation of evidence and generalisation. We undertook a systematic review of methods used in food systems governance research with a view to identifying a core set of indicators for future research. We gathered literature through a structured consultation and sampling from recent reviews. Indicators were identified and classified according to the levels and sectors they investigate. We found a concentration of indicators in food production at local to national levels and a sparseness in distribution and consumption. Unsurprisingly, many indicators of institutional structure were found, while agency-related indicators are moderately represented. We call for piloting and validation of these indicators and for methodological development to fill gaps identified. These efforts are expected to support a more consolidated future evidence base and eventual meta-analysis.
Resumo:
Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.
What does it mean to analyse the historical dimension of discourses? A discourse-historical approach
Resumo:
The aim of analogue model experiments in geology is to simulate structures in nature under specific imposed boundary conditions using materials whose rheological properties are similar to those of rocks in nature. In the late 1980s, X-ray computed tomography (CT) was first applied to the analysis of such models. In early studies only a limited number of cross-sectional slices could be recorded because of the time involved in CT data acquisition, the long cooling periods for the X-ray source and computational capacity. Technological improvements presently allow an almost unlimited number of closely spaced serial cross-sections to be acquired and calculated. Computer visualization software allows a full 3D analysis of every recorded stage. Such analyses are especially valuable when trying to understand complex geological structures, commonly with lateral changes in 3D geometry. Periodic acquisition of volumetric data sets in the course of the experiment makes it possible to carry out a 4D analysis of the model, i.e. 3D analysis through time. Examples are shown of 4D analysis of analogue models that tested the influence of lateral rheological changes on the structures obtained in contractional and extensional settings.