939 resultados para Statistical Language Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A semantic approach towards political conflict first emerged in the 1930s and provides the methodological foundations for the description of political conflicts, in particular as the correlation between the language of description and reality. Any military or political confrontation presupposes axiological, conceptual and ideological confrontation. The form of adequate description can only be comprehended if the characteristic features of its language (structure) and thesaurus are revealed. Admitting the possibility of different descriptions implies the necessity of analysing this possible ambiguity, i.e. the characteristic features of the language which enable us to form various statements, including mutually exclusive ones. The insoluble task of finding a middle ground between the viewpoints of the conflicting parties should be replaced by soluble procedures of explaining and assessing the conflicting axiologies. For the description of conflict situations, when it is essential to represent various positions within a uniform system, an apparatus of model semantics seems to be the most appropriate one both for generating alternatives and for bringing them together in a modal system of a world in which procedures of transition from one world to another (i.e. the transworld compatibility between them) are also reflected. Reality is reconstructed not as a sort of middle ground between the mutually exclusive approaches nor as their sum, but as a result of the overlapping of various worlds and the procedures of transition from one state of affairs to another. The description of a conflict is therefore seen as a system of worlds connected by modal relations, with a system of worlds emerging as a reality to be described. This approach makes it possible to describe the processes from the points of view of the participating parties and, at the same time, to reveal their basic attitudes. The main idea of this research is shown by the problems analysed: the description of conflict as methodology; language and behaviour (general problems of semiotic description), the logico-semantic analysis of the notions of "problem and conflict", "Genesis and Chronology", "the recurrent model of the (historical) explanation and interpretation of the conflict". Zolyan used data on the Karabagh conflict to demonstrate the dependence of the structure of semio-cultural codes on current political development and considered post-soviet history as a semio-cultural problem. He sought to consider and reveal the logic of manipulations with history, and proposed the logic of preferences as a possible instrument for achieving compromise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct observations, satellite measurements and paleo records reveal strong variability in the Atlantic subpolar gyre on various time scales. Here we show that variations of comparable amplitude can only be simulated in a coupled climate model in the proximity of a dynamical threshold. The threshold and the associated dynamic response is due to a positive feedback involving increased salt transport in the subpolar gyre and enhanced deep convection in its centre. A series of sensitivity experiments is performed with a coarse resolution ocean general circulation model coupled to a statistical-dynamical atmosphere model which in itself does not produce atmospheric variability. To simulate the impact of atmospheric variability, the model system is perturbed with freshwater forcing of varying, but small amplitude and multi-decadal to centennial periodicities and observational variations in wind stress. While both freshwater and wind-stress-forcing have a small direct effect on the strength of the subpolar gyre, the magnitude of the gyre's response is strongly increased in the vicinity of the threshold. Our results indicate that baroclinic self-amplification in the North Atlantic ocean can play an important role in presently observed SPG variability and thereby North Atlantic climate variability on multi-decadal scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Object-oriented meta-languages such as MOF or EMOF are often used to specify domain specific languages. However, these meta-languages lack the ability to describe behavior or operational semantics. Several approaches used a subset of Java mixed with OCL as executable meta-languages. In this paper, we report our experience of using Smalltalk as an executable and integrated meta-language. We validated this approach in incrementally building over the last decade, Moose, a meta-described reengineering environment. The reflective capabilities of Smalltalk support a uniform way of letting the base developer focus on his tasks while at the same time allowing him to meta-describe his domain model. The advantage of our this approach is that the developer uses the same tools and environment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally agreed that the mechanical environment of intervertebral disc cells plays an important role in maintaining a balanced matrix metabolism. The precise mechanism by which the signals are transduced into the cells is poorly understood. Osmotic changes in the extracellular matrix (ECM) are thought to be involved. Current in-vitro studies on this topic are mostly short-term and show conflicting data on the reaction of disc cells subjected to osmotic changes which is partially due to the heterogenous and often substantially-reduced culture systems. The aim of the study was therefore to investigate the effects of cyclic osmotic loading for 4 weeks on metabolism and matrix gene expression in a full-organ intervertebral disc culture system. Intervertebral disc/endplate units were isolated from New Zealand White Rabbits and cultured either in iso-osmotic media (335 mosmol/kg) or were diurnally exposed for 8 hours to hyper-osmotic conditions (485 mosmol/kg). Cell viability, metabolic activity, matrix composition and matrix gene expression profile (collagen types I/II and aggrecan) were monitored using Live/Dead cell viability assay, tetrazolium reduction test (WST 8), proteoglycan and DNA quantification assays and quantitative PCR. The results show that diurnal osmotic stimulation did not have significant effects on proteoglycan content, cellularity and disc cell viability after 28 days in culture. However, hyperosmolarity caused increased cell death in the early culture phase and counteracted up-regulation of type I collagen gene expression in nucleus and annulus cells. Moreover, the initially decreased cellular dehydrogenase activity recovered with osmotic stimulation after 4 weeks and aggrecan gene down-regulation was delayed, although the latter was not significant according to our statistical criteria. In contrast, collagen type II did not respond to the osmotic changes and was down-regulated in both groups. In conclusion, diurnal hyper-osmotic stimulation of a whole-organ disc/endplate culture partially inhibits a matrix gene expression profile as encountered in degenerative disc disease and counteracts cellular metabolic hypo-activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reconstruction of patient-specific 3D bone surface from 2D calibrated fluoroscopic images and a point distribution model is discussed. We present a 2D/3D reconstruction scheme combining statistical extrapolation and regularized shape deformation with an iterative image-to-model correspondence establishing algorithm, and show its application to reconstruct the surface of proximal femur. The image-to-model correspondence is established using a non-rigid 2D point matching process, which iteratively uses a symmetric injective nearest-neighbor mapping operator and 2D thin-plate splines based deformation to find a fraction of best matched 2D point pairs between features detected from the fluoroscopic images and those extracted from the 3D model. The obtained 2D point pairs are then used to set up a set of 3D point pairs such that we turn a 2D/3D reconstruction problem to a 3D/3D one. We designed and conducted experiments on 11 cadaveric femurs to validate the present reconstruction scheme. An average mean reconstruction error of 1.2 mm was found when two fluoroscopic images were used for each bone. It decreased to 1.0 mm when three fluoroscopic images were used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed models and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated margional residual vector by the Cholesky decomposition of the inverse of the estimated margional variance matrix. The resulting "rotated" residuals are used to construct an empirical cumulative distribution function and pointwise standard errors. The theoretical framework, including conditions and asymptotic properties, involves technical details that are motivated by Lange and Ryan (1989), Pierce (1982), and Randles (1982). Our method appears to work well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series). Our methods can produce satisfactory results even for models that do not satisfy all of the technical conditions stated in our theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Predicting asthma episodes is notoriously difficult but has potentially significant consequences for the individual, as well as for healthcare services. The purpose of this review is to describe recent insights into the prediction of acute asthma episodes in relation to classical clinical, functional or inflammatory variables, as well as present a new concept for evaluating asthma as a dynamically regulated homeokinetic system. RECENT FINDINGS: Risk prediction for asthma episodes or relapse has been attempted using clinical scoring systems, considerations of environmental factors and lung function, as well as inflammatory and immunological markers in induced sputum or exhaled air, and these are summarized here. We have recently proposed that newer mathematical methods derived from statistical physics may be used to understand the complexity of asthma as a homeokinetic, dynamic system consisting of a network comprising multiple components, and also to assess the risk for future asthma episodes based on fluctuation analysis of long time series of lung function. SUMMARY: Apart from the classical analysis of risk factor and functional parameters, this new approach may be used to assess asthma control and treatment effects in the individual as well as in future research trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We establish a fundamental equivalence between singular value decomposition (SVD) and functional principal components analysis (FPCA) models. The constructive relationship allows to deploy the numerical efficiency of SVD to fully estimate the components of FPCA, even for extremely high-dimensional functional objects, such as brain images. As an example, a functional mixed effect model is fitted to high-resolution morphometric (RAVENS) images. The main directions of morphometric variation in brain volumes are identified and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To examine effects of primary care physicians (PCPs) and patients on the association between charges for primary care and specialty care in a point-of-service (POS) health plan. Data Source. Claims from 1996 for 3,308 adult male POS plan members, each of whom was assigned to one of the 50 family practitioner-PCPs with the largest POS plan member-loads. Study Design. A hierarchical multivariate two-part model was fitted using a Gibbs sampler to estimate PCPs' effects on patients' annual charges for two types of services, primary care and specialty care, the associations among PCPs' effects, and within-patient associations between charges for the two services. Adjusted Clinical Groups (ACGs) were used to adjust for case-mix. Principal Findings. PCPs with higher case-mix adjusted rates of specialist use were less likely to see their patients at least once during the year (estimated correlation: –.40; 95% CI: –.71, –.008) and provided fewer services to patients that they saw (estimated correlation: –.53; 95% CI: –.77, –.21). Ten of 11 PCPs whose case-mix adjusted effects on primary care charges were significantly less than or greater than zero (p < .05) had estimated, case-mix adjusted effects on specialty care charges that were of opposite sign (but not significantly different than zero). After adjustment for ACG and PCP effects, the within-patient, estimated odds ratio for any use of primary care given any use of specialty care was .57 (95% CI: .45, .73). Conclusions. PCPs and patients contributed independently to a trade-off between utilization of primary care and specialty care. The trade-off appeared to partially offset significant differences in the amount of care provided by PCPs. These findings were possible because we employed a hierarchical multivariate model rather than separate univariate models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous time series studies have provided strong evidence of an association between increased levels of ambient air pollution and increased levels of hospital admissions, typically at 0, 1, or 2 days after an air pollution episode. An important research aim is to extend existing statistical models so that a more detailed understanding of the time course of hospitalization after exposure to air pollution can be obtained. Information about this time course, combined with prior knowledge about biological mechanisms, could provide the basis for hypotheses concerning the mechanism by which air pollution causes disease. Previous studies have identified two important methodological questions: (1) How can we estimate the shape of the distributed lag between increased air pollution exposure and increased mortality or morbidity? and (2) How should we estimate the cumulative population health risk from short-term exposure to air pollution? Distributed lag models are appropriate tools for estimating air pollution health effects that may be spread over several days. However, estimation for distributed lag models in air pollution and health applications is hampered by the substantial noise in the data and the inherently weak signal that is the target of investigation. We introduce an hierarchical Bayesian distributed lag model that incorporates prior information about the time course of pollution effects and combines information across multiple locations. The model has a connection to penalized spline smoothing using a special type of penalty matrix. We apply the model to estimating the distributed lag between exposure to particulate matter air pollution and hospitalization for cardiovascular and respiratory disease using data from a large United States air pollution and hospitalization database of Medicare enrollees in 94 counties covering the years 1999-2002.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a method for evaluating an ensemble of predictive models given a sample of observations comprising the model predictions and the outcome event measured with error. Our formulation allows us to simultaneously estimate measurement error parameters, true outcome — aka the gold standard — and a relative weighting of the predictive scores. We describe conditions necessary to estimate the gold standard and for these estimates to be calibrated and detail how our approach is related to, but distinct from, standard model combination techniques. We apply our approach to data from a study to evaluate a collection of BRCA1/BRCA2 gene mutation prediction scores. In this example, genotype is measured with error by one or more genetic assays. We estimate true genotype for each individual in the dataset, operating characteristics of the commonly used genotyping procedures and a relative weighting of the scores. Finally, we compare the scores against the gold standard genotype and find that Mendelian scores are, on average, the more refined and better calibrated of those considered and that the comparison is sensitive to measurement error in the gold standard.