946 resultados para Predictive Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large fraction of genome variation between individuals is comprised of submicroscopic copy number variation of genomic DNA segments. We assessed the relative contribution of structural changes and gene dosage alterations on phenotypic outcomes with mouse models of Smith-Magenis and Potocki-Lupski syndromes. We phenotyped mice with 1n (Deletion/+), 2n (+/+), 3n (Duplication/+), and balanced 2n compound heterozygous (Deletion/Duplication) copies of the same region. Parallel to the observations made in humans, such variation in gene copy number was sufficient to generate phenotypic consequences: in a number of cases diametrically opposing phenotypes were associated with gain versus loss of gene content. Surprisingly, some neurobehavioral traits were not rescued by restoration of the normal gene copy number. Transcriptome profiling showed that a highly significant propensity of transcriptional changes map to the engineered interval in the five assessed tissues. A statistically significant overrepresentation of the genes mapping to the entire length of the engineered chromosome was also found in the top-ranked differentially expressed genes in the mice containing rearranged chromosomes, regardless of the nature of the rearrangement, an observation robust across different cell lineages of the central nervous system. Our data indicate that a structural change at a given position of the human genome may affect not only locus and adjacent gene expression but also "genome regulation." Furthermore, structural change can cause the same perturbation in particular pathways regardless of gene dosage. Thus, the presence of a genomic structural change, as well as gene dosage imbalance, contributes to the ultimate phenotype.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Divergent and convergent margins actualistic models are reviewed and applied to the history of the western Alps. Tethyan rifting history and geometry are analyzed: the northern European margin is considered as an upper plate whereas the southern Apulian margin is a lower plate; the Breche basin is regarded as the former break-away trough; the internal Brianconnais domain represents the northern rift shoulder whilst the more external domains are regarded as the infill of a complex rim basin locally affected by important extension (Valaisan and Vocontain trough). The Schistes lustres and ophiolites of the Tsate nappe are compared to an accretionary prism: the imbrication of this nappe elements is regarded as a direct consequence of the accretionary phenomena already active in early Cretaceous; the Gets/Simme complex could orginate from a more internal part of the accretionary prism. Some eclogitic basements represent the former Apulian margin substratum (Sesia) others (Mont-Rose) are interpreted as the former edge of the European margin. The history of the closing Tethyan domain is analyzed and the remaining problems concerning the cinematics, the presence/absence of a volcanic arc and the eoalpine metamorphism are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In automobile insurance, it is useful to achieve a priori ratemaking by resorting to gene- ralized linear models, and here the Poisson regression model constitutes the most widely accepted basis. However, insurance companies distinguish between claims with or without bodily injuries, or claims with full or partial liability of the insured driver. This paper exa- mines an a priori ratemaking procedure when including two di®erent types of claim. When assuming independence between claim types, the premium can be obtained by summing the premiums for each type of guarantee and is dependent on the rating factors chosen. If the independence assumption is relaxed, then it is unclear as to how the tari® system might be a®ected. In order to answer this question, bivariate Poisson regression models, suitable for paired count data exhibiting correlation, are introduced. It is shown that the usual independence assumption is unrealistic here. These models are applied to an automobile insurance claims database containing 80,994 contracts belonging to a Spanish insurance company. Finally, the consequences for pure and loaded premiums when the independence assumption is relaxed by using a bivariate Poisson regression model are analysed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed at identifying clinical factors for predicting hematologic toxicity after radioimmunotherapy with (90)Y-ibritumomab tiuxetan or (131)I-tositumomab in clinical practice. Hematologic data were available from 14 non-Hodgkin lymphoma patients treated with (90)Y-ibritumomab tiuxetan and 18 who received (131)I-tositumomab. The percentage baseline at nadir and 4 wk post nadir and the time to nadir were selected as the toxicity indicators for both platelets and neutrophils. Multiple linear regression analysis was performed to identify significant predictors (P < 0.05) of each indicator. For both platelets and neutrophils, pooled and separate analyses of (90)Y-ibritumomab tiuxetan and (131)I-tositumomab data yielded the time elapsed since the last chemotherapy as the only significant predictor of the percentage baseline at nadir. The extent of bone marrow involvement was not a significant factor in this study, possibly because of the short time elapsed since the last chemotherapy of the 7 patients with bone marrow involvement. Because both treatments were designed to deliver a comparable bone marrow dose, this factor also was not significant. None of the 14 factors considered was predictive of the time to nadir. The R(2) value for the model predicting percentage baseline at nadir was 0.60 for platelets and 0.40 for neutrophils. This model predicted the platelet and neutrophil toxicity grade to within ±1 for 28 and 30 of the 32 patients, respectively. For the 7 patients predicted with grade I thrombocytopenia, 6 of whom had actual grade I-II, dosing might be increased to improve treatment efficacy. The elapsed time since the last chemotherapy can be used to predict hematologic toxicity and customize the current dosing method in radioimmunotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Difficult tracheal intubation assessment is an important research topic in anesthesia as failed intubations are important causes of mortality in anesthetic practice. The modified Mallampati score is widely used, alone or in conjunction with other criteria, to predict the difficulty of intubation. This work presents an automatic method to assess the modified Mallampati score from an image of a patient with the mouth wide open. For this purpose we propose an active appearance models (AAM) based method and use linear support vector machines (SVM) to select a subset of relevant features obtained using the AAM. This feature selection step proves to be essential as it improves drastically the performance of classification, which is obtained using SVM with RBF kernel and majority voting. We test our method on images of 100 patients undergoing elective surgery and achieve 97.9% accuracy in the leave-one-out crossvalidation test and provide a key element to an automatic difficult intubation assessment system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estudi elaborat a partir d’una estada a l’Institut National d'Histoire de l'Art- Bibliothèque Nationale de France entre l'1 i el 31 de juliol de 2007. S’ha treballat en la recerca documental sobre les relacions artístiques entre França i Catalunya a l’Època Moderna. Els materials o fons documentals d’interès han estat dos: els fons gràfics d’estampes i gravats de la Bibliothèque Nationale, no tant en el sentit de consulta dels originals –que en alguna ocasió també- com sí en el la visualització de les vastíssimes fototeques, que ha permès a l’autor aplegar un bon nombre d’imatges que en el futur serviran per posar en relació la cultura figurativa francesa –sobretot de la pintura i de l’escultura, però també de la tractadística arquitectònica- amb la catalana de l’època, ja sigui per constatar les semblances com per fer notar les diferències en els usos dels models figuratius. S’ha buidat material bibliogràfic difícil de localitzar a Catalunya. Són publicacions referides a gravat, d’una banda, i a patrimoni artístic. Aquest darrer aspecte s’ha treballat des de dos vessants: en seguir notícies de la presència d’artistes catalans a França i viceversa, i en buscar dades sobre l’espoli d’obres d’art portat a terme a Catalunya durant el període napoleònic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To confirm the accuracy of sentinel node biopsy (SNB) procedure and its morbidity, and to investigate predictive factors for SN status and prognostic factors for disease-free survival (DFS) and disease-specific survival (DSS). MATERIALS AND METHODS: Between October 1997 and December 2004, 327 consecutive patients in one centre with clinically node-negative primary skin melanoma underwent an SNB by the triple technique, i.e. lymphoscintigraphy, blue-dye and gamma-probe. Multivariate logistic regression analyses as well as the Kaplan-Meier were performed. RESULTS: Twenty-three percent of the patients had at least one metastatic SN, which was significantly associated with Breslow thickness (p<0.001). The success rate of SNB was 99.1% and its morbidity was 7.6%. With a median follow-up of 33 months, the 5-year DFS/DSS were 43%/49% for patients with positive SN and 83.5%/87.4% for patients with negative SN, respectively. The false-negative rate of SNB was 8.6% and sensitivity 91.4%. On multivariate analysis, DFS was significantly worsened by Breslow thickness (RR=5.6, p<0.001), positive SN (RR=5.0, p<0.001) and male sex (RR=2.9, p=0.001). The presence of a metastatic SN (RR=8.4, p<0.001), male sex (RR=6.1, p<0.001), Breslow thickness (RR=3.2, p=0.013) and ulceration (RR=2.6, p=0.015) were significantly associated with a poorer DSS. CONCLUSION: SNB is a reliable procedure with high sensitivity (91.4%) and low morbidity. Breslow thickness was the only statistically significant parameter predictive of SN status. DFS was worsened in decreasing order by Breslow thickness, metastatic SN and male gender. Similarly DSS was significantly worsened by a metastatic SN, male gender, Breslow thickness and ulceration. These data reinforce the SN status as a powerful staging procedure

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.