876 resultados para 2447: modelling and forecasting
Resumo:
AIM: We investigated the prognostic significance of intraductal carcinoma of the prostate (IDC-P) in biopsies and transurethral resections prior to external beam radiotherapy with or without androgen deprivation. METHODS: Cohort 1 consisted of 118 intermediate risk prostate cancer patients treated by radiotherapy, with biochemical relapse as primary end-point (median follow-up 6.5 years). Cohort 2 consisted of 132 high risk patients, enrolled in a phase III randomised trial (EORTC 22863) comparing radiotherapy alone to radiotherapy with long-term androgen deprivation (LTAD) with clinical progression free survival as primary end-point (median follow-up 9.1 years). Presence of IDC-P was identified after central review. Multivariable regression modelling and Kaplan-Meier analysis were performed with IDC-P as dichotomous variable. RESULTS: IDC-P was a strong prognosticator for early (<36 months) biochemical relapse (HR 7.3; p = 0.007) in cohort 1 and for clinical disease-free survival in both arms of cohort 2 (radiotherapy arm: HR 3.5; p < 0.0001; radiotherapy plus LTAD arm: HR 2.8, p = 0.018). IDC-P retained significance after stratification for reviewed Gleason score in the radiotherapy arm (HR 2.3; p = 0.03). IDC-P was a strong prognosticator for metastatic failure rate (radiotherapy arm: HR 5.3; p < 0.0001; radiotherapy plus LTAD arm: HR 3.6; p = 0.05). CONCLUSIONS: IDC-P in diagnostic samples of patients with intermediate or high risk prostate cancer is an independent prognosticator of early biochemical relapse and metastatic failure rate after radiotherapy. We suggest that the presence of IDC-P in prostate biopsies should routinely be reported.
Resumo:
Schistosomiasis mansoni is not just a physical disease, but is related to social and behavioural factors as well. Snails of the Biomphalaria genus are an intermediate host for Schistosoma mansoni and infect humans through water. The objective of this study is to classify the risk of schistosomiasis in the state of Minas Gerais (MG). We focus on socioeconomic and demographic features, basic sanitation features, the presence of accumulated water bodies, dense vegetation in the summer and winter seasons and related terrain characteristics. We draw on the decision tree approach to infection risk modelling and mapping. The model robustness was properly verified. The main variables that were selected by the procedure included the terrain's water accumulation capacity, temperature extremes and the Human Development Index. In addition, the model was used to generate two maps, one that included risk classification for the entire of MG and another that included classification errors. The resulting map was 62.9% accurate.
Resumo:
Reverse transcriptase (RT) is a multifunctional enzyme in the human immunodeficiency virus (HIV)-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs) and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.
Resumo:
In this project, we have investigated new ways of modelling and analysis of human vasculature from Medical images. The research was divided in two main areas: cerebral vasculature analysis and coronary arteries modeling. Regarding cerebral vasculature analysis, we have studed cerebral aneurysms, internal carotid and the Circle of Willis (CoW). Aneurysms are abnormal vessel enlargements that can rupture causing important cerebral damages or death. The understanding of this pathology, together with its virtual treatment, and image diagnosis and prognosis, includes identification and detailed measurement of the aneurysms. In this context, we have proposed two automatic aneurysm isolation method, to separate the abnormal part of the vessel from the healthy part, to homogenize and speed-up the processing pipeline usually employed to study this pathology, [Cardenes2011TMI, arrabide2011MedPhys]. The results obtained from both methods have been also compared and validatied in [Cardenes2012MBEC]. A second important task here the analysis of the internal carotid [Bogunovic2011Media] and the automatic labelling of the CoW, Bogunovic2011MICCAI, Bogunovic2012TMI]. The second area of research covers the study of coronary arteries, specially coronary bifurcations because there is where the formation of atherosclerotic plaque is more common, and where the intervention is more challenging. Therefore, we proposed a novel modelling method from Computed Tomography Angiography (CTA) images, combined with Conventional Coronary Angiography (CCA), to obtain realistic vascular models of coronary bifurcations, presented in [Cardenes2011MICCAI], and fully validated including phantom experiments in [Cardene2013MedPhys]. The realistic models obtained from this method are being used to simulate stenting procedures, and to investigate the hemodynamic variables in coronary bifurcations in the works submitted in [Morlachi2012, Chiastra2012]. Additionally, another preliminary work has been done to reconstruct the coronary tree from rotational angiography, and published in [Cardenes2012ISBI].
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
A new parameter is introduced: the lightning potential index (LPI), which is a measure of the potential for charge generation and separation that leads to lightning flashes in convective thunderstorms. The LPI is calculated within the charge separation region of clouds between 0 C and 20 C, where the noninductive mechanism involving collisions of ice and graupel particles in the presence of supercooled water is most effective. As shown in several case studies using the Weather Research and Forecasting (WRF) model with explicit microphysics, the LPI is highly correlated with observed lightning. It is suggested that the LPI may be a useful parameter for predicting lightning as well as a tool for improving weather forecasting of convective storms and heavy rainfall.
Resumo:
Interdisciplinary frameworks for studying natural hazards and their temporal trends have an important potential in data generation for risk assessment, land use planning, and therefore the sustainable management of resources. This paper focuses on the adjustments required because of the wide variety of scientific fields involved in the reconstruction and characterisation of flood events for the past 1000 years. The aim of this paper is to describe various methodological aspects of the study of flood events in their historical dimension, including the critical evaluation of old documentary and instrumental sources, flood-event classification and hydraulic modelling, and homogeneity and quality control tests. Standardized criteria for flood classification have been defined and applied to the Isère and Drac floods in France, from 1600 to 1950, and to the Ter, the Llobregat and the Segre floods, in Spain, from 1300 to 1980. The analysis on the Drac and Isère data series from 1600 to the present day showed that extraordinary and catastrophic floods were not distributed uniformly in time. However, the largest floods (general catastrophic floods) were homogeneously distributed in time within the period 1600¿1900. No major flood occurred during the 20th century in these rivers. From 1300 to the present day, no homogeneous behaviour was observed for extraordinary floods in the Spanish rivers. The largest floods were uniformly distributed in time within the period 1300-1900, for the Segre and Ter rivers.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
The present research deals with the review of the analysis and modeling of Swiss franc interest rate curves (IRC) by using unsupervised (SOM, Gaussian Mixtures) and supervised machine (MLP) learning algorithms. IRC are considered as objects embedded into different feature spaces: maturities; maturity-date, parameters of Nelson-Siegel model (NSM). Analysis of NSM parameters and their temporal and clustering structures helps to understand the relevance of model and its potential use for the forecasting. Mapping of IRC in a maturity-date feature space is presented and analyzed for the visualization and forecasting purposes.
Resumo:
The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.
Resumo:
The material presented in the these notes covers the sessions Modelling of electromechanical systems, Passive control theory I and Passive control theory II of the II EURON/GEOPLEX Summer School on Modelling and Control of Complex Dynamical Systems.We start with a general description of what an electromechanical system is from a network modelling point of view. Next, a general formulation in terms of PHDS is introduced, and some of the previous electromechanical systems are rewritten in this formalism. Power converters, which are variable structure systems (VSS), can also be given a PHDS form.We conclude the modelling part of these lectures with a rather complex example, showing the interconnection of subsystems from several domains, namely an arrangement to temporally store the surplus energy in a section of a metropolitan transportation system based on dc motor vehicles, using either arrays of supercapacitors or an electric poweredflywheel. The second part of the lectures addresses control of PHD systems. We first present the idea of control as power connection of a plant and a controller. Next we discuss how to circumvent this obstacle and present the basic ideas of Interconnection and Damping Assignment (IDA) passivity-based control of PHD systems.
Resumo:
The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
This report details the port interconnection of two subsystems: a power electronics subsystem (a back-to-back AC/AC converter (B2B), coupled to a phase of the power grid), and an electromechanical subsystem (a doubly-fed induction machine (DFIM), coupled mechanically to a flywheel and electrically to the power grid and to a local varying load). Both subsystems have been essentially described in previous reports (deliverables D 0.5 and D 4.3.1), although some previously unpublished details are presented here. The B2B is a variable structure system (VSS), due to the presence of control-actuated switches: however from a modelling and simulation, as well as a control-design, point of view, it is sensible to consider modulated transformers (MTF in the bond-graph language) instead of the pairs of complementary switches. The port-Hamiltonian models of both subsystems are presents and coupled through a power-preserving interconnection, and the Hamiltonian description of the whole system is obtained; detailed bond-graphs of all the subsystems and the complete system are provided.
Resumo:
Tämän tutkimustyön kohteena on TietoEnator Oy:n kehittämän Fenix-tietojärjestelmän kapasiteettitarpeen ennustaminen. Työn tavoitteena on tutustua Fenix-järjestelmän eri osa-alueisiin, löytää tapa eritellä ja mallintaa eri osa-alueiden vaikutus järjestelmän kuormitukseen ja selvittää alustavasti mitkä parametrit vaikuttavat kyseisten osa-alueiden luomaan kuormitukseen. Osa tätä työtä on tutkia eri vaihtoehtoja simuloinnille ja selvittää eri vaihtoehtojen soveltuvuus monimutkaisten järjestelmien mallintamiseen. Kerätyn tiedon pohjaltaluodaan järjestelmäntietovaraston kuormitusta kuvaava simulaatiomalli. Hyödyntämällä mallista saatua tietoa ja tuotantojärjestelmästä mitattua tietoa mallia kehitetään vastaamaan yhä lähemmin todellisen järjestelmän toimintaa. Mallista tarkastellaan esimerkiksi simuloitua järjestelmäkuormaa ja jonojen käyttäytymistä. Tuotantojärjestelmästä mitataan eri kuormalähteiden käytösmuutoksia esimerkiksi käyttäjämäärän ja kellonajan suhteessa. Tämän työn tulosten on tarkoitus toimia pohjana myöhemmin tehtävälle jatkotutkimukselle, jossa osa-alueiden parametrisointia tarkennetaan lisää, mallin kykyä kuvata todellista järjestelmää tehostetaanja mallin laajuutta kasvatetaan.
Resumo:
The present dissertation is devoted to the systematic approach to the development of organic toxic and refractory pollutants abatement by chemical decomposition methods in aqueous and gaseous phases. The systematic approach outlines the basic scenario of chemical decomposition process applications with a step-by-step approximation to the most effective result with a predictable outcome for the full-scale application, confirmed by successful experience. The strategy includes the following steps: chemistry studies, reaction kinetic studies in interaction with the mass transfer processes under conditions of different control parameters, contact equipment design and studies, mathematical description of the process for its modelling and simulation, processes integration into treatment technology and its optimisation, and the treatment plant design. The main idea of the systematic approach for oxidation process introduction consists of a search for the most effective combination between the chemical reaction and the treatment device, in which the reaction is supposed to take place. Under this strategy,a knowledge of the reaction pathways, its products, stoichiometry and kinetics is fundamental and, unfortunately, often unavailable from the preliminary knowledge. Therefore, research made in chemistry on novel treatment methods, comprisesnowadays a substantial part of the efforts. Chemical decomposition methods in the aqueous phase include oxidation by ozonation, ozone-associated methods (O3/H2O2, O3/UV, O3/TiO2), Fenton reagent (H2O2/Fe2+/3+) and photocatalytic oxidation (PCO). In the gaseous phase, PCO and catalytic hydrolysis over zero valent ironsare developed. The experimental studies within the described methodology involve aqueous phase oxidation of natural organic matter (NOM) of potable water, phenolic and aromatic amino compounds, ethylene glycol and its derivatives as de-icing agents, and oxygenated motor fuel additives ¿ methyl tert-butyl ether (MTBE) ¿ in leachates and polluted groundwater. Gas-phase chemical decomposition includes PCO of volatile organic compounds and dechlorination of chlorinated methane derivatives. The results of the research summarised here are presented in fifteenattachments (publications and papers submitted for publication and under preparation).