976 resultados para Models validation
Resumo:
Summary: Lipophilicity plays an important role in the determination and the comprehension of the pharmacokinetic behavior of drugs. It is usually expressed by the partition coefficient (log P) in the n-octanol/water system. The use of an additional solvent system (1,2-dichlorethane/water) is necessary to obtain complementary information, as the log Poct values alone are not sufficient to explain ail biological properties. The aim of this thesis is to develop tools allowing to predict lipophilicity of new drugs and to analyze the information yielded by those log P values. Part I presents the development of theoretical models used to predict lipophilicity. Chapter 2 shows the necessity to extend the existing solvatochromic analyses in order to predict correctly the lipophilicity of new and complex neutral compounds. In Chapter 3, solvatochromic analyses are used to develop a model for the prediction of the lipophilicity of ions. A global model was obtained allowing to estimate the lipophilicity of neutral, anionic and cationic solutes. Part II presents the detailed study of two physicochemical filters. Chapter 4 shows that the Discovery RP Amide C16 stationary phase allows to estimate lipophilicity of the neutral form of basic and acidic solutes, except of lipophilic acidic solutes. Those solutes present additional interactions with this particular stationary phase. In Chapter 5, 4 different IANI stationary phases are investigated. For neutral solutes, linear data are obtained whatever the IANI column used. For the ionized solutes, their retention is due to a balance of electrostatic and hydrophobie interactions. Thus no discrimination is observed between different series of solutes bearing the same charge, from one column to an other. Part III presents two examples illustrating the information obtained thanks to Structure-Properties Relationships (SPR). Comparing graphically lipophilicity values obtained in two different solvent systems allows to reveal the presence of intramolecular effects .such as internai H-bond (Chapter 6). SPR is used to study the partitioning of ionizable groups encountered in Medicinal Chemistry (Chapter7). Résumé La lipophilie joue un .rôle important dans la détermination et la compréhension du comportement pharmacocinétique des médicaments. Elle est généralement exprimée par le coefficient de partage (log P) d'un composé dans le système de solvants n-octanol/eau. L'utilisation d'un deuxième système de solvants (1,2-dichloroéthane/eau) s'est avérée nécessaire afin d'obtenir des informations complémentaires, les valeurs de log Poct seules n'étant pas suffisantes pour expliquer toutes les propriétés biologiques. Le but de cette thèse est de développer des outils permettant de prédire la lipophilie de nouveaux candidats médicaments et d'analyser l'information fournie par les valeurs de log P. La Partie I présente le développement de modèles théoriques utilisés pour prédire la lipophilie. Le chapitre 2 montre la nécessité de mettre à jour les analyses solvatochromiques existantes mais inadaptées à la prédiction de la lipophilie de nouveaux composés neutres. Dans le chapitre 3, la même méthodologie des analyses solvatochromiques est utilisée pour développer un modèle permettant de prédire la lipophilie des ions. Le modèle global obtenu permet la prédiction de la lipophilie de composés neutres, anioniques et cationiques. La Partie II présente l'étude approfondie de deux filtres physicochimiques. Le Chapitre 4 montre que la phase stationnaire Discovery RP Amide C16 permet la détermination de la lipophilie de la forme neutre de composés basiques et acides, à l'exception des acides très lipophiles. Ces derniers présentent des interactions supplémentaires avec cette phase stationnaire. Dans le Chapitre 5, 4 phases stationnaires IAM sont étudiées. Pour les composés neutres étudiés, des valeurs de rétention linéaires sont obtenues, quelque que soit la colonne IAM utilisée. Pour les composés ionisables, leur rétention est due à une balance entre des interactions électrostatiques et hydrophobes. Donc aucune discrimination n'est observée entre les différentes séries de composés portant la même charge d'une colonne à l'autre. La Partie III présente deux exemples illustrant les informations obtenues par l'utilisation des relations structures-propriétés. Comparer graphiquement la lipophilie mesurée dans deux différents systèmes de solvants permet de mettre en évidence la présence d'effets intramoléculaires tels que les liaisons hydrogène intramoléculaires (Chapitre 6). Cette approche des relations structures-propriétés est aussi appliquée à l'étude du partage de fonctions ionisables rencontrées en Chimie Thérapeutique (Chapitre 7) Résumé large public Pour exercer son effet thérapeutique, un médicament doit atteindre son site d'action en quantité suffisante. La quantité effective de médicament atteignant le site d'action dépend du nombre d'interactions entre le médicament et de nombreux constituants de l'organisme comme, par exemple, les enzymes du métabolisme ou les membranes biologiques. Le passage du médicament à travers ces membranes, appelé perméation, est un paramètre important à optimiser pour développer des médicaments plus puissants. La lipophilie joue un rôle clé dans la compréhension de la perméation passive des médicaments. La lipophilie est généralement exprimée par le coefficient de partage (log P) dans le système de solvants (non miscibles) n-octanol/eau. Les valeurs de log Poct seules se sont avérées insuffisantes pour expliquer la perméation à travers toutes les différentes membranes biologiques du corps humain. L'utilisation d'un système de solvants additionnel (le système 1,2-dichloroéthane/eau) a permis d'obtenir les informations complémentaires indispensables à une bonne compréhension du processus de perméation. Un grand nombre d'outils expérimentaux et théoriques sont à disposition pour étudier la lipophilie. Ce travail de thèse se focalise principalement sur le développement ou l'amélioration de certains de ces outils pour permettre leur application à un champ plus large de composés. Voici une brève description de deux de ces outils: 1)La factorisation de la lipophilie en fonction de certaines propriétés structurelles (telle que le volume) propres aux composés permet de développer des modèles théoriques utilisables pour la prédiction de la lipophilie de nouveaux composés ou médicaments. Cette approche est appliquée à l'analyse de la lipophilie de composés neutres ainsi qu'à la lipophilie de composés chargés. 2)La chromatographie liquide à haute pression sur phase inverse (RP-HPLC) est une méthode couramment utilisée pour la détermination expérimentale des valeurs de log Poct.
Resumo:
A wide range of numerical models and tools have been developed over the last decades to support the decision making process in environmental applications, ranging from physical models to a variety of statistically-based methods. In this study, a landslide susceptibility map of a part of Three Gorges Reservoir region of China was produced, employing binary logistic regression analyses. The available information includes the digital elevation model of the region, geological map and different GIS layers including land cover data obtained from satellite imagery. The landslides were observed and documented during the field studies. The validation analysis is exploited to investigate the quality of mapping.
Resumo:
Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.
Resumo:
OBJECTIVE: To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. DESIGN: Validation study using data from cross-sectional survey. PARTICIPANTS: A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). MAIN OUTCOME MEASURE: French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. RESULTS: The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. CONCLUSIONS: Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described.
Resumo:
The objective of this paper is to compare the performance of twopredictive radiological models, logistic regression (LR) and neural network (NN), with five different resampling methods. One hundred and sixty-seven patients with proven calvarial lesions as the only known disease were enrolled. Clinical and CT data were used for LR and NN models. Both models were developed with cross validation, leave-one-out and three different bootstrap algorithms. The final results of each model were compared with error rate and the area under receiver operating characteristic curves (Az). The neural network obtained statistically higher Az than LR with cross validation. The remaining resampling validation methods did not reveal statistically significant differences between LR and NN rules. The neural network classifier performs better than the one based on logistic regression. This advantage is well detected by three-fold cross-validation, but remains unnoticed when leave-one-out or bootstrap algorithms are used.
Resumo:
Cells of epithelial origin, e.g. from breast and prostate cancers, effectively differentiate into complex multicellular structures when cultured in three-dimensions (3D) instead of conventional two-dimensional (2D) adherent surfaces. The spectrum of different organotypic morphologies is highly dependent on the culture environment that can be either non-adherent or scaffold-based. When embedded in physiological extracellular matrices (ECMs), such as laminin-rich basement membrane extracts, normal epithelial cells differentiate into acinar spheroids reminiscent of glandular ductal structures. Transformed cancer cells, in contrast, typically fail to undergo acinar morphogenic patterns, forming poorly differentiated or invasive multicellular structures. The 3D cancer spheroids are widely accepted to better recapitulate various tumorigenic processes and drug responses. So far, however, 3D models have been employed predominantly in the Academia, whereas the pharmaceutical industry has yet to adopt a more widely and routine use. This is mainly due to poor characterisation of cell models, lack of standardised workflows and high throughput cell culture platforms, and the availability of proper readout and quantification tools. In this thesis, a complete workflow has been established entailing well-characterised 3D cell culture models for prostate cancer, a standardised 3D cell culture routine based on high-throughput-ready platform, automated image acquisition with concomitant morphometric image analysis, and data visualisation, in order to enable large-scale high-content screens. Our integrated suite of software and statistical analysis tools were optimised and validated using a comprehensive panel of prostate cancer cell lines and 3D models. The tools quantify multiple key cancer-relevant morphological features, ranging from cancer cell invasion through multicellular differentiation to growth, and detect dynamic changes both in morphology and function, such as cell death and apoptosis, in response to experimental perturbations including RNA interference and small molecule inhibitors. Our panel of cell lines included many non-transformed and most currently available classic prostate cancer cell lines, which were characterised for their morphogenetic properties in 3D laminin-rich ECM. The phenotypes and gene expression profiles were evaluated concerning their relevance for pre-clinical drug discovery, disease modelling and basic research. In addition, a spontaneous model for invasive transformation was discovered, displaying a highdegree of epithelial plasticity. This plasticity is mediated by an abundant bioactive serum lipid, lysophosphatidic acid (LPA), and its receptor LPAR1. The invasive transformation was caused by abrupt cytoskeletal rearrangement through impaired G protein alpha 12/13 and RhoA/ROCK, and mediated by upregulated adenylyl cyclase/cyclic AMP (cAMP)/protein kinase A, and Rac/ PAK pathways. The spontaneous invasion model tangibly exemplifies the biological relevance of organotypic cell culture models. Overall, this thesis work underlines the power of novel morphometric screening tools in drug discovery.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.
Resumo:
The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.
Resumo:
Evaluating CCMs with the presented framework will increase our confidence in predictions of stratospheric ozone change.
Resumo:
As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions. © 2011 Brazilian Operations Research Society.
Resumo:
Numerical modeling of the interaction among waves and coastal structures is a challenge due to the many nonlinear phenomena involved, such as, wave propagation, wave transformation with water depth, interaction among incident and reflected waves, run-up / run-down and wave overtopping. Numerical models based on Lagrangian formulation, like SPH (Smoothed Particle Hydrodynamics), allow simulating complex free surface flows. The validation of these numerical models is essential, but comparing numerical results with experimental data is not an easy task. In the present paper, two SPH numerical models, SPHysics LNEC and SPH UNESP, are validated comparing the numerical results of waves interacting with a vertical breakwater, with data obtained in physical model tests made in one of the LNEC's flume. To achieve this validation, the experimental set-up is determined to be compatible with the Characteristics of the numerical models. Therefore, the flume dimensions are exactly the same for numerical and physical model and incident wave characteristics are identical, which allows determining the accuracy of the numerical models, particularly regarding two complex phenomena: wave-breaking and impact loads on the breakwater. It is shown that partial renormalization, i.e. renormalization applied only for particles near the structure, seems to be a promising compromise and an original method that allows simultaneously propagating waves, without diffusion, and modeling accurately the pressure field near the structure.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
This paper presents the offorts to calculate the geoid model for Brazil. It is limited by 6 degrees N and 35 degrees S in latitude and 30 degrees W and 75 degrees W in longitude. The terrestrial gravity data for the continent have been updated by means of the most recent surveys in Brazil and in the neighbour countries. The complete Bouguer and Helmert gravity anomalies have been derived through the Canadian package SHGEO. The short wavelength component was estimated via FFT. The geopotential model EGM2008 was used as a reference field restricted to degree and order 150. The model was validated over 844 GPS observations on Bench Marks of the spirit leveling network. The height anomalies plus a topography dependent correction term derived from EGM2008 (degree 2190 and order 2159), GO_CONS_GCF_2_DIR_R2 (degree and order 240), GOCO02S (degree and order 250), EIGEN 51C (degree and order 359) and EIGEN 6C (degree and order 1420), geoidal height derived from MAPGEO2004 (old official geoid model in Brazil) have also been compared to the GPS points on Bench Marks.