941 resultados para Log-linear model
Resumo:
OBJECTIVE: (1) To quantify wear of two different denture tooth materials in vivo with two study designs, (2) to relate tooth variables to vertical loss. METHODS: Two different denture tooth materials had been used (experimental material=test; DCL=control). In study 1 (split-mouth, 6 test centers) 60 subjects received complete dentures, in study 2 (two-arm, 1 test center) 29 subjects. In study 1 the mandibular dentures were supported by implants in 33% of the subjects, in study 2 only in 3% of the subjects. Impressions of the dentures were taken and poured with improved stone at baseline and after 6, 12, 18 and 24 months. Each operator evaluated the wear subjectively. Wear analysis was carried out with a laser scanning device. Maximal vertical loss of the attrition zones was calculated for each tooth cusp and tooth. A mixed linear model was used to statistically analyse the logarithmically transformed wear data. RESULTS: Due to drop-outs and unmatchable casts, only 47 subjects of study 1 and 14 of study 2 completed the 2-year recall. Overall, 75% of all teeth present could be analysed. There was no statistically difference in the overall wear between the test and control material for either study 1 or study 2. The relative increase in wear over time was similar in both study designs. However, a strong subject effect and center effect were observed. The fixed factors included in the model (time, tooth, center, etc.) accounted for 43% of the variability, whereas the random subject effect accounted for another 30% of the variability, leaving about 28% of unexplained variability. More wear was consistently recorded in the maxillary teeth compared to the mandibular teeth and in the first molar teeth compared to the premolar teeth and the second molars. Likewise, the supporting cusps showed more wear than the non-supporting cusps. The amount of wear did not depend on whether or not the lower dentures were supported by implants. The subjective wear was correct in about 67% of the cases if it is postulated that a wear difference of 100μm should be subjectively detectable. SIGNIFICANCE: The clinical wear of denture teeth is highly variable with a strong patient effect. More wear can be expected in maxillary denture teeth compared to mandibular teeth, first molars compared to premolars and supported cusps compared to non-supported cusps. Laboratory data on the wear of denture tooth materials may not be confirmed in well-structured clinical trials probably due to the large inter-individual variability.
Resumo:
1. Few examples of habitat-modelling studies of rare and endangered species exist in the literature, although from a conservation perspective predicting their distribution would prove particularly useful. Paucity of data and lack of valid absences are the probable reasons for this shortcoming. Analytic solutions to accommodate the lack of absence include the ecological niche factor analysis (ENFA) and the use of generalized linear models (GLM) with simulated pseudo-absences. 2. In this study we tested a new approach to generating pseudo-absences, based on a preliminary ENFA habitat suitability (HS) map, for the endangered species Eryngium alpinum. This method of generating pseudo-absences was compared with two others: (i) use of a GLM with pseudo-absences generated totally at random, and (ii) use of an ENFA only. 3. The influence of two different spatial resolutions (i.e. grain) was also assessed for tackling the dilemma of quality (grain) vs. quantity (number of occurrences). Each combination of the three above-mentioned methods with the two grains generated a distinct HS map. 4. Four evaluation measures were used for comparing these HS maps: total deviance explained, best kappa, Gini coefficient and minimal predicted area (MPA). The last is a new evaluation criterion proposed in this study. 5. Results showed that (i) GLM models using ENFA-weighted pseudo-absence provide better results, except for the MPA value, and that (ii) quality (spatial resolution and locational accuracy) of the data appears to be more important than quantity (number of occurrences). Furthermore, the proposed MPA value is suggested as a useful measure of model evaluation when used to complement classical statistical measures. 6. Synthesis and applications. We suggest that the use of ENFA-weighted pseudo-absence is a possible way to enhance the quality of GLM-based potential distribution maps and that data quality (i.e. spatial resolution) prevails over quantity (i.e. number of data). Increased accuracy of potential distribution maps could help to define better suitable areas for species protection and reintroduction.
Resumo:
Experimental research has identified many putative agents of amphibian decline, yet the population-level consequences of these agents remain unknown, owing to lack of information on compensatory density dependence in natural populations. Here, we investigate the relative importance of intrinsic (density-dependent) and extrinsic (climatic) factors impacting the dynamics of a tree frog (Hyla arborea) population over 22 years. A combination of log-linear density dependence and rainfall (with a 2-year time lag corresponding to development time) explain 75% of the variance in the rate of increase. Such fluctuations around a variable return point might be responsible for the seemingly erratic demography and disequilibrium dynamics of many amphibian populations.
Resumo:
OBJECTIVE: Palliative sedation is a last resort medical act aimed at relieving intolerable suffering induced by intractable symptoms in patients at the end-of-life. This act is generally accepted as being medically indicated under certain circumstances. A controversy remains in the literature as to its ethical validity. There is a certain vagueness in the literature regarding the legitimacy of palliative sedation in cases of non-physical refractory symptoms, especially "existential suffering." This pilot study aims to measure the influence of two independent variables (short/long prognosis and physical/existential suffering) on the physicians' attitudes toward palliative sedation (dependent variable). METHODS: We used a 2 × 2 experimental design as described by Blondeau et al. Four clinical vignettes were developed (vignette 1: short prognosis/existential suffering; vignette 2: long prognosis/existential suffering; vignette 3: short prognosis/physical suffering; vignette 4: long prognosis/physical suffering). Each vignette presented a terminally ill patient with a summary description of his physical and psychological condition, medication, and family situation. The respondents' attitude towards sedation was assessed with a six-point Likert scale. A total of 240 vignettes were sent to selected Swiss physicians. RESULTS: 74 vignettes were completed (36%). The means scores for attitudes were 2.62 ± 2.06 (v1), 1.88 ± 1.54 (v2), 4.54 ± 1.67 (v3), and 4.75 ± 1.71 (v4). General linear model analyses indicated that only the type of suffering had a significant impact on the attitude towards sedation (F = 33.92, df = 1, p = 0.000). Significance of the results: The French Swiss physicians' attitude toward palliative sedation is more favorable in case of physical suffering than in existential suffering. These results are in line with those found in the study of Blondeau et al. with Canadian physicians and will be discussed in light of the arguments given by physicians to explain their decisions.
Resumo:
Conventional chemotherapy of ovarian cancer often fails because of initiation of drug resistance and/or side effects and trace of untouched remaining cancerous cells. This highlights an urgent need for advanced targeted therapies for effective remediation of the disease using a cytotoxic agent with immunomodulatory effects, such as shikonin (SHK). Based on preliminary experiments, we found SHK to be profoundly toxic in ovarian epithelial cancer cells (OVCAR-5 and ID8 cells) as well as in normal ovarian IOSE-398 cells, endothelial MS1 cells, and lymphocytes. To limit its cytotoxic impact solely to tumor cells within the tumor microenvironment (TME), we aimed to engineer SHK as polymeric nanoparticles (NPs) with targeting moiety toward tumor microvasculature. To this end, using single/double emulsion solvent evaporation/diffusion technique with sonication, we formulated biodegradable NPs of poly(lactic-co-glycolic acid) (PLGA) loaded with SHK. The surface of NPs was further decorated with solubilizing agent polyethylene glycol (PEG) and tumor endothelial marker 1 (TEM1)/endosialin-targeting antibody (Ab) through carbodiimide/N-hydroxysuccinimide chemistry. Having characterized the physicochemical and morphological properties of NPs, we studied their drug-release profiles using various kinetic models. The biological impact of NPs was also evaluated in tumor-associated endothelial MS1 cells, primary lymphocytes, and epithelial ovarian cancer OVCAR-5 cells. Based on particle size analysis and electron microscopy, the engineered NPs showed a smooth spherical shape with size range of 120 to 250 nm and zeta potential value of -30 to -40 mV. Drug entrapment efficiency was ~80%-90%, which was reduced to ~50%-60% upon surface decoration with PEG and Ab. The liberation of SHK from NPs showed a sustained-release profile that was best fitted with Wagner log-probability model. Fluorescence microscopy and flow cytometry analysis showed active interaction of Ab-armed NPs with TEM1-positive MS1 cells, but not with TEM1-negative MS1 cells. While exposure of the PEGylated NPs for 2 hours was not toxic to lymphocytes, long-term exposure of the Ab-armed and PEGylated NPs was significantly toxic to TEM1-positive MS1 cells and OVCAR-5 cells. Based on these findings, we propose SHK-loaded Ab-armed PEGylated PLGA NPs as a novel nanomedicine for targeted therapy of solid tumors.
Resumo:
The objective of this work was to evaluate the effects of single-nucleotide polymorphisms (SNPs) in the genes IGF1 (AF_017143.1:g.198C>T), MSTN (AF_320998.1:g.433C>A), MYOD1 (NC_007313:g.1274A>G) and MYF5 (NC_007303:g.1911A>G) on carcass and meat traits in Nelore (Bos indicus) and Nelore x B. taurus. A total of 300 animals were genotyped and phenotyped for rib eye area (REA), backfat thickness (BT), intramuscular fat (IF), shear force (SF) and myofibrillar fragmentation index (MFI). The effects of allele substitution for each SNP were estimated by regression of the evaluated phenotypes on the number of copies of a particular allele using the general linear model. The polymorphism at IGF1 was non-informative in Nelore animals. In crossbred animals, the IGF1 C allele was associated with greater REA. However, this relation was not significant after Bonferroni correction for multiple testing. The A allele of the MSTN polymorphism was absent in Nelore cattle and was only found in two crossbred animals. The polymorphisms of MYOD1 and MYF5 were little informative in Nelore animals with G allele frequency of 0.097 and A allele frequency of 0.031, respectively. These markers show no association with the analyzed traits in the total sample of evaluated animals.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.
Resumo:
PURPOSE: Incisional hernia (IH) is one of the most frequent postoperative complications. Of all patients undergoing IH repair, a vast amount have a hernia which can be defined as a large incisional hernia (LIH). The aim of this study is to identify the preferred technique for LIH repair. METHODS: A systematic review of the literature was performed and studies describing patients with IH with a diameter of 10 cm or a surface of 100 cm2 or more were included. Recurrence hazards per year were calculated for all techniques using a generalized linear model. RESULTS: Fifty-five articles were included, containing 3,945 LIH repairs. Mesh reinforced techniques displayed better recurrence rates and hazards than techniques without mesh reinforcement. Of all the mesh techniques, sublay repair, sandwich technique with sublay mesh and aponeuroplasty with intraperitoneal mesh displayed the best results (recurrence rates of <3.6%, recurrence hazard <0.5% per year). Wound complications were frequent and most often seen after complex LIH repair. CONCLUSIONS: The use of mesh during LIH repair displayed the best recurrence rates and hazards. If possible mesh in sublay position should be used in cases of LIH repair.
Resumo:
Tässä työssä tutkitaan ohjelmistoarkkitehtuurisuunnitteluominaisuuksien vaikutusta erään client-server –arkkitehtuuriin perustuvan mobiilipalvelusovelluksen suunnittelu- ja toteutusaikaan. Kyseinen tutkimus perustuu reaalielämän projektiin, jonka kvalitatiivinen analyysi paljasti arkkitehtuurikompponenttien välisten kytkentöjen merkittävästi vaikuttavan projektin työmäärään. Työn päätavoite oli kvantitatiivisesti tutkia yllä mainitun havainnon oikeellisuus. Tavoitteen saavuttamiseksi suunniteltiin ohjelmistoarkkitehtuurisuunnittelun mittaristo kuvaamaan kyseisen järjestelmän alijärjestelmien arkkitehtuuria ja luotiin kaksi suunniteltua mittaristoa käyttävää, työmäärää (komponentin suunnittelu-, toteutus- ja testausaikojen summa) arvioivaa mallia, joista toinen on lineaarinen ja toinen epälineaarinen. Näiden mallien kertoimet sovitettiin optimoimalla niiden arvot epälineaarista gloobaalioptimointimenetelmää, differentiaalievoluutioalgoritmia, käyttäen, niin että mallien antamat arvot vastasivat parhaiten mitattua työmäärää sekä kaikilla ominaisuuksilla eli attribuuteilla että vain osalla niistä (yksi jätettiin vuorotellen pois). Kun arkkitehtuurikompenttien väliset kytkennät jätettiin malleista pois, mitattujen ja arvoitujen työmäärien välinen ero (ilmaistuna virheenä) kasvoi eräässä tapauksessa 367 % entisestä tarkoittaen sitä, että näin muodostettu malli vastasi toteutusaikoja huonosti annetulla ainestolla. Tämä oli suurin havaitu virhe kaikkien poisjätettyjen ominaisuuksien kesken. Saadun tuloksen perusteella päätettiin, että kyseisen järjestelmän toteutusajat ovat vahvasti riippuvaisia kytkentöjen määrästä, ja näin ollen kytkentöjen määrä oli mitä todennäköisemmin kaikista tärkein työmäärään vaikuttava tekijä tutkitun järjestelmän arkkitehtuurisuunnittelussa.
Resumo:
OBJECTIVES: The aim of this study was to investigate pathological mechanisms underlying brain tissue alterations in mild cognitive impairment (MCI) using multi-contrast 3 T magnetic resonance imaging (MRI). METHODS: Forty-two MCI patients and 77 healthy controls (HC) underwent T1/T2* relaxometry as well as Magnetization Transfer (MT) MRI. Between-groups comparisons in MRI metrics were performed using permutation-based tests. Using MRI data, a generalized linear model (GLM) was computed to predict clinical performance and a support-vector machine (SVM) classification was used to classify MCI and HC subjects. RESULTS: Multi-parametric MRI data showed microstructural brain alterations in MCI patients vs HC that might be interpreted as: (i) a broad loss of myelin/cellular proteins and tissue microstructure in the hippocampus (p ≤ 0.01) and global white matter (p < 0.05); and (ii) iron accumulation in the pallidus nucleus (p ≤ 0.05). MRI metrics accurately predicted memory and executive performances in patients (p ≤ 0.005). SVM classification reached an accuracy of 75% to separate MCI and HC, and performed best using both volumes and T1/T2*/MT metrics. CONCLUSION: Multi-contrast MRI appears to be a promising approach to infer pathophysiological mechanisms leading to brain tissue alterations in MCI. Likewise, parametric MRI data provide powerful correlates of cognitive deficits and improve automatic disease classification based on morphometric features.
Resumo:
INTRODUCTION: Local microstructural pathology in multiple sclerosis patients might influence their clinical performance. This study applied multicontrast MRI to quantify inflammation and neurodegeneration in MS lesions. We explored the impact of MRI-based lesion pathology in cognition and disability. METHODS: 36 relapsing-remitting MS subjects and 18 healthy controls underwent neurological, cognitive, behavioural examinations and 3 T MRI including (i) fluid attenuated inversion recovery, double inversion recovery, and magnetization-prepared gradient echo for lesion count; (ii) T1, T2, and T2(*) relaxometry and magnetisation transfer imaging for lesion tissue characterization. Lesions were classified according to the extent of inflammation/neurodegeneration. A generalized linear model assessed the contribution of lesion groups to clinical performances. RESULTS: Four lesion groups were identified and characterized by (1) absence of significant alterations, (2) prevalent inflammation, (3) concomitant inflammation and microdegeneration, and (4) prevalent tissue loss. Groups 1, 3, 4 correlated with general disability (Adj-R (2) = 0.6; P = 0.0005), executive function (Adj-R (2) = 0.5; P = 0.004), verbal memory (Adj-R (2) = 0.4; P = 0.02), and attention (Adj-R (2) = 0.5; P = 0.002). CONCLUSION: Multicontrast MRI provides a new approach to infer in vivo histopathology of plaques. Our results support evidence that neurodegeneration is the major determinant of patients' disability and cognitive dysfunction.
Resumo:
OBJECTIVE: To quantify the relation between body mass index (BMI) and endometrial cancer risk, and to describe the shape of such a relation. DESIGN: Pooled analysis of three hospital-based case-control studies. SETTING: Italy and Switzerland. POPULATION: A total of 1449 women with endometrial cancer and 3811 controls. METHODS: Multivariate odds ratios (OR) and 95% confidence intervals (95% CI) were obtained from logistic regression models. The shape of the relation was determined using a class of flexible regression models. MAIN OUTCOME MEASURE: The relation of BMI with endometrial cancer. RESULTS: Compared with women with BMI 18.5 to <25 kg/m(2) , the odds ratio was 5.73 (95% CI 4.28-7.68) for women with a BMI ≥35 kg/m(2) . The odds ratios were 1.10 (95% CI 1.09-1.12) and 1.63 (95% CI 1.52-1.75) respectively for an increment of BMI of 1 and 5 units. The relation was stronger in never-users of oral contraceptives (OR 3.35, 95% CI 2.78-4.03, for BMI ≥30 versus <25 kg/m(2) ) than in users (OR 1.22, 95% CI 0.56-2.67), and in women with diabetes (OR 8.10, 95% CI 4.10-16.01, for BMI ≥30 versus <25 kg/m(2) ) than in those without diabetes (OR 2.95, 95% CI 2.44-3.56). The relation was best fitted by a cubic model, although after the exclusion of the 5% upper and lower tails, it was best fitted by a linear model. CONCLUSIONS: The results of this study confirm a role of elevated BMI in the aetiology of endometrial cancer and suggest that the risk in obese women increases in a cubic nonlinear fashion. The relation was stronger in never-users of oral contraceptives and in women with diabetes. TWEETABLE ABSTRACT: Risk of endometrial cancer increases with elevated body weight in a cubic nonlinear fashion.
Resumo:
The process of building mathematical models in quantitative structure-activity relationship (QSAR) studies is generally limited by the size of the dataset used to select variables from. For huge datasets, the task of selecting a given number of variables that produces the best linear model can be enormous, if not unfeasible. In this case, some methods can be used to separate good parameter combinations from the bad ones. In this paper three methodologies are analyzed: systematic search, genetic algorithm and chemometric methods. These methods have been exposed and discussed through practical examples.
Resumo:
The objective of this manuscript is to describe a practical experiment that can be employed for teaching concepts related to design of experiments using Matlab or Octave computing environment to beginners, undergraduate and graduate students. The classical experiment for determination of Fe (II) using o-phenanthroline was selected because it is easy to understand, and all the required materials are readily available in most analytical laboratories. The approach used in this tutorial is divided in two steps: first, the students are introduced to the concept of multivariate effects, how to calculate and interpret them, and the construction and evaluation of a linear model to describe the experimental domain by using a 2³ factorial design. Second, an extension of the factorial design by adding axial points is described, thereby, providing a central composite design. The quadratic model is then introduced and used to build the response surface.