904 resultados para Test data
Resumo:
RÉSUMÉ EN FRANCAIS : Introduction: Le pseudoxanthome élastique (PXE) est une maladie génétique. Les mutations responsables ont été localisées au niveau du gène codant le transporteur transmembranaire ABC-C6. Des calcifications pathologiques des fibres élastiques de la peau, des yeux et du système cardiovasculaire en sont la conséquence. Buts: Evaluer les critères diagnostiques actuels du PXE en se basant sur les données moléculaires. Méthodes: 142 sujets provenant de 10 familles avec une anamnèse familiale positive pour le PXE ont été investiguées sur le plan clinique, histopathologique et génétique. Résultats: 25 sujets se sont avérés être homozygotes pour le gène PXE muté. 23 d'entre eux ont présenté les manifestations cliniques et histopathologique typiques. Les deux autres souffraient d'une élastose et d'une dégénérescence maculaire si importante qu'un diagnostic de PXE ne pouvait pas être confirmé cliniquement. 67 sujets se sont révélés être des porteurs hétérozygotes et 50 ne présentaient pas de mutation. De ces 117 sujets, 116 n'ont montré aucune lésion cutanée ou ophtalmique pouvant correspondre au PXE. Un seul des sujets sans mutation a présenté une importante élastose solaire ainsi qu'une cicatrisation de la rétine, imitant les lésions typiques du PXE. Quatre des 67 sujets hétérozygotes ont eu une biopsie de peau, dont les analyses histopathologique se sont avérées normales. Conclusion: Dans notre cohorte de patients, le PXE était transmis exclusivement de façoh autosomique récessive. La corrélation retrouvée entre le génotype et le phénotype a permis de confirmer les critères diagnostiques majeurs actuels. Le diagnostic clinique peut être difficile, voir impossible, chez des patients atteints d'une élastose solaire importante et/ou d'une dégénérescence maculaire étendue. Dans ces cas, un test moléculaire est nécessaire afin de confirmer le diagnostic de PXE. A notre connaissance, notre étude présentée ici est le premier travail comparant des données cliniques à des données moléculaires dans le domaine du PXE. ABSTRACT : Background: Pseudoxanthoma elasticum (PXE) is a genetic disorder due to mutations in the gene encoding the transmembrane transporter protein adenosine triphosphate binding cassette (ABC)-C6, resulting in calcifications of elastic fibers in the skin, eyes and cardiovascular system. Objectives: To evaluate the diagnostic criteria for PXE based on molecular data. Methods: Of 10 families with a positive history of PXE 142 subjects were investigated for clinical symptoms, histological findings and genetic haplotype analysis. Results: Of these, 25 subjects were haplotypic homozygous for PXE and 23 had typical clinical and histopathological manifestations. Two of the 25 patients showed such marked solar elastosis and macular degeneration that PXE could not be confirmed clinically. Sixty-seven subject were haplotypic heterozygous carriers and 50 haplotypic homozygous unaffected. Of these 117 subjects, 116 showed no cutaneous or ophthalmologic signs of PXE. In one of the 50 haplotypic homozygous unaffected patients important solar elastosis and scaring of the retina mimicked PXE lesions. Only four of the 67 haplotypic heterozygous carriers had biopsies of nonlesional skin; all were histopathologically normal. Conclusions: In our patients, PXE presents as an autosomal recessive genodermatosis. Correlation of haplotype and phenotype confirmed actual major diagnostic criteria. In patients with marked solar elastosis and/ or severe macular degeneration clinical diagnosis can be impossible and molecular testing is needed to confirm the presence of PXE. To the best of our knowledge our large study compares for the first time clinical findings with molecular data.
Resumo:
Clozapine (CLO), an atypical antipsychotic, depends mainly on cytochrome P450 1A2 (CYP1A2) for its metabolic clearance. Four patients treated with CLO, who were smokers, were nonresponders and had low plasma levels while receiving usual doses. Their plasma levels to dose ratios of CLO (median; range, 0.34; 0.22 to 0.40 ng x day/mL x mg) were significantly lower than ratios calculated from another study with 29 patients (0.75; 0.22 to 2.83 ng x day/mL x mg; P < 0.01). These patients were confirmed as being CYP1A2 ultrarapid metabolizers by the caffeine phenotyping test (median systemic caffeine plasma clearance; range, 3.85; 3.33 to 4.17 mL/min/kg) when compared with previous studies (0.3 to 3.33 mL/min/kg). The sequencing of the entire CYP1A2 gene from genomic DNA of these patients suggests that the -164C > A mutation (CYP1A2*1F) in intron 1, which confers a high inducibility of CYP1A2 in smokers, is the most likely explanation for their ultrarapid CYP1A2 activity. A marked (2 patients) or a moderate (2 patients) improvement of the clinical state of the patients occurred after the increase of CLO blood levels above the therapeutic threshold by the increase of CLO doses to very high values (ie, up to 1400 mg/d) or by the introduction of fluvoxamine, a potent CYP1A2 inhibitor, at low dosage (50 to 100 mg/d). Due to the high frequency of smokers among patients with schizophrenia and to the high frequency of the -164C > A polymorphism, CYP1A2 genotyping could have important clinical implications for the treatment of patients with CLO.
Resumo:
With the trend in molecular epidemiology towards both genome-wide association studies and complex modelling, the need for large sample sizes to detect small effects and to allow for the estimation of many parameters within a model continues to increase. Unfortunately, most methods of association analysis have been restricted to either a family-based or a case-control design, resulting in the lack of synthesis of data from multiple studies. Transmission disequilibrium-type methods for detecting linkage disequilibrium from family data were developed as an effective way of preventing the detection of association due to population stratification. Because these methods condition on parental genotype, however, they have precluded the joint analysis of family and case-control data, although methods for case-control data may not protect against population stratification and do not allow for familial correlations. We present here an extension of a family-based association analysis method for continuous traits that will simultaneously test for, and if necessary control for, population stratification. We further extend this method to analyse binary traits (and therefore family and case-control data together) and accurately to estimate genetic effects in the population, even when using an ascertained family sample. Finally, we present the power of this binary extension for both family-only and joint family and case-control data, and demonstrate the accuracy of the association parameter and variance components in an ascertained family sample.
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.
Resumo:
Moisture sensitivity of Hot Mix Asphalt (HMA) mixtures, generally called stripping, is a major form of distress in asphalt concrete pavement. It is characterized by the loss of adhesive bond between the asphalt binder and the aggregate (a failure of the bonding of the binder to the aggregate) or by a softening of the cohesive bonds within the asphalt binder (a failure within the binder itself), both of which are due to the action of loading under traffic in the presence of moisture. The evaluation of HMA moisture sensitivity has been divided into two categories: visual inspection test and mechanical test. However, most of them have been developed in pre-Superpave mix design. This research was undertaken to develop a protocol for evaluating the moisture sensitivity potential of HMA mixtures using the Nottingham Asphalt Tester (NAT). The mechanisms of HMA moisture sensitivity were reviewed and the test protocols using the NAT were developed. Different types of blends as moisture-sensitive groups and non-moisture-sensitive groups were used to evaluate the potential of the proposed test. The test results were analyzed with three parameters based on performance character: the retained flow number depending on critical permanent deformation failure (RFNP), the retained flow number depending on cohesion failure (RFNC), and energy ratio (ER). Analysis based on energy ratio of elastic strain (EREE ) at flow number of cohesion failure (FNC) has higher potential to evaluate the HMA moisture sensitivity than other parameters. If the measurement error in data-acquisition process is removed, analyses based on RFNP and RFNC would also have high potential to evaluate the HMA moisture sensitivity. The vacuum pressure saturation used in AASHTO T 283 and proposed test has a risk to damage specimen before the load applying.
Resumo:
PURPOSE: The current study tested the applicability of Jessor's problem behavior theory (PBT) in national probability samples from Georgia and Switzerland. Comparisons focused on (1) the applicability of the problem behavior syndrome (PBS) in both developmental contexts, and (2) on the applicability of employing a set of theory-driven risk and protective factors in the prediction of problem behaviors. METHODS: School-based questionnaire data were collected from n = 18,239 adolescents in Georgia (n = 9499) and Switzerland (n = 8740) following the same protocol. Participants rated five measures of problem behaviors (alcohol and drug use, problems because of alcohol and drug use, and deviance), three risk factors (future uncertainty, depression, and stress), and three protective factors (family, peer, and school attachment). Final study samples included n = 9043 Georgian youth (mean age = 15.57; 58.8% females) and n = 8348 Swiss youth (mean age = 17.95; 48.5% females). Data analyses were completed using structural equation modeling, path analyses, and post hoc z-tests for comparisons of regression coefficients. RESULTS: Findings indicated that the PBS replicated in both samples, and that theory-driven risk and protective factors accounted for 13% and 10% in Georgian and Swiss samples, respectively in the PBS, net the effects by demographic variables. Follow-up z-tests provided evidence of some differences in the magnitude, but not direction, in five of six individual paths by country. CONCLUSION: PBT and the PBS find empirical support in these Eurasian and Western European samples; thus, Jessor's theory holds value and promise in understanding the etiology of adolescent problem behaviors outside of the United States.
Resumo:
Field-based soil moisture measurements are cumbersome. Thus, remote sensing techniques are needed because allows field and landscape-scale mapping of soil moisture depth-averaged through the root zone of existing vegetation. The objective of the study was to evaluate the accuracy of an empirical relationship to calculate soil moisture from remote sensing data of irrigated soils of the Apodi Plateau, in the Brazilian semiarid region. The empirical relationship had previously been tested for irrigated soils in Mexico, Egypt, and Pakistan, with promising results. In this study, the relationship was evaluated from experimental data collected from a cotton field. The experiment was carried out in an area of 5 ha with irrigated cotton. The energy balance and evaporative fraction (Λ) were measured by the Bowen ratio method. Soil moisture (θ) data were collected using a PR2 - Profile Probe (Delta-T Devices Ltd). The empirical relationship was tested using experimentally collected Λ and θ values and was applied using the Λ values obtained from the Surface Energy Balance Algorithm for Land (SEBAL) and three TM - Landsat 5 images. There was a close correlation between measured and estimated θ values (p<0.05, R² = 0.84) and there were no significant differences according to the Student t-test (p<0.01). The statistical analyses showed that the empirical relationship can be applied to estimate the root-zone soil moisture of irrigated soils, i.e. when the evaporative fraction is greater than 0.45.
Resumo:
INTRODUCTION: Quantitative sensory testing (QST) is widely used in human research to investigate the integrity of the sensory function in patients with pain of neuropathic origin, or other causes such as low back pain. Reliability of QST has been evaluated on both sides of the face, hands and feet as well as on the trunk (Th3-L3). In order to apply these tests on other body-parts such as the lower lumbar spine, it is important first to establish reliability on healthy individuals. The aim of this study was to investigate intra-rater reliability of thermal QST in healthy adults, on two sites within the L5 dermatome of the lumbar spine and lower extremity. METHODS: Test-retest reliability of thermal QST was determined at the L5-level of the lumbar spine and in the same dermatome on the lower extremity in 30 healthy persons under 40 years of age. Results were analyzed using descriptive statistics and intraclass correlation coefficient (ICC). Values were compared to normative data, using Z-transformation. RESULTS: Mean intraindividual differences were small for cold and warm detection thresholds but larger for pain thresholds. ICC values showed excellent reliability for warm detection and heat pain threshold, good-to-excellent reliability for cold pain threshold and fair-to-excellent reliability for cold detection threshold. ICC had large ranges of confidence interval (95%). CONCLUSION: In healthy adults, thermal QST on the lumbar spine and lower extremity demonstrated fair-to-excellent test-retest reliability.
Resumo:
Field capacity (FC) is a parameter widely used in applied soil science. However, its in situ method of determination may be difficult to apply, generally because of the need of large supplies of water at the test sites. Ottoni Filho et al. (2014) proposed a standardized procedure for field determination of FC and showed that such in situ FC can be estimated by a linear pedotransfer function (PTF) based on volumetric soil water content at the matric potential of -6 kPa [θ(6)] for the same soils used in the present study. The objective of this study was to use soil moisture data below a double ring infiltrometer measured 48 h after the end of the infiltration test in order to develop PTFs for standard in situ FC. We found that such ring FC data were an average of 0.03 m³ m- 3 greater than standard FC values. The linear PTF that was developed for the ring FC data based only on θ(6) was nearly as accurate as the equivalent PTF reported by Ottoni Filho et al. (2014), which was developed for the standard FC data. The root mean squared residues of FC determined from both PTFs were about 0.02 m³ m- 3. The proposed method has the advantage of estimating the soil in situ FC using the water applied in the infiltration test.
Resumo:
OBJECTIVE: The purpose of the present study was to submit the same materials that were tested in the round robin wear test of 2002/2003 to the Alabama wear method. METHODS: Nine restorative materials, seven composites (belleGlass, Chromasit, Estenia, Heliomolar, SureFil, Targis, Tetric Ceram) an amalgam (Amalcap) and a ceramic (IPS Empress) have been submitted to the Alabama wear method for localized and generalized wear. The test centre did not know which brand they were testing. Both volumetric and vertical loss had been determined with an optical sensor. After completion of the wear test, the raw data were sent to IVOCLAR for further analysis. The statistical analysis of the data included logarithmic transformation of the data, the calculation of relative ranks of each material within each test centre, measures of agreement between methods, the discrimination power and coefficient of variation of each method as well as measures of the consistency and global performance for each material. RESULTS: Relative ranks of the materials varied tremendously between the test centres. When all materials were taken into account and the test methods compared with each other, only ACTA agreed reasonably well with two other methods, i.e. OHSU and ZURICH. On the other hand, MUNICH did not agree with the other methods at all. The ZURICH method showed the lowest discrimination power, ACTA, IVOCLAR and ALABAMA localized the highest. Material-wise, the best global performance was achieved by the leucite reinforced ceramic material Empress, which was clearly ahead of belleGlass, SureFil and Estenia. In contrast, Heliomolar, Tetric Ceram and especially Chromasit demonstrated a poor global performance. The best consistency was achieved by SureFil, Tetric Ceram and Chromasit, whereas the consistency of Amalcap and Heliomolar was poor. When comparing the laboratory data with clinical data, a significant agreement was found for the IVOCLAR and ALABAMA generalized wear method. SIGNIFICANCE: As the different wear simulator settings measure different wear mechanisms, it seems reasonable to combine at least two different wear settings to assess the wear resistance of a new material.
Resumo:
In response to the mandate on Load and Resistance Factor Design (LRFD) implementations by the Federal Highway Administration (FHWA) on all new bridge projects initiated after October 1, 2007, the Iowa Highway Research Board (IHRB) sponsored these research projects to develop regional LRFD recommendations. The LRFD development was performed using the Iowa Department of Transportation (DOT) Pile Load Test database (PILOT). To increase the data points for LRFD development, develop LRFD recommendations for dynamic methods, and validate the results ofLRFD calibration, 10 full-scale field tests on the most commonly used steel H-piles (e.g., HP 10 x 42) were conducted throughout Iowa. Detailed in situ soil investigations were carried out, push-in pressure cells were installed, and laboratory soil tests were performed. Pile responses during driving, at the end of driving (EOD), and at re-strikes were monitored using the Pile Driving Analyzer (PDA), following with the CAse Pile Wave Analysis Program (CAPWAP) analysis. The hammer blow counts were recorded for Wave Equation Analysis Program (WEAP) and dynamic formulas. Static load tests (SLTs) were performed and the pile capacities were determined based on the Davisson’s criteria. The extensive experimental research studies generated important data for analytical and computational investigations. The SLT measured loaddisplacements were compared with the simulated results obtained using a model of the TZPILE program and using the modified borehole shear test method. Two analytical pile setup quantification methods, in terms of soil properties, were developed and validated. A new calibration procedure was developed to incorporate pile setup into LRFD.
Resumo:
For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
Several methods and algorithms have recently been proposed that allow for the systematic evaluation of simple neuron models from intracellular or extracellular recordings. Models built in this way generate good quantitative predictions of the future activity of neurons under temporally structured current injection. It is, however, difficult to compare the advantages of various models and algorithms since each model is designed for a different set of data. Here, we report about one of the first attempts to establish a benchmark test that permits a systematic comparison of methods and performances in predicting the activity of rat cortical pyramidal neurons. We present early submissions to the benchmark test and discuss implications for the design of future tests and simple neurons models
Resumo:
Background: Detection rates for adenoma and early colorectal cancer (CRC) are unsatisfactory due to low compliance towards invasive screening procedures such as colonoscopy. There is a large unmet screening need calling for an accurate, non-invasive and cost-effective test to screen for early neoplastic and pre-neoplastic lesions. Our goal is to identify effective biomarker combinations to develop a screening test aimed at detecting precancerous lesions and early CRC stages, based on a multigene assay performed on peripheral blood mononuclear cells (PBMC).Methods: A pilot study was conducted on 92 subjects. Colonoscopy revealed 21 CRC, 30 adenomas larger than 1 cm and 41 healthy controls. A panel of 103 biomarkers was selected by two approaches: a candidate gene approach based on literature review and whole transcriptome analysis of a subset of this cohort by Illumina TAG profiling. Blood samples were taken from each patient and PBMC purified. Total RNA was extracted and the 103 biomarkers were tested by multiplex RT-qPCR on the cohort. Different univariate and multivariate statistical methods were applied on the PCR data and 60 biomarkers, with significant p-value (< 0.01) for most of the methods, were selected.Results: The 60 biomarkers are involved in several different biological functions, such as cell adhesion, cell motility, cell signaling, cell proliferation, development and cancer. Two distinct molecular signatures derived from the biomarker combinations were established based on penalized logistic regression to separate patients without lesion from those with CRC or adenoma. These signatures were validated using bootstrapping method, leading to a separation of patients without lesion from those with CRC (Se 67%, Sp 93%, AUC 0.87) and from those with adenoma larger than 1cm (Se 63%, Sp 83%, AUC 0.77). In addition, the organ and disease specificity of these signatures was confirmed by means of patients with other cancer types and inflammatory bowel diseases.Conclusions: The two defined biomarker combinations effectively detect the presence of CRC and adenomas larger than 1 cm with high sensitivity and specificity. A prospective, multicentric, pivotal study is underway in order to validate these results in a larger cohort.