967 resultados para Robust methods
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensic science denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstrated its potential to distinguish chemically identical compounds coming from different sources. Despite the numerous applications of IRMS to a wide range of forensic materials, its implementation in a forensic framework is less straightforward than it appears. In addition, each laboratory has developed its own strategy of analysis on calibration, sequence design, standards utilisation and data treatment without a clear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose a methodological framework of the whole process using IRMS methods. We emphasize the importance of considering isotopic results as part of a whole approach, when applying this technology to a particular forensic issue. The process is divided into six different steps, which should be considered for a thoughtful and relevant application. The dissection of this process into fundamental steps, further detailed, enables a better understanding of the essential, though not exhaustive, factors that have to be considered in order to obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratory comparisons.
Resumo:
BACKGROUND: Few European studies have investigated how cardiovascular risk factors (CRF) in adults relate to those observed in younger generations. OBJECTIVE: To explore this issue in a Swiss region using two population health surveys of 3636 adolescents ages 9-19 years and 3299 adults ages 25-74 years. METHODS: Age patterns of continuous CRF were estimated by robust locally weighted regression and those of high-risk groups were calculated using adult criteria with appropriate adjustment for children. RESULTS: Gender differences in height, weight, blood pressure, and HDL cholesterol observed in adults were found to emerge in adolescents. Overweight, affecting 10-12% of adolescents, was increasing steeply in young adults (three times among males and twice among females) in parallel with inactivity. Median age at smoking initiation was decreasing rapidly from 18 to 20 years in young adults to 15 in adolescents. A statistically significant social gradient in disfavor of the lower education level was observed for overweight in all age groups of women above 16 (odds ratios (ORs) 2.4 to 3.3, P < 0.01), for inactivity in adult males (ORs 1.6 to 2.0, P < 0.05), and for regular smoking in older adolescents (OR 1.9 for males, 2.7 for females, P < 0.005), but not for elevated blood pressure. CONCLUSION: Discontinuities in the cross-sectional age patterns of CRF indicated the emergence of a social gradient and the need for preventive actions against the early adoption of persistent unhealthy behaviors, to which low-educated girls and women are particularly exposed.
Resumo:
Fluvial deposits are a challenge for modelling flow in sub-surface reservoirs. Connectivity and continuity of permeable bodies have a major impact on fluid flow in porous media. Contemporary object-based and multipoint statistics methods face a problem of robust representation of connected structures. An alternative approach to model petrophysical properties is based on machine learning algorithm ? Support Vector Regression (SVR). Semi-supervised SVR is able to establish spatial connectivity taking into account the prior knowledge on natural similarities. SVR as a learning algorithm is robust to noise and captures dependencies from all available data. Semi-supervised SVR applied to a synthetic fluvial reservoir demonstrated robust results, which are well matched to the flow performance
Resumo:
Drosophila melanogaster is a model organism instrumental for numerous biological studies. The compound eye of this insect consists of some eight hundred individual ommatidia or facets, ca. 15 µm in cross-section. Each ommatidium contains eighteen cells including four cone cells secreting the lens material (cornea). High-resolution imaging of the cornea of different insects has demonstrated that each lens is covered by the nipple arrays--small outgrowths of ca. 200 nm in diameter. Here we for the first time utilize atomic force microscopy (AFM) to investigate nipple arrays of the Drosophila lens, achieving an unprecedented visualization of the architecture of these nanostructures. We find by Fourier analysis that the nipple arrays of Drosophila are disordered, and that the seemingly ordered appearance is a consequence of dense packing of the nipples. In contrast, Fourier analysis confirms the visibly ordered nature of the eye microstructures--the individual lenses. This is different in the frizzled mutants of Drosophila, where both Fourier analysis and optical imaging detect disorder in lens packing. AFM reveals intercalations of the lens material between individual lenses in frizzled mutants, providing explanation for this disorder. In contrast, nanostructures of the mutant lens show the same organization as in wild-type flies. Thus, frizzled mutants display abnormal organization of the corneal micro-, but not nano-structures. At the same time, nipples of the mutant flies are shorter than those of the wild-type. We also analyze corneal surface of glossy-appearing eyes overexpressing Wingless--the lipoprotein ligand of Frizzled receptors, and find the catastrophic aberration in nipple arrays, providing experimental evidence in favor of the major anti-reflective function of these insect eye nanostructures. The combination of the easily tractable genetic model organism and robust AFM analysis represents a novel methodology to analyze development and architecture of these surface formations.
Resumo:
A key, yet often neglected, component of digital evolution and evolutionary models is the 'selection method' which assigns fitness (number of offspring) to individuals based on their performance scores (efficiency in performing tasks). Here, we study with formal analysis and numerical experiments the evolution of cooperation under the five most common selection methods (proportionate, rank, truncation-proportionate, truncation-uniform and tournament). We consider related individuals engaging in a Prisoner's Dilemma game where individuals can either cooperate or defect. A cooperator pays a cost, whereas its partner receives a benefit, which affect their performance scores. These performance scores are translated into fitness by one of the five selection methods. We show that cooperation is positively associated with the relatedness between individuals under all selection methods. By contrast, the change in the performance benefit of cooperation affects the populations' average level of cooperation only under the proportionate methods. We also demonstrate that the truncation and tournament methods may introduce negative frequency-dependence and lead to the evolution of polymorphic populations. Using the example of the evolution of cooperation, we show that the choice of selection method, though it is often marginalized, can considerably affect the evolutionary dynamics.
Resumo:
The spared nerve injury (SNI) model mimics human neuropathic pain related to peripheral nerve injury and is based upon an invasive but simple surgical procedure. Since its first description in 2000, it has displayed a remarkable development. It produces a robust, reliable and long-lasting neuropathic pain-like behaviour (allodynia and hyperalgesia) as well as the possibility of studying both injured and non-injured neuronal populations in the same spinal ganglion. Besides, variants of the SNI model have been developed in rats, mice and neonatal/young rodents, resulting in several possible angles of analysis. Therefore, the purpose of this chapter is to provide a detailed guidance regarding the SNI model and its variants, highlighting its surgical and behavioural testing specificities.
Resumo:
In the last five years, Deep Brain Stimulation (DBS) has become the most popular and effective surgical technique for the treatent of Parkinson's disease (PD). The Subthalamic Nucleus (STN) is the usual target involved when applying DBS. Unfortunately, the STN is in general not visible in common medical imaging modalities. Therefore, atlas-based segmentation is commonly considered to locate it in the images. In this paper, we propose a scheme that allows both, to perform a comparison between different registration algorithms and to evaluate their ability to locate the STN automatically. Using this scheme we can evaluate the expert variability against the error of the algorithms and we demonstrate that automatic STN location is possible and as accurate as the methods currently used.
Resumo:
Gaseous N losses from soil are considerable, resulting mostly from ammonia volatilization linked to agricultural activities such as pasture fertilization. The use of simple and accessible measurement methods of such losses is fundamental in the evaluation of the N cycle in agricultural systems. The purpose of this study was to evaluate quantification methods of NH3 volatilization from fertilized surface soil with urea, with minimal influence on the volatilization processes. The greenhouse experiment was arranged in a completely randomized design with 13 treatments and five replications, with the following treatments: (1) Polyurethane foam (density 20 kg m-3) with phosphoric acid solution absorber (foam absorber), installed 1, 5, 10 and 20 cm above the soil surface; (2) Paper filter with sulfuric acid solution absorber (paper absorber, 1, 5, 10 and 20 cm above the soil surface); (3) Sulfuric acid solution absorber (1, 5 and 10 cm above the soil surface); (4) Semi-open static collector; (5) 15N balance (control). The foam absorber placed 1 cm above the soil surface estimated the real daily rate of loss and accumulated loss of NH3N and proved efficient in capturing NH3 volatized from urea-treated soil. The estimates based on acid absorbers 1, 5 and 10 cm above the soil surface and paper absorbers 1 and 5 cm above the soil surface were only realistic for accumulated N-NH3 losses. Foam absorbers can be indicated to quantify accumulated and daily rates of NH3 volatilization losses similarly to an open static chamber, making calibration equations or correction factors unnecessary.
Resumo:
Nickel, although essential to plants, may be toxic to plants and animals. It is mainly assimilated by food ingestion. However, information about the average levels of elements (including Ni) in edible vegetables from different regions is still scarce in Brazil. The objectives of this study were to: (a) evaluate and optimize a method for preparation of vegetable tissue samples for Ni determination; (b) optimize the analytical procedures for determination by Flame Atomic Absorption Spectrometry (FAAS) and by Electrothermal Atomic Absorption (ETAAS) in vegetable samples and (c) determine the Ni concentration in vegetables consumed in the cities of Lorena and Taubaté in the Vale do Paraíba, State of São Paulo, Brazil. By means of the analytical technique for determination by ETAAS or FAAS, the results were validated by the test of analyte addition and recovery. The most viable method tested for quantification of this element was HClO4-HNO3 wet digestion. All samples but carrot tissue collected in Lorena contained Ni levels above the permitted by the Brazilian Ministry of Health. The most disturbing results, requiring more detailed studies, were the Ni concentrations measured in carrot samples from Taubaté, where levels were five times higher than permitted by Brazilian regulations.
Resumo:
BACKGROUND: Expression of heterologous genes in mammalian cells or organisms for therapeutic or experimental purposes often requires tight control of transgene expression. Specifically, the following criteria should be met: no background gene activity in the off-state, high gene expression in the on-state, regulated expression over an extended period, and multiple switching between on- and off-states. METHODS: Here, we describe a genetic switch system for controlled transgene transcription using chimeric repressor and activator proteins functioning in a novel regulatory network. In the off-state, the target transgene is actively silenced by a chimeric protein consisting of multimerized eukaryotic transcriptional repression domains fused to the DNA-binding tetracycline repressor. In the on-state, the inducer drug doxycycline affects both the derepression of the target gene promoter and activation by the GAL4-VP16 transactivator, which in turn is under the control of an autoregulatory feedback loop. RESULTS: The hallmark of this new system is the efficient transgene silencing in the off-state, as demonstrated by the tightly controlled expression of the highly cytotoxic diphtheria toxin A gene. Addition of the inducer drug allows robust activation of transgene expression. In stably transfected cells, this control is still observed after months of repeated cycling between the repressed and activated states of the target genes. CONCLUSIONS: This system permits tight long-term regulation when stably introduced into cell lines. The underlying principles of this network system should have general applications in biotechnology and gene therapy.
Resumo:
For successful treatment of prosthetic joint infection, the identification of the infecting microorganism is crucial. Cultures of synovial fluid and intraoperative periprosthetic tissue represent the standard method for diagnosing prosthetic joint infection. Rapid and accurate diagnostic tools which can detect a broad range of causing microorganisms and their antimicrobial resistance are increasingly needed. With newer diagnostic techniques, such as sonication of removed implants, microcalorimetry, molecular methods and mass spectrometry, the sensitivity has been significantly increased. In this article, we describe the conventional and newer diagnostic techniques with their advantages and potential future applications.
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.
Resumo:
The influence of hole-hole (h-h) propagation in addition to the conventional particle-particle (p-p) propagation, on the energy per particle and the momentum distribution is investigated for the v2 central interaction which is derived from Reid¿s soft-core potential. The results are compared to Brueckner-Hartree-Fock calculations with a continuous choice for the single-particle (SP) spectrum. Calculation of the energy from a self-consistently determined SP spectrum leads to a lower saturation density. This result is not corroborated by calculating the energy from the hole spectral function, which is, however, not self-consistent. A generalization of previous calculations of the momentum distribution, based on a Goldstone diagram expansion, is introduced that allows the inclusion of h-h contributions to all orders. From this result an alternative calculation of the kinetic energy is obtained. In addition, a direct calculation of the potential energy is presented which is obtained from a solution of the ladder equation containing p-p and h-h propagation to all orders. These results can be considered as the contributions of selected Goldstone diagrams (including p-p and h-h terms on the same footing) to the kinetic and potential energy in which the SP energy is given by the quasiparticle energy. The results for the summation of Goldstone diagrams leads to a different momentum distribution than the one obtained from integrating the hole spectral function which in general gives less depletion of the Fermi sea. Various arguments, based partly on the results that are obtained, are put forward that a self-consistent determination of the spectral functions including the p-p and h-h ladder contributions (using a realistic interaction) will shed light on the question of nuclear saturation at a nonrelativistic level that is consistent with the observed depletion of SP orbitals in finite nuclei.
Resumo:
Vibration-based damage identification (VBDI) techniques have been developed in part to address the problems associated with an aging civil infrastructure. To assess the potential of VBDI as it applies to highway bridges in Iowa, three applications of VBDI techniques were considered in this study: numerical simulation, laboratory structures, and field structures. VBDI techniques were found to be highly capable of locating and quantifying damage in numerical simulations. These same techniques were found to be accurate in locating various types of damage in a laboratory setting with actual structures. Although there is the potential for these techniques to quantify damage in a laboratory setting, the ability of the methods to quantify low-level damage in the laboratory is not robust. When applying these techniques to an actual bridge, it was found that some traditional applications of VBDI methods are capable of describing the global behavior of the structure but are most likely not suited for the identification of typical damage scenarios found in civil infrastructure. Measurement noise, boundary conditions, complications due to substructures and multiple material types, and transducer sensitivity make it very difficult for present VBDI techniques to identify, much less quantify, highly localized damage (such as small cracks and minor changes in thickness). However, while investigating VBDI techniques in the field, it was found that if the frequency-domain response of the structure can be generated from operating traffic load, the structural response can be animated and used to develop a holistic view of the bridge’s response to various automobile loadings. By animating the response of a field bridge, concrete cracking (in the abutment and deck) was correlated with structural motion and problem frequencies (i.e., those that cause significant torsion or tension-compression at beam ends) were identified. Furthermore, a frequency-domain study of operational traffic was used to identify both common and extreme frequencies for a given structure and loading. Common traffic frequencies can be compared to problem frequencies so that cost-effective, preventative solutions (either structural or usage-based) can be developed for a wide range of IDOT bridges. Further work should (1) perfect the process of collecting high-quality operational frequency response data; (2) expand and simplify the process of correlating frequency response animations with damage; and (3) develop efficient, economical, preemptive solutions to common damage types.