833 resultados para Probabilistic methodology
Resumo:
A statistical methodology for the objective comparison of LDI-MS mass spectra of blue gel pen inks was evaluated. Thirty-three blue gel pen inks previously studied by RAMAN were analyzed directly on the paper using both positive and negative mode. The obtained mass spectra were first compared using relative areas of selected peaks using the Pearson correlation coefficient and the Euclidean distance. Intra-variability among results from one ink and inter-variability between results from different inks were compared in order to choose a differentiation threshold minimizing the rate of false negative (i.e. avoiding false differentiation of the inks). This yielded a discriminating power of up to 77% for analysis made in the negative mode. The whole mass spectra were then compared using the same methodology, allowing for a better DP in the negative mode of 92% using the Pearson correlation on standardized data. The positive mode results generally yielded a lower differential power (DP) than the negative mode due to a higher intra-variability compared to the inter-variability in the mass spectra of the ink samples.
Resumo:
My research in live drawing and new technologies uses a combination of a human figure in live in composition, overlaid with a digital projection of a second human figure. The aim is to explore, to amplify and thoroughly analyse the search for distinctive identities and graphic languages of representation for live and projected models.
Resumo:
Two trends which presently exist in relation to the concept of Paleontology are analyzed, pointing out some of the aspects which negative influence. Various reflections are made based on examples of some of the principal points of paleontological method, such as the influence of a punctual sampling, the meaning of size-frequency distribution and subjectivity in the identification of fossils. Topics which have a marked repercussion in diverse aspects of Paleontology are discussed.
Resumo:
This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.
Resumo:
Synthetic root exudates were formulated based on the organic acid composition of root exudates derived from the rhizosphere of aseptically grown corn plants, pH of the rhizosphere, and the background chemical matrices of the soil solutions. The synthetic root exudates, which mimic the chemical conditions of the rhizosphere environment where soil-borne metals are dissolved and absorbed by plants, were used to extract metals from sewage-sludge treated soils 16 successive times. The concentrations of Zn, Cd, Ni, Cr, and Cu of the sludge-treated soil were 71.74, 0.21, 15.90, 58.12, and 37.44 mg kg-1, respectively. The composition of synthetic root exudates consisted of acetic, butyric, glutaric, lactic, maleic, propionic, pyruvic, succinic, tartaric, and valeric acids. The organic acid mixtures had concentrations of 0.05 and 0.1 mol L-1 -COOH. The trace elements removed by successive extractions may be considered representative for the availability of these metals to plants in these soils. The chemical speciation of the metals in the liquid phase was calculated; results showed that metals in sludge-treated soils were dissolved and formed soluble complexes with the different organic acid-based root exudates. The most reactive organic acid ligands were lactate, maleate, tartarate, and acetate. The inorganic ligands of chloride and sulfate played insignificant roles in metal dissolution. Except for Cd, free ions did not represent an important chemical species of the metals in the soil rhizosphere. As different metals formed soluble complexes with different ligands in the rhizosphere, no extractor, based on a single reagent would be able to recover all of the potentially plant-available metals from soils; the root exudate-derived organic acid mixtures tested in this study may be better suited to recover potentially plant-available metals from soils than the conventional extractors.
Resumo:
Vibration-based damage identification (VBDI) techniques have been developed in part to address the problems associated with an aging civil infrastructure. To assess the potential of VBDI as it applies to highway bridges in Iowa, three applications of VBDI techniques were considered in this study: numerical simulation, laboratory structures, and field structures. VBDI techniques were found to be highly capable of locating and quantifying damage in numerical simulations. These same techniques were found to be accurate in locating various types of damage in a laboratory setting with actual structures. Although there is the potential for these techniques to quantify damage in a laboratory setting, the ability of the methods to quantify low-level damage in the laboratory is not robust. When applying these techniques to an actual bridge, it was found that some traditional applications of VBDI methods are capable of describing the global behavior of the structure but are most likely not suited for the identification of typical damage scenarios found in civil infrastructure. Measurement noise, boundary conditions, complications due to substructures and multiple material types, and transducer sensitivity make it very difficult for present VBDI techniques to identify, much less quantify, highly localized damage (such as small cracks and minor changes in thickness). However, while investigating VBDI techniques in the field, it was found that if the frequency-domain response of the structure can be generated from operating traffic load, the structural response can be animated and used to develop a holistic view of the bridge’s response to various automobile loadings. By animating the response of a field bridge, concrete cracking (in the abutment and deck) was correlated with structural motion and problem frequencies (i.e., those that cause significant torsion or tension-compression at beam ends) were identified. Furthermore, a frequency-domain study of operational traffic was used to identify both common and extreme frequencies for a given structure and loading. Common traffic frequencies can be compared to problem frequencies so that cost-effective, preventative solutions (either structural or usage-based) can be developed for a wide range of IDOT bridges. Further work should (1) perfect the process of collecting high-quality operational frequency response data; (2) expand and simplify the process of correlating frequency response animations with damage; and (3) develop efficient, economical, preemptive solutions to common damage types.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.
Resumo:
The quadrennial need study was developed to assist in identifying county highway financial needs (construction, rehabilitation, maintenance, and administration) and in the distribution of the road use tax fund (RUTF) among the counties in the state. During the period since the need study was first conducted using HWYNEEDS software, between 1982 and 1998, there have been large fluctuations in the level of funds distributed to individual counties. A recent study performed by Jim Cable (HR-363, 1993), found that one of the major factors affecting the volatility in the level of fluctuations is the quality of the pavement condition data collected and the accuracy of these data. In 1998, the Center for Transportation Research and Education researchers (Maze and Smadi) completed a project to study the feasibility of using automated pavement condition data collected for the Iowa Pavement Management Program (IPMP) for the paved county roads to be used in the HWYNEEDS software (TR-418). The automated condition data are objective and also more current since they are collected in a two year cycle compared to the 10-year cycle used by HWYNEEDS right now. The study proved the use of the automated condition data in HWYNEEDS would be feasible and beneficial in educing fluctuations when applied to a pilot study area. In another recommendation from TR-418, the researchers recommended a full analysis and investigation of HWYNEEDS methodology and parameters (for more information on the project, please review the TR-418 project report). The study reported in this document builds on the previous study on using the automated condition data in HWYNEEDS and covers the analysis and investigation of the HWYNEEDS computer program methodology and parameters. The underlying hypothesis for this study is thatalong with the IPMP automated condition data, some changes need to be made to HWYNEEDS parameters to accommodate the use of the new data, which will stabilize the process of allocating resources and reduce fluctuations from one quadrennial need study to another. Another objective of this research is to investigate the gravel roads needs and study the feasibility of developing a more objective approach to determining needs on the counties gravel road network. This study identifies new procedures by which the HWYNEEDS computer program is used to conduct the quadrennial needs study on paved roads. Also, a new procedure will be developed to determine gravel roads needs outside of the HWYNEED program. Recommendations are identified for the new procedures and also in terms of making changes to the current quadrennial need study. Future research areas are also identified.
Resumo:
Aim Conservation strategies are in need of predictions that capture spatial community composition and structure. Currently, the methods used to generate these predictions generally focus on deterministic processes and omit important stochastic processes and other unexplained variation in model outputs. Here we test a novel approach of community models that accounts for this variation and determine how well it reproduces observed properties of alpine butterfly communities. Location The western Swiss Alps. Methods We propose a new approach to process probabilistic predictions derived from stacked species distribution models (S-SDMs) in order to predict and assess the uncertainty in the predictions of community properties. We test the utility of our novel approach against a traditional threshold-based approach. We used mountain butterfly communities spanning a large elevation gradient as a case study and evaluated the ability of our approach to model species richness and phylogenetic diversity of communities. Results S-SDMs reproduced the observed decrease in phylogenetic diversity and species richness with elevation, syndromes of environmental filtering. The prediction accuracy of community properties vary along environmental gradient: variability in predictions of species richness was higher at low elevation, while it was lower for phylogenetic diversity. Our approach allowed mapping the variability in species richness and phylogenetic diversity projections. Main conclusion Using our probabilistic approach to process species distribution models outputs to reconstruct communities furnishes an improved picture of the range of possible assemblage realisations under similar environmental conditions given stochastic processes and help inform manager of the uncertainty in the modelling results
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The Agenda 21 for the Geneva region is the results from a broad consultation process including all local actors. The article 12 stipulates that « the State facilitates possible synergies between economic activities in order to minimize their environmental impacts » thus opening the way for Industrial Ecology (IE) and Industrial Symbiosis (IS). An Advisory Board for Industrial Ecology and Industrial Symbiosis implementation was established in 2002 involving relevant government agencies. Regulatory and technical conditions for IS are studied in the Swiss context. Results reveal that the Swiss law on waste does not hinder by-product exchanges. Methodology and technical factors including geographic, qualitative, quantitative and economical aspects are detailed. The competition with waste operators in a highly developed recycling system is also tackled.The IS project develops an empirical and systematic method for detecting and implementing by-products synergies between industrial actors disseminated throughout the Geneva region. Database management tool for the treatment of input-output analysis data and GIS tools for detecting potentials industrial partners are constantly improved. Potential symbioses for 17 flows (including energy, water and material flows) are currently studied for implementation.
Resumo:
The objective of this work was to develop a low-cost portable damage detection tool to assess and predict damage areas in highway bridges. The proposed tool was based on standard vibration-based damage identification (VBDI) techniques but was extended to a new approach based on operational traffic load. The methodology was tested using numerical simulations, laboratory experiments, and field testing.