931 resultados para Nature of the science
Resumo:
Forensic science is increasingly relied upon by law enforcement to assist in solvingcrime and gaining convictions, and by the judicial system in the adjudication ofspecific criminal cases. However, the value of forensic science relative to the workinvolved and the outcome of cases has yet to be established in the Australiancontext. Previous research in this area has mainly focused on the science andtechnology, rather than examining how people can use forensic services/science tothe best possible advantage to produce appropriate justice outcomes. This fiveyearproject entails an investigation into the effectiveness of forensic science inpolice investigations and court trials. It aims to identify when, where and howforensic science can add value to criminal investigations, court trials and justiceoutcomes while ensuring the efficient use of available resources initially in theVictorian and the ACT criminal justice systems and ultimately across Australiaand New Zealand. This paper provides an overview of the rationale and aims ofthe research project and discusses current work-in-progress.
Resumo:
INTRODUCTION: In patients with multiple sclerosis (MS), conventional magnetic resonance imaging (MRI) provides only limited insights into the nature of brain damage with modest clinic-radiological correlation. In this study, we applied recent advances in MRI techniques to study brain microstructural alterations in early relapsing-remitting MS (RRMS) patients with minor deficits. Further, we investigated the potential use of advanced MRI to predict functional performances in these patients. METHODS: Brain relaxometry (T1, T2, T2*) and magnetization transfer MRI were performed at 3T in 36 RRMS patients and 18 healthy controls (HC). Multicontrast analysis was used to assess for microstructural alterations in normal-appearing (NA) tissue and lesions. A generalized linear model was computed to predict clinical performance in patients using multicontrast MRI data, conventional MRI measures as well as demographic and behavioral data as covariates. RESULTS: Quantitative T2 and T2* relaxometry were significantly increased in temporal normal-appearing white matter (NAWM) of patients compared to HC, indicating subtle microedema (P = 0.03 and 0.004). Furthermore, significant T1 and magnetization transfer ratio (MTR) variations in lesions (mean T1 z-score: 4.42 and mean MTR z-score: -4.09) suggested substantial tissue loss. Combinations of multicontrast and conventional MRI data significantly predicted cognitive fatigue (P = 0.01, Adj-R (2) = 0.4), attention (P = 0.0005, Adj-R (2) = 0.6), and disability (P = 0.03, Adj-R (2) = 0.4). CONCLUSION: Advanced MRI techniques at 3T, unraveled the nature of brain tissue damage in early MS and substantially improved clinical-radiological correlations in patients with minor deficits, as compared to conventional measures of disease.
Resumo:
Statistics occupies a prominent role in science and citizens' daily life. This article provides a state-of-the-art of the problems associated with statistics in science and in society, structured along the three paradigms defined by Bauer, Allum and Miller (2007). It explores in more detail medicine and public understanding of science on the one hand, and risks and surveys on the other. Statistics has received a good deal of attention; however, very often handled in terms of deficit - either of scientists or of citizens. Many tools have been proposed to improve statistical literacy, the image of and trust in statistics, but with little understanding of their roots, with little coordination among stakeholders and with few assessments of impacts. These deficiencies represent as many new and promising directions in which the PUS research agenda could be expanded.
Resumo:
At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
Resumo:
The Iowa economy is undergoing great change. Among the sectors deemed important to Iowa’s economic future is bioscience. Definition of what constitutes the bioscience sector but suggests it includes agricultural, medical, plant-life sciences, and related industrial activity.
Resumo:
The present paper aims at an historical reconstruction, in the framework of the publishing industry in the years between the two World Wars, of the role played by the publisher William W. Norton in the genesis, published in 1932, and new edition in 1938, of Walter B. Cannon"s book The Wisdom of the Body. With the analysis of this case study, we aimed at contributing to the current criticism of the «dominant view», which tries, in an uncritical manner, that scientific popularization follows an ineluctable, continuous and linear evolution.
Resumo:
The establishment of legislative rules about explosives in the eighties has reduced the illicit use of military and civilian explosives. However, bomb-makers have rapidly taken advantage of substances easily accessible and intended for licit uses to produce their own explosives. This change in strategy has given rise to an increase of improvised explosive charges, which is moreover assisted by the ease of implementation of the recipes, widely available through open sources. While the nature of the explosive charges has evolved, instrumental methods currently used in routine, although more sensitive than before, have a limited power of discrimination and allow mostly the determination of the chemical nature of the substance. Isotope ratio mass spectrometry (IRMS) has been applied to a wide range of forensic materials. Conclusions drawn from the majority of the studies stress its high power of discrimination. Preliminary studies conducted so far on the isotopic analysis of intact explosives (pre-blast) have shown that samples with the same chemical composition and coming from different sources could be differentiated. The measurement of stable isotope ratios appears therefore as a new and remarkable analytical tool for the discrimination or the identification of a substance with a definite source. However, much research is still needed to assess the validity of the results in order to use them either in an operational prospect or in court. Through the isotopic study of black powders and ammonium nitrates, this research aims at evaluating the contribution of isotope ratio mass spectrometry to the investigation of explosives, both from a pre-blast and from a post-blast approach. More specifically, the goal of the research is to provide additional elements necessary to a valid interpretation of the results, when used in explosives investigation. This work includes a fundamental study on the variability of the isotopic profile of black powder and ammonium nitrate in both space and time. On one hand, the inter-variability between manufacturers and, particularly, the intra-variability within a manufacturer has been studied. On the other hand, the stability of the isotopic profile over time has been evaluated through the aging of these substances exposed to different environmental conditions. The second part of this project considers the applicability of this high-precision technology to traces and residues of explosives, taking account of the characteristics specific to the field, including their sampling, a probable isotopic fractionation during the explosion, and the interferences with the matrix of the site.
Resumo:
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the ethanol production in the fermentation of Saccharomyces cerevisiae.
Resumo:
The present paper aims at an historical reconstruction, in the framework of the publishing industry in the years between the two World Wars, of the role played by the publisher William W. Norton in the genesis, published in 1932, and new edition in 1938, of Walter B. Cannon"s book The Wisdom of the Body. With the analysis of this case study, we aimed at contributing to the current criticism of the «dominant view», which tries, in an uncritical manner, that scientific popularization follows an ineluctable, continuous and linear evolution.
Resumo:
In this paper we provide a formal account for underapplication of vowel reduction to schwa in Majorcan Catalan loanwords and learned words. On the basis of the comparison of these data with those concerning productive derivation and verbal inflection, which show analogous patterns, in this paper we also explore the existing and not yet acknowledged correlation between those processes that exhibit a particular behaviour in the loanword phonology with respect to the native phonology of the language, those processes that show lexical exceptions and those processes that underapply due to morphological reasons. In light of the analysis of the very same data and taking into account the aforementioned correlation, we show how there might exist a natural diachronic relation between two kinds of Optimality Theory constraints which are commonly used but, in principle, mutually exclusive: positional faithfulness and contextual markedness constraints. Overall, phonological productivity is proven to be crucial in three respects: first, as a context of the grammar, given that «underapplication» is systematically found in what we call the productive phonology of the dialect (including loanwords, learned words, productive derivation and verbal inflection); second, as a trigger or blocker of processes, in that the productivity or the lack of productivity of a specific process or constraint in the language is what explains whether it is challenged or not in any of the depicted situations, and, third, as a guiding principle which can explain the transition from the historical to the synchronic phonology of a linguistic variety.
Resumo:
Deliberate fires appear to be borderless and timeless events creating a serious security problem. There have been many attempts to develop approaches to tackle this problem, but unfortunately acting effectively against deliberate fires has proven a complex challenge. This article reviews the current situation relating to deliberate fires: what do we know, how serious is the situation, how is it being dealt with, and what challenges are faced when developing a systematic and global methodology to tackle the issues? The repetitive nature of some types of deliberate fires will also be discussed. Finally, drawing on the reality of repetition within deliberate fires and encouraged by successes obtained in previous repetitive crimes (such as property crimes or drug trafficking), we will argue that the use of the intelligence process cycle as a framework to allow a follow-up and systematic analysis of fire events is a relevant approach. This is the first article of a series of three articles. This first part is introducing the context and discussing the background issues in order to provide a better underpinning knowledge to managers and policy makers planning on tackling this issue. The second part will present a methodology developed to detect and identify repetitive fire events from a set of data, and the third part will discuss the analyses of these data to produce intelligence.
Resumo:
This paper is concerned with the contribution of forensic science to the legal process by helping reduce uncertainty. Although it is now widely accepted that uncertainty should be handled by probability because it is a safeguard against incoherent proceedings, there remain diverging and conflicting views on how probability ought to be interpreted. This is exemplified by the proposals in scientific literature that call for procedures of probability computation that are referred to as "objective," suggesting that scientists ought to use them in their reporting to recipients of expert information. I find such proposals objectionable. They need to be viewed cautiously, essentially because ensuing probabilistic statements can be perceived as making forensic science prescriptive. A motivating example from the context of forensic DNA analysis will be chosen to illustrate this. As a main point, it shall be argued that such constraining suggestions can be avoided by interpreting probability as a measure of personal belief, that is, subjective probability. Invoking references to foundational literature from mathematical statistics and philosophy of science, the discussion will explore the consequences of this interdisciplinary viewpoint for the practice of forensic expert reporting. It will be emphasized that-as an operational interpretation of probability-the subjectivist perspective enables forensic science to add value to the legal process, in particular by avoiding inferential impasses to which other interpretations of probability may lead. Moreover, understanding probability from a subjective perspective can encourage participants in the legal process to take on more responsibility in matters regarding the coherent handling of uncertainty. This would assure more balanced interactions at the interface between science and the law. This, in turn, provides support for ongoing developments that can be called the "probabilization" of forensic science.