52 resultados para Numerical Analysis and Scientific Computing
em Université de Lausanne, Switzerland
Resumo:
This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.
Resumo:
This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.
Resumo:
In 2008, a Swiss Academies of Arts and Sciences working group chaired by Professor Emilio Bossi issued a "Memorandum on scientific integrity and the handling of misconduct in the scientific context", together with a paper setting out principles and procedures concerning integrity in scientific research. In the Memorandum, unjustified claims of authorship in scientific publications are referred to as a form of scientific misconduct - a view widely shared in other countries. In the Principles and Procedures, the main criteria for legitimate authorship are specified, as well as the associated responsibilities. It is in fact not uncommon for disputes about authorship to arise with regard to publications in fields where research is generally conducted by teams rather than individuals. Such disputes may concern not only the question who is or is not to be listed as an author but also, frequently, the precise sequence of names, if the list is to reflect the various authors' roles and contributions. Subjective assessments of the contributions made by the individual members of a research group may differ substantially. As scientific collaboration - often across national boundaries - is now increasingly common, ensuring appropriate recognition of all parties is a complex matter and, where disagreements arise, it may not be easy to reach a consensus. In addition, customs have changed over the past few decades; for example, the practice of granting "honorary" authorship to an eminent researcher - formerly not unusual - is no longer considered acceptable. It should be borne in mind that the publications list has become by far the most important indicator of a researcher's scientific performance; for this reason, appropriate authorship credit has become a decisive factor in the careers of young researchers, and it needs to be managed and protected accordingly. At the international and national level, certain practices have therefore developed concerning the listing of authors and the obligations of authorship. The Scientific Integrity Committee of the Swiss Academies of Arts and Sciences has collated the relevant principles and regulations and formulated recommendations for authorship in scientific publications. These should help to prevent authorship disputes and offer guidance in the event of conflicts.
Resumo:
The pigments and the plasters of the Roman frescoes discovered at the House of Diana (Cosa, Grosseto, Italy) were analysed using non-destructive and destructive mineralogical and chemical techniques. The characterization of both pigments and plasters was performed through optical microscopy, scanning electron microscopy and electron microprobe analysis. The pigments were identified by Raman spectroscopy and submitted to stable isotope analysis. The results were integrated with the archaeological data in order to determine and reconstruct the provenance, trade patterns and the employment of the raw materials used for the elaboration of the frescoes.
Resumo:
In this work we analyze how patchy distributions of CO2 and brine within sand reservoirs may lead to significant attenuation and velocity dispersion effects, which in turn may have a profound impact on surface seismic data. The ultimate goal of this paper is to contribute to the understanding of these processes within the framework of the seismic monitoring of CO2 sequestration, a key strategy to mitigate global warming. We first carry out a Monte Carlo analysis to study the statistical behavior of attenuation and velocity dispersion of compressional waves traveling through rocks with properties similar to those at the Utsira Sand, Sleipner field, containing quasi-fractal patchy distributions of CO2 and brine. These results show that the mean patch size and CO2 saturation play key roles in the observed wave-induced fluid flow effects. The latter can be remarkably important when CO2 concentrations are low and mean patch sizes are relatively large. To analyze these effects on the corresponding surface seismic data, we perform numerical simulations of wave propagation considering reservoir models and CO2 accumulation patterns similar to the CO2 injection site in the Sleipner field. These numerical experiments suggest that wave-induced fluid flow effects may produce changes in the reservoir's seismic response, modifying significantly the main seismic attributes usually employed in the characterization of these environments. Consequently, the determination of the nature of the fluid distributions as well as the proper modeling of the seismic data constitute important aspects that should not be ignored in the seismic monitoring of CO2 sequestration problems.
Resumo:
There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.
Resumo:
PURPOSE OF REVIEW: The mechanisms involved in the formation of red blood cell (RBC) microparticles in vivo as well as during erythrocyte storage are reviewed, and the potential role of microparticles in transfusion medicine is described. RECENT FINDINGS: Microparticles release is an integral part of the erythrocyte ageing process, preventing early removal of RBCs. Proteomics analyses have outlined the key role of band 3-ankyrin anchoring complex and the occurrence of selective RBC membrane remodelling mechanisms in microparticles formation. The presence of several RBC antigens, expressed on microparticles, has been demonstrated. The potential deleterious effects of RBC microparticles in transfused recipients, including hypercoagulability, microcirculation impairment and immunosuppression, are discussed. SUMMARY: Formation and role of RBC microparticles are far from being completely understood. Combining various approaches to elucidate these mechanisms could improve blood product quality and transfusion safety. Implementation of RBC microparticles as biomarkers in the laboratory routine needs to overcome technical barriers involved in their analysis.
Resumo:
Retroelements are important evolutionary forces but can be deleterious if left uncontrolled. Members of the human APOBEC3 family of cytidine deaminases can inhibit a wide range of endogenous, as well as exogenous, retroelements. These enzymes are structurally organized in one or two domains comprising a zinc-coordinating motif. APOBEC3G contains two such domains, only the C terminal of which is endowed with editing activity, while its N-terminal counterpart binds RNA, promotes homo-oligomerization, and is necessary for packaging into human immunodeficiency virus type 1 (HIV-1) virions. Here, we performed a large-scale mutagenesis-based analysis of the APOBEC3G N terminus, testing mutants for (i) inhibition of vif-defective HIV-1 infection and Alu retrotransposition, (ii) RNA binding, and (iii) oligomerization. Furthermore, in the absence of structural information on this domain, we used homology modeling to examine the positions of functionally important residues and of residues found to be under positive selection by phylogenetic analyses of primate APOBEC3G genes. Our results reveal the importance of a predicted RNA binding dimerization interface both for packaging into HIV-1 virions and inhibition of both HIV-1 infection and Alu transposition. We further found that the HIV-1-blocking activity of APOBEC3G N-terminal mutants defective for packaging can be almost entirely rescued if their virion incorporation is forced by fusion with Vpr, indicating that the corresponding region of APOBEC3G plays little role in other aspects of its action against this pathogen. Interestingly, residues forming the APOBEC3G dimer interface are highly conserved, contrasting with the rapid evolution of two neighboring surface-exposed amino acid patches, one targeted by the Vif protein of primate lentiviruses and the other of yet-undefined function.