36 resultados para Quick-XAS
em CentAUR: Central Archive University of Reading - UK
Resumo:
Noncompetitive bids have recently become a major concern in both public and private sector construction contract auctions. Consequently, several models have been developed to help identify bidders potentially involved in collusive practices. However, most of these models require complex calculations and extensive information that is difficult to obtain. The aim of this paper is to utilize recent developments for detecting abnormal bids in capped auctions (auctions with an upper bid limit set by the auctioner) and extend them to the more conventional uncapped auctions (where no such limits are set). To accomplish this, a new method is developed for estimating the values of bid distribution supports by using the solution to what has become known as the German Tank problem. The model is then demonstrated and tested on a sample of real construction bid data, and shown to detect cover bids with high accuracy. This paper contributes to an improved understanding of abnormal bid behavior as an aid to detecting and monitoring potential collusive bid practices.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
Predicting metal bioaccumulation and toxicity in soil organisms is complicated by site-specific biotic and abiotic parameters. In this study we exploited tissue fractionation and digestion techniques, combined with X-ray absorption spectroscopy (XAS), to investigate the whole-body and subcellular distributions, ligand affinities, and coordination chemistry of accumulated Pb and Zn in field populations of the epigeic earthworm Lumbricus rubellus inhabiting three contrasting metalliferous and two unpolluted soils. Our main findings were (i) earthworms were resident in soils with concentrations of Pb and Zn ranging from 1200 to 27 000 mg kg(-1) and 200 to 34 000 mg kg(-1), respectively; (ii) Pb and Zn primarily accumulated in the posterior alimentary canal in nonsoluble subcellular fractions of earthworms; (iii) site-specific differences in the tissue and subcellular partitioning profiles of populations were observed, with earthworms from a calcareous site partitioning proportionally more Pb to their anterior body segments and Zn to the chloragosome-rich subcellular fraction than their acidic-soil inhabiting counterparts; (iv) XAS indicated that the interpopulation differences in metal partitioning between organs were not accompanied by qualitative differences in ligand-binding speciation, because crystalline phosphate-containing pyromorphite was a predominant chemical species in the whole-worm tissues of all mine soil residents. Differences in metal (Pb, Zn) partitioning at both organ and cellular levels displayed by field populations with protracted histories of metal exposures may reflect their innate ecophysiological responses to essential edaphic variables, such as Ca2+ status. These observations are highly significant in the challenging exercise of interpreting holistic biomarker data delivered by "omic" technologies.
Resumo:
The Human Development Index (HDI) introduced by the United Nations Development Programme (UNDP) in 1990 has helped facilitate widespread debate amongst development researchers, practitioners and policy makers. The HDI is an aggregate index, calculated on an annual basis by the UNDP and published in its Human Development Reports, comprising measures of three components deemed by them to be central to development: W income (the gross domestic product per capita), (ii) education (adult literacy rate) and (iii) health (life expectancy at birth). The results of calculating the HDI are typically presented as country/regional league tables, and provide a quick means for policy makers and others to judge performance. Perhaps partly because of the relative simplicity of the index, the HDI has managed to achieve a level of acceptance and use amongst politicians and policy makers that has yet to emerge with any indicator of sustainability. Indeed, despite its existence for 11 years, including nine years after the Rio Earth Summit, the HDI has not even been modified to take on board wider issues of sustainability. This paper will critically examine the potential for 'greening' the HDI so as to include environmental and resource-consumption dimensions. Copyright (C) 2003 John Wiley & Sons, Ltd and ERP Environment.
Resumo:
A range of archaeological samples have been examined using FT-IR spectroscopy. These include suspected coprolite samples from the Neolithic site of Catalhoyuk in Turkey, pottery samples from the Roman site of Silchester, UK and the Bronze Age site of Gatas, Spain and unidentified black residues on pottery sherds from the Roman sites of Springhead and Cambourne, UK. For coprolite samples the aim of FT-IR analysis is identification. Identification of coprolites in the field is based on their distinct orange colour; however, such visual identifications can often be misleading due to their similarity with deposits such as ochre and clay. For pottery the aim is to screen those samples that might contain high levels of organic residues which would be suitable for GC-MS analysis. The experiments have shown coprolites to have distinctive spectra, containing strong peaks from calcite, phosphate and quartz; the presence of phosphorus may be confirmed by SEM-EDX analysis. Pottery containing organic residues of plant and animal origin has also been shown to generally display strong phosphate peaks. FT-IR has distinguished between organic resin and non-organic compositions for the black residues, with differences also being seen between organic samples that have the same physical appearance. Further analysis by CC-MS has confirmed the identification of the coprolites through the presence of coprostanol and bile acids, and shows that the majority of organic pottery residues are either fatty acids or mono- or di-acylglycerols from foodstuffs, or triterpenoid resin compounds exposed to high temperatures. One suspected resin sample was shown to contain no organic residues. and it is seen that resin samples with similar physical appearances have different chemical compositions. FT-IR is proposed as a quick and cheap method of screening archaeological samples before subjecting them to the more expensive and time-consuming method of GC-MS. This will eliminate inorganic samples such as clays and ochre from CC-MS analysis, and will screen those samples which are most likely to have a high concentration of preserved organic residues. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.
Resumo:
It is generally acknowledged that population-level assessments provide,I better measure of response to toxicants than assessments of individual-level effects. population-level assessments generally require the use of models to integrate potentially complex data about the effects of toxicants on life-history traits, and to provide a relevant measure of ecological impact. Building on excellent earlier reviews we here briefly outline the modelling options in population-level risk assessment. Modelling is used to calculate population endpoints from available data, which is often about Individual life histories, the ways that individuals interact with each other, the environment and other species, and the ways individuals are affected by pesticides. As population endpoints, we recommend the use of population abundance, population growth rate, and the chance of population persistence. We recommend two types of model: simple life-history models distinguishing two life-history stages, juveniles and adults; and spatially-explicit individual-based landscape models. Life-history models are very quick to set up and run, and they provide a great deal or insight. At the other extreme, individual-based landscape models provide the greatest verisimilitude, albeit at the cost of greatly increased complexity. We conclude with a discussion of the cations of the severe problems of parameterising models.
Resumo:
Identification of Fusarium species has always been difficult due to confusing phenotypic classification systems. We have developed a fluorescent-based polymerase chain reaction assay that allows for rapid and reliable identification of five toxigenic and pathogenic Fusarium species. The species includes Fusarium avenaceum, F. culmorum, F. equiseti, F. oxysporum and F. sambucinum. The method is based on the PCR amplification of species-specific DNA fragments using fluorescent oligonucleotide primers, which were designed based on sequence divergence within the internal transcribed spacer region of nuclear ribosomal DNA. Besides providing an accurate, reliable, and quick diagnosis of these Fusaria, another advantage with this method is that it reduces the potential for exposure to carcinogenic chemicals as it substitutes the use of fluorescent dyes in place of ethidium, bromide. Apart from its multidisciplinary importance and usefulness, it also obviates the need for gel electrophoresis. (C) 2002 Published by Elsevier Science B.V. on behalf of the Federation of European Microbiological Societies.
Resumo:
It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.
Resumo:
This paper discusses experimental and theoretical investigations and Computational Fluid Dynamics (CFD) modelling considerations to evaluate the performance of a square section wind catcher system connected to the top of a test room for the purpose of natural ventilation. The magnitude and distribution of pressure coefficients (C-p) around a wind catcher and the air flow into the test room were analysed. The modelling results indicated that air was supplied into the test room through the wind catcher's quadrants with positive external pressure coefficients and extracted out of the test room through quadrants with negative pressure coefficients. The air flow achieved through the wind catcher depends on the speed and direction of the wind. The results obtained using the explicit and AIDA implicit calculation procedures and CFX code correlate relatively well with the experimental results at lower wind speeds and with wind incidents at an angle of 0 degrees. Variation in the C-p and air flow results were observed particularly with a wind direction of 45 degrees. The explicit and implicit calculation procedures were found to be quick and easy to use in obtaining results whereas the wind tunnel tests were more expensive in terms of effort, cost and time. CFD codes are developing rapidly and are widely available especially with the decreasing prices of computer hardware. However, results obtained using CFD codes must be considered with care, particularly in the absence of empirical data.
Resumo:
Eye gaze is an important conversational resource that until now could only be supported across a distance if people were rooted to the spot. We introduce EyeCVE, the worldpsilas first tele-presence system that allows people in different physical locations to not only see what each other are doing but follow each otherpsilas eyes, even when walking about. Projected into each space are avatar representations of remote participants, that reproduce not only body, head and hand movements, but also those of the eyes. Spatial and temporal alignment of remote spaces allows the focus of gaze as well as activity and gesture to be used as a resource for non-verbal communication. The temporal challenge met was to reproduce eye movements quick enough and often enough to interpret their focus during a multi-way interaction, along with communicating other verbal and non-verbal language. The spatial challenge met was to maintain communicational eye gaze while allowing free movement of participants within a virtually shared common frame of reference. This paper reports on the technical and especially temporal characteristics of the system.
Resumo:
In this paper we consider hybrid (fast stochastic approximation and deterministic refinement) algorithms for Matrix Inversion (MI) and Solving Systems of Linear Equations (SLAE). Monte Carlo methods are used for the stochastic approximation, since it is known that they are very efficient in finding a quick rough approximation of the element or a row of the inverse matrix or finding a component of the solution vector. We show how the stochastic approximation of the MI can be combined with a deterministic refinement procedure to obtain MI with the required precision and further solve the SLAE using MI. We employ a splitting A = D – C of a given non-singular matrix A, where D is a diagonal dominant matrix and matrix C is a diagonal matrix. In our algorithm for solving SLAE and MI different choices of D can be considered in order to control the norm of matrix T = D –1C, of the resulting SLAE and to minimize the number of the Markov Chains required to reach given precision. Further we run the algorithms on a mini-Grid and investigate their efficiency depending on the granularity. Corresponding experimental results are presented.