33 resultados para Precision and recall
em CentAUR: Central Archive University of Reading - UK
Resumo:
Mood state facilitates recall of affectively congruent memories (i.e., mood-congruent recall). Mood state may also promote motivation to alleviate a negative affective state, leading to retrieval of affectively incongruent memories (i.e., mood incongruent recall). The present study demonstrates that the focus of self-knowledge influences the occurrence of both mood-congruent recall and mood-incongruent recall. Three experiments found that mood-congruent recall occurred when participants recalled their experiences from a self-aspect that was related to the elicitor of moods, whereas mood-incongruent recall occurred when they recalled their experiences from a self-aspect that was unrelated to the elicitor of moods. These results suggest that the nature of the self-aspect from which persons recall their experiences determines whether mood-congruent or mood-incongruent recall occurs.
Resumo:
As part of the European Commission (EC)'s revision of the Sewage Sludge Directive and the development of a Biowaste Directive, there was recognition of the difficulty of comparing data from Member States (MSs) because of differences in sampling and analytical procedures. The 'HORIZONTAL' initiative, funded by the EC and MSs, seeks to address these differences in approach and to produce standardised procedures in the form of CEN standards. This article is a preliminary investigation into aspects of the sampling of biosolids, composts and soils to which there is a history of biosolid application. The article provides information on the measurement uncertainty associated with sampling from heaps, large bags and pipes and soils in the landscape under a limited set of conditions, using sampling approaches in space and time and sample numbers based on procedures widely used in the relevant industries and when sampling similar materials. These preliminary results suggest that considerably more information is required before the appropriate sample design, optimum number of samples, number of samples comprising a composite, and temporal and spatial frequency of sampling might be recommended to achieve consistent results of a high level of precision and confidence. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Compared with younger adults, older adults have a relative preference to attend to and remember positive over negative information. This is known as the “positivity effect,” and researchers have typically evoked socioemotional selectivity theory to explain it. According to socioemotional selectivity theory, as people get older they begin to perceive their time left in life as more limited. These reduced time horizons prompt older adults to prioritize achieving emotional gratification and thus exhibit increased positivity in attention and recall. Although this is the most commonly cited explanation of the positivity effect, there is currently a lack of clear experimental evidence demonstrating a link between time horizons and positivity. The goal of the current research was to address this issue. In two separate experiments, we asked participants to complete a writing activity, which directed them to think of time as being either limited or expansive (Experiments 1 and 2) or did not orient them to think about time in a particular manner (Experiment 2). Participants were then shown a series of emotional pictures, which they subsequently tried to recall. Results from both studies showed that regardless of chronological age, thinking about a limited future enhanced the relative positivity of participants’ recall. Furthermore, the results of Experiment 2 showed that this effect was not driven by changes in mood. Thus, the fact that older adults’ recall is typically more positive than younger adults’ recall may index naturally shifting time horizons and goals with age.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The first IUPAC Manual of Symbols and Terminology for Physicochemical Quantities and Units (the Green Book) of which this is the direct successor, was published in 1969, with the object of 'securing clarity and precision, and wider agreement in the use of symbols, by chemists in different countries, among physicists, chemists and engineers, and by editors of scientific journals'. Subsequent revisions have taken account of many developments in the field, culminating in the major extension and revision represented by the 1988 edition under the simplified title Quantities, Units and Symbols in Physical Chemistry. This 2007, third edition, is a further revision of the material which reflects the experience of the contributors with the previous editions. The book has been systematically brought up to date and new sections have been added. It strives to improve the exchange of scientific information among the readers in different disciplines and across different nations. In a rapidly expanding volume of scientific literature where each discipline has a tendency to retreat into its own jargon this book attempts to provide a readable compilation of widely used terms and symbols from many sources together with brief understandable definitions. This is the definitive guide for scientists and organizations working across a multitude of disciplines requiring internationally approved nomenclature.
Resumo:
Liquid chromatography-mass spectrometry (LC-MS) datasets can be compared or combined following chromatographic alignment. Here we describe a simple solution to the specific problem of aligning one LC-MS dataset and one LC-MS/MS dataset, acquired on separate instruments from an enzymatic digest of a protein mixture, using feature extraction and a genetic algorithm. First, the LC-MS dataset is searched within a few ppm of the calculated theoretical masses of peptides confidently identified by LC-MS/MS. A piecewise linear function is then fitted to these matched peptides using a genetic algorithm with a fitness function that is insensitive to incorrect matches but sufficiently flexible to adapt to the discrete shifts common when comparing LC datasets. We demonstrate the utility of this method by aligning ion trap LC-MS/MS data with accurate LC-MS data from an FTICR mass spectrometer and show how hybrid datasets can improve peptide and protein identification by combining the speed of the ion trap with the mass accuracy of the FTICR, similar to using a hybrid ion trap-FTICR instrument. We also show that the high resolving power of FTICR can improve precision and linear dynamic range in quantitative proteomics. The alignment software, msalign, is freely available as open source.
Resumo:
The early eighties saw the introduction of liposomes as skin drug delivery systems, initially promoted primarily for localised effects with minimal systemic delivery. Subsequently, a novel ultradeformable vesicular system (termed "Transfersomes" by the inventors) was reported for transdermal delivery with an efficiency similar to subcutaneous injection. Further research illustrated that the mechanisms of liposome action depended on the application regime and the vesicle composition and morphology. Ethical, health and supply problems with human skin have encouraged researchers to use skin models. 'IYaditional models involved polymer membranes and animal tissue, but whilst of value for release studies, such models are not always good mimics for the complex human skin barrier, particularly with respect to the stratum corneal intercellular lipid domains. These lipids have a multiply bilayered organization, a composition and organization somewhat similar to liposomes, Consequently researchers have used vesicles as skin model membranes. Early work first employed phospholipid liposomes and tested their interactions with skin penetration enhancers, typically using thermal analysis and spectroscopic analyses. Another approach probed how incorporation of compounds into liposomes led to the loss of entrapped markers, analogous to "fluidization" of stratum corneum lipids on treatment with a penetration enhancer. Subsequently scientists employed liposomes formulated with skin lipids in these types of studies. Following a brief description of the nature of the skin barrier to transdermal drug delivery and the use of liposomes in drug delivery through skin, this article critically reviews the relevance of using different types of vesicles as a model for human skin in permeation enhancement studies, concentrating primarily on liposomes after briefly surveying older models. The validity of different types of liposome is considered and traditional skin models are compared to vesicular model membranes for their precision and accuracy as skin membrane mimics. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Tracer gas techniques have been the most appropriate experimental method of determining airflows and ventilation rates in houses. However, current trends to reduce greenhouse gas effects have prompted the need for alternative techniques, such as passive sampling. In this research passive sampling techniques have been used to demonstrate the potential to fulfil these requirements by using solutions of volatile organic compounds (VOCs) and solid phase microextraction (SPME) fibres. These passive sampling techniques have been calibrated against tracer gas decay techniques and measurements from a standard orifice plate. Two constant sources of volatile organic compounds were diffused into two sections of a humidity chamber and sampled using SPME fibres. From a total of four SPME fibres (two in each section), reproducible results were obtained. Emission rates and air movement from one section to the other were predicted using developed algorithms. Comparison of the SPME fibre technique with that of the tracer gas technique and measurements from an orifice plate showed similar results with good precision and accuracy. With these fibres, infiltration rates can be measured over grab samples in a time weighted averaged period lasting from 10 minutes up to several days. Key words: passive samplers, solid phase microextraction fibre, tracer gas techniques, airflow, air infiltration, houses.
Resumo:
CloudSat is a satellite experiment designed to measure the vertical structure of clouds from space. The expected launch of CloudSat is planned for 2004, and once launched, CloudSat will orbit in formation as part of a constellation of satellites (the A-Train) that includes NASA's Aqua and Aura satellites, a NASA-CNES lidar satellite (CALIPSO), and a CNES satellite carrying a polarimeter (PARASOL). A unique feature that CloudSat brings to this constellation is the ability to fly a precise orbit enabling the fields of view of the CloudSat radar to be overlapped with the CALIPSO lidar footprint and the other measurements of the constellation. The precision and near simultaneity of this overlap creates a unique multisatellite observing system for studying the atmospheric processes essential to the hydrological cycle.The vertical profiles of cloud properties provided by CloudSat on the global scale fill a critical gap in the investigation of feedback mechanisms linking clouds to climate. Measuring these profiles requires a combination of active and passive instruments, and this will be achieved by combining the radar data of CloudSat with data from other active and passive sensors of the constellation. This paper describes the underpinning science and general overview of the mission, provides some idea of the expected products and anticipated application of these products, and the potential capability of the A-Train for cloud observations. Notably, the CloudSat mission is expected to stimulate new areas of research on clouds. The mission also provides an important opportunity to demonstrate active sensor technology for future scientific and tactical applications. The CloudSat mission is a partnership between NASA's JPL, the Canadian Space Agency, Colorado State University, the U.S. Air Force, and the U.S. Department of Energy.
New age estimates for the Palaeolithic assemblages and Pleistocene succession of Casablanca, Morocco
Resumo:
Marine and aeolian Quaternary sediments from Casablanca, Morocco were dated using the optically stimulated luminescence (OSL) signal of quartz grains. These sediments form part of an extensive succession spanning the Pleistocene, and contain a rich faunal and archaeological record, including an Acheulian lithic assemblage from before the Brunhes–Matayama boundary, and a Homo erectus jaw from younger cave deposits. Sediment samples from the sites of Reddad Ben Ali, Oulad J’mel, Sidi Abderhamane and Thomas Quarries have been dated, in order to assess the upper limits of OSL. The revision of previously measured mammalian tooth enamel electron spin resonance (ESR) dates from the Grotte des Rhinocéros, Oulad Hamida Quarry 1, incorporating updated environmental dose rate measurements and attenuation calculations, also provide chronological constraint for the archaeological material preserved at Thomas Quarries. Several OSL age estimates extend back to around 500,000 years, with a single sample providing an OSL age close to 1 Ma in magnetically reversed sediments. These luminescence dates are some of the oldest determined, and their reliability is assessed using both internal criteria based on stratigraphic consistency, and external lithostratigraphic, morphostratigraphic and independent chronological constraints. For most samples, good internal agreement is observed using single aliquot regenerative-dose OSL measurements, while multiple aliquot additive-dose measurements generally have poorer resolution and consistency. Novel slow-component and component-resolved OSL approaches applied to four samples provide significantly enhanced dating precision, and an examination of the degree of signal zeroing at deposition. A comparison of the OSL age estimates with the updated ESR dates and one U-series date demonstrate that this method has great potential for providing reliable age estimates for sediments of this antiquity. We consider the cause of some slight age inversion observed at Thomas Quarries, and provide recommendations for further luminescence dating within this succession.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
Demand for organic meat is partially driven by consumer perceptions that organic foods are more nutritious than non-organic foods. However, there have been no systematic reviews comparing specifically the nutrient content of organic and conventionally produced meat. In this study, we report results of a meta-analysis based on sixty-seven published studies comparing the composition of organic and non-organic meat products. For many nutritionally relevant compounds (e.g. minerals, antioxidants and most individual fatty acids (FA)), the evidence base was too weak for meaningful meta-analyses. However, significant differences in FA profiles were detected when data from all livestock species were pooled. Concentrations of SFA and MUFA were similar or slightly lower, respectively, in organic compared with conventional meat. Larger differences were detected for total PUFA and n-3 PUFA, which were an estimated 23 (95 % CI 11, 35) % and 47 (95 % CI 10, 84) % higher in organic meat, respectively. However, for these and many other composition parameters, for which meta-analyses found significant differences, heterogeneity was high, and this could be explained by differences between animal species/meat types. Evidence from controlled experimental studies indicates that the high grazing/forage-based diets prescribed under organic farming standards may be the main reason for differences in FA profiles. Further studies are required to enable meta-analyses for a wider range of parameters (e.g. antioxidant, vitamin and mineral concentrations) and to improve both precision and consistency of results for FA profiles for all species. Potential impacts of composition differences on human health are discussed.
Resumo:
New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally ratified to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0-26 cal kyr BP (Before Present, 0 cal. BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0-10.5 call kyr BR Beyond 10.5 cal kyr BP, high-resolution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific C-14 reservoir age information to provide a single global marine mixed-layer calibration from 10.5-26.0 cal kyr BR A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the C-14 age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue).
Resumo:
Recent developments in contracting practice in the UK have built upon recommendations contained in highprofile reports, such as those by Latham and Egan. However, the New Engineering Contract (NEC), endorsed by Latham, is based upon principles of contract drafting that seem open to question. Any contract operates in the context of its legislative environment and current working practices. This report identifies eight contentious hypotheses in the literature on construction contracts and tests their validity in a sample survey that attracted 190 responses. The survey shows, among other things, that while partnership is a positive and useful idea, authoritative contract management is considered more effective and that “win-win” contracts, while desirable, are basically impractical. Further, precision and fairness in contracts are not easy to achieve simultaneously. While participants should know what is in their contracts, they should not routinely resort to legal action; and standard-form contracts should not seek to be universally applicable. Fundamental changes to drafting policy should be undertaken within the context of current legal contract doctrine and with a sensitivity to the way that contracts are used in contemporary practice. Attitudes to construction contracting may seem to be changing on the surface, but detailed analysis of what lies behind apparent agreement on new ways of working reveals that attitudes are changing much more slowly than they appear to be.