915 resultados para Precision and recall
Resumo:
The major objective of the thesis is essentially to evolve and apply certain computational procedures to evaluate the structure and properties of some simple polyatomic molecules making use of spectroscopic data available from the literature. It must be said that though there is dwindling interest in recent times in such analyses, there exists tremendous scope and utility for attempting such calculations as the precision and reliability of'experimental techniques in spectroscopy have increased vastly due to enormous sophistication of the instruments used for these measurements. In the present thesis an attempt is made to extract maximum amount of information regarding the geometrical structure and interatmic forces of simple molecules from the experimental data on microwave and infrared spectra of these molecules
Resumo:
In this text, we present two stereo-based head tracking techniques along with a fast 3D model acquisition system. The first tracking technique is a robust implementation of stereo-based head tracking designed for interactive environments with uncontrolled lighting. We integrate fast face detection and drift reduction algorithms with a gradient-based stereo rigid motion tracking technique. Our system can automatically segment and track a user's head under large rotation and illumination variations. Precision and usability of this approach are compared with previous tracking methods for cursor control and target selection in both desktop and interactive room environments. The second tracking technique is designed to improve the robustness of head pose tracking for fast movements. Our iterative hybrid tracker combines constraints from the ICP (Iterative Closest Point) algorithm and normal flow constraint. This new technique is more precise for small movements and noisy depth than ICP alone, and more robust for large movements than the normal flow constraint alone. We present experiments which test the accuracy of our approach on sequences of real and synthetic stereo images. The 3D model acquisition system we present quickly aligns intensity and depth images, and reconstructs a textured 3D mesh. 3D views are registered with shape alignment based on our iterative hybrid tracker. We reconstruct the 3D model using a new Cubic Ray Projection merging algorithm which takes advantage of a novel data structure: the linked voxel space. We present experiments to test the accuracy of our approach on 3D face modelling using real-time stereo images.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The first IUPAC Manual of Symbols and Terminology for Physicochemical Quantities and Units (the Green Book) of which this is the direct successor, was published in 1969, with the object of 'securing clarity and precision, and wider agreement in the use of symbols, by chemists in different countries, among physicists, chemists and engineers, and by editors of scientific journals'. Subsequent revisions have taken account of many developments in the field, culminating in the major extension and revision represented by the 1988 edition under the simplified title Quantities, Units and Symbols in Physical Chemistry. This 2007, third edition, is a further revision of the material which reflects the experience of the contributors with the previous editions. The book has been systematically brought up to date and new sections have been added. It strives to improve the exchange of scientific information among the readers in different disciplines and across different nations. In a rapidly expanding volume of scientific literature where each discipline has a tendency to retreat into its own jargon this book attempts to provide a readable compilation of widely used terms and symbols from many sources together with brief understandable definitions. This is the definitive guide for scientists and organizations working across a multitude of disciplines requiring internationally approved nomenclature.
Resumo:
Liquid chromatography-mass spectrometry (LC-MS) datasets can be compared or combined following chromatographic alignment. Here we describe a simple solution to the specific problem of aligning one LC-MS dataset and one LC-MS/MS dataset, acquired on separate instruments from an enzymatic digest of a protein mixture, using feature extraction and a genetic algorithm. First, the LC-MS dataset is searched within a few ppm of the calculated theoretical masses of peptides confidently identified by LC-MS/MS. A piecewise linear function is then fitted to these matched peptides using a genetic algorithm with a fitness function that is insensitive to incorrect matches but sufficiently flexible to adapt to the discrete shifts common when comparing LC datasets. We demonstrate the utility of this method by aligning ion trap LC-MS/MS data with accurate LC-MS data from an FTICR mass spectrometer and show how hybrid datasets can improve peptide and protein identification by combining the speed of the ion trap with the mass accuracy of the FTICR, similar to using a hybrid ion trap-FTICR instrument. We also show that the high resolving power of FTICR can improve precision and linear dynamic range in quantitative proteomics. The alignment software, msalign, is freely available as open source.
Resumo:
The early eighties saw the introduction of liposomes as skin drug delivery systems, initially promoted primarily for localised effects with minimal systemic delivery. Subsequently, a novel ultradeformable vesicular system (termed "Transfersomes" by the inventors) was reported for transdermal delivery with an efficiency similar to subcutaneous injection. Further research illustrated that the mechanisms of liposome action depended on the application regime and the vesicle composition and morphology. Ethical, health and supply problems with human skin have encouraged researchers to use skin models. 'IYaditional models involved polymer membranes and animal tissue, but whilst of value for release studies, such models are not always good mimics for the complex human skin barrier, particularly with respect to the stratum corneal intercellular lipid domains. These lipids have a multiply bilayered organization, a composition and organization somewhat similar to liposomes, Consequently researchers have used vesicles as skin model membranes. Early work first employed phospholipid liposomes and tested their interactions with skin penetration enhancers, typically using thermal analysis and spectroscopic analyses. Another approach probed how incorporation of compounds into liposomes led to the loss of entrapped markers, analogous to "fluidization" of stratum corneum lipids on treatment with a penetration enhancer. Subsequently scientists employed liposomes formulated with skin lipids in these types of studies. Following a brief description of the nature of the skin barrier to transdermal drug delivery and the use of liposomes in drug delivery through skin, this article critically reviews the relevance of using different types of vesicles as a model for human skin in permeation enhancement studies, concentrating primarily on liposomes after briefly surveying older models. The validity of different types of liposome is considered and traditional skin models are compared to vesicular model membranes for their precision and accuracy as skin membrane mimics. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Tracer gas techniques have been the most appropriate experimental method of determining airflows and ventilation rates in houses. However, current trends to reduce greenhouse gas effects have prompted the need for alternative techniques, such as passive sampling. In this research passive sampling techniques have been used to demonstrate the potential to fulfil these requirements by using solutions of volatile organic compounds (VOCs) and solid phase microextraction (SPME) fibres. These passive sampling techniques have been calibrated against tracer gas decay techniques and measurements from a standard orifice plate. Two constant sources of volatile organic compounds were diffused into two sections of a humidity chamber and sampled using SPME fibres. From a total of four SPME fibres (two in each section), reproducible results were obtained. Emission rates and air movement from one section to the other were predicted using developed algorithms. Comparison of the SPME fibre technique with that of the tracer gas technique and measurements from an orifice plate showed similar results with good precision and accuracy. With these fibres, infiltration rates can be measured over grab samples in a time weighted averaged period lasting from 10 minutes up to several days. Key words: passive samplers, solid phase microextraction fibre, tracer gas techniques, airflow, air infiltration, houses.
Resumo:
CloudSat is a satellite experiment designed to measure the vertical structure of clouds from space. The expected launch of CloudSat is planned for 2004, and once launched, CloudSat will orbit in formation as part of a constellation of satellites (the A-Train) that includes NASA's Aqua and Aura satellites, a NASA-CNES lidar satellite (CALIPSO), and a CNES satellite carrying a polarimeter (PARASOL). A unique feature that CloudSat brings to this constellation is the ability to fly a precise orbit enabling the fields of view of the CloudSat radar to be overlapped with the CALIPSO lidar footprint and the other measurements of the constellation. The precision and near simultaneity of this overlap creates a unique multisatellite observing system for studying the atmospheric processes essential to the hydrological cycle.The vertical profiles of cloud properties provided by CloudSat on the global scale fill a critical gap in the investigation of feedback mechanisms linking clouds to climate. Measuring these profiles requires a combination of active and passive instruments, and this will be achieved by combining the radar data of CloudSat with data from other active and passive sensors of the constellation. This paper describes the underpinning science and general overview of the mission, provides some idea of the expected products and anticipated application of these products, and the potential capability of the A-Train for cloud observations. Notably, the CloudSat mission is expected to stimulate new areas of research on clouds. The mission also provides an important opportunity to demonstrate active sensor technology for future scientific and tactical applications. The CloudSat mission is a partnership between NASA's JPL, the Canadian Space Agency, Colorado State University, the U.S. Air Force, and the U.S. Department of Energy.
New age estimates for the Palaeolithic assemblages and Pleistocene succession of Casablanca, Morocco
Resumo:
Marine and aeolian Quaternary sediments from Casablanca, Morocco were dated using the optically stimulated luminescence (OSL) signal of quartz grains. These sediments form part of an extensive succession spanning the Pleistocene, and contain a rich faunal and archaeological record, including an Acheulian lithic assemblage from before the Brunhes–Matayama boundary, and a Homo erectus jaw from younger cave deposits. Sediment samples from the sites of Reddad Ben Ali, Oulad J’mel, Sidi Abderhamane and Thomas Quarries have been dated, in order to assess the upper limits of OSL. The revision of previously measured mammalian tooth enamel electron spin resonance (ESR) dates from the Grotte des Rhinocéros, Oulad Hamida Quarry 1, incorporating updated environmental dose rate measurements and attenuation calculations, also provide chronological constraint for the archaeological material preserved at Thomas Quarries. Several OSL age estimates extend back to around 500,000 years, with a single sample providing an OSL age close to 1 Ma in magnetically reversed sediments. These luminescence dates are some of the oldest determined, and their reliability is assessed using both internal criteria based on stratigraphic consistency, and external lithostratigraphic, morphostratigraphic and independent chronological constraints. For most samples, good internal agreement is observed using single aliquot regenerative-dose OSL measurements, while multiple aliquot additive-dose measurements generally have poorer resolution and consistency. Novel slow-component and component-resolved OSL approaches applied to four samples provide significantly enhanced dating precision, and an examination of the degree of signal zeroing at deposition. A comparison of the OSL age estimates with the updated ESR dates and one U-series date demonstrate that this method has great potential for providing reliable age estimates for sediments of this antiquity. We consider the cause of some slight age inversion observed at Thomas Quarries, and provide recommendations for further luminescence dating within this succession.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
Demand for organic meat is partially driven by consumer perceptions that organic foods are more nutritious than non-organic foods. However, there have been no systematic reviews comparing specifically the nutrient content of organic and conventionally produced meat. In this study, we report results of a meta-analysis based on sixty-seven published studies comparing the composition of organic and non-organic meat products. For many nutritionally relevant compounds (e.g. minerals, antioxidants and most individual fatty acids (FA)), the evidence base was too weak for meaningful meta-analyses. However, significant differences in FA profiles were detected when data from all livestock species were pooled. Concentrations of SFA and MUFA were similar or slightly lower, respectively, in organic compared with conventional meat. Larger differences were detected for total PUFA and n-3 PUFA, which were an estimated 23 (95 % CI 11, 35) % and 47 (95 % CI 10, 84) % higher in organic meat, respectively. However, for these and many other composition parameters, for which meta-analyses found significant differences, heterogeneity was high, and this could be explained by differences between animal species/meat types. Evidence from controlled experimental studies indicates that the high grazing/forage-based diets prescribed under organic farming standards may be the main reason for differences in FA profiles. Further studies are required to enable meta-analyses for a wider range of parameters (e.g. antioxidant, vitamin and mineral concentrations) and to improve both precision and consistency of results for FA profiles for all species. Potential impacts of composition differences on human health are discussed.
Resumo:
The ground state thermal neutron cross section and the resonance integral for the (165)Ho(n, gamma)(166)Ho reaction in thermal and 1/E regions, respectively, of a thermal reactor neutron spectrum have been measured experimentally by activation technique. The reaction product, (166)Ho in the ground state, is gaining considerable importance as a therapeutic radionuclide and precisely measured data of the reaction are of significance from the fundamental point of view as well as for application. In this work, the spectrographically pure holmium oxide (Ho(2)O(3)) powder samples were irradiated with and without cadmium covers at the IEA-RI reactor (IPEN, Sao Paulo), Brazil. The deviation of the neutron spectrum shape from 1/E law was measured by co-irradiating Co, Zn, Zr and Au activation detectors with thermal and epithermal neutrons followed by regression and iterative procedures. The magnitudes of the discrepancies that can occur in measurements made with the ideal 1/E law considerations in the epithermal range were studied. The measured thermal neutron cross section at the Maxwellian averaged thermal energy of 0.0253 eV is 59.0 +/- 2.1 b and for the resonance integral 657 +/- 36b. The results are measured with good precision and indicated a consistency trend to resolve the discrepant status of the literature data. The results are compared with the values in main libraries such as ENDF/B-VII, JEF-2.2 and JENDL-3.2, and with other measurements in the literature.
Resumo:
in this work, a simple method for the simultaneous determination of cocaine (COC) and five COC metabolites (benzoylecgonine, cocaethylene (CET), anhydroecgonine, anhydroecgonine methyl ester and ecgonine methyl ester) in human urine using CE coupled to MS via electrospray ionization (CE-ESI-MS) was developed and validated. Formic acid at 1 mol/L concentration was used as electrolyte whereas formic acid at 0.05 mol/L concentration in 1:1 methanol:water composed the coaxial sheath liquid at the ESI nozzle. The developed method presented good linearity in the dynamic range from 250 ng/mL to 5000 ng/mL (coefficient of determination greater than 0.98 for all compounds). LODs (signal-to-noise ratio of 3) were 100 ng/mL for COC and CET and 250 ng/mL for the other studied metabolites whereas LOQ`s (signal-to-noise ratio of 10) were 250 ng/mL for COC and CET and 500 ng/mL for all other compounds. Intra-day precision and recovery tests estimated at three different concentration levels (500, 1500 and 5000 ng/mL) provided RSD lower than 10% (except anhydroecgonine, 18% RSD) and recoveries from 83-109% for all analytes. The method was successfully applied to real cases. For the positive urine samples, the presence of COC and its` metabolites was further confirmed by MS/MS experiments.
Resumo:
This work describes the electroanalytical determination of pendimethalin herbicide levels in natural waters, river sediment and baby food samples, based on the electro-reduction of herbicide on the hanging mercury drop electrode using square wave voltammetry (SWV). A number of experimental and voltammetric conditions were evaluated and the best responses were achieved in Britton-Robinson buffer solutions at pH 8.0, using a frequency of 500 s(-1). a scan increment of 10 mV and a square wave amplitude of 50 mV. Under these conditions, the pendimethalin is reduced in an irreversible process, with two reduction peaks at -0.60 V and -0.71 V. using a Ag/AgCl reference system. Analytical curves were constructed and the detection limit values were calculated to be 7.79 mu g L(-1) and 4.88 mu g L(-1), for peak 1 and peak 2, respectively. The precision and accuracy were determinate as a function of experimental repeatability and reproducibility, which showed standard relative deviation values that were lower than 2% for both voltammetric peaks. The applicability of the proposed methodology was evaluated in natural water, river sediments and baby food samples. The calculated recovery efficiencies demonstrate that the proposed methodology is suitable for determining any contamination by pendimethalin in these samples. Additionally, adsorption isotherms were used to evaluate information about the behavior of pendimethalin in river sediment samples. (C) 2010 Elsevier B.V. All rights reserved.