942 resultados para MATCHING-TO-SAMPLE


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Adult monkeys (Macaca mulatta) with lesions of the hippocampal formation, perirhinal cortex, areas TH/TF, as well as controls were tested on tasks of object, spatial and contextual recognition memory. ^ Using a visual paired-comparison (VPC) task, all experimental groups showed a lack of object recognition relative to controls, although this impairment emerged at 10 sec with perirhinal lesions, 30 sec with areas TH/TF lesions and 60 sec with hippocampal lesions. In contrast, only perirhinal lesions impaired performance on delayed nonmatching-to-sample (DNMS), another task of object recognition memory. All groups were tested on DNMS with distraction (dDNMS) to examine whether the use of active cognitive strategies during the delay period could enable good performance on DNMS in spite of impaired recognition memory (revealed by the VPC task). Distractors affected performance of animals with perirhinal lesions at the 10-sec delay (the only delay in which their DNMS performance was above chance). They did not affect performance of animals with areas TH/TF lesions. Hippocampectomized animals were impaired at the 600-sec delay (the only delay at which prevention of active strategies would likely affect their behavior). ^ While lesions of areas TH/TF impaired spatial location memory and object-in-place memory, hippocampal lesions impaired only object-in-place memory. The pattern of results for perirhinal cortex lesions on the different task conditions indicated that this cortical area is not critical for spatial memory. ^ Finally, all three lesions impaired contextual recognition memory processes. The pattern of impairment appeared to result from the formation of only a global representation of the object and background, and suggests that all three areas are recruited for associating information across sources. ^ These results support the view that (1) the perirhinal cortex maintains storage of information about object and the context in which it is learned for a brief period of time, (2) areas TH/TF maintain information about spatial location and form associations between objects and their spatial relationship (a process that likely requires additional time) and (3) the hippocampal formation mediates associations between objects, their spatial relationship and the general context in which these associations are formed (an integrative function that requires additional time). ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Considering work instability as a contextual stressor we designed a specific instrument to assess how it is perceived by a group of psychologists: The Perceived Uneasiness in Work Instability - Psychologists Inventory (in Spanish, IMPIL-PS). The data were collected from a 44-subject sample, both male and female, residents of the City of Buenos Aires and Greater Buenos Aires. We present data referring to sample characteristics and the areas with the greatest impact of the stressor are indicated. Recent research on work instability as a contextual stressor point out its influence on subjects' performance and behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During Ocean Drilling Program Leg 199 a high-resolution (~1-2 cm/k.y.) biogenic sediment record from the late Paleocene to the early Miocene was recovered, containing an uninterrupted set of geomagnetic chrons as well as a detailed record of calcareous and siliceous biostratigraphic datum events. Shipboard lithologic proxy measurements and shore-based determinations of CaCO3 revealed regular cycles that can be attributed to climatic forcing. Discovering drill sites with well defined magneto- and biostratigraphic records that also show clear lithologic cycles is rare and valuable and creates the opportunity to develop a detailed stratigraphic intersite correlation, providing the basis to study paleoceanographic processes and mass accumulation rates at high resolution. Here we present extensive postcruise work that extends the shipboard composite depth stratigraphy by providing a high-resolution revised meters composite depth (rmcd) scale to compensate for depth distortion within individual cores. The depth-aligned data were then used to generate stacked records of lithologic proxy measurements. Making use of the increased signal-to-noise ratio in the stacked records, we then proceeded to generate a detailed site-to-site correlation between Sites 1218 and 1219 in order to decrease the depth uncertainty for magneto- and biostratigraphic datums. Stacked lithologic proxy records in combination with discrete measurements of CaCO3 were then exploited to calculate high-resolution carbonate concentration curves by regression of the multisensor track data with discrete measurements. By matching correlative features between the cores and wireline logging data, we also rescaled our core rmcd back to in situ depths. Our study identifies lithology-dependent core expansion due to unloading as the mechanism of varying stratigraphic thicknesses between cores.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Considering work instability as a contextual stressor we designed a specific instrument to assess how it is perceived by a group of psychologists: The Perceived Uneasiness in Work Instability - Psychologists Inventory (in Spanish, IMPIL-PS). The data were collected from a 44-subject sample, both male and female, residents of the City of Buenos Aires and Greater Buenos Aires. We present data referring to sample characteristics and the areas with the greatest impact of the stressor are indicated. Recent research on work instability as a contextual stressor point out its influence on subjects' performance and behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Considering work instability as a contextual stressor we designed a specific instrument to assess how it is perceived by a group of psychologists: The Perceived Uneasiness in Work Instability - Psychologists Inventory (in Spanish, IMPIL-PS). The data were collected from a 44-subject sample, both male and female, residents of the City of Buenos Aires and Greater Buenos Aires. We present data referring to sample characteristics and the areas with the greatest impact of the stressor are indicated. Recent research on work instability as a contextual stressor point out its influence on subjects' performance and behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El propósito de este proyecto de �fin de carrera es la caracterización e instrumentación de un sensor de ultrasonidos modelado por el tutor de este proyecto: Don César Briso Rodrí��guez. Una vez realizado el modelado de dicho sensor, simulando tanto sus caracter�í�sticas f�í�sicas, como sus caracterí��sticas eléctricas, se procede a la intrumentación y uso del mismo. La parte de intrumentaci�ón incluye tanto la electrónica que ser��á necesaria para la excitación del piezoeléctrico, en el modo de emisi�ón, como para la recepción de los pulsos el�éctricos generados por el sensor, como respuesta a los ecos recibidos, y su adecuación a niveles de señal correctos para la adquisici�ón, en el modo de escucha. Tras la adecuaci�ón de las señales para la adquisici�ón, éstas ser�án digitalizadas, tratadas y representadas por pantalla en un PC, a trav�es de una tarjeta de adquisición de datos por puerto USB encargada del muestreo de las señales de respuesta ya tratadas y su posterior enví��o al software de control y representaci�ón desarrollado en este proyecto. El entorno de usuario, el software de control de la tarjeta de adquisición y el software de tratamiento y representaci�ón se ha desarrollado con Visual Basic 2008 y las utilidades gr�áfi�cas de las librer��ías OpenGL. ABSTRACT The purpose of this project is to limit the characterization and implementation of an ultrasonic sensor modeled by Mr. C�ésar Briso Rodr��íguez. Once the sensor modeling by simulating physical characteristics and electrical characteristics, we proceed to the instrumentation and use. This section includes electronic instrumentation that would be necessary for the piezoelectric excitation in the emission mode and for receiving electrical pulses generated by the sensor in response to the received echoes, and matching signal levels right to acquire, in the reception mode. After the adjustment of the signals for the acquisition, these signals will be digitalized, processed and represented on the screen on a PC through a data acquisition card by USB port. Acquisition card is able to sample the response signals and transmit the samples to representation and control software developed in this project. The user interface, the acquisition card control software and processing and representation software has been developed with Visual Basic 2008 and OpenGL graphical libraries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We describe a method that can be used to produce equimolar amounts of two or more specific proteins in a cell. In this approach, termed the ubiquitin/protein/reference (UPR) technique, a reference protein and a protein of interest are synthesized as a polyprotein separated by a ubiquitin moiety. This tripartite fusion is cleaved, cotranslationally or nearly so, by ubiquitin-specific processing proteases after the last residue of ubiquitin, producing equimolar amounts of the protein of interest and the reference protein bearing a C-terminal ubiquitin moiety. In applications such as pulse-chase analysis, the UPR technique can compensate for the scatter of immunoprecipitation yields, sample volumes, and other sources of sample-to-sample variation. In particular, this method allows a direct comparison of proteins' metabolic stabilities from the pulse data alone. We used UPR to examine the N-end rule (a relation between the in vivo half-life of a protein and the identity of its N-terminal residue) in L cells, a mouse cell line. The increased accuracy afforded by the UPR technique underscores insufficiency of the current "half-life" terminology, because in vivo degradation of many proteins deviates from first-order kinetics. We consider this problem and discuss other applications of UPR.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Monte Carlo simulation method for globular proteins, called extended-scaled-collective-variable (ESCV) Monte Carlo, is proposed. This method combines two Monte Carlo algorithms known as entropy-sampling and scaled-collective-variable algorithms. Entropy-sampling Monte Carlo is able to sample a large configurational space even in a disordered system that has a large number of potential barriers. In contrast, scaled-collective-variable Monte Carlo provides an efficient sampling for a system whose dynamics is highly cooperative. Because a globular protein is a disordered system whose dynamics is characterized by collective motions, a combination of these two algorithms could provide an optimal Monte Carlo simulation for a globular protein. As a test case, we have carried out an ESCV Monte Carlo simulation for a cell adhesive Arg-Gly-Asp-containing peptide, Lys-Arg-Cys-Arg-Gly-Asp-Cys-Met-Asp, and determined the conformational distribution at 300 K. The peptide contains a disulfide bridge between the two cysteine residues. This bond mimics the strong geometrical constraints that result from a protein's globular nature and give rise to highly cooperative dynamics. Computation results show that the ESCV Monte Carlo was not trapped at any local minimum and that the canonical distribution was correctly determined.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The generation of very short range forecasts of precipitation in the 0-6 h time window is traditionally referred to as nowcasting. Most existing nowcasting systems essentially extrapolate radar observations in some manner, however, very few systems account for the uncertainties involved. Thus deterministic forecast are produced, which have a limited use when decisions must be made, since they have no measure of confidence or spread of the forecast. This paper develops a Bayesian state space modelling framework for quantitative precipitation nowcasting which is probabilistic from conception. The model treats the observations (radar) as noisy realisations of the underlying true precipitation process, recognising that this process can never be completely known, and thus must be represented probabilistically. In the model presented here the dynamics of the precipitation are dominated by advection, so this is a probabilistic extrapolation forecast. The model is designed in such a way as to minimise the computational burden, while maintaining a full, joint representation of the probability density function of the precipitation process. The update and evolution equations avoid the need to sample, thus only one model needs be run as opposed to the more traditional ensemble route. It is shown that the model works well on both simulated and real data, but that further work is required before the model can be used operationally. © 2004 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The 9/11 Act mandates the inspection of 100% of cargo shipments entering the U.S. by 2012 and 100% inspection of air cargo by March 2010. So far, only 5% of inbound shipping containers are inspected thoroughly while air cargo inspections have fared better at 50%. Government officials have admitted that these milestones cannot be met since the appropriate technology does not exist. This research presents a novel planar solid phase microextraction (PSPME) device with enhanced surface area and capacity for collection of the volatile chemical signatures in air that are emitted from illicit compounds for direct introduction into ion mobility spectrometers (IMS) for detection. These IMS detectors are widely used to detect particles of illicit substances and do not have to be adapted specifically to this technology. For static extractions, PDMS and sol-gel PDMS PSPME devices provide significant increases in sensitivity over conventional fiber SPME. Results show a 50–400 times increase in mass detected of piperonal and a 2–4 times increase for TNT. In a blind study of 6 cases suspected to contain varying amounts of MDMA, PSPME-IMS correctly detected 5 positive cases with no false positives or negatives. One of these cases had minimal amounts of MDMA resulting in a false negative response for fiber SPME-IMS. A La (dihed) phase chemistry has shown an increase in the extraction efficiency of TNT and 2,4-DNT and enhanced retention over time. An alternative PSPME device was also developed for the rapid (seconds) dynamic sampling and preconcentration of large volumes of air for direct thermal desorption into an IMS. This device affords high extraction efficiencies due to strong retention properties under ambient conditions resulting in ppt detection limits when 3.5 L of air are sampled over the course of 10 seconds. Dynamic PSPME was used to sample the headspace over the following: MDMA tablets (12–40 ng detected of piperonal), high explosives (Pentolite) (0.6 ng detected of TNT), and several smokeless powders (26–35 ng of 2,4-DNT and 11–74 ng DPA detected). PSPME-IMS technology is flexible to end-user needs, is low-cost, rapid, sensitive, easy to use, easy to implement, and effective. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables full spectrum CT in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical eects in the detector and are very noisy due to photon starvation. In this work, we proposed two methods based on machine learning to address the spectral distortion issue and to improve the material decomposition. This rst approach is to model distortions using an articial neural network (ANN) and compensate for the distortion in a statistical reconstruction. The second approach is to directly correct for the distortion in the projections. Both technique can be done as a calibration process where the neural network can be trained using 3D printed phantoms data to learn the distortion model or the correction model of the spectral distortion. This replaces the need for synchrotron measurements required in conventional technique to derive the distortion model parametrically which could be costly and time consuming. The results demonstrate experimental feasibility and potential advantages of ANN-based distortion modeling and correction for more accurate K-edge imaging with a PCXD. Given the computational eciency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Deciphering the driving mechanisms of Earth system processes, including the climate dynamics expressed as paleoceanographic events, requires a complete, continuous, and high-resolution stratigraphy that is very accurately dated. In this study, we construct a robust astronomically calibrated age model for the middle Eocene to early Oligocene interval (31-43 Ma) in order to permit more detailed study of the exceptional climatic events that occurred during this time, including the Middle Eocene Climate Optimum and the Eocene/Oligocene transition. A goal of this effort is to accurately date the middle Eocene to early Oligocene composite section cored during the Pacific Equatorial Age Transect (PEAT, IODP Exp. 320/321). The stratigraphic framework for the new time scale is based on the identification of the stable long eccentricity cycle in published and new high-resolution records encompassing bulk and benthic stable isotope, calibrated XRF core scanning, and magnetostratigraphic data from ODP Sites 171B-1052, 189-1172, 199-1218, and 207-1260 as well as IODP Sites 320-U1333, and -U1334 spanning magnetic polarity Chrons C12n to C20n. Subsequently we applied orbital tuning of the records to the La2011 orbital solution. The resulting new time scale revises and refines the existing orbitally tuned age model and the Geomagnetic Polarity Time Scale from 31 to 43 Ma. Our newly defined absolute age for the Eocene/Oligocene boundary validates the astronomical tuned age of 33.89 Ma identified at the Massignano (Italy) global stratotype section and point. Our compilation of geochemical records of climate-controlled variability in sedimentation through the middle-to-late Eocene and early Oligocene demonstrates strong power in the eccentricity band that is readily tuned to the latest astronomical solution. Obliquity driven cyclicity is only apparent during very long eccentricity cycle minima around 35.5 Ma, 38.3 Ma and 40.1 Ma.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pesticides applications have been described by many researches as a very inefficient process. In some cases, there are reports that only 0.02% of the applied products are used for the effective control of the problem. The main factor that influences pesticides applications is the droplet size formed on spraying nozzles. Many parameters affects the dynamic of the droplets, like wind, temperature, relative humidity, and others. Small droplets are biologically more active, but they are affected by evaporation and drift. On the other hand, the great droplets do not promote a good distribution of the product on the target. In this sense, associated with the risk of non target areas contamination and with the high costs involved in applications, the knowledge of the droplet size is of fundamental importance in the application technology. When sophisticated technology for droplets analysis is unavailable, is common the use of artificial targets like water-sensitive paper to sample droplets. On field sampling, water-sensitive papers are placed on the trials where product will be applied. When droplets impinging on it, the yellow surface of this paper will be stained dark blue, making easy their recognition. Collected droplets on this papers have different kinds of sizes. In this sense, the determination of the droplet size distribution gives a mass distribution of the material and so, the efficience of the application of the product. The stains produced by droplets shows a spread factor proportional to their respectives initial sizes. One of methodologies to analyse the droplets is a counting and measure of the droplets made in microscope. The Porton N-G12 graticule, that shows equaly spaces class intervals on geometric progression of square 2, are coulpled to the lens of the microscope. The droplet size parameters frequently used are the Volumetric Median Diameter (VMD) and the Numeric Median Diameter. On VMD value, a representative droplets sample is divided in two equal parts of volume, in such away one part contains droplets of sizes smaller than VMD and the other part contains droplets of sizes greater that VMD. The same process is done to obtaining the NMD, which divide the sample in two equal parts in relation to the droplets size. The ratio between VMD and NMD allows the droplets uniformity evaluation. After that, the graphics of accumulated probability of the volume and size droplets are plotted on log scale paper (accumulated probability versus median diameter of each size class). The graphics provides the NMD on the x-axes point corresponding to the value of 50% founded on the y-axes. All this process is very slow and subjected to operator error. So, in order to decrease the difficulty envolved with droplets measuring it was developed a numeric model, implemented on easy and accessfull computational language, which allows approximate VMD and NMD values, with good precision. The inputs to this model are the frequences of the droplets sizes colected on the water-sensitive paper, observed on the Porton N-G12 graticule fitted on microscope. With these data, the accumulated distribution of the droplet medium volumes and sizes are evaluated. The graphics obtained by plotting this distributions allow to obtain the VMD and NMD using linear interpolation, seen that on the middle of the distributions the shape of the curves are linear. These values are essential to evaluate the uniformity of droplets and to estimate the volume deposited on the observed paper by the density (droplets/cm2). This methodology to estimate the droplets volume was developed by 11.0.94.224 Project of the CNPMA/EMBRAPA. Observed data of herbicides aerial spraying samples, realized by Project on Pelotas/RS county, were used to compare values obtained manual graphic method and with those obtained by model has shown, with great precision, the values of VMD and NMD on each sampled collector, allowing to estimate a quantities of deposited product and, by consequence, the quantities losses by drifty. The graphics of variability of VMD and NMD showed that the quantity of droplets that reachs the collectors had a short dispersion, while the deposited volume shows a great interval of variation, probably because the strong action of air turbulence on the droplets distribution, enfasizing the necessity of a deeper study to verify this influences on drift.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes the development of the Sample Fetch Rover (SFR), studied for Mars Sample Return (MSR), an international campaign carried out in cooperation between the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The focus of this document is the design of the electro-mechanical systems of the rover. After placing this work into the general context of robotic planetary exploration and summarising the state of the art for what concerns Mars rovers, the architecture of the Mars Sample Return Campaign is presented. A complete overview of the current SFR architecture is provided, touching upon all the main subsystems of the spacecraft. For each area, it is discussed what are the design drivers, the chosen solutions and whether they use heritage technology (in particular from the ExoMars Rover) or new developments. This research focuses on two topics of particular interest, due to their relevance for the mission and the novelty of their design: locomotion and sample acquisition, which are discussed in depth. The early SFR locomotion concepts are summarised, covering the initial trade-offs and discarded designs for higher traverse performance. Once a consolidated architecture was reached, the locomotion subsystem was developed further, defining the details of the suspension, actuators, deployment mechanisms and wheels. This technology is presented here in detail, including some key analysis and test results that support the design and demonstrate how it responds to the mission requirements. Another major electro-mechanical system developed as part of this work is the one dedicated to sample tube acquisition. The concept of operations of this machinery was defined to be robust against the unknown conditions that characterise the mission. The design process led to a highly automated robotic system which is described here in its main components: vision system, robotic arm and tube storage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ground deformation provides valuable insights on subsurface processes with pattens reflecting the characteristics of the source at depth. In active volcanic sites displacements can be observed in unrest phases; therefore, a correct interpretation is essential to assess the hazard potential. Inverse modeling is employed to obtain quantitative estimates of parameters describing the source. However, despite the robustness of the available approaches, a realistic imaging of these reservoirs is still challenging. While analytical models return quick but simplistic results, assuming an isotropic and elastic crust, more sophisticated numerical models, accounting for the effects of topographic loads, crust inelasticity and structural discontinuities, require much higher computational effort and information about the crust rheology may be challenging to infer. All these approaches are based on a-priori source shape constraints, influencing the solution reliability. In this thesis, we present a new approach aimed at overcoming the aforementioned limitations, modeling sources free of a-priori shape constraints with the advantages of FEM simulations, but with a cost-efficient procedure. The source is represented as an assembly of elementary units, consisting in cubic elements of a regular FE mesh loaded with a unitary stress tensors. The surface response due to each of the six stress tensor components is computed and linearly combined to obtain the total displacement field. In this way, the source can assume potentially any shape. Our tests prove the equivalence of the deformation fields due to our assembly and that of corresponding cavities with uniform boundary pressure. Our ability to simulate pressurized cavities in a continuum domain permits to pre-compute surface responses, avoiding remeshing. A Bayesian trans-dimensional inversion algorithm implementing this strategy is developed. 3D Voronoi cells are used to sample the model domain, selecting the elementary units contributing to the source solution and those remaining inactive as part of the crust.