995 resultados para work sampling
Performance Tuning Non-Uniform Sampling for Sensitivity Enhancement of Signal-Limited Biological NMR
Resumo:
Non-uniform sampling (NUS) has been established as a route to obtaining true sensitivity enhancements when recording indirect dimensions of decaying signals in the same total experimental time as traditional uniform incrementation of the indirect evolution period. Theory and experiments have shown that NUS can yield up to two-fold improvements in the intrinsic signal-to-noise ratio (SNR) of each dimension, while even conservative protocols can yield 20-40 % improvements in the intrinsic SNR of NMR data. Applications of biological NMR that can benefit from these improvements are emerging, and in this work we develop some practical aspects of applying NUS nD-NMR to studies that approach the traditional detection limit of nD-NMR spectroscopy. Conditions for obtaining high NUS sensitivity enhancements are considered here in the context of enabling H-1,N-15-HSQC experiments on natural abundance protein samples and H-1,C-13-HMBC experiments on a challenging natural product. Through systematic studies we arrive at more precise guidelines to contrast sensitivity enhancements with reduced line shape constraints, and report an alternative sampling density based on a quarter-wave sinusoidal distribution that returns the highest fidelity we have seen to date in line shapes obtained by maximum entropy processing of non-uniformly sampled data.
Resumo:
Proteins are linear chain molecules made out of amino acids. Only when they fold to their native states, they become functional. This dissertation aims to model the solvent (environment) effect and to develop & implement enhanced sampling methods that enable a reliable study of the protein folding problem in silico. We have developed an enhanced solvation model based on the solution to the Poisson-Boltzmann equation in order to describe the solvent effect. Following the quantum mechanical Polarizable Continuum Model (PCM), we decomposed net solvation free energy into three physical terms– Polarization, Dispersion and Cavitation. All the terms were implemented, analyzed and parametrized individually to obtain a high level of accuracy. In order to describe the thermodynamics of proteins, their conformational space needs to be sampled thoroughly. Simulations of proteins are hampered by slow relaxation due to their rugged free-energy landscape, with the barriers between minima being higher than the thermal energy at physiological temperatures. In order to overcome this problem a number of approaches have been proposed of which replica exchange method (REM) is the most popular. In this dissertation we describe a new variant of canonical replica exchange method in the context of molecular dynamic simulation. The advantage of this new method is the easily tunable high acceptance rate for the replica exchange. We call our method Microcanonical Replica Exchange Molecular Dynamic (MREMD). We have described the theoretical frame work, comment on its actual implementation, and its application to Trp-cage mini-protein in implicit solvent. We have been able to correctly predict the folding thermodynamics of this protein using our approach.
Resumo:
We present a generalized framework for gradient-domain Metropolis rendering, and introduce three techniques to reduce sampling artifacts and variance. The first one is a heuristic weighting strategy that combines several sampling techniques to avoid outliers. The second one is an improved mapping to generate offset paths required for computing gradients. Here we leverage the properties of manifold walks in path space to cancel out singularities. Finally, the third technique introduces generalized screen space gradient kernels. This approach aligns the gradient kernels with image structures such as texture edges and geometric discontinuities to obtain sparser gradients than with the conventional gradient kernel. We implement our framework on top of an existing Metropolis sampler, and we demonstrate significant improvements in visual and numerical quality of our results compared to previous work.
Resumo:
Indoor and ambient air organic pollutants have been gaining attention because they have been measured at levels with possible health effects. Studies have shown that most airborne polychlorinated biphenyls (PCBs), pesticides and many polycyclic aromatic hydrocarbons (PAHs) are present in the free vapor state. The purpose of this research was to extend recent investigative work with polyurethane foam (PUF) as a collection medium for semivolatile compounds. Open-porous flexible PUFs with different chemical makeup and physical properties were evaluated as to their collection affinities/efficiencies for various classes of compounds and the degree of sample recovery. Filtered air samples were pulled through plugs of PUF spiked with various semivolatiles under different simulated environmental conditions (temperature and humidity), and sampling parameters (flow rate and sample volume) in order to measure their effects on sample breakthrough volume (V(,B)). PUF was also evaluated in the passive mode using organo-phosphorus pesticides. Another major goal was to improve the overall analytical methodology; PUF is inexpensive, easy to handle in the field and has excellent airflow characteristics (low pressure drop). It was confirmed that the PUF collection apparatus behaves as if it were a gas-solid chromatographic system, in that, (V(,B)) was related to temperature and sample volume. Breakthrough volumes were essentially the same using both polyether and polyester type PUF. Also, little change was observed in the V(,B)s after coating PUF with common chromatographic liquid phases. Open cell (reticulated) foams gave better recoveries than closed cell foams. There was a slight increase in (V(,B)) with an increase in the number of cells/pores per inch. The high-density polyester PUF was found to be an excellent passive and active collection adsorbent. Good recoveries could be obtained using just solvent elution. A gas chromatograph equipped with a photoionization detector gave excellent sensitivities and selectivities for the various classes of compounds investigated. ^
Resumo:
Purpose. The purpose of this study was to determine the perceptions of work engagement of Taiwanese nurses with 3 specific aims: (1) understand Taiwanese nurses' perceptions of work engagement; (2) explore the factors influencing work engagement, and (3) examine how work engagement impacts nursing care for patients. ^ Design. The study used an ethnographic approach with participant observation and semi-structured interviews with RNs. ^ Setting. The study was conducted in the highest and lowest nurse turnover medical surgical units at a regional teaching hospital in southwestern Taiwan. ^ Sample. Purposive sampling resulted in 28 formal interviews with RNs who provided direct patient care, had at least 3 months experience in nursing, and were full-time employees. ^ Methods. Descriptive data were collected through participant observation in each unit. Observations were made while attending meetings, continuing education sessions, and informal conversations with RNs. Field notes and audio recorded semi-structured interviews were analyzed using qualitative thematic analytic techniques. ^ Findings. Findings revealed perceptions of work engagement spanned four domains: patients ("wholehearted care"), work (positive attitude), self (fulfillment and happiness), and others (relationships with colleagues). Providing "wholehearted care" toward patients was the foundation of work engagement for nurses in Taiwan. Engaged nurses felt fulfilled, happy, and found "meaning" through the process of patient care. The study revealed five factors that influenced work engagement: personal, organizational, social, patient, and professional. The impact of work engagement on nurse and patient outcomes are confirmed. ^ Conclusions. Taiwanese nurses connect work engagement with patients, the job, oneself, and colleagues. "Wholehearted patient care" is the core manifestation of work engagement among these nurses. In contrast, studies in western business only focused on work attitudes. Losing interest and "heart" lead to work routines which can lead to individual unhappiness. Findings from this study validate the multiple factors contributing to work engagement of nurses. Job demands and resources can only partially explain what hinders work engagement. Work disengagement and burnout share some commonality but should be measured differently. An understanding of RNs' perceptions of work engagement may provide direction for strategies that improve work engagement leading to decreased RN turnover. ^
Resumo:
To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.
Resumo:
The aim of this work is to solve a question raised for average sampling in shift-invariant spaces by using the well-known matrix pencil theory. In many common situations in sampling theory, the available data are samples of some convolution operator acting on the function itself: this leads to the problem of average sampling, also known as generalized sampling. In this paper we deal with the existence of a sampling formula involving these samples and having reconstruction functions with compact support. Thus, low computational complexity is involved and truncation errors are avoided. In practice, it is accomplished by means of a FIR filter bank. An answer is given in the light of the generalized sampling theory by using the oversampling technique: more samples than strictly necessary are used. The original problem reduces to finding a polynomial left inverse of a polynomial matrix intimately related to the sampling problem which, for a suitable choice of the sampling period, becomes a matrix pencil. This matrix pencil approach allows us to obtain a practical method for computing the compactly supported reconstruction functions for the important case where the oversampling rate is minimum. Moreover, the optimality of the obtained solution is established.
Resumo:
Multi-camera 3D tracking systems with overlapping cameras represent a powerful mean for scene analysis, as they potentially allow greater robustness than monocular systems and provide useful 3D information about object location and movement. However, their performance relies on accurately calibrated camera networks, which is not a realistic assumption in real surveillance environments. Here, we introduce a multi-camera system for tracking the 3D position of a varying number of objects and simultaneously refin-ing the calibration of the network of overlapping cameras. Therefore, we introduce a Bayesian framework that combines Particle Filtering for tracking with recursive Bayesian estimation methods by means of adapted transdimensional MCMC sampling. Addi-tionally, the system has been designed to work on simple motion detection masks, making it suitable for camera networks with low transmission capabilities. Tests show that our approach allows a successful performance even when starting from clearly inaccurate camera calibrations, which would ruin conventional approaches.
Resumo:
Dynamic thermal management techniques require a collection of on-chip thermal sensors that imply a significant area and power overhead. Finding the optimum number of temperature monitors and their location on the chip surface to optimize accuracy is an NP-hard problem. In this work we improve the modeling of the problem by including area, power and networking constraints along with the consideration of three inaccuracy terms: spatial errors, sampling rate errors and monitor-inherent errors. The problem is solved by the simulated annealing algorithm. We apply the algorithm to a test case employing three different types of monitors to highlight the importance of the different metrics. Finally we present a case study of the Alpha 21364 processor under two different constraint scenarios.
Resumo:
The Nakagami-m distribution is widely used for the simulation of fading channels in wireless communications. A novel, simple and extremely efficient acceptance-rejection algorithm is introduced for the generation of independent Nakagami-m random variables. The proposed method uses another Nakagami density with a half-integer value of the fading parameter, mp ¼ n/2 ≤ m, as proposal function, from which samples can be drawn exactly and easily. This novel rejection technique is able to work with arbitrary values of m ≥ 1, average path energy, V, and provides a higher acceptance rate than all currently available methods. RESUMEN. Método extremadamente eficiente para generar variables aleatorias de Nakagami (utilizadas para modelar el desvanecimiento en canales de comunicaciones móviles) basado en "rejection sampling".