978 resultados para sampling techniques
Resumo:
The historical development of atomic spectrometry techniques based on chemical vapor generation by both batch and flow injection sampling formats is presented. Detection via atomic absorption spectrometry (AAS), microwave induced plasma optical emission spectrometry (MIP-OES), inductively coupled plasma optical emission spectrometry (ICP-OES) , inductively coupled plasma mass spectrometry (ICP-MS) and furnace atomic nonthermal excitation spectrometry (FANES) are considered. Hydride generation is separately considered in contrast to other methods of generation of volatile derivatives. Hg ¾ CVAAS (cold vapor atomic absorption spectrometry) is not considered here. The current state-of-the-art, including extension, advantages and limitations of this approach is discussed.
Resumo:
Advancements in power electronic semiconductor switching devices have lead to significantly faster switching times. In motor and generator applications, the fast switching times of pulse width modulated (PWM) inverters lead to overvoltages caused by voltage reflections with shorter and shorter cables. These excessive overvoltages may lead to a failure of the electrical machine in a matter of months. In this thesis, the causes behind the overvoltage phenomenon as well as its different mitigation techniques are studied. The most suitable techniques for mitigating the overvoltage phenomenon in wind power generator applications are chosen based on both simulations and actual measurements performed on a prototype. An RC filter at the terminals of the electrical machine and an inverter output filter designed to reduce the rise and fall times of voltage pulses are presented as a solution to the overvoltage problem. The performance and losses of both filter types are analysed.
Resumo:
As a result of the growing interest in studying employee well-being as a complex process that portrays high levels of within-individual variability and evolves over time, this present study considers the experience of flow in the workplace from a nonlinear dynamical systems approach. Our goal is to offer new ways to move the study of employee well-being beyond linear approaches. With nonlinear dynamical systems theory as the backdrop, we conducted a longitudinal study using the experience sampling method and qualitative semi-structured interviews for data collection; 6981 registers of data were collected from a sample of 60 employees. The obtained time series were analyzed using various techniques derived from the nonlinear dynamical systems theory (i.e., recurrence analysis and surrogate data) and multiple correspondence analyses. The results revealed the following: 1) flow in the workplace presents a high degree of within-individual variability; this variability is characterized as chaotic for most of the cases (75%); 2) high levels of flow are associated with chaos; and 3) different dimensions of the flow experience (e.g., merging of action and awareness) as well as individual (e.g., age) and job characteristics (e.g., job tenure) are associated with the emergence of different dynamic patterns (chaotic, linear and random).
Resumo:
Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.
Resumo:
Multispectral images are becoming more common in the field of remote sensing, computer vision, and industrial applications. Due to the high accuracy of the multispectral information, it can be used as an important quality factor in the inspection of industrial products. Recently, the development on multispectral imaging systems and the computational analysis on the multispectral images have been the focus of a growing interest. In this thesis, three areas of multispectral image analysis are considered. First, a method for analyzing multispectral textured images was developed. The method is based on a spectral cooccurrence matrix, which contains information of the joint distribution of spectral classes in a spectral domain. Next, a procedure for estimating the illumination spectrum of the color images was developed. Proposed method can be used, for example, in color constancy, color correction, and in the content based search from color image databases. Finally, color filters for the optical pattern recognition were designed, and a prototype of a spectral vision system was constructed. The spectral vision system can be used to acquire a low dimensional component image set for the two dimensional spectral image reconstruction. The data obtained by the spectral vision system is small and therefore convenient for storing and transmitting a spectral image.
Resumo:
In this paper, we present view-dependent information theory quality measures for pixel sampling and scene discretization in flatland. The measures are based on a definition for the mutual information of a line, and have a purely geometrical basis. Several algorithms exploiting them are presented and compare well with an existing one based on depth differences
Resumo:
In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms
Resumo:
Water quality was monitored at the upper course of the Rio das Velhas, a major tributary of the São Francisco basin located in the state of Minas Gerais, over an extension of 108 km from its source up to the limits with the Sabara district. Monitoring was done at 37 different sites over a period of 2 years (2003-2004) for 39 parameters. Multivariate statistical techniques were applied to interpret the large water-quality data set and to establish an optimal long-term monitoring network. Cluster analysis separated the sampling sites into groups of similarity, and also indicated the stations investigated for correlation and recommended to be removed from the monitoring network. Principal component analysis identified four components, which are responsible for the data structure explaining 80% of the total variance of the data. The principal parameters are characterized as due to mining activities and domestic sewage. Significant data reduction was achieved.
Resumo:
Cognitive radio networks sense spectrum occupancy and manage themselvesto operate in unused bands without disturbing licensed users. The detection capability of aradio system can be enhanced if the sensing process is performed jointly by a group of nodesso that the effects of wireless fading and shadowing can be minimized. However, taking acollaborative approach poses new security threats to the system as nodes can report falsesensing data to reach a wrong decision. This paper makes a review of secure cooperativespectrum sensing in cognitive radio networks. The main objective of these protocols is toprovide an accurate resolution about the availability of some spectrum channels, ensuring thecontribution from incapable users as well as malicious ones is discarded. Issues, advantagesand disadvantages of such protocols are investigated and summarized.
Resumo:
The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,
Resumo:
The ongoing development of the digital media has brought a new set of challenges with it. As images containing more than three wavelength bands, often called spectral images, are becoming a more integral part of everyday life, problems in the quality of the RGB reproduction from the spectral images have turned into an important area of research. The notion of image quality is often thought to comprise two distinctive areas – image quality itself and image fidelity, both dealing with similar questions, image quality being the degree of excellence of the image, and image fidelity the measure of the match of the image under study to the original. In this thesis, both image fidelity and image quality are considered, with an emphasis on the influence of color and spectral image features on both. There are very few works dedicated to the quality and fidelity of spectral images. Several novel image fidelity measures were developed in this study, which include kernel similarity measures and 3D-SSIM (structural similarity index). The kernel measures incorporate the polynomial, Gaussian radial basis function (RBF) and sigmoid kernels. The 3D-SSIM is an extension of a traditional gray-scale SSIM measure developed to incorporate spectral data. The novel image quality model presented in this study is based on the assumption that the statistical parameters of the spectra of an image influence the overall appearance. The spectral image quality model comprises three parameters of quality: colorfulness, vividness and naturalness. The quality prediction is done by modeling the preference function expressed in JNDs (just noticeable difference). Both image fidelity measures and the image quality model have proven to be effective in the respective experiments.
Resumo:
The water content in seafoods is very important since it affects their sensorial quality, microbiological stability, physical characteristics and shelf life. In this study, thermoanalytical techniques were employed to develop a simple and accurate method to determine water content (moisture) by thermogravimetry (TG) and water activity from moisture content values and freezing point depression using differential scanning calorimetry (DSC). The precision of the results suggests that TG is a suitable technique to determine moisture content in biological samples. The average water content values for fish samples of Lutjanus synagris and Ocyurus chrysurus species were 76.4 ± 5.7% and 63.3 ± 3.9%, respectively, while that of Ulva lactuca marine algae species was 76.0 ± 4.4%. The method presented here was also successfully applied to determine water activity in two species of fish and six species of marine algae collected in the Atlantic coastal waters of Bahia, in Brazil. Water activity determined in fish samples ranged from 0.946 - 0.960 and was consistent with values reported in the literature, i.e., 0.9 - 1.0. The water activity values determined in marine algae samples lay within the interval of 0.974 - 0.979.
Resumo:
Analyzing the state of the art in a given field in order to tackle a new problem is always a mandatory task. Literature provides surveys based on summaries of previous studies, which are often based on theoretical descriptions of the methods. An engineer, however, requires some evidence from experimental evaluations in order to make the appropriate decision when selecting a technique for a problem. This is what we have done in this paper: experimentally analyzed a set of representative state-of-the-art techniques in the problem we are dealing with, namely, the road passenger transportation problem. This is an optimization problem in which drivers should be assigned to transport services, fulfilling some constraints and minimizing some function cost. The experimental results have provided us with good knowledge of the properties of several methods, such as modeling expressiveness, anytime behavior, computational time, memory requirements, parameters, and free downloadable tools. Based on our experience, we are able to choose a technique to solve our problem. We hope that this analysis is also helpful for other engineers facing a similar problem
Resumo:
Two spectrophotometric methods are described for the simultaneous determination of ezetimibe (EZE) and simvastatin (SIM) in pharmaceutical preparations. The obtained data was evaluated by using two different chemometric techniques, Principal Component Regression (PCR) and Partial Least-Squares (PLS-1). In these techniques, the concentration data matrix was prepared by using the mixtures containing these drugs in methanol. The absorbance data matrix corresponding to the concentration data matrix was obtained by the measurements of absorbances in the range of 240 - 300 nm in the intervals with Δλ = 1 nm at 61 wavelengths in their zero order spectra, then, calibration or regression was obtained by using the absorbance data matrix and concentration data matrix for the prediction of the unknown concentrations of EZE and SIM in their mixture. The procedure did not require any separation step. The linear range was found to be 5 - 20 µg mL-1 for EZE and SIM in both methods. The accuracy and precision of the methods were assessed. These methods were successfully applied to a pharmaceutical preparation, tablet; and the results were compared with each other.
Resumo:
Here we investigate the formation of superficial micro- and nanostructures in poly(ethylene-2,6-naphthalate) (PEN), with a view to their use in biomedical device applications, and compare its performance with a polymer commonly used for the fabrication of these devices, poly(methyl methacrylate) (PMMA). The PEN is found to replicate both micro- and nanostructures in its surface, albeit requiring more forceful replication conditions than PMMA, producing a slight increase in surface hydrophilicity. This ability to form micro/nanostructures, allied to biocompatibility and good optical transparency, suggests that PEN could be a useful material for production of, or for incorporation into, transparent devices for biomedical applications. Such devices will be able to be autoclaved, due to the polymer's high temperature stability, and will be useful for applications where forceful experimental conditions are required, due to a superior chemical resistance over PMMA.