69 resultados para Chromatography techniques
Resumo:
In this study, glyoxalated alkaline lignins with a non-volatile and non-toxic aldehyde, which can be obtained from several natural resources, namely glyoxal, were prepared and characterized for its use in wood adhesives. The preparation method consisted of the reaction of lignin with glyoxal under an alkaline medium. The influence of reaction conditions such as the molar ratio of sodium hydroxide-to-lignin and reaction time were studied relative to the properties of the prepared adducts. The analytical techniques used were FTIR and 1H-NMR spectroscopies, gel permeation chromatography (GPC), differential scanning calorimetry (DSC), and thermogravimetric analysis (TGA). Results from both the FTIR and 1H-NMR spectroscopies showed that the amount of introduced aliphatic hydroxyl groups onto the lignin molecule increased with increasing reaction time and reached a maximum value at 10 h, and after they began to decrease. The molecular weights remained unchanged until 10 h of reaction time, and then started to increase, possibly due to the repolymerization reactions. DSC analysis showed that the glass transition temperature (Tg) decreased with the introduction of glyoxal onto the lignin molecule due to the increase in free volume of the lignin molecules. TGA analysis showed that the thermal stability of glyoxalated lignin is not influenced and remained suitable for wood adhesives. Compared to the original lignin, the improved lignin is reactive and a suitable raw material for adhesive formula
Resumo:
Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system
Resumo:
The speed of fault isolation is crucial for the design and reconfiguration of fault tolerant control (FTC). In this paper the fault isolation problem is stated as a constraint satisfaction problem (CSP) and solved using constraint propagation techniques. The proposed method is based on constraint satisfaction techniques and uncertainty space refining of interval parameters. In comparison with other approaches based on adaptive observers, the major advantage of the presented method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements and model errors and without the monotonicity assumption. In order to illustrate the proposed approach, a case study of a nonlinear dynamic system is presented
Analysis and evaluation of techniques for the extraction of classes in the ontology learning process
Resumo:
This paper analyzes and evaluates, in the context of Ontology learning, some techniques to identify and extract candidate terms to classes of a taxonomy. Besides, this work points out some inconsistencies that may be occurring in the preprocessing of text corpus, and proposes techniques to obtain good terms candidate to classes of a taxonomy.
Resumo:
Two concentration methods for fast and routine determination of caffeine (using HPLC-UV detection) in surface, and wastewater are evaluated. Both methods are based on solid-phase extraction (SPE) concentration with octadecyl silica sorbents. A common “offline” SPE procedure shows that quantitative recovery of caffeine is obtained with 2 mL of an elution mixture solvent methanol-water containing at least 60% methanol. The method detection limit is 0.1 μg L−1 when percolating 1 L samples through the cartridge. The development of an “online” SPE method based on a mini-SPE column, containing 100 mg of the same sorbent, directly connected to the HPLC system allows the method detection limit to be decreased to 10 ng L−1 with a sample volume of 100 mL. The “offline” SPE method is applied to the analysis of caffeine in wastewater samples, whereas the “on-line” method is used for analysis in natural waters from streams receiving significant water intakes from local wastewater treatment plants
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image processing. This paper provides the theoretical background and technical information for performing the experiment. The proposed activity requires students able to develop a wide range of skills since they are expected to deal with optical components, including spatial light modulators, and develop scripts to perform some calculations.
Resumo:
Surface topography and light scattering were measured on 15 samples ranging from those having smooth surfaces to others with ground surfaces. The measurement techniques included an atomic force microscope, mechanical and optical profilers, confocal laser scanning microscope, angle-resolved scattering, and total scattering. The samples included polished and ground fused silica, silicon carbide, sapphire, electroplated gold, and diamond-turned brass. The measurement instruments and techniques had different surface spatial wavelength band limits, so the measured roughnesses were not directly comparable. Two-dimensional power spectral density (PSD) functions were calculated from the digitized measurement data, and we obtained rms roughnesses by integrating areas under the PSD curves between fixed upper and lower band limits. In this way, roughnesses measured with different instruments and techniques could be directly compared. Although smaller differences between measurement techniques remained in the calculated roughnesses, these could be explained mostly by surface topographical features such as isolated particles that affected the instruments in different ways.
Resumo:
Background Plant hormones play a pivotal role in several physiological processes during a plant's life cycle, from germination to senescence, and the determination of endogenous concentrations of hormones is essential to elucidate the role of a particular hormone in any physiological process. Availability of a sensitive and rapid method to quantify multiple classes of hormones simultaneously will greatly facilitate the investigation of signaling networks in controlling specific developmental pathways and physiological responses. Due to the presence of hormones at very low concentrations in plant tissues (10-9 M to 10-6 M) and their different chemistries, the development of a high-throughput and comprehensive method for the determination of hormones is challenging. Results The present work reports a rapid, specific and sensitive method using ultrahigh-performance liquid chromatography coupled to electrospray ionization tandem spectrometry (UPLC/ESI-MS/MS) to analyze quantitatively the major hormones found in plant tissues within six minutes, including auxins, cytokinins, gibberellins, abscisic acid, 1-amino-cyclopropane-1-carboxyic acid (the ethylene precursor), jasmonic acid and salicylic acid. Sample preparation, extraction procedures and UPLC-MS/MS conditions were optimized for the determination of all plant hormones and are summarized in a schematic extraction diagram for the analysis of small amounts of plant material without time-consuming additional steps such as purification, sample drying or re-suspension. Conclusions This new method is applicable to the analysis of dynamic changes in endogenous concentrations of hormones to study plant developmental processes or plant responses to biotic and abiotic stresses in complex tissues. An example is shown in which a hormone profiling is obtained from leaves of plants exposed to salt stress in the aromatic plant, Rosmarinus officinalis.
Resumo:
En la investigació de la complexació de metalls mitjançant eines electroanalítiques són emprades dues aproximacions generals. La primera, anomenada de modelatge dur (hardmodelling), es basa en la formulació d'un model fisicoquímic conjunt per als processos electròdic i de complexació i en la resolució analítica o numèrica del model. Posteriorment, l'ajust dels paràmetres del model a les dades experimentals donarà la informació desitjada sobre el procés de complexació. La segona aproximació, anomenada de modelatge tou (soft-modelling), es basa en la identificació d'un model de complexació a partir de l'anàlisi numèrica i estadística de les dades, sense cap assumpció prèvia d'un model. Aquesta aproximació, que ha estat extensivament emprada amb dades espectroscòpiques, ho ha estat poquíssim amb dades electroquímiques. En aquest article tractem de la formulació d'un model (hard-modelling) per a la complexació de metalls en sistemes amb mescles de lligands, incloent-hi lligands macromoleculars, i de l'aplicació d
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
Transmission electron microscopy is a proven technique in the field of cell biology and a very useful tool in biomedical research. Innovation and improvements in equipment together with the introduction of new technology have allowed us to improve our knowledge of biological tissues, to visualizestructures better and both to identify and to locate molecules. Of all the types ofmicroscopy exploited to date, electron microscopy is the one with the mostadvantageous resolution limit and therefore it is a very efficient technique fordeciphering the cell architecture and relating it to function. This chapter aims toprovide an overview of the most important techniques that we can apply to abiological sample, tissue or cells, to observe it with an electron microscope, fromthe most conventional to the latest generation. Processes and concepts aredefined, and the advantages and disadvantages of each technique are assessedalong with the image and information that we can obtain by using each one ofthem.