941 resultados para method of extraction
Resumo:
Ice cores from outside the Greenland and Antarctic ice sheets are difficult to date because of seasonal melting and multiple sources (terrestrial, marine, biogenic and anthropogenic) of sulfates deposited onto the ice. Here we present a method of volcanic sulfate extraction that relies on fitting sulfate profiles to other ion species measured along the cores in moving windows in log space. We verify the method with a well dated section of the Belukha ice core from central Eurasia. There are excellent matches to volcanoes in the preindustrial, and clear extraction of volcanic peaks in the post-1940 period when a simple method based on calcium as a proxy for terrestrial sulfate fails due to anthropogenic sulfate deposition. We then attempt to use the same statistical scheme to locate volcanic sulfate horizons within three ice cores from Svalbard and a core from Mount Everest. Volcanic sulfate is <5% of the sulfate budget in every core, and differences in eruption signals extracted reflect the large differences in environment between western, northern and central regions of Svalbard. The Lomonosovfonna and Vestfonna cores span about the last 1000 years, with good extraction of volcanic signals, while Holtedahlfonna which extends to about AD1700 appears to lack a clear record. The Mount Everest core allows clean volcanic signal extraction and the core extends back to about AD700, slightly older than a previous flow model has suggested. The method may thus be used to extract historical volcanic records from a more diverse geographical range than hitherto.
Resumo:
This paper discusses the target localization problem of wireless visual sensor networks. Specifically, each node with a low-resolution camera extracts multiple feature points to represent the target at the sensor node level. A statistical method of merging the position information of different sensor nodes to select the most correlated feature point pair at the base station is presented. This method releases the influence of the accuracy of target extraction on the accuracy of target localization in universal coordinate system. Simulations show that, compared with other relative approach, our proposed method can generate more desirable target localization's accuracy, and it has a better trade-off between camera node usage and localization accuracy.
Resumo:
This paper proposes a method of landscape characterisation and assessment of public works associated with fluvial landscapes, which is validated in the middle section of the Tajo River. In this method, a set of criteria is identified that unifies various characteristics of the landscape associated to the infrastructures. A specific weight is then assigned to each criterion in such a way as to produce a semi-quantitative value ranging from a minimum value of 0 to a maximum value of 10. Taken together, these criteria enable us to describe and assess the value of the public works selected for study, in this case helping us to evaluate the sections of the River Tajo analysed in our study area. Accordingly, the value of all the infrastructures associated to a stretch of the river covering several hundred kilometres was determined and after dividing this stretch into sections, they were compared under equivalent conditions to provide a hierarchal ranking.
Resumo:
Dynamic soil-structure interaction has been for a long time one of the most fascinating areas for the engineering profession. The building of large alternating machines and their effects on surrounding structures as well as on their own functional behavior, provided the initial impetus; a large amount of experimental research was done,and the results of the Russian and German groups were especially worthwhile. Analytical results by Reissner and Sehkter were reexamined by Quinlan, Sung, et. al., and finally Veletsos presented the first set of reliable results. Since then, the modeling of the homogeneous, elastic halfspace as a equivalent set of springs and dashpots has become an everyday tool in soil engineering practice, especially after the appearance of the fast Fourier transportation algorithm, which makes possible the treatment of the frequency-dependent characteristics of the equivalent elements in a unified fashion with the general method of analysis of the structure. Extensions to the viscoelastic case, as well as to embedded foundations and complicated geometries, have been presented by various authors. In general, they used the finite element method with the well known problems of geometric truncations and the subsequent use of absorbing boundaries. The properties of boundary integral equation methods are, in our opinion, specially well suited to this problem, and several of the previous results have confirmed our opinion. In what follows we present the general features related to steady-state elastodynamics and a series of results showing the splendid results that the BIEM provided. Especially interesting are the outputs obtained through the use of the so-called singular elements, whose description is incorporated at the end of the paper. The reduction in time spent by the computer and the small number of elements needed to simulate realistically the global properties of the halfspace make this procedure one of the most interesting applications of the BIEM.
Resumo:
Comunicación presentada en el XI Workshop of Physical Agents, Valencia, 9-10 septiembre 2010.
Resumo:
Questions of "viability" evaluation of innovation projects are considered in this article. As a method of evaluation Hidden Markov Models are used. Problem of determining model parameters, which reproduce test data with highest accuracy are solving. For training the model statistical data on the implementation of innovative projects are used. Baum-Welch algorithm is used as a training algorithm.
Resumo:
Objective: Expectancies about the outcomes of alcohol consumption are widely accepted as important determinants of drinking. This construct is increasingly recognized as a significant element of psychological interventions for alcohol-related problems. Much effort has been invested in producing reliable and valid instruments to measure this construct for research and clinical purposes, but very few have had their factor structure subjected to adequate validation. Among them, the Drinking Expectancies Questionnaire (DEQ) was developed to address some theoretical and design issues with earlier expectancy scales. Exploratory factor analyses, in addition to validity and reliability analyses, were performed when the original questionnaire was developed. The object of this study was to undertake a confirmatory analysis of the factor structure of the DEQ. Method: Confirmatory factor analysis through LISREL 8 was performed using a randomly split sample of 679 drinkers. Results: Results suggested that a new 5-factor model, which differs slightly from the original 6-factor version, was a more robust measure of expectancies. A new method of scoring the DEQ consistent with this factor structure is presented. Conclusions: The present study shows more robust psychometric properties of the DEQ using the new factor structure.
Resumo:
Stress relaxation is relevant to the design of both civil and mining excavations. While many authors refer to the adverse effect of stress relaxation on excavation stability, some present compelling empirical evidence indicating that stress relaxation does not have a significant effect. Establishing clear definitions of stress relaxation was critical to understanding and quantifying stress relaxation of the various types that have been referred to in the literature. This paper defines three types of stress relaxation – partial relaxation, full relaxation and tangential relaxation. Once clear definitions were determined, it became clear that the theoretical arguments and empirical evidence presented by various authors to support their respective cases are not contradictory; rather, the different conclusions can be attributed to different types of stress relaxation. In particular, when the minor principal stress is negative the intermediate principal stress has been identified as significantly affecting jointed rock mass behaviour. The aim of the study was to review and evaluate existing methods of quantifying the effect of stress relaxation around underground excavations and, if necessary, propose a new set of recommendations. An empirical stope stability model, that has been termed the Extended Mathews stability chart, was considered to be the most appropriate method of quantifying the effects of stress relaxation. A new set of guidelines to account for the effect of stress relaxation on excavation stability in the Extended Mathews stability chart has been proposed from a back-analysis of 55 case histories of stress relaxation.
Resumo:
Pseudo-ternary phase diagrams of the polar lipids Quil A, cholesterol (Chol) and phosphatidylcholine (PC) in aqueous mixtures prepared by the lipid film hydration method (where dried lipid film of phospholipids and cholesterol are hydrated by an aqueous solution of Quil A) were investigated in terms of the types of particulate structures formed therein. Negative staining transmission electron microscopy and polarized light microscopy were used to characterize the colloidal and coarse dispersed particles present in the systems. Pseudo-ternary phase diagrams were established for lipid mixtures hydrated in water and in Tris buffer (pH 7.4). The effect of equilibration time was also studied with respect to systems hydrated in water where the samples were stored for 2 months at 4degreesC. Depending on the mass ratio of Quil A, Chol and PC in the systems, various colloidal particles including ISCOM matrices, liposomes, ring-like micelles and worm-like micelles were observed. Other colloidal particles were also observed as minor structures in the presence of these predominant colloids including helices, layered structures and lamellae (hexagonal pattern of ring-like micelles). In terms of the conditions which appeared to promote the formation of ISCOM matrices, the area of the phase diagrams associated with systems containing these structures increased in the order: hydrated in water/short equilibration period < hydrated in buffer/short equilibration period < hydrated in water/prolonged equilibration period. ISCOM matrices appeared to form over time from samples, which initially contained a high concentration of ring-like micelles suggesting that these colloidal structures may be precursors to ISCOM matrix formation. Helices were also frequently found in samples containing ISCOM matrices as a minor colloidal structure. Equilibration time and presence of buffer salts also promoted the formation of liposomes in systems not containing Quil A. These parameters however, did not appear to significantly affect the occurrence and predominance of other structures present in the pseudo-binary systems containing Quil A. Pseudo-ternary phase diagrams of PC, Chol and Quil A are important to identify combinations which will produce different colloidal structures, particularly ISCOM matrices, by the method of lipid film hydration. Colloidal structures comprising these three components are readily prepared by hydration of dried lipid films and may have application in vaccine delivery where the functionality of ISCOMs has clearly been demonstrated. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Mounting concerns regarding the environmental impact of herbicides has meant a growing requirement for accurate, timely information regarding herbicide residue contamination of, in particular, aquatic systems. Conventional methods of detection remain limited in terms of practicality due to high costs of operation and the specialised information that analysis provides. A new phytotoxicity bioassay was trialled for the detection of herbicide residues in filter-purified (Milli-Q) as well as natural waters. The performance of the system, which combines solid-phase extraction (SPE) with the ToxY-PAM dual-channel yield analyser (Heinz Walz GmbH), was tested alongside the traditional method of liquid chromatography-mass spectrometry (LC-MS). The assay methodology was found to be highly sensitive (LOD 0.1 ng L-1 diuron) with good reproducibility. The study showed that the assay protocol is time effective and can be employed for the aquatic screening of herbicide residues in purified as well as natural waters.
Resumo:
We present a new method of modeling imaging of laser beams in the presence of diffraction. Our method is based on the concept of first orthogonally expanding the resultant diffraction field (that would have otherwise been obtained by the laborious application of the Huygens diffraction principle) and then representing it by an effective multimodal laser beam with different beam parameters. We show not only that the process of obtaining the new beam parameters is straightforward but also that it permits a different interpretation of the diffraction-caused focal shift in laser beams. All of the criteria that we have used to determine the minimum number of higher-order modes needed to accurately represent the diffraction field show that the mode-expansion method is numerically efficient. Finally, the characteristics of the mode-expansion method are such that it allows modeling of a vast array of diffraction problems, regardless of the characteristics of the incident laser beam, the diffracting element, or the observation plane. (C) 2005 Optical Society of America.
Resumo:
Water-sampler equilibrium partitioning coefficients and aqueous boundary layer mass transfer coefficients for atrazine, diuron, hexazionone and fluometuron onto C18 and SDB-RPS Empore disk-based aquatic passive samplers have been determined experimentally under a laminar flow regime (Re = 5400). The method involved accelerating the time to equilibrium of the samplers by exposing them to three water concentrations, decreasing stepwise to 50% and then 25% of the original concentration. Assuming first-order Fickian kinetics across a rate-limiting aqueous boundary layer, both parameters are determined computationally by unconstrained nonlinear optimization. In addition, a method of estimation of mass transfer coefficients-therefore sampling rates-using the dimensionless Sherwood correlation developed for laminar flow over a flat plate is applied. For each of the herbicides, this correlation is validated to within 40% of the experimental data. The study demonstrates that for trace concentrations (sub 0.1 mu g/L) and these flow conditions, a naked Empore disk performs well as an integrative sampler over short deployments (up to 7 days) for the range of polar herbicides investigated. The SDB-RPS disk allows a longer integrative period than the C18 disk due to its higher sorbent mass and/or its more polar sorbent chemistry. This work also suggests that for certain passive sampler designs, empirical estimation of sampling rates may be possible using correlations that have been available in the chemical engineering literature for some time.
Resumo:
The country-product-dummy (CPD) method, originally proposed in Summers (1973), has recently been revisited in its weighted formulation to handle a variety of data related situations (Rao and Timmer, 2000, 2003; Heravi et al., 2001; Rao, 2001; Aten and Menezes, 2002; Heston and Aten, 2002; Deaton et al., 2004). The CPD method is also increasingly being used in the context of hedonic modelling instead of its original purpose of filling holes in Summers (1973). However, the CPD method is seen, among practitioners, as a black box due to its regression formulation. The main objective of the paper is to establish equivalence of purchasing power parities and international prices derived from the application of the weighted-CPD method with those arising out of the Rao-system for multilateral comparisons. A major implication of this result is that the weighted-CPD method would then be a natural method of aggregation at all levels of aggregation within the context of international comparisons.
Resumo:
High-performance liquid chromatography coupled by an electrospray ion source to a tandem mass spectrometer (HPLC-EST-MS/ MS) is the current analytical method of choice for quantitation of analytes in biological matrices. With HPLC-ESI-MS/MS having the characteristics of high selectivity, sensitivity, and throughput, this technology is being increasingly used in the clinical laboratory. An important issue to be addressed in method development, validation, and routine use of HPLC-ESI-MS/MS is matrix effects. Matrix effects are the alteration of ionization efficiency by the presence of coeluting substances. These effects are unseen in the chromatograrn but have deleterious impact on methods accuracy and sensitivity. The two common ways to assess matrix effects are either by the postextraction addition method or the postcolumn infusion method. To remove or minimize matrix effects, modification to the sample extraction methodology and improved chromatographic separation must be performed. These two parameters are linked together and form the basis of developing a successful and robust quantitative HPLC-EST-MS/MS method. Due to the heterogenous nature of the population being studied, the variability of a method must be assessed in samples taken from a variety of subjects. In this paper, the major aspects of matrix effects are discussed with an approach to address matrix effects during method validation proposed. (c) 2004 The Canadian Society of Clinical Chemists. All rights reserved.
Resumo:
Background/Aims: Positron emission tomography has been applied to study cortical activation during human swallowing, but employs radio-isotopes precluding repeated experiments and has to be performed supine, making the task of swallowing difficult. Here we now describe Synthetic Aperture Magnetometry (SAM) as a novel method of localising and imaging the brain's neuronal activity from magnetoencephalographic (MEG) signals to study the cortical processing of human volitional swallowing in the more physiological prone position. Methods: In 3 healthy male volunteers (age 28–36), 151-channel whole cortex MEG (Omega-151, CTF Systems Inc.) was recorded whilst seated during the conditions of repeated volitional wet swallowing (5mls boluses at 0.2Hz) or rest. SAM analysis was then performed using varying spatial filters (5–60Hz) before co-registration with individual MRI brain images. Activation areas were then identified using standard sterotactic space neuro-anatomical maps. In one subject repeat studies were performed to confirm the initial study findings. Results: In all subjects, cortical activation maps for swallowing could be generated using SAM, the strongest activations being seen with 10–20Hz filter settings. The main cortical activations associated with swallowing were in: sensorimotor cortex (BA 3,4), insular cortex and lateral premotor cortex (BA 6,8). Of relevance, each cortical region displayed consistent inter-hemispheric asymmetry, to one or other hemisphere, this being different for each region and for each subject. Intra-subject comparisons of activation localisation and asymmetry showed impressive reproducibility. Conclusion: SAM analysis using MEG is an accurate, repeatable, and reproducible method for studying the brain processing of human swallowing in a more physiological manner and provides novel opportunities for future studies of the brain-gut axis in health and disease.