992 resultados para instrumental methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combining data from multiple analytical platforms is essential for comprehensive study of the molecular phenotype (metabotype) of a given biological sample. The metabolite profiles generated are intrinsically dependent on the analytical platforms, each requiring optimization of instrumental parameters, separation conditions, and sample extraction to deliver maximal biological information. An in-depth evaluation of extraction protocols for characterizing the metabolome of the hepatobiliary fluke Fasciola hepatica, using ultra performance liquid chromatography and capillary electrophoresis coupled with mass spectroscopy is presented. The spectrometric methods were characterized by performance, and metrics of merit were established, including precision, mass accuracy, selectivity, sensitivity, and platform stability. Although a core group of molecules was common to all methods, each platform contributed a unique set, whereby 142 metabolites out of 14,724 features were identified. A mixture design revealed that the chloroform:methanol:water proportion of 15:59:26 was globally the best composition for metabolite extraction across UPLC-MS and CE-MS platforms accommodating different columns and ionization modes. Despite the general assumption of the necessity of platform-adapted protocols for achieving effective metabotype characterization, we show that an appropriately designed single extraction procedure is able to fit the requirements of all technologies. This may constitute a paradigm shift in developing efficient protocols for high-throughput metabolite profiling with more-general analytical applicability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is based on five papers addressing variance reduction in different ways. The papers have in common that they all present new numerical methods. Paper I investigates quantitative structure-retention relationships from an image processing perspective, using an artificial neural network to preprocess three-dimensional structural descriptions of the studied steroid molecules. Paper II presents a new method for computing free energies. Free energy is the quantity that determines chemical equilibria and partition coefficients. The proposed method may be used for estimating, e.g., chromatographic retention without performing experiments. Two papers (III and IV) deal with correcting deviations from bilinearity by so-called peak alignment. Bilinearity is a theoretical assumption about the distribution of instrumental data that is often violated by measured data. Deviations from bilinearity lead to increased variance, both in the data and in inferences from the data, unless invariance to the deviations is built into the model, e.g., by the use of the method proposed in paper III and extended in paper IV. Paper V addresses a generic problem in classification; namely, how to measure the goodness of different data representations, so that the best classifier may be constructed. Variance reduction is one of the pillars on which analytical chemistry rests. This thesis considers two aspects on variance reduction: before and after experiments are performed. Before experimenting, theoretical predictions of experimental outcomes may be used to direct which experiments to perform, and how to perform them (papers I and II). After experiments are performed, the variance of inferences from the measured data are affected by the method of data analysis (papers III-V).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microarrays have established as instrumental for bacterial detection, identification, and genotyping as well as for transcriptomic studies. For gene expression analyses using limited numbers of bacteria (derived from in vivo or ex vivo origin, for example), RNA amplification is often required prior to labeling and hybridization onto microarrays. Evaluation of the fidelity of the amplification methods is crucial for the robustness and reproducibility of microarray results. We report here the first utilization of random primers and the highly processive Phi29 phage polymerase to amplify material for transcription profiling analyses. We compared two commercial amplification methods (GenomiPhi and MessageAmp kits) with direct reverse-transcription as the reference method, focusing on the robustness of mRNA quantification using either microarrays or quantitative RT-PCR. Both amplification methods using either poly-A tailing followed by in vitro transcription, or direct strand displacement polymerase, showed appreciable linearity. Strand displacement technique was particularly affordable compared to in vitro transcription-based (IVT) amplification methods and consisted in a single tube reaction leading to high amplification yields. Real-time measurements using low-, medium-, and highly expressed genes revealed that this simple method provided linear amplification with equivalent results in terms of relative messenger abundance as those obtained by conventional direct reverse-transcription.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution and highly precise age models for recent lake sediments (last 100–150 years) are essential for quantitative paleoclimate research. These are particularly important for sedimentological and geochemical proxies, where transfer functions cannot be established and calibration must be based upon the relation of sedimentary records to instrumental data. High-precision dating for the calibration period is most critical as it determines directly the quality of the calibration statistics. Here, as an example, we compare radionuclide age models obtained on two high-elevation glacial lakes in the Central Chilean Andes (Laguna Negra: 33°38′S/70°08′W, 2,680 m a.s.l. and Laguna El Ocho: 34°02′S/70°19′W, 3,250 m a.s.l.). We show the different numerical models that produce accurate age-depth chronologies based on 210Pb profiles, and we explain how to obtain reduced age-error bars at the bottom part of the profiles, i.e., typically around the end of the 19th century. In order to constrain the age models, we propose a method with five steps: (i) sampling at irregularly-spaced intervals for 226Ra, 210Pb and 137Cs depending on the stratigraphy and microfacies, (ii) a systematic comparison of numerical models for the calculation of 210Pb-based age models: constant flux constant sedimentation (CFCS), constant initial concentration (CIC), constant rate of supply (CRS) and sediment isotope tomography (SIT), (iii) numerical constraining of the CRS and SIT models with the 137Cs chronomarker of AD 1964 and, (iv) step-wise cross-validation with independent diagnostic environmental stratigraphic markers of known age (e.g., volcanic ash layer, historical flood and earthquakes). In both examples, we also use airborne pollutants such as spheroidal carbonaceous particles (reflecting the history of fossil fuel emissions), excess atmospheric Cu deposition (reflecting the production history of a large local Cu mine), and turbidites related to historical earthquakes. Our results show that the SIT model constrained with the 137Cs AD 1964 peak performs best over the entire chronological profile (last 100–150 years) and yields the smallest standard deviations for the sediment ages. Such precision is critical for the calibration statistics, and ultimately, for the quality of the quantitative paleoclimate reconstruction. The systematic comparison of CRS and SIT models also helps to validate the robustness of the chronologies in different sections of the profile. Although surprisingly poorly known and under-explored in paleolimnological research, the SIT model has a great potential in paleoclimatological reconstructions based on lake sediments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatively little is known about past cold-season temperature variability in high-Alpine regions because of a lack of natural cold-season temperature proxies as well as under-representation of high-altitude sites in meteorological, early-instrumental and documentary data sources. Recent studies have shown that chrysophyte stomatocysts, or simply cysts (sub-fossil algal remains of Chrysophyceae and Synurophyceae), are among the very few natural proxies that can be used to reconstruct cold-season temperatures. This study presents a quantitative, high-resolution (5-year), cold-season (Oct–May) temperature reconstruction based on sub-fossil chrysophyte stomatocysts in the annually laminated (varved) sediments of high-Alpine Lake Silvaplana, SE Switzerland (1,789 m a.s.l.), since AD 1500. We first explore the method used to translate an ecologically meaningful variable based on a biological proxy into a simple climate variable. A transfer function was applied to reconstruct the ‘date of spring mixing’ from cyst assemblages. Next, statistical regression models were tested to convert the reconstructed ‘dates of spring mixing’ into cold-season surface air temperatures with associated errors. The strengths and weaknesses of this approach are thoroughly tested. One much-debated, basic assumption for reconstructions (‘stationarity’), which states that only the environmental variable of interest has influenced cyst assemblages and the influence of confounding variables is negligible over time, is addressed in detail. Our inferences show that past cold-season air-temperature fluctuations were substantial and larger than those of other temperature reconstructions for Europe and the Alpine region. Interestingly, in this study, recent cold-season temperatures only just exceed those of previous, multi-decadal warm phases since AD 1500. These findings highlight the importance of local studies to assess natural climate variability at high altitudes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, declines of honey bee populations have received massive media attention worldwide, yet attempts to understand the causes have been hampered by a lack of standardisation of laboratory techniques. Published as a response to this, the COLOSS BEEBOOK is a unique collaborative venture involving 234 bee scientists from 34 countries, who have produced the definitive guide to how to carry out research on honey bees. It is hoped that these volumes will become the standards to be adopted by bee scientists worldwide. Volume I includes approximately 1,100 separate protocols dealing with the study of the honey bee, Apis mellifera. These cover anatomy, behavioural studies, chemical ecology, breeding, genetics, instrumental insemination and queen rearing, pollination, molecular studies, statistics, toxicology and numerous other techniques

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los estudios realizados hasta el momento para la determinación de la calidad de medida del instrumental geodésico han estado dirigidos, fundamentalmente, a las medidas angulares y de distancias. Sin embargo, en los últimos años se ha impuesto la tendencia generalizada de utilizar equipos GNSS (Global Navigation Satellite System) en el campo de las aplicaciones geomáticas sin que se haya establecido una metodología que permita obtener la corrección de calibración y su incertidumbre para estos equipos. La finalidad de esta Tesis es establecer los requisitos que debe satisfacer una red para ser considerada Red Patrón con trazabilidad metrológica, así como la metodología para la verificación y calibración de instrumental GNSS en redes patrón. Para ello, se ha diseñado y elaborado un procedimiento técnico de calibración de equipos GNSS en el que se han definido las contribuciones a la incertidumbre de medida. El procedimiento, que se ha aplicado en diferentes redes para distintos equipos, ha permitido obtener la incertidumbre expandida de dichos equipos siguiendo las recomendaciones de la Guide to the Expression of Uncertainty in Measurement del Joint Committee for Guides in Metrology. Asimismo, se han determinado mediante técnicas de observación por satélite las coordenadas tridimensionales de las bases que conforman las redes consideradas en la investigación, y se han desarrollado simulaciones en función de diversos valores de las desviaciones típicas experimentales de los puntos fijos que se han utilizado en el ajuste mínimo cuadrático de los vectores o líneas base. Los resultados obtenidos han puesto de manifiesto la importancia que tiene el conocimiento de las desviaciones típicas experimentales en el cálculo de incertidumbres de las coordenadas tridimensionales de las bases. Basándose en estudios y observaciones de gran calidad técnica, llevados a cabo en estas redes con anterioridad, se ha realizado un exhaustivo análisis que ha permitido determinar las condiciones que debe satisfacer una red patrón. Además, se han diseñado procedimientos técnicos de calibración que permiten calcular la incertidumbre expandida de medida de los instrumentos geodésicos que proporcionan ángulos y distancias obtenidas por métodos electromagnéticos, ya que dichos instrumentos son los que van a permitir la diseminación de la trazabilidad metrológica a las redes patrón para la verificación y calibración de los equipos GNSS. De este modo, ha sido posible la determinación de las correcciones de calibración local de equipos GNSS de alta exactitud en las redes patrón. En esta Tesis se ha obtenido la incertidumbre de la corrección de calibración mediante dos metodologías diferentes; en la primera se ha aplicado la propagación de incertidumbres, mientras que en la segunda se ha aplicado el método de Monte Carlo de simulación de variables aleatorias. El análisis de los resultados obtenidos confirma la validez de ambas metodologías para la determinación de la incertidumbre de calibración de instrumental GNSS. ABSTRACT The studies carried out so far for the determination of the quality of measurement of geodetic instruments have been aimed, primarily, to measure angles and distances. However, in recent years it has been accepted to use GNSS (Global Navigation Satellite System) equipment in the field of Geomatic applications, for data capture, without establishing a methodology that allows obtaining the calibration correction and its uncertainty. The purpose of this Thesis is to establish the requirements that a network must meet to be considered a StandardNetwork with metrological traceability, as well as the methodology for the verification and calibration of GNSS instrumental in those standard networks. To do this, a technical calibration procedure has been designed, developed and defined for GNSS equipment determining the contributions to the uncertainty of measurement. The procedure, which has been applied in different networks for different equipment, has alloweddetermining the expanded uncertainty of such equipment following the recommendations of the Guide to the Expression of Uncertainty in Measurement of the Joint Committee for Guides in Metrology. In addition, the three-dimensional coordinates of the bases which constitute the networks considered in the investigationhave been determined by satellite-based techniques. There have been several developed simulations based on different values of experimental standard deviations of the fixed points that have been used in the least squares vectors or base lines calculations. The results have shown the importance that the knowledge of experimental standard deviations has in the calculation of uncertainties of the three-dimensional coordinates of the bases. Based on high technical quality studies and observations carried out in these networks previously, it has been possible to make an exhaustive analysis that has allowed determining the requirements that a standard network must meet. In addition, technical calibration procedures have been developed to allow the uncertainty estimation of measurement carried outby geodetic instruments that provide angles and distances obtained by electromagnetic methods. These instruments provide the metrological traceability to standard networks used for verification and calibration of GNSS equipment. As a result, it has been possible the estimation of local calibration corrections for high accuracy GNSS equipment in standardnetworks. In this Thesis, the uncertainty of calibration correction has been calculated using two different methodologies: the first one by applying the law of propagation of uncertainty, while the second has applied the propagation of distributions using the Monte Carlo method. The analysis of the obtained results confirms the validity of both methodologies for estimating the calibration uncertainty of GNSS equipment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: In this study, the authors assessed the effects of a structured, moderate-intensity exercise program during the entire length of pregnancy on a woman’s method of delivery. Methods: A randomized controlled trial was conducted with 290 healthy pregnant Caucasian (Spanish) women with a singleton gestation who were randomly assigned to either an exercise (n=138) or a control (n=152) group. Pregnancy outcomes, including the type of delivery, were measured at the end of the pregnancy. Results: The percentage of cesarean and instrumental deliveries in the exercise group were lower than in the control group (15.9%, n=22; 11.6%, n=16 vs. 23%, n=35; 19.1%, n=29, respectively; p=0.03). The overall health status of the newborn as well as other pregnancy outcomes were unaffected. Conclusions: Based on these results, a supervised program of moderate-intensity exercise performed throughout pregnancy was associated with a reduction in the rate of cesarean sections and can be recommended for healthy women in pregnancy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this article is to compare the Suzuki and BAPNE methods based on bibliography published for both approaches. In the field of musical and instrumental education and especially for the childhood stage, the correct use of the body and voice are of fundamental importance. These two methods differ from one another; one principally musical and instrumental, which is the Suzuki method, and one non-musical, the BAPNE method, which aims at stimulating attention, concentration, memory and the executing function of the pupil through music and body percussion. Comparing different approaches may provide teachers with a useful insight for facing different issues related to their discipline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Laryngeal and tongue function was assessed in 28 patients to evaluate the presence, nature, and resolution of superior recurrent laryngeal and hypoglossal nerve damage resulting from standard open primary carotid endarterectomy (CEA). Methods. The laryngeal and tongue function in 28 patients who underwent CEA were examined prospectively with various physiologic (Aerophone II, laryngograph, tongue transducer), acoustic (Multi-Dimensional Voice Program), and perceptual speech assessments. Measures were obtained from all participants preoperatively, and at 2 weeks and at 3 months postoperatively. Results. The perceptual speech assessment indicated that the vocal quality of roughness was significantly more apparent at the 2-week postoperative assessment than preoperatively. However, by the 3-month postoperative assessment these values had returned to near preoperative levels, with no significant difference detected between preoperative and 3-month postoperative levels or between 2-week and 3-month postoperative levels. Both the instrumental assessments of laryngeal function and the acoustic assessment of vocal quality failed to identify any significant difference on any measure across the three assessment periods. Similarly, no significant impairment in tongue strength, endurance, or rate of repetitive tongue movements was detected at instrumental assessment of tongue function. Conclusions: No permanent changes to vocal or tongue function occurred in this group of participants after primary CEA. The lack of any significant long-term laryngeal or tongue dysfunction in this group suggests that the standard open CEA procedure is not associated with high rates of superior recurrent and hypoglossal nerve dysfunction, as previously believed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature discusses several methods to control for self-selection effects but provides little guidance on which method to use in a setting with a limited number of variables. The authors theoretically compare and empirically assess the performance of different matching methods and instrumental variable and control function methods in this type of setting by investigating the effect of online banking on product usage. Hybrid matching in combination with the Gaussian kernel algorithm outperforms the other methods with respect to predictive validity. The empirical finding of large self-selection effects indicates the importance of controlling for these effects when assessing the effectiveness of marketing activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ontological approach to structuring knowledge and the description of data domain of knowledge is considered. It is described tool ontology-controlled complex for research and developments of sensor systems. Some approaches to solution most frequently meeting tasks are considered for creation of the recognition procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Locard exchange principle proposes that a person can not enter or leave an area or come in contact with an object, without an exchange of materials. In the case of scent evidence, the suspect leaves his scent in the location of the crime scene itself or on objects found therein. Human scent evidence collected from a crime scene can be evaluated through the use of specially trained canines to determine an association between the evidence and a suspect. To date, there has been limited research as to the volatile organic compounds (VOCs) which comprise human odor and their usefulness in distinguishing among individuals. For the purposes of this research, human scent is defined as the most abundant volatile organic compounds present in the headspace above collected odor samples. ^ An instrumental method has been created for the analysis of the VOCs present in human scent, and has been utilized for the optimization of materials used for the collection and storage of human scent evidence. This research project has identified the volatile organic compounds present in the headspace above collected scent samples from different individuals and various regions of the body, with the primary focus involving the armpit area and the palms of the hands. Human scent from the armpit area and palms of an individual sampled over time shows lower variation in the relative peak area ratio of the common compounds present than what is seen across a population. A comparison of the compounds present in human odor for an individual over time, and across a population has been conducted and demonstrates that it is possible to instrumentally differentiate individuals based on the volatile organic compounds above collected odor samples. ^