8 resultados para Ultra-trace analysis

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Earth System Models (ESM) have been successfuly developed over past few years, and are currently beeing used for simulating present day-climate, seasonal to interanual predictions of climate change. The supercomputer performance plays an important role in climate modeling since one of the challenging issues for climate modellers is to efficiently and accurately couple earth System components on present day computers architectures. At the Barcelona Supercomputing Center (BSC), we work with the EC- Earth System Model. The EC- Earth is an ESM, which currently consists of an atmosphere (IFS) and an ocean (NEMO) model that communicate with each other through the OASIS coupler. Additional modules (e.g. for chemistry and vegetation ) are under development. The EC-Earth ESM has been ported successfully over diferent high performance computin platforms (e.g, IBM P6 AIX, CRAY XT-5, Intelbased Linux Clusters, SGI Altix) at diferent sites in Europ (e.g., KNMI, ICHEC, ECMWF). The objective of the first phase of the project was to identify and document the issues related with the portability and performance of EC-Earth on the MareNostrum supercomputer, a System based on IBM PowerPC 970MP processors and run under a Linux Suse Distribution. EC-Earth was successfully ported to MareNostrum, and a compilation incompatibilty was solved by a two step compilation approach using XLF version 10.1 and 12.1 compilers. In addition, the EC-Earth performance was analyzed with respect to escalability and trace analysis with the Paravear software. This analysis showed that EC-Earth with a larger number of IFS CPUs (<128) is not feasible at the moment since some issues exists with the IFS-NEMO balance and MPI Communications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present work describes the development of a fast and robust analytical method for the determination of 53 antibiotic residues, covering various chemical groups and some of their metabolites, in environmental matrices that are considered important sources of antibiotic pollution, namely hospital and urban wastewaters, as well as in river waters. The method is based on automated off-line solid phase extraction (SPE) followed by ultra-high-performance liquid chromatography coupled to quadrupole linear ion trap tandem mass spectrometry (UHPLC–QqLIT). For unequivocal identification and confirmation, and in order to fulfill EU guidelines, two selected reaction monitoring (SRM) transitions per compound are monitored (the most intense one is used for quantification and the second one for confirmation). Quantification of target antibiotics is performed by the internal standard approach, using one isotopically labeled compound for each chemical group, in order to correct matrix effects. The main advantages of the method are automation and speed-up of sample preparation, by the reduction of extraction volumes for all matrices, the fast separation of a wide spectrum of antibiotics by using ultra-high-performance liquid chromatography, its sensitivity (limits of detection in the low ng/L range) and selectivity (due to the use of tandem mass spectrometry) The inclusion of β-lactam antibiotics (penicillins and cephalosporins), which are compounds difficult to analyze in multi-residue methods due to their instability in water matrices, and some antibiotics metabolites are other important benefits of the method developed. As part of the validation procedure, the method developed was applied to the analysis of antibiotics residues in hospital, urban influent and effluent wastewaters as well as in river water samples

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A capillary microtrap thermal desorption module is developed for near real-time analysis of volatile organic compounds (VOCs) at sub-ppbv levels in air samples. The device allows the direct injection of the thermally desorbed VOCs into a chromatographic column. It does not use a second cryotrap to focalize the adsorbed compounds before entering the separation column so reducing the formation of artifacts. The connection of the microtrap to a GC–MS allows the quantitative determination of VOCs in less than 40 min with detection limits of between 5 and 10 pptv (25 °C and 760 mmHg), which correspond to 19–43 ng m−3, using sampling volumes of 775 cm3. The microtrap is applied to the analysis of environmental air contamination in different laboratories of our faculty. The results obtained indicate that most volatile compounds are easily diffused through the air and that they also may contaminate the surrounding areas when the habitual safety precautions (e.g., working under fume hoods) are used during the manipulation of solvents. The application of the microtrap to the analysis of VOCs in breath samples suggest that 2,5-dimethylfuran may be a strong indicator of a person's smoking status

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to develop and validate an analytical method to simultaneously determine European Union-regulated beta-lactams (penicillins and cephalosporins) and quinolones in cow milk. The procedure involves a new solid phase extraction (SPE) to clean-up and pre-concentrate the three series of antibiotics before analysis by liquid chromatography¿tandem mass spectrometry (LC-MS/MS) and ultra-high-performance liquid chromatography¿tandem mass spectrometry (UPLC-MS/MS). LC-MS/MS and UPLC-MS/MS techniques were also compared. The method was validated according to the Directive 2002/657/EC and subsequently applied to 56 samples of raw cow milk supplied by the Laboratori Interprofessional Lleter de Catalunya (ALLIC) (Laboratori Interprofessional Lleter de Catalunya, Control Laboratory Interprofessional of Milk of Catalunya).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selection of amino acid substitutions associated with resistance to nucleos(t)ide-analog (NA) therapy in the hepatitis B virus (HBV) reverse transcriptase (RT) and their combination in a single viral genome complicates treatment of chronic HBV infection and may affect the overlapping surface coding region. In this study, the variability of an overlapping polymerase-surface region, critical for NA resistance, is investigated before treatment and under antiviral therapy, with assessment of NA-resistant amino acid changes simultaneously occurring in the same genome (linkage analysis) and their influence on the surface coding region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human activities have serious impacts on marine apex predators. Inadequate knowledge of the spatial and trophic ecology of these marine animals ultimately compromises the viability of their populations and impedes our ability to use them as environmental biomonitors. Intrinsic biogeochemical markers, such as stable isotopes, fatty acids, trace elements, and chemical pollutants, are increasingly being used to trace the spatial and trophic ecology of marine top predators. Notable advances include the emergence of the first oceanographic"isoscapes" (isotopic geographic gradients), the advent of compound-specific isotopic analyses, improvements in diet reconstruction through Bayesian statistics, and tissue analysis of tracked animals to ground-truth biogeochemical profiles. However, most researchers still focus on only a few tracers. Moreover, insufficient knowledge of the biogeochemical integration in tissues, fractionation and routing processes, and geographic and temporal variability in baseline levels continue to hamper the resolution and potential of these markers in studying the spatial and feeding ecology of top predators.