51 resultados para Standard reference
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Chemical analysis is a well-established procedure for the provenancing of archaeological ceramics. Various analytical techniques are routinely used and large amounts of data have been accumulated so far in data banks. However, in order to exchange results obtained by different laboratories, the respective analytical procedures need to be tested in terms of their inter-comparability. In this study, the schemes of analysis used in four laboratories that are involved in archaeological pottery studies on a routine basis were compared. The techniques investigated were neutron activation analysis (NAA), X-ray fluorescence analysis (XRF), inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS). For this comparison series of measurements on different geological standard reference materials (SRM) were carried out and the results were statistically evaluated. An attempt was also made towards the establishment of calibration factors between pairs of analytical setups in order to smooth the systematic differences among the results.
Resumo:
This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.
Resumo:
The aim of this paper is to analyse the effects of recent regulatory reforms that Spanish Health Authorities have implemented in the pharmaceutical market: the introduction of a reference price system together with the promotion of generic drugs. The main objectives of these two reforms are to increase price competition and, ultimately, reduce pharmaceutical costs. Before the introduction of reference prices, consumers had to pay a fixed copayment of the price of whatever drug purchased. With the introduction of such system, the situation differs in the following way: if (s)he buys the more expensive branded drug, then (s)he pays a sum of two elements: the copayment associated to the reference price plus the difference between the price of this good and the reference price. However, if the consumer decides to buy the generic alternative, with price lower than the reference price, then (s)he has to pay the same copayment as before. We show that the introduction of a reference price system together with the promotion of generic drugs increase price competition and lower pharmaceutical costs only if the reference price is set in a certain interval. Also profits for the duopolists might be reduced. These results are due to the opposing effects that reference prices have on branded and generic producers respectively.
Resumo:
We show how to calibrate CES production and utility functions when indirect taxation affecting inputs and consumption is present. These calibrated functions can then be used in computable general equilibrium models. Taxation modifies the standard calibration procedures since any taxed good has two associated prices and a choice of reference value units has to be made. We also provide an example of computer code to solve the calibration of CES utilities under two alternate normalizations. To our knowledge, this paper fills a methodological gap in the CGE literature.
Resumo:
Estudi elaborat a partir d’una estada a la Universität Karlsruhe entre gener i maig del 2007. Les biblioteques d’estructures de dades defineixen interfícies i implementen algorismes i estructures de dades fonamentals. Un exemple n’és la Satandard Template Library (STL ), que forma part del llenguatge de programació C++. En el marc d’una tesi, s’està treballant per obtenir implementacions més eficients i/o versàtils d’alguns components de la STL. Per a fer-ho s’utilitzen tècniques de la enginyeria d’algorismes. En particular, s’integra el coneixement de la comunitat algorítmica i es té en consideració la tecnologia existent. L’acció durant l’estada s’ha emmarcat en el desenvolupament la Multi Core STL (MCSTL ). La MCSTL és una implementació paral•lela de la STL per a màquines multi-core. Les màquines multi-core són actualment l’únic tipus de màquina disponible al mercat. Per tant, tot i que el paral•lelisme obtingut no sigui òptim, és preferible a tenir els processadors esperant, ja que , la tendència és que el nombre de processadors per computador augmenti.
Application of standard and refined heat balance integral methods to one-dimensional Stefan problems
Resumo:
The work in this paper concerns the study of conventional and refined heat balance integral methods for a number of phase change problems. These include standard test problems, both with one and two phase changes, which have exact solutions to enable us to test the accuracy of the approximate solutions. We also consider situations where no analytical solution is available and compare these to numerical solutions. It is popular to use a quadratic profile as an approximation of the temperature, but we show that a cubic profile, seldom considered in the literature, is far more accurate in most circumstances. In addition, the refined integral method can give greater improvement still and we develop a variation on this method which turns out to be optimal in some cases. We assess which integral method is better for various problems, showing that it is largely dependent on the specified boundary conditions.
Resumo:
In this work discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio. In particular, this analysis focuses on the formula presented in the latest quantitative impact study (2010 CEIOPS) for non-life underwriting premium and reserve risk. One of the keys of the standard model for premium and reserves risk is the correlation matrix between lines of business. In this work we present how the correlation matrix between lines of business could be estimated from a quantitative perspective, as well as the possibility of using a credibility model for the estimation of the matrix of correlation between lines of business that merge qualitative and quantitative perspective.
Resumo:
This paper presents an automatic vision-based system for UUV station keeping. The vehicle is equipped with a down-looking camera, which provides images of the sea-floor. The station keeping system is based on a feature-based motion detection algorithm, which exploits standard correlation and explicit textural analysis to solve the correspondence problem. A visual map of the area surveyed by the vehicle is constructed to increase the flexibility of the system, allowing the vehicle to position itself when it has lost the reference image. The testing platform is the URIS underwater vehicle. Experimental results demonstrating the behavior of the system on a real environment are presented
Resumo:
A decentralized model reference controller is designed to reduce the magnitude of the transversal vibration of a flexible cable-stayed beam structure induced by a seismic excitation. The controller design is made based on the principle of sliding mode such that a priori knowledge
Resumo:
Roughly fifteen years ago, the Church of Jesus Christ of Latter-day Saints published a new proposed standard file format. They call it GEDCOM. It was designed to allow different genealogy programs to exchange data.Five years later, in may 2000, appeared the GENTECH Data Modeling Project, with the support of the Federation of Genealogical Societies (FGS) and other American genealogical societies. They attempted to define a genealogical logic data model to facilitate data exchange between different genealogical programs. Although genealogists deal with an enormous variety of data sources, one of the central concepts of this data model was that all genealogical data could be broken down into a series of short, formal genealogical statements. It was something more versatile than only export/import data records on a predefined fields. This project was finally absorbed in 2004 by the National Genealogical Society (NGS).Despite being a genealogical reference in many applications, these models have serious drawbacks to adapt to different cultural and social environments. At the present time we have no formal proposal for a recognized standard to represent the family domain.Here we propose an alternative conceptual model, largely inherited from aforementioned models. The design is intended to overcome their limitations. However, its major innovation lies in applying the ontological paradigm when modeling statements and entities.
Resumo:
We performed a comprehensive study to assess the fit for purpose of four chromatographic conditions for the determination of six groups of marine lipophilic toxins (okadaic acid and dinophysistoxins, pectenotoxins, azaspiracids, yessotoxins, gymnodimine and spirolides) by LC-MS/MS to select the most suitable conditions as stated by the European Union Reference Laboratory for Marine Biotoxins (EURLMB). For every case, the elution gradient has been optimized to achieve a total run-time cycle of 12 min. We performed a single-laboratory validation for the analysis of three relevant matrices for the seafood aquaculture industry (mussels, pacific oysters and clams), and for sea urchins for which no data about lipophilic toxins have been reported before. Moreover, we have compared the method performance under alkaline conditions using two quantification strategies: the external standard calibration (EXS) and the matrix-matched standard calibration (MMS). Alkaline conditions were the only scenario that allowed detection windows with polarity switching in a 3200 QTrap mass spectrometer, thus the analysis of all toxins can be accomplished in a single run, increasing sample throughput. The limits of quantification under alkaline conditions met the validation requirements established by the EURLMB for all toxins and matrices, while the remaining conditions failed in some cases. The accuracy of the method and the matrix effects where generally dependent on the mobile phases and the seafood species. The MMS had a moderate positive impact on method accuracy for crude extracts, but it showed poor trueness for seafood species other than mussels when analyzing hydrolyzed extracts. Alkaline conditions with EXS and recovery correction for OA were selected as the most proper conditions in the context of our laboratory. This comparative study can help other laboratories to choose the best conditions for the implementation of LC-MS/MS according to their own necessities.
Resumo:
Background: The GENCODE consortium was formed to identify and map all protein-coding genes within the ENCODE regions. This was achieved by a combination of initial manualannotation by the HAVANA team, experimental validation by the GENCODE consortium and a refinement of the annotation based on these experimental results.Results: The GENCODE gene features are divided into eight different categories of which onlythe first two (known and novel coding sequence) are confidently predicted to be protein-codinggenes. 5’ rapid amplification of cDNA ends (RACE) and RT-PCR were used to experimentallyverify the initial annotation. Of the 420 coding loci tested, 229 RACE products have beensequenced. They supported 5’ extensions of 30 loci and new splice variants in 50 loci. In addition,46 loci without evidence for a coding sequence were validated, consisting of 31 novel and 15putative transcripts. We assessed the comprehensiveness of the GENCODE annotation byattempting to validate all the predicted exon boundaries outside the GENCODE annotation. Outof 1,215 tested in a subset of the ENCODE regions, 14 novel exon pairs were validated, only twoof them in intergenic regions.Conclusions: In total, 487 loci, of which 434 are coding, have been annotated as part of theGENCODE reference set available from the UCSC browser. Comparison of GENCODEannotation with RefSeq and ENSEMBL show only 40% of GENCODE exons are contained withinthe two sets, which is a reflection of the high number of alternative splice forms with uniqueexons annotated. Over 50% of coding loci have been experimentally verified by 5’ RACE forEGASP and the GENCODE collaboration is continuing to refine its annotation of 1% humangenome with the aid of experimental validation.
Resumo:
Background: Non-invasive monitoring of respiratory muscle function is an area of increasing research interest, resulting in the appearance of new monitoring devices, one of these being piezoelectric contact sensors. The present study was designed to test whether the use of piezoelectric contact (non-invasive) sensors could be useful in respiratory monitoring, in particular in measuring the timing of diaphragmatic contraction.Methods: Experiments were performed in an animal model: three pentobarbital anesthetized mongrel dogs. The motion of the thoracic cage was acquired by means of a piezoelectric contact sensor placed on the costal wall. This signal is compared with direct measurements of the diaphragmatic muscle length, made by sonomicrometry. Furthermore, to assess the diaphragmatic function other respiratory signals were acquired: respiratory airflow and transdiaphragmatic pressure. Diaphragm contraction time was estimated with these four signals. Using diaphragm length signal as reference, contraction times estimated with the other three signals were compared with the contraction time estimated with diaphragm length signal.Results: The contraction time estimated with the TM signal tends to give a reading 0.06 seconds lower than the measure made with the DL signal (-0.21 and 0.00 for FL and DP signals, respectively), with a standard deviation of 0.05 seconds (0.08 and 0.06 for FL and DP signals, respectively). Correlation coefficients indicated a close link between time contraction estimated with TM signal and contraction time estimated with DL signal (a Pearson correlation coefficient of 0.98, a reliability coefficient of 0.95, a slope of 1.01 and a Spearman's rank-order coefficient of 0.98). In general, correlation coefficients and mean and standard deviation of the difference were better in the inspiratory load respiratory test than in spontaneous ventilation tests.Conclusion: The technique presented in this work provides a non-invasive method to assess the timing of diaphragmatic contraction in canines, using a piezoelectric contact sensor placed on the costal wall.
Resumo:
Expressions relating spectral efficiency, power, and Doppler spectrum, are derived for Rayleigh-faded wireless channels with Gaussian signal transmission. No side information on the state of the channel is assumed at the receiver. Rather, periodic reference signals are postulated in accordance with the functioning of most wireless systems. The analysis relies on a well-established lower bound, generally tight and asymptotically exact at low SNR. In contrast with most previous studies, which relied on block-fading channel models, a continuous-fading model is adopted. This embeds the Doppler spectrum directly in the derived expressions, imbuing them with practical significance. Closed-form relationships are obtained for the popular Clarke-Jakes spectrum and informative expansions, valid for arbitrary spectra, are found for the low- and high-power regimes. While the paper focuses on scalar channels, the extension to multiantenna settings is also discussed.
Resumo:
Expressions relating spectral efficiency, power and Doppler spectrum are derived for low-power Rayleighfaded wireless channels with proper complex signaling. Noside information on the state of the channel is assumed at the receiver. Rather, periodic reference signals are postulated inaccordance with the functioning of most wireless systems. In contrast with most previous studies, which relied on block-fading channel models, a continuous-fading model is adopted. This embeds the Doppler spectrum directly in thederived expressions thereby imbuing them with practical significance.