988 resultados para Fantôme de calibration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of self-calibrating techniques in parallel magnetic resonance imaging eliminates the need for coil sensitivity calibration scans and avoids potential mismatches between calibration scans and subsequent accelerated acquisitions (e.g., as a result of patient motion). Most examples of self-calibrating Cartesian parallel imaging techniques have required the use of modified k-space trajectories that are densely sampled at the center and more sparsely sampled in the periphery. However, spiral and radial trajectories offer inherent self-calibrating characteristics because of their densely sampled center. At no additional cost in acquisition time and with no modification in scanning protocols, in vivo coil sensitivity maps may be extracted from the densely sampled central region of k-space. This work demonstrates the feasibility of self-calibrated spiral and radial parallel imaging using a previously described iterative non-Cartesian sensitivity encoding algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is increasing evidence that the clinical efficacy of tamoxifen, the first and most widely used targeted therapy for estrogen-sensitive breast cancer, depends on the formation of the active metabolites 4-hydroxy-tamoxifen and 4-hydroxy-N-desmethyl-tamoxifen (endoxifen). Large inter-individual variability in endoxifen plasma concentrations has been observed and related both to genetic and environmental (i.e. drug-induced) factors altering CYP450s metabolizing enzymes activity. In this context, we have developed an ultra performance liquid chromatography-tandem mass spectrometry method (UPLC-MS/MS) requiring 100 μL of plasma for the quantification of tamoxifen and three of its major metabolites in breast cancer patients. Plasma is purified by a combination of protein precipitation, evaporation at room temperature under nitrogen, and reconstitution in methanol/20 mM ammonium formate 1:1 (v/v), adjusted to pH 2.9 with formic acid. Reverse-phase chromatographic separation of tamoxifen, N-desmethyl-tamoxifen, 4-hydroxy-tamoxifen and 4-hydroxy-N-desmethyl-tamoxifen is performed within 13 min using elution with a gradient of 10 mM ammonium formate and acetonitrile, both containing 0.1% formic acid. Analytes quantification, using matrix-matched calibration samples spiked with their respective deuterated internal standards, is performed by electrospray ionization-triple quadrupole mass spectrometry using selected reaction monitoring detection in the positive mode. The method was validated according to FDA recommendations, including assessment of relative matrix effects variability, as well as tamoxifen and metabolites short-term stability in plasma and whole blood. The method is precise (inter-day CV%: 2.5-7.8%), accurate (-1.4 to +5.8%) and sensitive (lower limits of quantification comprised between 0.4 and 2.0 ng/mL). Application of this method to patients' samples has made possible the identification of two further metabolites, 4'-hydroxy-tamoxifen and 4'-hydroxy-N-desmethyl-tamoxifen, described for the first time in breast cancer patients. This UPLC-MS/MS assay is currently applied for monitoring plasma levels of tamoxifen and its metabolites in breast cancer patients within the frame of a clinical trial aiming to assess the impact of dose increase on tamoxifen and endoxifen exposure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A previously developed high performance liquid chromatography mass spectrometry (HPLC-MS) procedure for the simultaneous determination of antidementia drugs, including donepezil, galantamine, memantine, rivastigmine and its metabolite NAP 226-90, was transferred to an ultra performance liquid chromatography system coupled to a tandem mass spectrometer (UPLC-MS/MS). The drugs and their internal standards ([(2)H(7)]-donepezil, [(13)C,(2)H(3)]-galantamine, [(13)C(2),(2)H(6)]-memantine, [(2)H(6)]-rivastigmine) were extracted from 250μL human plasma by protein precipitation with acetonitrile. Chromatographic separation was achieved on a reverse phase column (BEH C18 2.1mm×50mm; 1.7μm) with a gradient elution of an ammonium acetate buffer at pH 9.3 and acetonitrile at a flow rate of 0.4mL/min and an overall run time of 4.5min. The analytes were detected on a tandem quadrupole mass spectrometer operated in positive electrospray ionization mode, and quantification was performed using multiple reaction monitoring. The method was validated according to the recommendations of international guidelines over a calibration range of 1-300ng/mL for donepezil, galantamine and memantine, and 0.2-50ng/mL for rivastimgine and NAP 226-90. The trueness (86-108%), repeatability (0.8-8.3%), intermediate precision (2.3-10.9%) and selectivity of the method were found to be satisfactory. Matrix effects variability was inferior to 15% for the analytes and inferior to 5% after correction by internal standards. A method comparison was performed with patients' samples showing similar results between the HPLC-MS and UPLC-MS/MS procedures. Thus, this validated UPLC-MS/MS method allows to reduce the required amount of plasma, to use a simplified sample preparation, and to obtain a higher sensitivity and specificity with a much shortened run-time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the determinants of regional development using a newly constructed database of 1569 sub-national regions from 110 countries covering 74 percent of the world s surface and 97 percent of its GDP. We combine the cross-regional analysis of geographic, institutional, cultural, and human capital determinants of regional development with an examination of productivity in several thousand establishments located in these regions. To organize the discussion, we present a new model of regional development that introduces into a standard migration framework elements of both the Lucas (1978) model of the allocation of talent between entrepreneurship and work, and the Lucas (1988) model of human capital externalities. The evidence points to the paramount importance of human capital in accounting for regional differences in development, but also suggests from model estimation and calibration that entrepreneurial inputs and possibly human capital externalities help understand the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been adopted as part of the U.S. Environmental Protection Agency’s BASINS (Better Assessment Science Integrating Point & Nonpoint Sources) software package and is being used by many U.S. federal and state agencies, including the USDA within the Conservation Effects Assessment Project. At present, over 250 peer-reviewed, published articles have been identified that report SWAT applications, reviews of SWAT components, or other research that includes SWAT. Many of these peer-reviewed articles are summarized here according to relevant application categories such as streamflow calibration and related hydrologic analyses, climate change impacts on hydrology, pollutant load assessments, comparisons with other models, and sensitivity analyses and calibration techniques. Strengths and weaknesses of the model are presented, and recommended research needs for SWAT are provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 1994 Northridge earthquake sent ripples to insurance conpanieseverywhere. This was one in a series of natural disasters such asHurricane Andrew which together with the problems in Lloyd's of Londonhave insurance companies running for cover. This paper presents a calibration of the U.S. economy in a model with financial markets forinsurance derivatives that suggests the U.S. economy can deal with thedamage of natural catastrophe far better than one might think.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the advancement of phylogenetic methods to estimate speciation and extinction rates, their power can be limited under variable rates, in particular for clades with high extinction rates and small number of extant species. Fossil data can provide a powerful alternative source of information to investigate diversification processes. Here, we present PyRate, a computer program to estimate speciation and extinction rates and their temporal dynamics from fossil occurrence data. The rates are inferred in a Bayesian framework and are comparable to those estimated from phylogenetic trees. We describe how PyRate can be used to explore different models of diversification. In addition to the diversification rates, it provides estimates of the parameters of the preservation process (fossilization and sampling) and the times of speciation and extinction of each species in the data set. Moreover, we develop a new birth-death model to correlate the variation of speciation/extinction rates with changes of a continuous trait. Finally, we demonstrate the use of Bayes factors for model selection and show how the posterior estimates of a PyRate analysis can be used to generate calibration densities for Bayesian molecular clock analysis. PyRate is an open-source command-line Python program available at http://sourceforge.net/projects/pyrate/.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why was England first? And why Europe? We present a probabilistic model that builds on big-push models by Murphy, Shleifer and Vishny (1989), combined with hierarchical preferences. The interaction of exogenous demographic factors (in particular the English low-pressure variant of the European marriage pattern)and redistributive institutions such as the old Poor Law combined to make an Industrial Revolution more likely. Essentially, industrialization is the result of having a critical mass of consumers that is rich enough to afford (potentially) mass-produced goods. Our model is then calibrated to match the main characteristics of the English economy in 1750 and the observed transition until 1850.This allows us to address explicitly one of the key features of the British IndustrialRevolution unearthed by economic historians over the last three decades the slowness of productivity and output change. In our calibration, we find that the probability of Britain industrializing is 5 times larger than France s. Contrary to the recent argument by Pomeranz, China in the 18th century had essentially no chance to industrialize at all. This difference is decomposed into a demographic and a policy component, with the former being far more important than the latter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A procedure using a chirobiotic V column is presented which allows separation of the enantiomers of citalopram and its two N-demethylated metabolites, and of the internal standard, alprenolol, in human plasma. Citalopram, demethylcitalopram and didemethylcitalopram, as well as the internal standard, were recovered from plasma by liquid-liquid extraction. The limits of quantification were found to be 5 ng/ml for each enantiomer of citalopram and demethylcitalopram, and 7.5 ng/ml for each enantiomer of didemethylcitalopram. Inter- and intra-day coefficients of variation varied from 2.4% to 8.6% for S- and R-citalopram, from 2.9% to 7.4% for S- and R-demethylcitalopram, and from 5.6% to 12.4% for S- and R- didemethylcitalopram. No interference was observed from endogenous compounds following the extraction of plasma samples from 10 different patients treated with citalopram. This method allows accurate quantification for each enantiomer and is, therefore, well suited for pharmacokinetic and drug interaction investigations. The presented method replaces a previously described highly sensitive and selective high-performance liquid chromatography procedure using an acetylated 3-cyclobond column which, because of manufactural problems, is no longer usable for the separation of the enantiomers of citalopram and its demethylated metabolites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A previous study has shown the possibility to identify methane (CH4 ) using headspace-GC-MS and quantify it with a stable isotope as internal standard. The main drawback of the GC-MS methods discussed in literature for CH4 measurement is the absence of a specific internal standard necessary to perform quantification. However, it becomes essential to develop a safer method to limit the manipulation of gaseous CH4 and to precisely control the injected amount of gas for spiking and calibration by comparison with external calibration. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate a labeled gas as an internal standard in a vial on the basis of the formation of CH4 by the reaction of Grignard reagent methylmagnesium chloride with deuterated water. This method allows precise measurement of CH4 concentrations in gaseous sample as well as in a solid or a liquid sample after a thermodesorption step in a headspace vial. A full accuracy profile validation of this method is then presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing models of equilibrium unemployment with endogenous labor market participation are complex, generate procyclical unemployment rates and cannot match unemployment variability relative to GDP. We embed endogenous participation in a simple, tractable job market matching model, show analytically how variations in the participation rate are driven by the cross-sectional density of home productivity near the participation threshold, andhow this density translates into an extensive-margin labor supply elasticity. A calibration of the model to macro data not only matches employment and participation variabilities but also generates strongly countercyclical unemployment rates. With some wage rigidity the model also matches unemployment variations well. Furthermore, the labor supply elasticity implied by our calibration is consistent with microeconometric evidence for the US.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se presenta un nuevo modelo integrado de evaluación para el stock norte-centro de la anchoveta peruana que permite reconstruir y hacer un seguimiento de la estructura de longitudes del stock desde un modelo basado en edades. El modelo fue calibrado usando estimados acústicos de biomasa y estructuras de tallas provenientes de cruceros científicos y de desembarques de la pesquería. Para la calibración se utilizó un algoritmo evolutivo con diferentes funciones de aptitud para cada variable calibrada (biomasas y capturas). Se presentan los estimados mensuales de biomasa total, biomasa desovante, reclutamiento y mortalidad por pesca obtenidos por el modelo de evaluación integrada para el periodo 1964-2008. Se encontraron tres periodos cualitativamente distintos en la dinámica de anchoveta, entre 1961-1971, 1971-1991 y 1991 al presente, que se distinguen tanto por las biomasas medias anuales como por los niveles de reclutamiento observado.