953 resultados para Breakdown Criterion
Resumo:
The tumour necrosis factor (TNF) family members B cell activating factor (BAFF) and APRIL (a proliferation-inducing ligand) are crucial survival factors for peripheral B cells. An excess of BAFF leads to the development of autoimmune disorders in animal models, and high levels of BAFF have been detected in the serum of patients with various autoimmune conditions. In this Review, we consider the possibility that in mice autoimmunity induced by BAFF is linked to T cell-independent B cell activation rather than to a severe breakdown of B cell tolerance. We also outline the mechanisms of BAFF signalling, the impact of ligand oligomerization on receptor activation and the progress of BAFF-depleting agents in the clinical setting.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
The breakdown of the Bretton Woods system and the adoption of generalized oating exchange rates ushered in a new era of exchange rate volatility and uncer- tainty. This increased volatility lead economists to search for economic models able to describe observed exchange rate behavior. In the present paper we propose more general STAR transition functions which encompass both threshold nonlinearity and asymmetric e¤ects. Our framework allows for a gradual adjustment from one regime to another, and considers threshold e¤ects by encompassing other existing models, such as TAR models. We apply our methodology to three di¤erent exchange rate data-sets, one for developing countries, and o¢ cial nominal exchange rates, the sec- ond emerging market economies using black market exchange rates and the third for OECD economies.
Resumo:
We test the real interest rate parity hypothesis using data for the G7 countries over the period 1970-2008. Our contribution is two-fold. First, we utilize the ARDL bounds approach of Pesaran et al. (2001) which allows us to overcome uncertainty about the order of integration of real interest rates. Second, we test for structural breaks in the underlying relationship using the multiple structural breaks test of Bai and Perron (1998, 2003). Our results indicate significant parameter instability and suggest that, despite the advances in economic and financial integration, real interest rate parity has not fully recovered from a breakdown in the 1980s.
Resumo:
The breakdown of the Bretton Woods system and the adoption of generalized oating exchange rates ushered in a new era of exchange rate volatility and uncer- tainty. This increased volatility lead economists to search for economic models able to describe observed exchange rate behavior. The present is a technical Appendix to Cerrato et al. (2009) and presents detailed simulations of the proposed methodology and additional empirical results.
Resumo:
We extend the linear reforms introduced by Pf¨ahler (1984) to the case of dual taxes. We study the relative effect that linear dual tax cuts have on the inequality of income distribution -a symmetrical study can be made for dual linear tax hikes-. We also introduce measures of the degree of progressivity for dual taxes and show that they can be connected to the Lorenz dominance criterion. Additionally, we study the tax liability elasticity of each of the reforms proposed. Finally, by means of a microsimulation model and a considerably large data set of taxpayers drawn from 2004 Spanish Income Tax Return population, 1) we compare different yield-equivalent tax cuts applied to the Spanish dual income tax and 2) we investigate how much income redistribution the dual tax reform (Act ‘35/2006’) introduced with respect to the previous tax.
Resumo:
Executive Summary Many commentators have criticised the strategy currently used to finance the Scottish Parliament – both the block grant system, and the small degree of fiscal autonomy devised in the Calman report and the UK government’s 2009 White Paper. Nevertheless, fiscal autonomy has now been conceded in principle. This paper sets out to identify formally what level of autonomy would be best for the Scottish economy and the institutional changes needed to support that arrangement. Our conclusions are in line with the Steel Commission: that significantly more fiscal powers need to be transferred to Scotland. But what we can then do, which the Steel Commission could not, is to give a detailed blueprint for how this proposal might be implemented in practice. We face two problems. The existing block grant system can and has been criticised from such a wide variety of points of view that it effectively has no credibility left. On the other hand, the Calman proposals (and the UK government proposals that followed) are unworkable because, to function, they require information that the policy makers cannot possibly have; and because, without borrowing for current activities, they contain no mechanism to reconcile contractual spending (most of the budget) with variable revenue flows – which is to invite an eventual breakdown. But in its attempt to fix these problems, the UK White Paper introduces three further difficulties: new grounds for quarrels between the UK and Scottish governments, a long term deflation bias, and a loss of devolution.
Resumo:
Accurate chromosome segregation during mitosis is temporally and spatially coordinated by fidelity-monitoring checkpoint systems. Deficiencies in these checkpoint systems can lead to chromosome segregation errors and aneuploidy, and promote tumorigenesis. Here, we report that the TRAF-interacting protein (TRAIP), a ubiquitously expressed nucleolar E3 ubiquitin ligase important for cellular proliferation, is localized close to mitotic chromosomes. Its knockdown in HeLa cells by RNA interference (RNAi) decreased the time of early mitosis progression from nuclear envelope breakdown (NEB) to anaphase onset and increased the percentages of chromosome alignment defects in metaphase and lagging chromosomes in anaphase compared with those of control cells. The decrease in progression time was corrected by the expression of wild-type but not a ubiquitin-ligase-deficient form of TRAIP. TRAIP-depleted cells bypassed taxol-induced mitotic arrest and displayed significantly reduced kinetochore levels of MAD2 (also known as MAD2L1) but not of other spindle checkpoint proteins in the presence of nocodazole. These results imply that TRAIP regulates the spindle assembly checkpoint, MAD2 abundance at kinetochores and the accurate cellular distribution of chromosomes. The TRAIP ubiquitin ligase activity is functionally required for the spindle assembly checkpoint control.
Resumo:
In this paper, we develop numerical algorithms that use small requirements of storage and operations for the computation of invariant tori in Hamiltonian systems (exact symplectic maps and Hamiltonian vector fields). The algorithms are based on the parameterization method and follow closely the proof of the KAM theorem given in [LGJV05] and [FLS07]. They essentially consist in solving a functional equation satisfied by the invariant tori by using a Newton method. Using some geometric identities, it is possible to perform a Newton step using little storage and few operations. In this paper we focus on the numerical issues of the algorithms (speed, storage and stability) and we refer to the mentioned papers for the rigorous results. We show how to compute efficiently both maximal invariant tori and whiskered tori, together with the associated invariant stable and unstable manifolds of whiskered tori. Moreover, we present fast algorithms for the iteration of the quasi-periodic cocycles and the computation of the invariant bundles, which is a preliminary step for the computation of invariant whiskered tori. Since quasi-periodic cocycles appear in other contexts, this section may be of independent interest. The numerical methods presented here allow to compute in a unified way primary and secondary invariant KAM tori. Secondary tori are invariant tori which can be contracted to a periodic orbit. We present some preliminary results that ensure that the methods are indeed implementable and fast. We postpone to a future paper optimized implementations and results on the breakdown of invariant tori.
Resumo:
This paper is an investigation into the dynamics of asset markets with adverse selection a la Akerlof (1970). The particular question asked is: can market failure at some later date precipitate market failure at an earlier date? The answer is yes: there can be "contagious illiquidity" from the future back to the present. The mechanism works as follows. If the market is expected to break down in the future, then agents holding assets they know to be lemons (assets with low returns) will be forced to hold them for longer - they cannot quickly resell them. As a result, the effective difference in payoff between a lemon and a good asset is greater. But it is known from the static Akerlof model that the greater the payoff differential between lemons and non-lemons, the more likely is the market to break down. Hence market failure in the future is more likely to lead to market failure today. Conversely, if the market is not anticipated to break down in the future, assets can be readily sold and hence an agent discovering that his or her asset is a lemon can quickly jettison it. In effect, there is little difference in payoff between a lemon and a good asset. The logic of the static Akerlof model then runs the other way: the small payoff differential is unlikely to lead to market breakdown today. The conclusion of the paper is that the nature of today's market - liquid or illiquid - hinges critically on the nature of tomorrow's market, which in turn depends on the next day's, and so on. The tail wags the dog.
Resumo:
OBJECTIVE: To test the accuracy of a new pulse oximeter sensor based on transmittance and reflectance. This sensor makes transillumination of tissue unnecessary and allows measurements on the hand, forearm, foot, and lower limb. DESIGN: Prospective, open, nonrandomized criterion standard study. SETTING: Neonatal intensive care unit, tertiary care center. PATIENTS: Sequential sample of 54 critically ill neonates (gestational age 27 to 42 wks; postnatal age 1 to 28 days) with arterial catheters in place. MEASUREMENTS AND MAIN RESULTS: A total of 99 comparisons between pulse oximetry and arterial saturation were obtained. Comparison of femoral or umbilical arterial blood with transcutaneous measurements on the lower limb (n = 66) demonstrated an excellent correlation (r2 = .96). The mean difference was +1.44% +/- 3.51 (SD) % (range -11% to +8%). Comparison of the transcutaneous values with the radial artery saturation from the corresponding upper limb (n = 33) revealed a correlation coefficient of 0.94 with a mean error of +0.66% +/- 3.34% (range -6% to +7%). The mean difference between noninvasive and invasive measurements was least with the test sensor on the hand, intermediate on the calf and arm, and greatest on the foot. The mean error and its standard deviation were slightly larger for arterial saturation values < 90% than for values > or = 90%. CONCLUSION: Accurate pulse oximetry saturation can be acquired from the hand, forearm, foot, and calf of critically ill newborns using this new sensor.
Resumo:
In this paper we study decision making in situations where the individual’s preferences are not assumed to be complete. First, we identify conditions that are necessary and sufficient for choice behavior in general domains to be consistent with maximization of a possibly incomplete preference relation. In this model of maximally dominant choice, the agent defers/avoids choosing at those and only those menus where a most preferred option does not exist. This allows for simple explanations of conflict-induced deferral and choice overload. It also suggests a criterion for distinguishing between indifference and incomparability based on observable data. A simple extension of this model also incorporates decision costs and provides a theoretical framework that is compatible with the experimental design that we propose to elicit possibly incomplete preferences in the lab. The design builds on the introduction of monetary costs that induce choice of a most preferred feasible option if one exists and deferral otherwise. Based on this design we found evidence suggesting that a quarter of the subjects in our study had incomplete preferences, and that these made significantly more consistent choices than a group of subjects who were forced to choose. The latter effect, however, is mitigated once data on indifferences are accounted for.
Resumo:
We study the functional specialization whereby some countries contribute relatively more inventors vs. organizations in the production of inventions at a global scale. We propose a conceptual framework to explain this type of functional specialization, which posits the presence of feedbacks between two distinct sub-systems, each one providing inventors and organizations. We quantify the phenomenon by means of a new metric, the “inventor balance”, which we compute using patent data. We show that the observed imbalances, which are often conspicuous, are determined by several factors: the innovativeness of a country relative to its level of economic development, relative factor endowments, the degree of technological specialization and, last, cultural traits. We argue that the “inventor balance” is a useful indicator for policy makers, and its routine analysis could lead to better informed innovation policies.
Resumo:
A statistical methodology is developed by which realised outcomes can be used to identify, for calendar years between 1974 and 2012, when policy makers in ‘advanced’ economies have successfully pursued single objectives of different kinds, or multiple objectives. A simple criterion is then used to distinguish between multiple objectives pure and simple and multiple objectives subject to a price stability constraint. The overall and individual country results which this methodology produces seem broadly plausible. Unconditional and conditional analyses of the inflation and growth associated with different types of objectives reveal that multiple objectives subject to a price stability constraint are associated with roughly as good economic performance as the single objective of inflation. A proposal is then made as to how the remit of an inflation-targeting central bank could be adjusted to allow it to pursue other objectives in extremis without losing the credibility effects associated with inflation targeting.
Resumo:
The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.