979 resultados para cost estimating tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Coordination is a strategy chosen by the central nervous system to control the movements and maintain stability during gait. Coordinated multi-joint movements require a complex interaction between nervous outputs, biomechanical constraints, and pro-prioception. Quantitatively understanding and modeling gait coordination still remain a challenge. Surgeons lack a way to model and appreciate the coordination of patients before and after surgery of the lower limbs. Patients alter their gait patterns and their kinematic synergies when they walk faster or slower than normal speed to maintain their stability and minimize the energy cost of locomotion. The goal of this study was to provide a dynamical system approach to quantitatively describe human gait coordination and apply it to patients before and after total knee arthroplasty. Methods: A new method of quantitative analysis of interjoint coordination during gait was designed, providing a general model to capture the whole dynamics and showing the kinematic synergies at various walking speeds. The proposed model imposed a relationship among lower limb joint angles (hips and knees) to parameterize the dynamics of locomotion of each individual. An integration of different analysis tools such as Harmonic analysis, Principal Component Analysis, and Artificial Neural Network helped overcome high-dimensionality, temporal dependence, and non-linear relationships of the gait patterns. Ten patients were studied using an ambulatory gait device (Physilog®). Each participant was asked to perform two walking trials of 30m long at 3 different speeds and to complete an EQ-5D questionnaire, a WOMAC and Knee Society Score. Lower limbs rotations were measured by four miniature angular rate sensors mounted respectively, on each shank and thigh. The outcomes of the eight patients undergoing total knee arthroplasty, recorded pre-operatively and post-operatively at 6 weeks, 3 months, 6 months and 1 year were compared to 2 age-matched healthy subjects. Results: The new method provided coordination scores at various walking speeds, ranged between 0 and 10. It determined the overall coordination of the lower limbs as well as the contribution of each joint to the total coordination. The difference between the pre-operative and post-operative coordination values were correlated with the improvements of the subjective outcome scores. Although the study group was small, the results showed a new way to objectively quantify gait coordination of patients undergoing total knee arthroplasty, using only portable body-fixed sensors. Conclusion: A new method for objective gait coordination analysis has been developed with very encouraging results regarding the objective outcome of lower limb surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a theoretical analysis of democratization processes in which an elite extends the franchise to the poor when threatened with a revolution. The poor could govern without changing the political system by maintaining a continuous revolutionary threat on the elite. Revolutionary threats, however, are costly to the poor and democracy is a superior sys- tem in which political agreement is reached through costless voting. This provides a rationale for democratic transitions that has not been discussed in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the screening problem that arises in a framework where, initially, the agent is privately informed about both the expected production cost and the cost variability and, at a later stage, he learns privately the cost realization. The speci c set of relevant incentive constraints, and so the characteristics of the optimal mechanism, depend nely upon the curvature of the principal s marginal surplus function as well as the relative importance of the two initial information problems. Pooling of production levels is optimally induced with respect to the cost variability when the principal's knowledge imperfection about the latter is sufficiently less important than that about the expected cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we show that the inclusion of unemployment-tenure interaction variates in Mincer wage equations is subject to serious pitfalls. These variates were designed to test whether or not the sensitivity to the business cycle of a worker’s wage varies according to her tenure. We show that three canonical variates used in the literature - the minimum unemployment rate during a worker’s time at the firm(min u), the unemployment rate at the start of her tenure(Su) and the current unemployment rate interacted with a new hire dummy(δu) - can all be significant and "correctly" signed even when each worker in the firm receives the same wage, regardless of tenure (equal treatment). In matched data the problem can be resolved by the inclusion in the panel of firm-year interaction dummies. In unmatched data where this is not possible, we propose a solution for min u and Su based on Solon, Barsky and Parker’s(1994) two step method. This method is sub-optimal because it ignores a large amount of cross tenure variation in average wages and is only valid when the scaled covariances of firm wages and firm employment are acyclical. Unfortunately δu cannot be identified in unmatched data because a differential wage response to unemployment of new hires and incumbents will appear under both equal treatment and unequal treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses forecasts from the European Central Bank's Survey of Professional Forecasters to investigate the relationship between inflation and inflation expectations in the euro area. We use theoretical structures based on the New Keynesian and Neoclassical Phillips curves to inform our empirical work. Given the relatively short data span of the Survey of Professional Forecasters and the need to control for many explanatory variables, we use dynamic model averaging in order to ensure a parsimonious econometric speci cation. We use both regression-based and VAR-based methods. We find no support for the backward looking behavior embedded in the Neo-classical Phillips curve. Much more support is found for the forward looking behavior of the New Keynesian Phillips curve, but most of this support is found after the beginning of the financial crisis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project will develop a modelling framework to explain changes in income-related health inequalities and benchmark the performance of Scotland in tackling income-related health inequalities, both over time and relative to that of England and Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

‘Modern’ Phillips curve theories predict inflation is an integrated, or near integrated, process. However, inflation appears bounded above and below in developed economies and so cannot be ‘truly’ integrated and more likely stationary around a shifting mean. If agents believe inflation is integrated as in the ‘modern’ theories then they are making systematic errors concerning the statistical process of inflation. An alternative theory of the Phillips curve is developed that is consistent with the ‘true’ statistical process of inflation. It is demonstrated that United States inflation data is consistent with the alternative theory but not with the existing ‘modern’ theories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After a large scale field trial performed in central Brazil envisaging the control of Chagas' disease vectors in an endemic area colonized by Triatoma infestans and T. sordida the cost-effectiveness analysis for each insecticide/formulation was performed. It considered the operational costs and the prices of insecticides and formulations, related to the activity and persistence of each one. The end point was considered to be less than 90% of domicilliary unitis (house + annexes) free of infestation. The results showed good cost-effectiveness for a slow-release emulsifiable suspension (SRES) based on PVA and containing malathion as active ingredient, as well as for the pyrethroids tested in this assay-cyfluthrin, cypermethrin, deltamethrin and permethrin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NORTH SEA STUDY OCCASIONAL PAPER No. 112

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluates the effect of the individual‘s household income on their health at the later stages of working life. A structural equation model is utilised in order to derive a composite and continuous index of the latent health status from qualitative health status indicators. The endogenous relationship between health status and household income status is taken into account by using IV estimators. The findings reveal a significant effect of individual household income on health before and after endogeneity is taken into account and after a host of other factors which is known to influence health, including hereditary factors and the individual‘s locus of control. Importantly, it is also shown that the childhood socioeconomic position of the individual has long lasting effects on health as it appears to play a significant role in determining health during the later stages of working life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the literature estimating DSGE models for monetary policy analysis assume that policy follows a simple rule. In this paper we allow policy to be described by various forms of optimal policy - commitment, discretion and quasi-commitment. We find that, even after allowing for Markov switching in shock variances, the inflation target and/or rule parameters, the data preferred description of policy is that the US Fed operates under discretion with a marked increase in conservatism after the 1970s. Parameter estimates are similar to those obtained under simple rules, except that the degree of habits is significantly lower and the prevalence of cost-push shocks greater. Moreover, we find that the greatest welfare gains from the ‘Great Moderation’ arose from the reduction in the variances in shocks hitting the economy, rather than increased inflation aversion. However, much of the high inflation of the 1970s could have been avoided had policy makers been able to commit, even without adopting stronger anti-inflation objectives. More recently the Fed appears to have temporarily relaxed policy following the 1987 stock market crash, and has lost, without regaining, its post-Volcker conservatism following the bursting of the dot-com bubble in 2000.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the wasteful e ffect of bureaucracy on the economy by addressing the link between rent-seeking behavior of government bureaucrats and the public sector wage bill, which is taken to represent the rent component. In particular, public o fficials are modeled as individuals competing for a larger share of those public funds. The rent-seeking extraction technology in the government administration is modeled as in Murphy et al. (1991) and incorporated in an otherwise standard Real-Business-Cycle (RBC) framework with public sector. The model is calibrated to German data for the period 1970-2007. The main fi ndings are: (i) Due to the existence of a signi ficant public sector wage premium and the high public sector employment, a substantial amount of working time is spent rent-seeking, which in turn leads to signifi cant losses in terms of output; (ii) The measures for the rent-seeking cost obtained from the model for the major EU countries are highly-correlated to indices of bureaucratic ineffi ciency; (iii) Under the optimal scal policy regime,steady-state rent-seeking is smaller relative to the exogenous policy case, as the government chooses a higher public wage premium, but sets a much lower public employment, thus achieving a decrease in rent-seeking.