965 resultados para linear-threshold model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Labour market regulations aimed at enhancing job-security are dominant in several OECD countries. These regulations seek to reduce dismissals of workers and fluctuations in employment. The main theoretical contribution is to gauge the effects of such regulations on labour demand across establishment sizes. In order to achieve this, we investigate an optimising model of labour demand under uncertainty through the application of real option theory. We also consider other forms of employment which increase the flexibility of the labour market. In particular, we are modelling the contribution of temporary employment agencies (Zeitarbeit) allowing for quick personnel adjustments in client firms. The calibration results indicate that labour market rigidities may be crucial for understanding sluggishness in firms´ labour demand and the emergence and growth of temporary work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend the linear reforms introduced by Pf¨ahler (1984) to the case of dual taxes. We study the relative effect that linear dual tax cuts have on the inequality of income distribution -a symmetrical study can be made for dual linear tax hikes-. We also introduce measures of the degree of progressivity for dual taxes and show that they can be connected to the Lorenz dominance criterion. Additionally, we study the tax liability elasticity of each of the reforms proposed. Finally, by means of a microsimulation model and a considerably large data set of taxpayers drawn from 2004 Spanish Income Tax Return population, 1) we compare different yield-equivalent tax cuts applied to the Spanish dual income tax and 2) we investigate how much income redistribution the dual tax reform (Act ‘35/2006’) introduced with respect to the previous tax.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce duration dependent skill decay among the unemployed into a New-Keynesian model with hiring frictions developed by Blanchard/Gali (2008). If the central bank responds only to (current, lagged or expected future) inflation and quarterly skill decay is above a threshold level, determinacy requires a coefficient on inflation smaller than one. The threshold level is plausible with little steady-state hiring and firing ("Continental European Calibration") but implausibly high in the opposite case ("American calibration"). Neither interest rate smoothing nor responding to the output gap helps to restore determinacy if skill decay exceeds the threshold level. However, a modest response to unemployment guarantees determinacy. Moreover, under indeterminacy, both an adverse sunspot shock and an adverse technology shock increase unemployment extremely persistently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a time-varying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we introduce and analyze a linear size-structured population model with infinite states-at-birth. We model the dynamics of a population in which individuals have two distinct life-stages: an “active” phase when individuals grow, reproduce and die and a second “resting” phase when individuals only grow. Transition between these two phases depends on individuals’ size. First we show that the problem is governed by a positive quasicontractive semigroup on the biologically relevant state space. Then we investigate, in the framework of the spectral theory of linear operators, the asymptotic behavior of solutions of the model. We prove that the associated semigroup has, under biologically plausible assumptions, the property of asynchronous exponential growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asynchronous exponential growth has been extensively studied in population dynamics. In this paper we find out the asymptotic behaviour in a non-linear age-dependent model which takes into account sexual reproduction interactions. The main feature of our model is that the non-linear process converges to a linear one as the solution becomes large, so that the population undergoes asynchronous growth. The steady states analysis and the corresponding stability analysis are completely made and are summarized in a bifurcation diagram according to the parameter R0. Furthermore the effect of intraspecific competition is taken into account, leading to complex dynamics around steady states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An online algorithm for determining respiratory mechanics in patients using non-invasive ventilation (NIV) in pressure support mode was developed and embedded in a ventilator system. Based on multiple linear regression (MLR) of respiratory data, the algorithm was tested on a patient bench model under conditions with and without leak and simulating a variety of mechanics. Bland-Altman analysis indicates reliable measures of compliance across the clinical range of interest (± 11-18% limits of agreement). Resistance measures showed large quantitative errors (30-50%), however, it was still possible to qualitatively distinguish between normal and obstructive resistances. This outcome provides clinically significant information for ventilator titration and patient management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a contemporaneous-threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are also a function of the regime-specific innovation covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method of objectively determining imaging performance for a mammography quality assurance programme for digital systems was developed. The method is based on the assessment of the visibility of a spherical microcalcification of 0.2 mm using a quasi-ideal observer model. It requires the assessment of the spatial resolution (modulation transfer function) and the noise power spectra of the systems. The contrast is measured using a 0.2-mm thick Al sheet and Polymethylmethacrylate (PMMA) blocks. The minimal image quality was defined as that giving a target contrast-to-noise ratio (CNR) of 5.4. Several evaluations of this objective method for evaluating image quality in mammography quality assurance programmes have been considered on computed radiography (CR) and digital radiography (DR) mammography systems. The measurement gives a threshold CNR necessary to reach the minimum standard image quality required with regards to the visibility of a 0.2-mm microcalcification. This method may replace the CDMAM image evaluation and simplify the threshold contrast visibility test used in mammography quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In economic literature, information deficiencies and computational complexities have traditionally been solved through the aggregation of agents and institutions. In inputoutput modelling, researchers have been interested in the aggregation problem since the beginning of 1950s. Extending the conventional input-output aggregation approach to the social accounting matrix (SAM) models may help to identify the effects caused by the information problems and data deficiencies that usually appear in the SAM framework. This paper develops the theory of aggregation and applies it to the social accounting matrix model of multipliers. First, we define the concept of linear aggregation in a SAM database context. Second, we define the aggregated partitioned matrices of multipliers which are characteristic of the SAM approach. Third, we extend the analysis to other related concepts, such as aggregation bias and consistency in aggregation. Finally, we provide an illustrative example that shows the effects of aggregating a social accounting matrix model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Per definition, alcohol expectancies (after alcohol I expect X), and drinking motives (I drink to achieve X) are conceptually distinct constructs. Theorists have argued that motives mediate the association between expectancies and drinking outcomes. Yet, given the use of different instruments, do these constructs remain distinct when assessment items are matched? The present study tested to what extent motives mediated the link between expectancies and alcohol outcomes when identical items were used, first as expectancies and then as motives. A linear structural equation model was estimated based on a national representative sample of 5,779 alcohol-using students in Switzerland (mean age = 15.2 years). The results showed that expectancies explained up to 38% of the variance in motives. Together with motives, they explained up to 48% of the variance in alcohol outcomes (volume, 5+ drinking, and problems). In 10 of 12 outcomes, there was a significant mediated effect that was often higher than the direct expectancy effect. For coping, the expectancy effect was close to zero, indicating the strongest form of mediation. In only one case (conformity and 5+ drinking), there was a direct expectancy effect but no mediation. To conclude, the study demonstrates that motives are distinct from expectancies even when identical items are used. Motives are more proximally related to different alcohol outcomes, often mediating the effects of expectancies. Consequently, the effectiveness of interventions, particularly those aimed at coping drinkers, should be improved through a shift in focus from expectancies to drinking motives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models predicting species spatial distribution are increasingly applied to wildlife management issues, emphasising the need for reliable methods to evaluate the accuracy of their predictions. As many available datasets (e.g. museums, herbariums, atlas) do not provide reliable information about species absences, several presence-only based analyses have been developed. However, methods to evaluate the accuracy of their predictions are few and have never been validated. The aim of this paper is to compare existing and new presenceonly evaluators to usual presence/absence measures. We use a reliable, diverse, presence/absence dataset of 114 plant species to test how common presence/absence indices (Kappa, MaxKappa, AUC, adjusted D-2) compare to presenceonly measures (AVI, CVI, Boyce index) for evaluating generalised linear models (GLM). Moreover we propose a new, threshold-independent evaluator, which we call "continuous Boyce index". All indices were implemented in the B10MAPPER software. We show that the presence-only evaluators are fairly correlated (p > 0.7) to the presence/absence ones. The Boyce indices are closer to AUC than to MaxKappa and are fairly insensitive to species prevalence. In addition, the Boyce indices provide predicted-toexpected ratio curves that offer further insights into the model quality: robustness, habitat suitability resolution and deviation from randomness. This information helps reclassifying predicted maps into meaningful habitat suitability classes. The continuous Boyce index is thus both a complement to usual evaluation of presence/absence models and a reliable measure of presence-only based predictions.