985 resultados para Standard Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of adhesive joints has increased in recent decades due to its competitive features compared with traditional methods. This work aims to estimate the tensile critical strain energy release rate (GIC) of adhesive joints by the Double-Cantilever Beam (DCB) test. The J-integral is used since it enables obtaining the tensile Cohesive Zone Model (CZM) law. An optical measuring method was developed for assessing the crack tip opening (δn) and adherends rotation (θo). The proposed CZM laws were best approximated by a triangular shape for the brittle adhesive and a trapezoidal shape for the two ductile adhesives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principal aim of this paper is to develop a simple and objective model which can classify an systematise telework, and provide a standard for Portugal. This model defines special concepts of telework and allows the division of the activity in this area, making its application easier. The model was constructed on the basis of four perspectives which it has to cover: the economic and social, of the employers, of the teleworkers and the suppliers of goods and services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paracoccidioidomycosis (PCM) is caused by the dimorphic fungus Paracoccidioides brasiliensis (Pb) and corresponds to prevalent systemic mycosis in Latin America. The aim of the present work was to evaluate the dose response effect of the fungal yeast phase for the standardization of an experimental model of septic arthritis. The experiments were performed with groups of 14 rats that received doses of 103, 104 or 105 P. brasiliensis (Pb18) cells. The fungi were injected in 50 µL of phosphate-buffered saline (PBS) directly into the knee joints of the animals. The following parameters were analyzed in this work: the formation of swelling in knees infused with yeast cells and the radiological and anatomopathological alterations, besides antibody titer by ELISA. After 15 days of infection, signs of inflammation were evident. At 45 days, some features of damage and necrosis were observed in the articular cartilage. The systemic dissemination of the fungus was observed in 11% of the inoculated animals, and it was concluded that the experimental model is able to mimic articular PCM in humans and that the dose of 105 yeast cells can be used as standard in this model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As manufacturers face an increasingly competitive environment, they seek out opportunities to reduce production costs without negatively affecting the yield or the quality of their finished products. The challenge of maintaining high product quality while simultaneously reducing production costs can often be met through investments in energy efficient technologies and energy efficiency practices. Energy management systems can offer both technological and best practice efficiencies in order to achieve substantial savings. A strong energy management system provides a solid foundation for an organisation to reduce production costs and improve site efficiency. The I.S EN16001 energy management standard specifies the requirements for establishing, implementing, maintaining and improving an energy management system and represents the latest best practice for energy management in Ireland. The objective of the energy management system is to establish a systematic approach for improving energy performance continuously. The I.S EN16001 standard specifies the requirements for continuous improvement through using energy more efficiently. The author analysed how GlaxoSmithKline’s (GSK) pharmaceutical manufacturing facility in Cork implemented the I.S. EN16001 energy management system model, and defined how energy saving opportunities where identified and introduced to improve efficiency performance. The author performed an extensive literature research in order to determine the current status of the pharmaceutical industry in Ireland, the processes involved in pharmaceutical manufacturing, the energy users required for pharmaceutical manufacturing and the efficiency measures that can be applied to these energy users in order to reduce energy consumption. The author then analysed how energy management standards are introduced to industry and critically analysed the driving factors for energy management performance in Ireland through case studies. Following an investigation as to how the I.S. EN16001 energy management standard is operated in GSK, a critical analysis of the performance achieved by the GSK energy management system is undertaken in order to determine if implementing the I.S EN16001 standard accelerates achieving energy savings. Since its introduction, the I.S. EN16001 model for energy management has enabled GSK to monitor, target and identify energy efficiency opportunities throughout the site. The model has put in place an energy management system that is continuously reviewed for improvement and to date has reduced GSK’s site operations cost by over 30% through technical improvements and generating energy awareness for smarter energy consumption within the GSK Cork site. Investment in I.S. EN16001 has proved to be a sound business strategy for GSK especially in today's manufacturing environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variational steepest descent approximation schemes for the modified Patlak-Keller-Segel equation with a logarithmic interaction kernel in any dimension are considered. We prove the convergence of the suitably interpolated in time implicit Euler scheme, defined in terms of the Euclidean Wasserstein distance, associated to this equation for sub-critical masses. As a consequence, we recover the recent result about the global in time existence of weak-solutions to the modified Patlak-Keller-Segel equation for the logarithmic interaction kernel in any dimension in the sub-critical case. Moreover, we show how this method performs numerically in one dimension. In this particular case, this numerical scheme corresponds to a standard implicit Euler method for the pseudo-inverse of the cumulative distribution function. We demonstrate its capabilities to reproduce easily without the need of mesh-refinement the blow-up of solutions for super-critical masses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Selective embolization of the left-gastric artery (LGA) reduces levels of ghrelin and achieves significant short-term weight loss. However, embolization of the LGA would prevent the performance of bariatric procedures because the high-risk leakage area (gastroesophageal junction [GEJ]) would be devascularized. Aim. To assess an alternative vascular approach to the modulation of ghrelin levels and generate a blood flow manipulation, consequently increasing the vascular supply to the GEJ. Materials and methods. A total of 6 pigs underwent a laparoscopic clipping of the left gastroepiploic artery. Preoperative and postoperative CT angiographies were performed. Ghrelin levels were assessed perioperatively and then once per week for 3 weeks. Reactive oxygen species (ROS; expressed as ROS/mg of dry weight [DW]), mitochondria respiratory rate, and capillary lactates were assessed before and 1 hour after clipping (T0 and T1) and after 3 weeks of survival (T2), on seromuscular biopsies. A celiac trunk angiography was performed at 3 weeks. Results. Mean (±standard deviation) ghrelin levels were significantly reduced 1 hour after clipping (1902 ± 307.8 pg/mL vs 1084 ± 680.0; P = .04) and at 3 weeks (954.5 ± 473.2 pg/mL; P = .01). Mean ROS levels were statistically significantly decreased at the cardia at T2 when compared with T0 (0.018 ± 0.006 mg/DW vs 0.02957 ± 0.0096 mg/DW; P = .01) and T1 (0.0376 ± 0.008mg/DW; P = .007). Capillary lactates were significantly decreased after 3 weeks, and the mitochondria respiratory rate remained constant over time at the cardia and pylorus, showing significant regional differences. Conclusions. Manipulation of the gastric flow targeting the gastroepiploic arcade induces ghrelin reduction. An endovascular approach is currently under evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While consumption habits have been utilised as a means of generating a humpshaped output response to monetary policy shocks in sticky-price New Keynesian economies, there is relatively little analysis of the impact of habits (particularly,external habits) on optimal policy. In this paper we consider the implications of external habits for optimal monetary policy, when those habits either exist at the level of the aggregate basket of consumption goods (‘superficial’ habits) or at the level of individual goods (‘deep’ habits: see Ravn, Schmitt-Grohe, and Uribe (2006)). External habits generate an additional distortion in the economy, which implies that the flex-price equilibrium will no longer be efficient and that policy faces interesting new trade-offs and potential stabilisation biases. Furthermore, the endogenous mark-up behaviour, which emerges when habits are deep, can also significantly affect the optimal policy response to shocks, as well as dramatically affecting the stabilising properties of standard simple rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that introducing weak property rights in the standard real business cycle (RBC) model can help to explain economic fluctuations. This is motivated by the empirical observation that changes in institutions in emerging markets are related to the evolution of the main macroeconomic variables. In particular, in Mexico, the movements in productivity in the data are associated with changes in institutions, so that we can explain productivity shocks to a large extent as shocks to the quality of institutions. We find that the model with shocks to the degree of protection of property rights only - without technology shocks - can match the second moments in the data for Mexico well. In particular, the fit is better than that of the standard neoclassical model with full protection of property rights regarding the auto-correlations and cross-correlations in the data, especially those related to labor. Viewing productivity shocks as shocks to institutions is also consistent with the stylized fact of falling productivity and non-decreasing labor hours in Mexico over 1980-1994, which is a feature that the neoclassical model cannot match.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study how the use of judgement or “add-factors” in forecasting may disturb the set of equilibrium outcomes when agents learn using recursive methods. We isolate conditions under which new phenomena, which we call exuberance equilibria, can exist in a standard self-referential environment. Local indeterminacy is not a requirement for existence. We construct a simple asset pricing example and find that exuberance equilibria, when they exist, can be extremely volatile relative to fundamental equilibria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paper delivered at the Western Regional Science Association Annual Conference, Sedona, Arizona, February, 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a time-varying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a general equilibrium model in which nominal government debt pays an inflation risk premium. The model predicts that the inflation risk premium will be higher in economies which are exposed to unanticipated inflation through nominal asset holdings. In particular, the inflation risk premium is higher when government debt is primarily nominal, steady-state inflation is low, and when cash and nominal debt account for a large fraction of consumers' retirement portfolios. These channels do not appear to have been highlighted in previous models or tested empirically. Numerical results suggest that the inflation risk premium is comparable in magnitude to standard representative agent models. These findings have implications for management of government debt, since the inflation risk premium makes it more costly for governments to borrow using nominal rather than indexed debt. Simulations of an extended model with Epstein-Zin preferences suggest that increasing the share of indexed debt would enable governments to permanently lower taxes by an amount that is quantitatively non-trivial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper extends the Nelson-Siegel linear factor model by developing a flexible macro-finance framework for modeling and forecasting the term structure of US interest rates. Our approach is robust to parameter uncertainty and structural change, as we consider instabilities in parameters and volatilities, and our model averaging method allows for investors' model uncertainty over time. Our time-varying parameter Nelson-Siegel Dynamic Model Averaging (NS-DMA) predicts yields better than standard benchmarks and successfully captures plausible time-varying term premia in real time. The proposed model has significant in-sample and out-of-sample predictability for excess bond returns, and the predictability is of economic value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.