879 resultados para 3D printing,steel bars,calibration of design values,correlation
Resumo:
In spite of all innovations in stent design, commonly used metallic stents present several problems such as corrosion, infection and restenosis, leading to health complications or even death of patients. In this context, the present paper reports a systematic investigation on designing and development of 100% fiber based stents, which can eliminate or minimize the problems with existing metallic stents. For this purpose, braided stents were produced by varying different materials, structural and process parameters such as mono-filament type and diameter, braiding angle and mandrel diameter. The influence of these design parameters on mechanical behavior as well as stent's porosity was thoroughly investigated, and suitable parameters were selected for developing a stentwith mechanical characteristics and porosity matching with the commercial stents. According to the experimental results, the best performance was achieved with a polyester stent designed with 0.27 mm monofilament diameter, braiding angle of 35° and mandrel diameter of 6 mm, providing similar properties to commercial Nitinol stents.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
We show how to calibrate CES production and utility functions when indirect taxation affecting inputs and consumption is present. These calibrated functions can then be used in computable general equilibrium models. Taxation modifies the standard calibration procedures since any taxed good has two associated prices and a choice of reference value units has to be made. We also provide an example of computer code to solve the calibration of CES utilities under two alternate normalizations. To our knowledge, this paper fills a methodological gap in the CGE literature.
Resumo:
This text was presented at the 16th International Seminar on Olympic Studies for Postgraduate Students that was organised by the International Olympic Academy in Ancient Olympia, from 1st to 30th July 2008. First here are reported, fundamental concepts on Olympics such as the Olympic values and the educational mandate of Pierre de Coubertin, the Olympic brand and symbols, the sponsorship and the Olympic partner programme. Then there is a chapter regarding the Top sponsors educational initiatives on Olympic values, and specially, describing the Olympic sponsors involvement in education and Top sponsors educational activities. And finally, the author analyses the sponsorship role in the promotion of Olympic Values Education, providing conclusions, comments on future and perspectives and some recommendations.
Resumo:
BACKGROUND: Management of blood pressure (BP) in acute ischemic stroke is controversial. The present study aims to explore the association between baseline BP levels and BP change and outcome in the overall stroke population and in specific subgroups with regard to the presence of arterial hypertensive disease and prior antihypertensive treatment. METHODS: All patients registered in the Acute STroke Registry and Analysis of Lausanne (ASTRAL) between 2003 and 2009 were analyzed. Unfavorable outcome was defined as modified Rankin score more than 2. A local polynomial surface algorithm was used to assess the effect of BP values on outcome in the overall population and in predefined subgroups. RESULTS: Up to a certain point, as initial BP was increasing, optimal outcome was seen with a progressively more substantial BP decrease over the next 24-48 h. Patients without hypertensive disease and an initially low BP seemed to benefit from an increase of BP. In patients with hypertensive disease, initial BP and its subsequent changes seemed to have less influence on clinical outcome. Patients who were previously treated with antihypertensives did not tolerate initially low BPs well. CONCLUSION: Optimal outcome in acute ischemic stroke may be determined not only by initial BP levels but also by the direction and magnitude of associated BP change over the first 24-48 h.
Resumo:
An accurate sense of time contributes to functions ranging from the perception and anticipation of sensory events to the production of coordinated movements. However, accumulating evidence demonstrates that time perception is subject to strong illusory distortion. In two experiments, we investigated whether the subjective speed of temporal perception is dependent on our visual environment. By presenting human observers with speed-altered movies of a crowded street scene, we modulated performance on subsequent production of "20s" elapsed intervals. Our results indicate that one's visual environment significantly contributes to calibrating our sense of time, independently of any modulation of arousal. This plasticity generates an assay for the integrity of our sense of time and its rehabilitation in clinical pathologies.
Resumo:
Despite abundant research on work meaningfulness, the link between work meaningfulness and general ethical attitude at work has not been discussed so far. In this article, we propose a theoretical framework to explain how work meaningfulness contributes to enhanced ethical behavior. We argue that by providing a way for individuals to relate work to one's personal core values and identity, work meaningfulness leads to affective commitment - the involvement of one's cognitive, emotional, and physical resources. This, in turn, leads to engagement and so facilitates the integration of one's personal values in the daily work routines, and so reduces the risk of unethical behavior. On the contrary, anomie, that is, the absence of meaning and consequently of personal involvement, will lead to lower rational commitment rather than affective commitment, and consequently to disengagement and a-morality. We conclude with implications for the management of ethical attitudes.
Resumo:
Catadioptric sensors are combinations of mirrors and lenses made in order to obtain a wide field of view. In this paper we propose a new sensor that has omnidirectional viewing ability and it also provides depth information about the nearby surrounding. The sensor is based on a conventional camera coupled with a laser emitter and two hyperbolic mirrors. Mathematical formulation and precise specifications of the intrinsic and extrinsic parameters of the sensor are discussed. Our approach overcomes limitations of the existing omni-directional sensors and eventually leads to reduced costs of production
Resumo:
In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.
Resumo:
Review of the book: The Tinkerer's Accomplice: How Design Emerges From Life Itself by J. Scott TurnerHarvard University Press: 2007. 304 pp.
Resumo:
This paper points out an empirical puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, both sticky wages and match-specific productivity shocks help the model reproduce the stylized facts: both make the firm's flow of surplus more procyclical, thus making hiring more procyclical too.
Resumo:
In this paper, we develop a general equilibrium model of crime and show thatlaw enforcement has different roles depending on the equilibrium characterization and the value of social norms. When an economy has a unique stable equilibrium where a fraction of the population is productive and the remaining predates, the government can choose an optimal law enforcement policy to maximize a welfare function evaluated at the steady state. If such steady state is not unique, law enforcement is still relevant but in a completely different way because the steady state that prevails depends on the initial proportions of productive and predator individuals in the economy. The relative importance of these proportions can be changed through law enforcement policy.
Resumo:
Using a suitable Hull and White type formula we develop a methodology to obtain asecond order approximation to the implied volatility for very short maturities. Using thisapproximation we accurately calibrate the full set of parameters of the Heston model. Oneof the reasons that makes our calibration for short maturities so accurate is that we alsotake into account the term-structure for large maturities. We may say that calibration isnot "memoryless", in the sense that the option's behavior far away from maturity doesinfluence calibration when the option gets close to expiration. Our results provide a wayto perform a quick calibration of a closed-form approximation to vanilla options that canthen be used to price exotic derivatives. The methodology is simple, accurate, fast, andit requires a minimal computational cost.