992 resultados para algorithm Context
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
We discuss intrinsic noise effects in stochastic multiplicative-noise partial differential equations, which are qualitatively independent of the noise interpretation (Itô vs Stratonovich), in particular in the context of noise-induced ordering phase transitions. We study a model which, contrary to all cases known so far, exhibits such ordering transitions when the noise is interpreted not only according to Stratonovich, but also to Itô. The main feature of this model is the absence of a linear instability at the transition point. The dynamical properties of the resulting noise-induced growth processes are studied and compared in the two interpretations and with a reference Ginzburg-Landau-type model. A detailed discussion of a different numerical algorithm valid for both interpretations is also presented.
Resumo:
We herein present a preliminary practical algorithm for evaluating complementary and alternative medicine (CAM) for children which relies on basic bioethical principles and considers the influence of CAM on global child healthcare. CAM is currently involved in almost all sectors of pediatric care and frequently represents a challenge to the pediatrician. The aim of this article is to provide a decision-making tool to assist the physician, especially as it remains difficult to keep up-to-date with the latest developments in the field. The reasonable application of our algorithm together with common sense should enable the pediatrician to decide whether pediatric (P)-CAM represents potential harm to the patient, and allow ethically sound counseling. In conclusion, we propose a pragmatic algorithm designed to evaluate P-CAM, briefly explain the underlying rationale and give a concrete clinical example.
Resumo:
We present a numerical method for spectroscopic ellipsometry of thick transparent films. When an analytical expression for the dispersion of the refractive index (which contains several unknown coefficients) is assumed, the procedure is based on fitting the coefficients at a fixed thickness. Then the thickness is varied within a range (according to its approximate value). The final result given by our method is as follows: The sample thickness is considered to be the one that gives the best fitting. The refractive index is defined by the coefficients obtained for this thickness.
Resumo:
BACKGROUND: The emergency department has been identified as an area within the health care sector with the highest reports of violence. The best way to control violence is to prevent it before it becomes an issue. Ideally, to prevent violent episodes we should eliminate all triggers of frustration and violence. Our study aims to assess the impact of a quality improvement multi-faceted program aiming at preventing incivility and violence against healthcare professionals working at the ophthalmological emergency department of a teaching hospital. METHODS/DESIGN: This study is a single-center prospective, controlled time-series study with an alternate-month design. The prevention program is based on the successive implementation of five complementary interventions: a) an organizational approach with a standardized triage algorithm and patient waiting number screen, b) an environmental approach with clear signage of the premises, c) an educational approach with informational videos for patients and accompanying persons in waiting rooms, d) a human approach with a mediator in waiting rooms and e) a security approach with surveillance cameras linked to the hospital security. The primary outcome is the rate of incivility or violence by patients, or those accompanying them against healthcare staff. All patients admitted to the ophthalmological emergency department, and those accompanying them, will be enrolled. In all, 45,260 patients will be included in over a 24-month period. The unit analysis will be the patient admitted to the emergency department. Data analysis will be blinded to allocation, but due to the nature of the intervention, physicians and patients will not be blinded. DISCUSSION: The strengths of this study include the active solicitation of event reporting, that this is a prospective study and that the study enables assessment of each of the interventions that make up the program. The challenge lies in identifying effective interventions, adapting them to the context of care in an emergency department, and thoroughly assessing their efficacy with a high level of proof.The study has been registered as a cRCT at clinicaltrials.gov (identifier: NCT02015884).
Resumo:
Research on Public Service Motivation (PSM) has increased enormously in the last 20 years. Besides the analysis of the antecedents of PSM and its impact on organizations and individuals, many open questions about the nature of PSM itself still remain. This article argues that the theoretical construct of PSM should be contextualized by integrating the political and administrative contexts of public servants when investigating their specific attitudes towards working in a public environment. It also challenges the efficacy of the classic four-dimensional structure of PSM when it is applied to a specific context. The findings of a confirmatory factor analysis from a dataset of 3754 employees of 279 Swiss municipalities support the appropriateness of contextualizing parts of the PSM construct. They also support the addition of an extra dimension called, according to previous research, Swiss democratic governance. With regard to our results, there is a need for further PSM research to set a definite measure of PSM, particularly in regard to the international diffusion of empirical research on PSM.Points for practitionersThis study shows that public service motivation is a relevant construct for practitioners and may be used to better assess whether public agents are motivated by values or not. Nevertheless, it stresses also that the measurement of PSM must be adapted to the institutional context as well. Public managers interested in understanding better the degree to which their employees are motivated by public values must be aware that the measurement of this PSM construct has to be contextualized. In other words, PSM is also a function of the institutional environment in which organizations operate.
Resumo:
The observation that real complex networks have internal structure has important implication for dynamic processes occurring on such topologies. Here we investigate the impact of community structure on a model of information transfer able to deal with both search and congestion simultaneously. We show that networks with fuzzy community structure are more efficient in terms of packet delivery than those with pronounced community structure. We also propose an alternative packet routing algorithm which takes advantage of the knowledge of communities to improve information transfer and show that in the context of the model an intermediate level of community structure is optimal. Finally, we show that in a hierarchical network setting, providing knowledge of communities at the level of highest modularity will improve network capacity by the largest amount.
Resumo:
Context: Foreign body aspiration (FbA) is a serious problem in children. Accurate clinical and radiographic diagnosis is important because missed or delayed diagnosis can result in respiratory difficulties ranging from life-treatening airway obstruction to chronic wheezing or recurrent pneumonia. Bronchoscopy also has risks and accurate clinical and radiographc diagnosis can support the decision of bronchoscopy. Objective: To rewiev the diagnostic accuracy of clinical presentation (CP) and pulmonary radiograph (PR) for the diagnosis of FbA. There is no previous rewievMethods: A search of Medline is conducted for articles containing data regarding CP and PR signes of FbA. Calculation of likelihood ratios (LR) and pre and post test probability using Bayes theorem were performed for all signs of CP and PR. Inclusion criteria: Articles containing prospective data regarding CP and PR of FbA. Exclusion criteria: Retrospectives studies. Articles containing incomplete data for calculation of LR. Results: Five prospectives studies are included with a total of 585 patients. Prevalence of FbA is 63% in children suspected of FbA. If CP is normal, probability of FbA is 25% and if PR is normal, probability is 14%. If CP is pathologic, probability of FbA is 69-76% with presence of cough (LR = 1.32) or dyspnea (LR = 1.84) or localized crackles (LR = 1.5). Probability is 81-88% if cyanosis (LR = 4.8) or decreased breaths sounds (LR = 4.3) or asymetric auscultation (LR = 2.9) or localized wheezing (LR = 2.5) are present. When CP is anormal and PR show mediatinal shift (LR = 100), pneumomediatin (LR = 100), radio opaque foreign body (LR = 100), lobar distention (LR = 4), atelectasis (LR = 2.5), inspiratory/expiratory abnormal (LR = 7), the probability of FbA is 96-100%. If CP is normal and PR is abnormal the probability is 40-100%. If CP is abnormal and PR is normal the probability is 55-75%. Conclusions: This rewiev of prospective studies demonstrates the importance of CP and PR and an algorithm can be proposed. When CP is abnormal with or without PR pathologic, the probability of FbA is high and bronchoscopy is indicated. When CP and PR are normal the probability of FbA is low and bronchoscopy is not necessary immediatly, observation should be proposed. This approach should be validated with prospective study.
Resumo:
The worldwide prevalence of smoking has been estimated at about 50% in men, and 10% in women, with larger variations among different populations studied. Smoking has been shown to affect many organ systems resulting in severe morbidity and increased mortality. In addition, smoking has been identified as a predictor of ten-year fracture risk in men and women, largely independent of an individual's bone mineral density. This finding has eventually lead to incorporation of this risk factor into FRAX®, an algorithm that has been developed to calculate an individual's ten-year fracture risk. However, only little, or conflicting data is available on a possible association between smoking dose, duration, length of time after cessation, type of tobacco and fracture risk, limiting this risk factor's applicability in the context of FRAX®.
Resumo:
Projecte de recerca (EDU2011-25960) Ministerio de Ciencia e Innovación.
Resumo:
Neutrophils are massively and rapidly recruited following infection. They migrate to the site of acute infection and also transiently to dLNs. In addition to their well-established role as microbial killers, accumulating evidence shows that neutrophils can play an immunoregulatory role. Neutrophils were recently shown to influence the activation of different leukocyte types including NK cells, B cells, and DCs. DCs are professional APCs playing a key role to the launching and regulation of the immune response; thus, crosstalk between neutrophils and resident or newly recruited DCs may have a direct impact on the development of the antigen-specific immune response and thereby, on the outcome of infection. Neutrophils may regulate DC recruitment and/or activation. We will review here recent progress in the field, including those presented during the first international symposium on "Neutrophil in Immunity", held in Québec, Canada, in June 2012, and discuss how neutrophil regulatory action on DCs may differ depending on the type of invading microorganism and local host factors.
Resumo:
The primary goal of this project is to demonstrate the accuracy and utility of a freezing drizzle algorithm that can be implemented on roadway environmental sensing systems (ESSs). The types of problems related to the occurrence of freezing precipitation range from simple traffic delays to major accidents that involve fatalities. Freezing drizzle can also lead to economic impacts in communities with lost work hours, vehicular damage, and downed power lines. There are means for transportation agencies to perform preventive and reactive treatments to roadways, but freezing drizzle can be difficult to forecast accurately or even detect as weather radar and surface observation networks poorly observe these conditions. The detection of freezing precipitation is problematic and requires special instrumentation and analysis. The Federal Aviation Administration (FAA) development of aircraft anti-icing and deicing technologies has led to the development of a freezing drizzle algorithm that utilizes air temperature data and a specialized sensor capable of detecting ice accretion. However, at present, roadway ESSs are not capable of reporting freezing drizzle. This study investigates the use of the methods developed for the FAA and the National Weather Service (NWS) within a roadway environment to detect the occurrence of freezing drizzle using a combination of icing detection equipment and available ESS sensors. The work performed in this study incorporated the algorithm developed initially and further modified for work with the FAA for aircraft icing. The freezing drizzle algorithm developed for the FAA was applied using data from standard roadway ESSs. The work performed in this study lays the foundation for addressing the central question of interest to winter maintenance professionals as to whether it is possible to use roadside freezing precipitation detection (e.g., icing detection) sensors to determine the occurrence of pavement icing during freezing precipitation events and the rates at which this occurs.
Resumo:
3D dose reconstruction is a verification of the delivered absorbed dose. Our aim was to describe and evaluate a 3D dose reconstruction method applied to phantoms in the context of narrow beams. A solid water phantom and a phantom containing a bone-equivalent material were irradiated on a 6 MV linac. The transmitted dose was measured by using one array of a 2D ion chamber detector. The dose reconstruction was obtained by an iterative algorithm. A phantom set-up error and organ interfraction motion were simulated to test the algorithm sensitivity. In all configurations convergence was obtained within three iterations. A local reconstructed dose agreement of at least 3% / 3mm with respect to the planned dose was obtained, except in a few points of the penumbra. The reconstructed primary fluences were consistent with the planned ones, which validates the whole reconstruction process. The results validate our method in a simple geometry and for narrow beams. The method is sensitive to a set-up error of a heterogeneous phantom and interfraction heterogeneous organ motion.