953 resultados para portfolio management process
Resumo:
In the absence of works which would significantly change the perspective on the management of diabetes in the elapsed year, this article proposes a reflection on the integration of the evolving knowledge over the past decade into clinical practice. The major preventive impact of an approach targeting all the cardiovascular risk factors in diabetic patients will remain as the main lesson of this decade. The therapeutic goals need to be tailored to the individual patient's situation based on the evaluation of the benefit: inconvenience-ratio of the treatments. The process of their choice has to include the quest for a shared vision with the patient who is in charge of diabetes management in daily life.
Resumo:
By the end of the 1970s, contaminated sites had emerged as one of the most complex and urgent environmental issues affecting industrialized countries. The authors show that small and prosperous Switzerland is no exception to the pervasive problem of sites contamination, the legacy of past practices in waste management having left some 38,000 contaminated sites throughout the country. This book outlines the problem, offering evidence that open and polycentric environmental decision-making that includes civil society actors is valuable. They propose an understanding of environmental management of contaminated sites as a political process in which institutions frame interactions between strategic actors pursuing sometimes conflicting interests. In the opening chapter, the authors describe the influences of politics and the power relationships between actors involved in decision-making in contaminated sites management, which they term a "wicked problem." Chapter Two offers a theoretical framework for understanding institutions and the environmental management of contaminated sites. The next five chapters present a detailed case study on environmental management and contaminated sites in Switzerland, focused on the Bonfol Chemical Landfill. The study and analysis covers the establishment of the landfill under the first generation of environmental regulations, its closure and early remediation efforts, and the gambling on the remediation objectives, methods and funding in the first decade of the 21st Century. The concluding chapter discusses the question of whether the strength of environmental regulations, and the type of interactions between public, private, and civil society actors can explain the environmental choices in contaminated sites management. Drawing lessons from research, the authors debate the value of institutional flexibility for dealing with environmental issues such as contaminated sites.
Resumo:
This paper shows how to introduce liquidity into the well known mean-variance framework of portfolio selection. Either by estimating mean-variance liquidity constrained frontiers or directly estimating optimal portfolios for alternative levels of risk aversion and preference for liquidity, we obtain strong effects of liquidity on optimal portfolio selection. In particular, portfolio performance, measured by the Sharpe ratio relative to the tangency portfolio, varies significantly with liquidity. Moreover, although mean-variance performance becomes clearly worse, the levels of liquidity onoptimal portfolios obtained when there is a positive preference for liquidity are much lower than on those optimal portfolios where investors show no sign of preference for liquidity.
Resumo:
This paper explores the integration process that firms follow to implementSupply Chain Management (SCM) and the main barriers and benefits relatedto this strategy. This study has been inspired in the SCM literature,especially in the logistics integration model by Stevens [1]. Due to theexploratory nature of this paper and the need to obtain an in depthknowledge of the SCM development in the Spanish grocery sector, we used thecase study methodology. A multiple case study analysis based on interviewswith leading manufacturers and retailers was conducted.The results of this analysis suggest that firms seem to follow the integration process proposed by Stevens, integrating internally first, andthen, extending this integration to other supply chain members. The casesalso show that Spanish manufacturers, in general, seem to have a higherlevel of SCM development than Spanish retailers. Regarding the benefitsthat SCM can bring, most of the companies identify the general objectivesof cost and stock reductions and service improvements. However, withrespect to the barriers found in its implementation, retailers andmanufacturers are not coincident: manufacturers seem to see more barrierswith respect to aspects related to the other party, such as distrust and alack of culture of sharing information, while retailers find as mainbarriers the need of a know-how , the company culture and the historyand habits.
Resumo:
Doubts about the reliability of a company's qualitative financial disclosure increase market participant expectations from the auditor's report. The auditing process is supposed to serve as a monitoring device that reduces management incentives to manipulate reported earnings. Empirical research confirms that it could be an efficient device under some circumstancesand recognizes that our estimates of the informativeness of audit reports are unavoidably biased (e.g., because of a client's anticipation of the auditing process). This empirical study supports the significant role of auditors in the financial market, in particular in the prevention of earnings management practice. We focus on earnings misstatements, which auditors correct with anadjustment, using a sample of past and current constituents of the benchmark market index in Spain, IBEX 35, and manually collected audit adjustments reported over the 1997-2004 period (42 companies, 336 annual reports, 75 earnings misstatements). Our findings confirm that companies more often overstate than understate their earnings. An investor may foresee earningsmisreporting, as manipulators have a similar profile (e.g., more leveraged and with lower sales). However, he may receive valuable information from the audit adjustment on the size of earnings misstatement, which can be significantly large (i.e., material in almost all cases). We suggest that the magnitude of an audit adjustment depends, other things constant, on annual revenues and free cash levels. We also examine how the audit adjustment relates to the observed market price, trading volume and stock returns. Our findings are that earnings manipulators have a lower price and larger trading volume compared to their rivals. Their returns are positively associated with the magnitude of earnings misreporting, which is not consistent with the possible pricing of audit information.
Resumo:
The organisation of inpatient care provision has undergone significant reform in many southern European countries. Overall across Europe, public management is moving towards the introduction of more flexibility and autonomy . In this setting, the promotion of the further decentralisation of health care provision stands out as a key salient policy option in all countries that have hitherto had a traditionally centralised structure. Yet, the success of the underlying incentives that decentralised structures create relies on the institutional design at the organisational level, especially in respect of achieving efficiency and promoting policy innovation without harming the essential principle of equal access for equal need that grounds National Health Systems (NHS). This paper explores some of the specific organisational developments of decentralisation structures drawing from the Spanish experience, and particularly those in the Catalonia. This experience provides some evidence of the extent to which organisation decentralisation structures that expand levels of autonomy and flexibility lead to organisational innovation while promoting activity and efficiency. In addition to this pure managerial decentralisation process, Spain is of particular interest as a result of the specific regional NHS decentralisation that started in the early 1980 s and was completed in 2002 when all seventeen autonomous communities that make up the country had responsibility for health care services.Already there is some evidence to suggest that this process of decentralisation has been accompanied by a degree of policy innovation and informal regional cooperation. Indeed, the Spanish experience is relevant because both institutional changes took place, namely managerial decentralisation leading to higher flexibility and autonomy- alongside an increasing political decentralisation at the regional level. The coincidence of both processes could potentially explain why some organisation and policy innovation resulting from policy experimentation at the regional level might be an additional featureto take into account when examining the benefits of decentralisation.
Resumo:
Organisations are becoming increasingly aware of the need for management information systems, due largely to the changing environment and a continuous process of globalisation. All of this means that managers need to adapt the structures of their organisations to the changes and, therefore, to plan, control and manage better. The Spanish public university cannot avoid this changing (demographic, economic and social changes) and globalising (among them the convergence of European qualifications) environment, to which we must add the complex organisation structure, characterised by a high dispersion of authority for decision making in different collegiate and unipersonal organs. It seems obvious that these changes must have repercussions on the direction, organisation and management structures of those public higher education institutions, and it seems natural that, given this environment, the universities must adapt their present management systems to the demand by society for the quality and suitability of the services they provide.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
This paper presents several applications to interest rate risk managementbased on a two-factor continuous-time model of the term structure of interestrates previously presented in Moreno (1996). This model assumes that defaultfree discount bond prices are determined by the time to maturity and twofactors, the long-term interest rate and the spread (difference between thelong-term rate and the short-term (instantaneous) riskless rate). Several newmeasures of ``generalized duration" are presented and applied in differentsituations in order to manage market risk and yield curve risk. By means ofthese measures, we are able to compute the hedging ratios that allows us toimmunize a bond portfolio by means of options on bonds. Focusing on thehedging problem, it is shown that these new measures allow us to immunize abond portfolio against changes (parallel and/or in the slope) in the yieldcurve. Finally, a proposal of solution of the limitations of conventionalduration by means of these new measures is presented and illustratednumerically.
Resumo:
OBJECTIVE: To provide an update to the original Surviving Sepsis Campaign clinical management guidelines, "Surviving Sepsis Campaign guidelines for management of severe sepsis and septic shock," published in 2004. DESIGN: Modified Delphi method with a consensus conference of 55 international experts, several subsequent meetings of subgroups and key individuals, teleconferences, and electronic-based discussion among subgroups and among the entire committee. This process was conducted independently of any industry funding. METHODS: We used the GRADE system to guide assessment of quality of evidence from high (A) to very low (D) and to determine the strength of recommendations. A strong recommendation indicates that an intervention's desirable effects clearly outweigh its undesirable effects (risk, burden, cost), or clearly do not. Weak recommendations indicate that the tradeoff between desirable and undesirable effects is less clear. The grade of strong or weak is considered of greater clinical importance than a difference in letter level of quality of evidence. In areas without complete agreement, a formal process of resolution was developed and applied. Recommendations are grouped into those directly targeting severe sepsis, recommendations targeting general care of the critically ill patient that are considered high priority in severe sepsis, and pediatric considerations. RESULTS: Key recommendations, listed by category, include: early goal-directed resuscitation of the septic patient during the first 6 hrs after recognition (1C); blood cultures prior to antibiotic therapy (1C); imaging studies performed promptly to confirm potential source of infection (1C); administration of broad-spectrum antibiotic therapy within 1 hr of diagnosis of septic shock (1B) and severe sepsis without septic shock (1D); reassessment of antibiotic therapy with microbiology and clinical data to narrow coverage, when appropriate (1C); a usual 7-10 days of antibiotic therapy guided by clinical response (1D); source control with attention to the balance of risks and benefits of the chosen method (1C); administration of either crystalloid or colloid fluid resuscitation (1B); fluid challenge to restore mean circulating filling pressure (1C); reduction in rate of fluid administration with rising filing pressures and no improvement in tissue perfusion (1D); vasopressor preference for norepinephrine or dopamine to maintain an initial target of mean arterial pressure > or = 65 mm Hg (1C); dobutamine inotropic therapy when cardiac output remains low despite fluid resuscitation and combined inotropic/vasopressor therapy (1C); stress-dose steroid therapy given only in septic shock after blood pressure is identified to be poorly responsive to fluid and vasopressor therapy (2C); recombinant activated protein C in patients with severe sepsis and clinical assessment of high risk for death (2B except 2C for post-operative patients). In the absence of tissue hypoperfusion, coronary artery disease, or acute hemorrhage, target a hemoglobin of 7-9 g/dL (1B); a low tidal volume (1B) and limitation of inspiratory plateau pressure strategy (1C) for acute lung injury (ALI)/acute respiratory distress syndrome (ARDS); application of at least a minimal amount of positive end-expiratory pressure in acute lung injury (1C); head of bed elevation in mechanically ventilated patients unless contraindicated (1B); avoiding routine use of pulmonary artery catheters in ALI/ARDS (1A); to decrease days of mechanical ventilation and ICU length of stay, a conservative fluid strategy for patients with established ALI/ARDS who are not in shock (1C); protocols for weaning and sedation/analgesia (1B); using either intermittent bolus sedation or continuous infusion sedation with daily interruptions or lightening (1B); avoidance of neuromuscular blockers, if at all possible (1B); institution of glycemic control (1B) targeting a blood glucose < 150 mg/dL after initial stabilization ( 2C ); equivalency of continuous veno-veno hemofiltration or intermittent hemodialysis (2B); prophylaxis for deep vein thrombosis (1A); use of stress ulcer prophylaxis to prevent upper GI bleeding using H2 blockers (1A) or proton pump inhibitors (1B); and consideration of limitation of support where appropriate (1D). Recommendations specific to pediatric severe sepsis include: greater use of physical examination therapeutic end points (2C); dopamine as the first drug of choice for hypotension (2C); steroids only in children with suspected or proven adrenal insufficiency (2C); a recommendation against the use of recombinant activated protein C in children (1B). CONCLUSION: There was strong agreement among a large cohort of international experts regarding many level 1 recommendations for the best current care of patients with severe sepsis. Evidenced-based recommendations regarding the acute management of sepsis and septic shock are the first step toward improved outcomes for this important group of critically ill patients.
Resumo:
The parasellar region is the location of a wide variety of inflammatory and benign or malignant lesions. A pathological diagnostic strategy may be difficult to establish relying solely on imaging data. Percutaneous biopsy through the foramen ovale using the Hartel technique has been developed for decision-making process. It is an accurate diagnostic tool allowing pathological diagnosis to determine the best treatment strategy. However, in some cases, this procedure may fail or may be inappropriate particularly for anterior parasellar lesions. Over these past decades, endoscopy has been widely developed and promoted in many indications. It represents an interesting alternative approach to parasellar lesions with low morbidity when compared to the classic microscopic sub-temporal extradural approach with or without orbito-zygomatic removal. In this chapter, we describe our experience with the endoscopic approach to parasellar lesions. We propose a complete overview of surgical anatomy and describe methods and results of the technique. We also suggest a model of a decision-making tree for the diagnosis and treatment of parasellar lesions.
Resumo:
BACKGROUND: Open lung biopsy (OLB) is helpful in the management of patients with acute respiratory distress syndrome (ARDS) of unknown etiology. We determine the impact of surgical lung biopsies performed at the bedside on the management of patients with ARDS. METHODS: We reviewed all consecutive cases of patients with ARDS who underwent a surgical OLB at the bedside in a medical intensive care unit between 1993 and 2005. RESULTS: Biopsies were performed in 19 patients mechanically ventilated for ARDS of unknown etiology despite extensive diagnostic process and empirical therapeutic trials. Among them, 17 (89%) were immunocompromised and 10 patients experienced hematological malignancies. Surgical biopsies were obtained after a median (25%-75%) mechanical ventilation of 5 (2-11) days; mean (+/-SD) Pao(2)/Fio(2) ratio was 119.3 (+/-34.2) mm Hg. Histologic diagnoses were obtained in all cases and were specific in 13 patients (68%), including 9 (47%) not previously suspected. Immediate complications (26%) were local (pneumothorax, minimal bleeding) without general or respiratory consequences. The biopsy resulted in major changes in management in 17 patients (89%). It contributed to a decision to limit care in 12 of 17 patients who died. CONCLUSION: Our data confirm that surgical OLB may have an important impact on the management of patients with ARDS of unknown etiology after extensive diagnostic process. The procedure can be performed at the bedside, is safe, and has a high diagnostic yield leading to major changes in management, including withdrawal of vital support, in the majority of patients.
Resumo:
Vertebral fracture (VF) is the most common osteoporotic fracture and is associated with high morbidity and mortality. Conservative treatment combining antalgic agents and rest is usually recommended for symptomatic VFs. The aim of this paper is to review the randomized controlled trials comparing the efficacy and safety of percutaneous vertebroplasty (VP) and percutaneous balloon kyphoplasty (KP) versus conservative treatment. VP and KP procedures are associated with an acceptable general safety. Although the case series investigating VP/KP have all shown an outstanding analgesic benefit, randomized controlled studies are rare and have yielded contradictory results. In several of these studies, a short-term analgesic benefit was observed, except in the prospective randomized sham-controlled studies. A long-term analgesic and functional benefit has rarely been noted. Several recent studies have shown that both VP and KP are associated with an increased risk of new VFs. These fractures are mostly VFs adjacent to the procedure, and they occur within a shorter time period than VFs in other locations. The main risk factors include the number of preexisting VFs, the number of VPs/KPs performed, age, decreased bone mineral density, and intradiscal cement leakage. It is therefore important to involve the patients to whom VP/KP is being proposed in the decision-making process. It is also essential to rapidly initiate a specific osteoporosis therapy when a VF occurs (ideally a bone anabolic treatment) so as to reduce the risk of fracture. Randomized controlled studies are necessary in order to better define the profile of patients who likely benefit the most from VP/KP.
Resumo:
Despite numerous discussions, workshops, reviews and reports about responsible development of nanotechnology, information describing health and environmental risk of engineered nanoparticles or nanomaterials is severely lacking and thus insufficient for completing rigorous risk assessment on their use. However, since preliminary scientific evaluations indicate that there are reasonable suspicions that activities involving nanomaterials might have damaging effects on human health; the precautionary principle must be applied. Public and private institutions as well as industries have the duty to adopt preventive and protective measures proportionate to the risk intensity and the desired level of protection. In this work, we present a practical, 'user-friendly' procedure for a university-wide safety and health management of nanomaterials, developed as a multi-stakeholder effort (government, accident insurance, researchers and experts for occupational safety and health). The process starts using a schematic decision tree that allows classifying the nano laboratory into three hazard classes similar to a control banding approach (from Nano 3 - highest hazard to Nano1 - lowest hazard). Classifying laboratories into risk classes would require considering actual or potential exposure to the nanomaterial as well as statistical data on health effects of exposure. Due to the fact that these data (as well as exposure limits for each individual material) are not available, risk classes could not be determined. For each hazard level we then provide a list of required risk mitigation measures (technical, organizational and personal). The target 'users' of this safety and health methodology are researchers and safety officers. They can rapidly access the precautionary hazard class of their activities and the corresponding adequate safety and health measures. We succeed in convincing scientist dealing with nano-activities that adequate safety measures and management are promoting innovation and discoveries by ensuring them a safe environment even in the case of very novel products. The proposed measures are not considered as constraints but as a support to their research. This methodology is being implemented at the Ecole Polytechnique de Lausanne in over 100 research labs dealing with nanomaterials. It is our opinion that it would be useful to other research and academia institutions as well. [Authors]
Resumo:
This paper presents a thermal modeling for power management of a new three-dimensional (3-D) thinned dies stacking process. Besides the high concentration of power dissipating sources, which is the direct consequence of the very interesting integration efficiency increase, this new ultra-compact packaging technology can suffer of the poor thermal conductivity (about 700 times smaller than silicon one) of the benzocyclobutene (BCB) used as both adhesive and planarization layers in each level of the stack. Thermal simulation was conducted using three-dimensional (3-D) FEM tool to analyze the specific behaviors in such stacked structure and to optimize the design rules. This study first describes the heat transfer limitation through the vertical path by examining particularly the case of the high dissipating sources under small area. First results of characterization in transient regime by means of dedicated test device mounted in single level structure are presented. For the design optimization, the thermal draining capabilities of a copper grid or full copper plate embedded in the intermediate layer of stacked structure are evaluated as a function of the technological parameters and the physical properties. It is shown an interest for the transverse heat extraction under the buffer devices dissipating most the power and generally localized in the peripheral zone, and for the temperature uniformization, by heat spreading mechanism, in the localized regions where the attachment of the thin die is altered. Finally, all conclusions of this analysis are used for the quantitative projections of the thermal performance of a first demonstrator based on a three-levels stacking structure for space application.