905 resultados para network revenue management
Resumo:
AIMS We aimed to assess the prevalence and management of clinical familial hypercholesterolaemia (FH) among patients with acute coronary syndrome (ACS). METHODS AND RESULTS We studied 4778 patients with ACS from a multi-centre cohort study in Switzerland. Based on personal and familial history of premature cardiovascular disease and LDL-cholesterol levels, two validated algorithms for diagnosis of clinical FH were used: the Dutch Lipid Clinic Network algorithm to assess possible (score 3-5 points) or probable/definite FH (>5 points), and the Simon Broome Register algorithm to assess possible FH. At the time of hospitalization for ACS, 1.6% had probable/definite FH [95% confidence interval (CI) 1.3-2.0%, n = 78] and 17.8% possible FH (95% CI 16.8-18.9%, n = 852), respectively, according to the Dutch Lipid Clinic algorithm. The Simon Broome algorithm identified 5.4% (95% CI 4.8-6.1%, n = 259) patients with possible FH. Among 1451 young patients with premature ACS, the Dutch Lipid Clinic algorithm identified 70 (4.8%, 95% CI 3.8-6.1%) patients with probable/definite FH, and 684 (47.1%, 95% CI 44.6-49.7%) patients had possible FH. Excluding patients with secondary causes of dyslipidaemia such as alcohol consumption, acute renal failure, or hyperglycaemia did not change prevalence. One year after ACS, among 69 survivors with probable/definite FH and available follow-up information, 64.7% were using high-dose statins, 69.0% had decreased LDL-cholesterol from at least 50, and 4.6% had LDL-cholesterol ≤1.8 mmol/L. CONCLUSION A phenotypic diagnosis of possible FH is common in patients hospitalized with ACS, particularly among those with premature ACS. Optimizing long-term lipid treatment of patients with FH after ACS is required.
Resumo:
Ensuring sustainable use of natural resources is crucial for maintaining the basis for our livelihoods. With threats from climate change, disputes over water, biodiversity loss, competing claims on land, and migration increasing worldwide, the demands for sustainable land management (SLM) practices will only increase in the future. For years already, various national and international organizations (GOs, NGOs, donors, research institutes, etc.) have been working on alternative forms of land management. And numerous land users worldwide – especially small farmers – have been testing, adapting, and refining new and better ways of managing land. All too often, however, the resulting SLM knowledge has not been sufficiently evaluated, documented and shared. Among other things, this has often prevented valuable SLM knowledge from being channelled into evidence-based decision-making processes. Indeed, proper knowledge management is crucial for SLM to reach its full potential. Since more than 20 years, the international WOCAT network documents and promotes SLM through its global platform. As a whole, the WOCAT methodology comprises tools for documenting, evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing, analysis and use for decision support in the field, at the planning level, and in scaling up identified good practices. In early 2014, WOCAT’s growth and ongoing improvement culminated in its being officially recognized by the UNCCD as the primary recommended database for SLM best practices. Over the years, the WOCAT network confirmed that SLM helps to prevent desertification, to increase biodiversity, enhance food security and to make people less vulnerable to the effects of climate variability and change. In addi- tion, it plays an important role in mitigating climate change through improving soil organic matter and increasing vegetation cover. In-depth assessments of SLM practices from desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats. The impacts mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Among others, favourable local-scale cost-benefit relationships of SLM practices play a crucial role in their adoption. An economic analysis from the WOCAT database showed that land users perceive a large majority of the technologies as having benefits that outweigh costs in the long term. The high investment costs associated with some practices may constitute a barrier to adoption, however, where appropriate, short-term support for land users can help to promote these practices. The increased global concerns on climate change, disaster risks and food security redirect attention to, and trigger more funds for SLM. To provide the necessary evidence-based rationale for investing in SLM and to reinforce expert and land users assessments of SLM impacts, more field research using inter- and transdisciplinary approaches is needed. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the EU FP7 projects CASCADE and RECARE.
Resumo:
Atrial fibrillation (AF) is the most common sustained arrhythmia in the general population. As an age-related arrhythmia AF is becoming a huge socio-economic burden for European healthcare systems. Despite significant progress in our understanding of the pathophysiology of AF, therapeutic strategies for AF have not changed substantially and the major challenges in the management of AF are still unmet. This lack of progress may be related to the multifactorial pathogenesis of atrial remodelling and AF that hampers the identification of causative pathophysiological alterations in individual patients. Also, again new mechanisms have been identified and the relative contribution of these mechanisms still has to be established. In November 2010, the European Union launched the large collaborative project EUTRAF (European Network of Translational Research in Atrial Fibrillation) to address these challenges. The main aims of EUTRAF are to study the main mechanisms of initiation and perpetuation of AF, to identify the molecular alterations underlying atrial remodelling, to develop markers allowing to monitor this processes, and suggest strategies to treat AF based on insights in newly defined disease mechanisms. This article reports on the objectives, the structure, and initial results of this network.
Resumo:
BACKGROUND Non-steroidal anti-inflammatory drugs (NSAIDs) are the backbone of osteoarthritis pain management. We aimed to assess the effectiveness of different preparations and doses of NSAIDs on osteoarthritis pain in a network meta-analysis. METHODS For this network meta-analysis, we considered randomised trials comparing any of the following interventions: NSAIDs, paracetamol, or placebo, for the treatment of osteoarthritis pain. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the reference lists of relevant articles for trials published between Jan 1, 1980, and Feb 24, 2015, with at least 100 patients per group. The prespecified primary and secondary outcomes were pain and physical function, and were extracted in duplicate for up to seven timepoints after the start of treatment. We used an extension of multivariable Bayesian random effects models for mixed multiple treatment comparisons with a random effect at the level of trials. For the primary analysis, a random walk of first order was used to account for multiple follow-up outcome data within a trial. Preparations that used different total daily dose were considered separately in the analysis. To assess a potential dose-response relation, we used preparation-specific covariates assuming linearity on log relative dose. FINDINGS We identified 8973 manuscripts from our search, of which 74 randomised trials with a total of 58 556 patients were included in this analysis. 23 nodes concerning seven different NSAIDs or paracetamol with specific daily dose of administration or placebo were considered. All preparations, irrespective of dose, improved point estimates of pain symptoms when compared with placebo. For six interventions (diclofenac 150 mg/day, etoricoxib 30 mg/day, 60 mg/day, and 90 mg/day, and rofecoxib 25 mg/day and 50 mg/day), the probability that the difference to placebo is at or below a prespecified minimum clinically important effect for pain reduction (effect size [ES] -0·37) was at least 95%. Among maximally approved daily doses, diclofenac 150 mg/day (ES -0·57, 95% credibility interval [CrI] -0·69 to -0·46) and etoricoxib 60 mg/day (ES -0·58, -0·73 to -0·43) had the highest probability to be the best intervention, both with 100% probability to reach the minimum clinically important difference. Treatment effects increased as drug dose increased, but corresponding tests for a linear dose effect were significant only for celecoxib (p=0·030), diclofenac (p=0·031), and naproxen (p=0·026). We found no evidence that treatment effects varied over the duration of treatment. Model fit was good, and between-trial heterogeneity and inconsistency were low in all analyses. All trials were deemed to have a low risk of bias for blinding of patients. Effect estimates did not change in sensitivity analyses with two additional statistical models and accounting for methodological quality criteria in meta-regression analysis. INTERPRETATION On the basis of the available data, we see no role for single-agent paracetamol for the treatment of patients with osteoarthritis irrespective of dose. We provide sound evidence that diclofenac 150 mg/day is the most effective NSAID available at present, in terms of improving both pain and function. Nevertheless, in view of the safety profile of these drugs, physicians need to consider our results together with all known safety information when selecting the preparation and dose for individual patients. FUNDING Swiss National Science Foundation (grant number 405340-104762) and Arco Foundation, Switzerland.
Resumo:
Early Employee Assistance Programs (EAPs) had their origin in humanitarian motives, and there was little concern for their cost/benefit ratios; however, as some programs began accumulating data and analyzing it over time, even with single variables such as absenteeism, it became apparent that the humanitarian reasons for a program could be reinforced by cost savings particularly when the existence of the program was subject to justification.^ Today there is general agreement that cost/benefit analyses of EAPs are desirable, but the specific models for such analyses, particularly those making use of sophisticated but simple computer based data management systems, are few.^ The purpose of this research and development project was to develop a method, a design, and a prototype for gathering managing and presenting information about EAPS. This scheme provides information retrieval and analyses relevant to such aspects of EAP operations as: (1) EAP personnel activities, (2) Supervisory training effectiveness, (3) Client population demographics, (4) Assessment and Referral Effectiveness, (5) Treatment network efficacy, (6) Economic worth of the EAP.^ This scheme has been implemented and made operational at The University of Texas Employee Assistance Programs for more than three years.^ Application of the scheme in the various programs has defined certain variables which remained necessary in all programs. Depending on the degree of aggressiveness for data acquisition maintained by program personnel, other program specific variables are also defined. ^
Resumo:
The purpose of this dissertation was to develop a conceptual framework which can be used to account for policy decisions made by the House Ways and Means Committee (HW&MC) of the Texas House of Representatives. This analysis will examine the actions of the committee over a ten-year period with the goal of explaining and predicting the success of failure of certain efforts to raise revenue.^ The basis framework for modelling the revenue decision-making process includes three major components--the decision alternatives, the external factors and two competing contingency theories. The decision alternatives encompass the particular options available to increase tax revenue. The options were classified as non-innovative or innovative. The non-innovative options included the sales, franchise, property and severance taxes. The innovative options were principally the personal and corporate income taxes.^ The external factors included political and economic constraints that affected the actions of the HW&MC. Several key political constraints on committee decision-making were addressed--including public attitudes, interest groups, political party strength and tradition and precedents. The economic constraints that affected revenue decisions included court mandates, federal mandates and the fiscal condition of the nation and the state.^ The third component of the revenue decision-making framework included two alternative contingency theories. The first alternative theory postulated that the committee structure, including the individual member roles and the overall committee style, resulted in distinctive revenue decisions. This theory will be favored if evidence points to the committee acting autonomously with less concern for the policies of the Speaker of the House. The Speaker assignment theory, postulated that the assignment of committee members shaped or changed the course of committee decision-making. This theory will be favored if there was evidence that the committee was strictly a vehicle for the Speaker to institute his preferred tax policies.^ The ultimate goal of this analysis is to develop an explanation for legislative decision-making about tax policy. This explanation will be based on the linkages across various tax options, political and economic constraints, member roles and committee style and the patterns of committee assignment. ^
Resumo:
In view of the drastic growth in the Canadian Inuit population, the rising costs of living, the missing job and income alternatives and the high unemployment rate in the arctic, efforts are being made to make use of the muskox populations in order to provide additional sources of food and/or revenue. The present paper attempts to review the course of muskox utilization in the Canadian Arctic and to tentatively assess its present as weIl as its future economic importance. Starting with the pre-European status of muskoxen in Canada, the drastic reduction in numbers resulting from the combined efforts of hide traders, whalers and expedition parties in the 19th and early 20th centuries, the impact of the legal protection and the recovery since 1917 are being described. Establishing muskox farms with semi-domesticated herds failed in Canada in the 1970's. Since 1969, though, increasing numbers of animals have been allotted to many Inuit communities, and despite the fact that most of the animals were primarily used for subsistence purposes, some communities could reserve part of their quotas for trophy (sport) hunters. While controlled sustainable subsistence and trophy hunts may eventually be carried out over the whole muskox range, including recently colonized northern Quebec, commercial harvesting for meat, hides and wool, introduced in 1981, will at least for some time be restricted to Banks and Victoria islands which at present show 78 % of the Canadian muskox population and 94 % of the overall quota.
Resumo:
This paper proposes a new mechanism linking innovation and network in developing economies to detect explicit production and information linkages and investigates the testable implications of these linkages using survey data gathered from manufacturing firms in East Asia. We found that firms with more information linkages tend to innovate more, have a higher probability of introducing new goods, introducing new goods to new markets using new technologies, and finding new partners located in remote areas. We also found that firms that dispatched engineers to customers achieved more innovations than firms that did not. These findings support the hypothesis that production linkages and face‐to‐face communication encourage product and process innovation.
Resumo:
When Vietnam joined the WTO, it accepted foreign direct investment and started to grow. Technically, it was then greatly influenced by the enterprises that entered the country through direct investment. This report shows that the technology network for machine tools is formed via direct investment and subcontracting.
Resumo:
This paper describes the architecture of a computer system conceived as an intelligent assistant for public transport management. The goal of the system is to help operators of a control center in making strategic decisions about how to solve problems of a fleet of buses in an urban network. The system uses artificial intelligence techniques to simulate the decision processes. In particular, a complex knowledge model has been designed by using advanced knowledge engineering methods that integrates three main tasks: diagnosis, prediction and planning. Finally, the paper describes two particular applications developed following this architecture for the cities of Torino (Italy) and Vitoria (Spain).
Resumo:
As it is defined in ATM 2000+ Strategy (Eurocontrol 2001), the mission of the Air Traffic Management (ATM) System is: “For all the phases of a flight, the ATM system should facilitate a safe, efficient, and expedite traffic flow, through the provision of adaptable ATM services that can be dimensioned in relation to the requirements of all the users and areas of the European air space. The ATM services should comply with the demand, be compatible, operate under uniform principles, respect the environment and satisfy the national security requirements.” The objective of this paper is to present a methodology designed to evaluate the status of the ATM system in terms of the relationship between the offered capacity and traffic demand, identifying weakness areas and proposing solutions. The first part of the methodology relates to the characterization and evaluation of the current system, while a second part proposes an approach to analyze the possible development limit. As part of the work, general criteria are established to define the framework in which the analysis and diagnostic methodology presented is placed. They are: the use of Air Traffic Control (ATC) sectors as analysis unit, the presence of network effects, the tactical focus, the relative character of the analysis, objectivity and a high level assessment that allows assumptions on the human and Communications, Navigation and Surveillance (CNS) elements, considered as the typical high density air traffic resources. The steps followed by the methodology start with the definition of indicators and metrics, like the nominal criticality or the nominal efficiency of a sector; scenario characterization where the necessary data is collected; network effects analysis to study the relations among the constitutive elements of the ATC system; diagnostic by means of the “System Status Diagram”; analytical study of the ATC system development limit; and finally, formulation of conclusions and proposal for improvement. This methodology was employed by Aena (Spanish Airports Manager and Air Navigation Service Provider) and INECO (Spanish Transport Engineering Company) in the analysis of the Spanish ATM System in the frame of the Spanish airspace capacity sustainability program, although it could be applied elsewhere.
Resumo:
Energy Efficiency is one of the goals of the Smart Building initiatives. This paper presents an Open Energy Management System which consists of an ontology-based multi-technology platform and a wireless transducer network using 6LoWPAN communication technology. The system allows the integration of several building automation protocols and eases the development of different kind of services to make use of them. The system has been implemented and tested in the Energy Efficiency Research Facility at CeDInt-UPM.
Resumo:
Improving energy efficiency in buildings is one of the goals of the Smart City initiatives and a challenge for the European Union. This paper presents a 6LoWPAN wireless transducer network (BatNet) as part of an open energy management system. This network has been designed to operate in buildings, to collect environmental information (temperature, humidity, illumination and presence) and electrical consumption in real time (voltage, current and power factor). The system has been implemented and tested in the Energy Efficiency Research Facility at CeDInt-UPM.
Resumo:
Over the last ten years, Salamanca has been considered among the most polluted cities in México. This paper presents a Self-Organizing Maps (SOM) Neural Network application to classify pollution data and automatize the air pollution level determination for Sulphur Dioxide (SO2) in Salamanca. Meteorological parameters are well known to be important factors contributing to air quality estimation and prediction. In order to observe the behavior and clarify the influence of wind parameters on the SO2 concentrations a SOM Neural Network have been implemented along a year. The main advantages of the SOM is that it allows to integrate data from different sensors and provide readily interpretation results. Especially, it is powerful mapping and classification tool, which others information in an easier way and facilitates the task of establishing an order of priority between the distinguished groups of concentrations depending on their need for further research or remediation actions in subsequent management steps. The results show a significative correlation between pollutant concentrations and some environmental variables.
Resumo:
This paper presents an Ontology-Based multi-technology platform as part of an open energy management system which also comprises a wireless transducer network for control and monitoring. The platform allows the integration of several building automation protocols, eases the development and implementation of different kinds of services and allows sharing of the data of a building. The system has been implemented and tested in the Energy Efficiency Research Facility at CeDInt-UPM.