893 resultados para Policy-based network management
Resumo:
The effects of artemisinin-based combination therapies (ACTs) on transmission of Plasmodium falciparum were evaluated after a policy change instituting the use of ACTs in an endemic area. P. falciparum gametocyte carriage, sex ratios and inbreeding rates were examined in 2,585 children at presentation with acute falciparum malaria during a 10-year period from 2001-2010. Asexual parasite rates were also evaluated from 2003-2010 in 10,615 children before and after the policy change. Gametocyte carriage declined significantly from 12.4% in 2001 to 3.6% in 2010 (@@χ² for trend = 44.3, p < 0.0001), but sex ratios and inbreeding rates remained unchanged. Additionally, overall parasite rates remained unchanged before and after the policy change (47.2% vs. 45.4%), but these rates declined significantly from 2003-2010 (@@χ² for trend 35.4, p < 0.0001). Chloroquine (CQ) and artemether-lumefantrine (AL) were used as prototype drugs before and after the policy change, respectively. AL significantly shortened the duration of male gametocyte carriage in individual patients after treatment began compared with CQ (log rank statistic = 7.92, p = 0.005). ACTs reduced the rate of gametocyte carriage in children with acute falciparum infections at presentation and shortened the duration of male gametocyte carriage after treatment. However, parasite population sex ratios, inbreeding rates and overall parasite rate were unaffected.
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.
Resumo:
BACKGROUND In the last decades the presence of social inequalities in diabetes care has been observed in multiple countries, including Spain. These inequalities have been at least partially attributed to differences in diabetes self-management behaviours. Communication problems during medical consultations occur more frequently to patients with a lower educational level. The purpose of this cluster randomized trial is to determine whether an intervention implemented in a General Surgery, based in improving patient-provider communication, results in a better diabetes self-management in patients with lower educational level. A secondary objective is to assess whether telephone reinforcement enhances the effect of such intervention. We report the design and implementation of this on-going study. METHODS/DESIGN The study is being conducted in a General Practice located in a deprived neighbourhood of Granada, Spain. Diabetic patients 18 years old or older with a low educational level and inadequate glycaemic control (HbA1c > 7%) were recruited. General Practitioners (GPs) were randomised to three groups: intervention A, intervention B and control group. GPs allocated to intervention groups A and B received training in communication skills and are providing graphic feedback about glycosylated haemoglobin levels. Patients whose GPs were allocated to group B are additionally receiving telephone reinforcement whereas patients from the control group are receiving usual care. The described interventions are being conducted during 7 consecutive medical visits which are scheduled every three months. The main outcome measure will be HbA1c; blood pressure, lipidemia, body mass index and waist circumference will be considered as secondary outcome measures. Statistical analysis to evaluate the effectiveness of the interventions will include multilevel regression analysis with three hierarchical levels: medical visit level, patient level and GP level. DISCUSSION The results of this study will provide new knowledge about possible strategies to promote a better diabetes self-management in a particularly vulnerable group. If effective, this low cost intervention will have the potential to be easily incorporated into routine clinical practice, contributing to decrease health inequalities in diabetic patients. TRIAL REGISTRATION Clinical Trials U.S. National Institutes of Health, NCT01849731.
Resumo:
Anemia, usually due to iron deficiency, is highly prevalent among patients with colorectal cancer. Inflammatory cytokines lead to iron restricted erythropoiesis further decreasing iron availability and impairing iron utilization. Preoperative anemia predicts for decreased survival. Allogeneic blood transfusion is widely used to correct anemia and is associated with poorer surgical outcomes, increased post-operative nosocomial infections, longer hospital stays, increased rates of cancer recurrence and perioperative venous thromboembolism. Infections are more likely to occur in those with low preoperative serum ferritin level compared to those with normal levels. A multidisciplinary, multimodal, individualized strategy, collectively termed Patient Blood Management, minimizes or eliminates allogeneic blood transfusion. This includes restrictive transfusion policy, thromboprophylaxis and anemia management to improve outcomes. Normalization of preoperative hemoglobin levels is a World Health Organization recommendation. Iron repletion should be routinely ordered when indicated. Oral iron is poorly tolerated with low adherence based on published evidence. Intravenous iron is safe and effective but is frequently avoided due to misinformation and misinterpretation concerning the incidence and clinical nature of minor infusion reactions. Serious adverse events with intravenous iron are extremely rare. Newer formulations allow complete replacement dosing in 15-60 min markedly facilitating care. Erythropoiesis stimulating agents may improve response rates. A multidisciplinary, multimodal, individualized strategy, collectively termed Patient Blood Management used to minimize or eliminate allogeneic blood transfusion is indicated to improve outcomes.
Resumo:
Monitor a distribution network implies working with a huge amount of data coining from the different elements that interact in the network. This paper presents a visualization tool that simplifies the task of searching the database for useful information applicable to fault management or preventive maintenance of the network
Resumo:
Only half of hypertensive patients has controlled blood pressure. Chronic kidney disease (CKD) is also associated with low blood pressure control, 25-30% of CKD patients achieving adequate blood pressure. The Community Preventive Services Task Force has recently recommended team-based care to improve blood pressure control. Team-based care of hypertension involves facilitating coordination of care among physician, pharmacist and nurse and requires sharing clinical data, laboratory results, and medications, e.g., electronically or by fax. Based on recent studies, development and evaluation of team-based care of hypertensive patients should be done in the Swiss healthcare system.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
It is well known that multiple-input multiple-output (MIMO) techniques can bring numerous benefits, such as higher spectral efficiency, to point-to-point wireless links. More recently, there has been interest in extending MIMO concepts tomultiuser wireless systems. Our focus in this paper is on network MIMO, a family of techniques whereby each end user in a wireless access network is served through several access points within its range of influence. By tightly coordinating the transmission and reception of signals at multiple access points, network MIMO can transcend the limits on spectral efficiency imposed by cochannel interference. Taking prior information-theoretic analyses of networkMIMO to the next level, we quantify the spectral efficiency gains obtainable under realistic propagation and operational conditions in a typical indoor deployment. Our study relies on detailed simulations and, for specificity, is conducted largely within the physical-layer framework of the IEEE 802.16e Mobile WiMAX system. Furthermore,to facilitate the coordination between access points, we assume that a high-capacity local area network, such as Gigabit Ethernet,connects all the access points. Our results confirm that network MIMO stands to provide a multiple-fold increase in spectralefficiency under these conditions.
Resumo:
Winter maintenance, particularly snow removal and the stress of snow removal materials on public structures, is an enormous budgetary burden on municipalities and nongovernmental maintenance organizations in cold climates. Lately, geospatial technologies such as remote sensing, geographic information systems (GIS), and decision support tools are roviding a valuable tool for planning snow removal operations. A few researchers recently used geospatial technologies to develop winter maintenance tools. However, most of these winter maintenance tools, while having the potential to address some of these information needs, are not typically placed in the hands of planners and other interested stakeholders. Most tools are not constructed with a nontechnical user in mind and lack an easyto-use, easily understood interface. A major goal of this project was to implement a web-based Winter Maintenance Decision Support System (WMDSS) that enhances the capacity of stakeholders (city/county planners, resource managers, transportation personnel, citizens, and policy makers) to evaluate different procedures for managing snow removal assets optimally. This was accomplished by integrating geospatial analytical techniques (GIS and remote sensing), the existing snow removal asset management system, and webbased spatial decision support systems. The web-based system was implemented using the ESRI ArcIMS ActiveX Connector and related web technologies, such as Active Server Pages, JavaScript, HTML, and XML. The expert knowledge on snow removal procedures is gathered and integrated into the system in the form of encoded business rules using Visual Rule Studio. The system developed not only manages the resources but also provides expert advice to assist complex decision making, such as routing, optimal resource allocation, and monitoring live weather information. This system was developed in collaboration with Black Hawk County, IA, the city of Columbia, MO, and the Iowa Department of transportation. This product was also demonstrated for these agencies to improve the usability and applicability of the system.
Resumo:
Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.
Resumo:
OBJECTIVE: The European Panel on the Appropriateness of Crohn's disease Therapy (EPACT) has developed appropriateness criteria. We have applied these criteria retrospectively to the population-based inception cohort of Crohn's disease (CD) patients of the European Collaborative Study Group on Inflammatory Bowel Disease (EC-IBD). MATERIAL AND METHODS: A total of 426 diagnosed CD patients from 13 European centers were enrolled at the time of diagnosis (first flare, naive patients). We used the EPACT definitions to identify 247 patients with active luminal CD. We then assessed the appropriateness of the initial drug prescription according to the EPACT criteria. RESULTS: Among the cohort patients 163 suffered from mild-to-moderate CD and 84 from severe CD. Among the mild-to-moderate disease group, 96 patients (59%) received an appropriate treatment, whereas for 66 patients (40%) the treatment was uncertain and in one case (1%) inappropriate. Among the severe disease group, 86% were treated medically and 14% required surgery. 59 (70%) were appropriately treated, whereas for one patient (1%) the procedure was considered uncertain and for 24 patients (29%) inappropriate. CONCLUSION: Initial treatment was appropriate in the majority of cases for non-complicated luminal CD. Inappropriate or uncertain treatment was given in a significant minority of patients, with an increased potential risk of adverse events.
Resumo:
A Comparison of the Management Models of Protected Areas between China and the African South Region allows reading and evaluating the similarities and differences in the use of management model as a management tool for protected areas in China and South African Region. Specifically, some positive and negative features of the management approaches for the two regions. Secondary data was collected from various related literature such as policy documents, students‟ dissertations/thesis, scientific articles and magazines. Based on the method above, the study found out that China's first nature reserve was the Dingus Mountain Nature Reserve in Zhaoqing, Guangdong province established in 1956. By the end of 2005, about 2,349 nature reserves of various kinds were set up throughout the country, covering a total area of 149.95 million ha and accounting for 15 percent of the total land territory. The study further found that Southern Africa has approximately 4,390 protected areas out of 11487920 total land areas and Eastern Africa has approximately 1838144 protected areas, which is equivalent to 15.0% of the total land areas. South Africa in this region had its first declared natural park in 1926 after Paul Kruger (a war hero) had alerted the authorities of the extinguishing threat of some animal species of region.
Resumo:
Revenue management practices often include overbooking capacity to account for customerswho make reservations but do not show up. In this paper, we consider the network revenuemanagement problem with no-shows and overbooking, where the show-up probabilities are specificto each product. No-show rates differ significantly by product (for instance, each itinerary andfare combination for an airline) as sale restrictions and the demand characteristics vary byproduct. However, models that consider no-show rates by each individual product are difficultto handle as the state-space in dynamic programming formulations (or the variable space inapproximations) increases significantly. In this paper, we propose a randomized linear program tojointly make the capacity control and overbooking decisions with product-specific no-shows. Weestablish that our formulation gives an upper bound on the optimal expected total profit andour upper bound is tighter than a deterministic linear programming upper bound that appearsin the existing literature. Furthermore, we show that our upper bound is asymptotically tightin a regime where the leg capacities and the expected demand is scaled linearly with the samerate. We also describe how the randomized linear program can be used to obtain a bid price controlpolicy. Computational experiments indicate that our approach is quite fast, able to scale to industrialproblems and can provide significant improvements over standard benchmarks.