989 resultados para Network tariffs allocation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intra-session network coding has been shown to offer significant gains in terms of achievable throughput and delay in settings where one source multicasts data to several clients. In this paper, we consider a more general scenario where multiple sources transmit data to sets of clients over a wireline overlay network. We propose a novel framework for efficient rate allocation in networks where intermediate network nodes have the opportunity to combine packets from different sources using randomized network coding. We formulate the problem as the minimization of the average decoding delay in the client population and solve it with a gradient-based stochastic algorithm. Our optimized inter-session network coding solution is evaluated in different network topologies and is compared with basic intra-session network coding solutions. Our results show the benefits of proper coding decisions and effective rate allocation for lowering the decoding delay when the network is used by concurrent multicast sessions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND To summarize the available evidence on the effectiveness of psychological interventions for patients with post-traumatic stress disorder (PTSD). METHOD We searched bibliographic databases and reference lists of relevant systematic reviews and meta-analyses for randomized controlled trials that compared specific psychological interventions for adults with PTSD symptoms either head-to-head or against control interventions using non-specific intervention components, or against wait-list control. Two investigators independently extracted the data and assessed trial characteristics. RESULTS The analyses included 4190 patients in 66 trials. An initial network meta-analysis showed large effect sizes (ESs) for all specific psychological interventions (ESs between -1.10 and -1.37) and moderate effects of psychological interventions that were used to control for non-specific intervention effects (ESs -0.58 and -0.62). ES differences between various types of specific psychological interventions were absent to small (ES differences between 0.00 and 0.27). Considerable between-trial heterogeneity occurred (τ 2 = 0.30). Stratified analyses revealed that trials that adhered to DSM-III/IV criteria for PTSD were associated with larger ESs. However, considerable heterogeneity remained. Heterogeneity was reduced in trials with adequate concealment of allocation and in large-sized trials. We found evidence for small-study bias. CONCLUSIONS Our findings show that patients with a formal diagnosis of PTSD and those with subclinical PTSD symptoms benefit from different psychological interventions. We did not identify any intervention that was consistently superior to other specific psychological interventions. However, the robustness of evidence varies considerably between different psychological interventions for PTSD, with most robust evidence for cognitive behavioral and exposure therapies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Recently, two simple clinical scores were published to predict survival in trauma patients. Both scores may successfully guide major trauma triage, but neither has been independently validated in a hospital setting. METHODS This is a cohort study with 30-day mortality as the primary outcome to validate two new trauma scores-Mechanism, Glasgow Coma Scale (GCS), Age, and Pressure (MGAP) score and GCS, Age and Pressure (GAP) score-using data from the UK Trauma Audit and Research Network. First, an assessment of discrimination, using the area under the receiver operating characteristic (ROC) curve, and calibration, comparing mortality rates with those originally published, were performed. Second, we calculated sensitivity, specificity, predictive values, and likelihood ratios for prognostic score performance. Third, we propose new cutoffs for the risk categories. RESULTS A total of 79,807 adult (≥16 years) major trauma patients (2000-2010) were included; 5,474 (6.9%) died. Mean (SD) age was 51.5 (22.4) years, median GCS score was 15 (interquartile range, 15-15), and median Injury Severity Score (ISS) was 9 (interquartile range, 9-16). More than 50% of the patients had a low-risk GAP or MGAP score (1% mortality). With regard to discrimination, areas under the ROC curve were 87.2% for GAP score (95% confidence interval, 86.7-87.7) and 86.8% for MGAP score (95% confidence interval, 86.2-87.3). With regard to calibration, 2,390 (3.3%), 1,900 (28.5%), and 1,184 (72.2%) patients died in the low, medium, and high GAP risk categories, respectively. In the low- and medium-risk groups, these were almost double the previously published rates. For MGAP, 1,861 (2.8%), 1,455 (15.2%), and 2,158 (58.6%) patients died in the low-, medium-, and high-risk categories, consonant with results originally published. Reclassifying score point cutoffs improved likelihood ratios, sensitivity and specificity, as well as areas under the ROC curve. CONCLUSION We found both scores to be valid triage tools to stratify emergency department patients, according to their risk of death. MGAP calibrated better, but GAP slightly improved discrimination. The newly proposed cutoffs better differentiate risk classification and may therefore facilitate hospital resource allocation. LEVEL OF EVIDENCE Prognostic study, level II.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coccolithophores are unicellular phytoplankton that produce calcium carbonate coccoliths as an exoskeleton. Emiliania huxleyi, the most abundant coccolithophore in the world's ocean, plays a major role in the global carbon cycle by regulating the exchange of CO2 across the ocean-atmosphere interface through photosynthesis and calcium carbonate precipitation. As CO2 concentration is rising in the atmosphere, the ocean is acidifying and ammonium (NH4) concentration of future ocean water is expected to rise. The latter is attributed to increasing anthropogenic nitrogen (N) deposition, increasing rates of cyanobacterial N2 fixation due to warmer and more stratified oceans, and decreased rates of nitrification due to ocean acidification. Thus future global climate change will cause oceanic phytoplankton to experience changes in multiple environmental parameters including CO2, pH, temperature and nitrogen source. This study reports on the combined effect of elevated pCO2 and increased NH4 to nitrate (NO3) ratio (NH4/NO3) on E. huxleyi, maintained in continuous cultures for more than 200 generations under two pCO2 levels and two different N sources. Here we show that NH4 assimilation under N-replete conditions depresses calcification at both low and high pCO2, alters coccolith morphology, and increases primary production. We observed that N source and pCO2 synergistically drive growth rates, cell size and the ratio of inorganic to organic carbon. These responses to N source suggest that, compared to increasing CO2 alone, a greater disruption of the organic carbon pump could be expected in response to the combined effect of increased NH4/NO3 ratio and CO2 level in the future acidified ocean. Additional experiments conducted under lower nutrient conditions are needed prior to extrapolating our findings to the global oceans. Nonetheless, our results emphasize the need to assess combined effects of multiple environmental parameters on phytoplankton biology in order to develop accurate predictions of phytoplankton responses to ocean acidification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anthropogenic CO2 emission will lead to an increase in seawater pCO2 of up to 80-100 Pa (800-1000 µatm) within this century and to an acidification of the oceans. Green sea urchins (Strongylocentrotus droebachiensis) occurring in Kattegat experience seasonal hypercapnic and hypoxic conditions already today. Thus, anthropogenic CO2 emissions will add up to existing values and will lead to even higher pCO2 values >200 Pa (>2000 µatm). To estimate the green sea urchins' potential to acclimate to acidified seawater, we calculated an energy budget and determined the extracellular acid base status of adult S. droebachiensis exposed to moderately (102 to 145 Pa, 1007 to 1431 µatm) and highly (284 to 385 Pa, 2800 to 3800 µatm) elevated seawater pCO2 for 10 and 45 days. A 45 - day exposure to elevated pCO2 resulted in a shift in energy budgets, leading to reduced somatic and reproductive growth. Metabolic rates were not significantly affected, but ammonium excretion increased in response to elevated pCO2. This led to decreased O:N ratios. These findings suggest that protein metabolism is possibly enhanced under elevated pCO2 in order to support ion homeostasis by increasing net acid extrusion. The perivisceral coelomic fluid acid-base status revealed that S. droebachiensis is able to fully (intermediate pCO2) or partially (high pCO2) compensate extracellular pH (pHe) changes by accumulation of bicarbonate (maximum increases 2.5 mM), albeit at a slower rate than typically observed in other taxa (10 day duration for full pHe compensation). At intermediate pCO2, sea urchins were able to maintain fully compensated pHe for 45 days. Sea urchins from the higher pCO2 treatment could be divided into two groups following medium-term acclimation: one group of experimental animals (29%) contained remnants of food in their digestive system and maintained partially compensated pHe (+2.3 mM HCO3), while the other group (71%) exhibited an empty digestive system and a severe metabolic acidosis (-0.5 pH units, -2.4 mM HCO3). There was no difference in mortality between the three pCO2 treatments. The results of this study suggest that S. droebachiensis occurring in the Kattegat might be pre-adapted to hypercapnia due to natural variability in pCO2 in its habitat. We show for the first time that some echinoderm species can actively compensate extracellular pH. Seawater pCO2 values of >200 Pa, which will occur in the Kattegat within this century during seasonal hypoxic events, can possibly only be endured for a short time period of a few weeks. Increases in anthropogenic CO2 emissions and leakages from potential sub-seabed CO2 storage (CCS) sites thus impose a threat to the ecologically and economically important species S. droebachiensis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The EU began railway reform in earnest around the turn of the century. Two ‘railway packages’ have meanwhile been adopted amounting to a series of directives and a third package has been proposed. A range of complementary initiatives has been undertaken or is underway. This BEEP Briefing inspects the main economic aspects of EU rail reform. After highlighting the dramatic loss of market share of rail since the 1960s, the case for reform is argued to rest on three arguments: the need for greater competitiveness of rail, promoting the (market driven) diversion of road haulage to rail as a step towards sustainable mobility in Europe, and an end to the disproportional claims on public budgets of Member States. The core of the paper deals respectively with market failures in rail and in the internal market for rail services; the complex economic issues underlying vertical separation (unbundling) and pricing options; and the methods, potential and problems of introducing competition in rail freight and in passenger services. Market failures in the rail sector are several (natural monopoly, economies of density, safety and asymmetries of information), exacerbated by no less than 7 technical and legal barriers precluding the practical operation of an internal rail market. The EU choice to opt for vertical unbundling (with benefits similar in nature as in other network industries e.g. preventing opaque cross-subsidisation and greater cost revelation) risks the emergence of considerable coordination costs. The adoption of marginal cost pricing is problematic on economic grounds (drawbacks include arbitrary cost allocation rules in the presence of large economies of scope and relatively large common costs; a non-optimal incentive system, holding back the growth of freight services; possibly anti-competitive effects of two-part tariffs). Without further detailed harmonisation, it may also lead to many different systems in Member States, causing even greater distortions. Insofar as freight could develop into a competitive market, a combination of Ramsey pricing (given the incentive for service providers to keep market share) and price ceilings based on stand-alone costs might be superior in terms of competition, market growth and regulatory oversight. The incipient cooperative approach for path coordination and allocation is welcome but likely to be seriously insufficient. The arguments to introduce competition, notably in freight, are valuable and many e.g. optimal cross-border services, quality differentiation as well as general quality improvement, larger scale for cost recovery and a decrease of rent seeking. Nevertheless, it is not correct to argue for the introduction of competition in rail tout court. It depends on the size of the market and on removing a host of barriers; it requires careful PSO definition and costing; also, coordination failures ought to be pre-empted. On the other hand, reform and competition cannot and should not be assessed in a static perspective. Conduct and cost structures will change with reform. Infrastructure and investment in technology are known to generate enormous potential for cost savings, especially when coupled with the EU interoperability programme. All this dynamism may well help to induce entry and further enlarge the (net) welfare gains from EU railway reform. The paper ends with a few pointers for the way forward in EU rail reform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a utilization of the high data-rates channels by threading of sending and receiving is studied. As a communication technology evolves the higher speeds are used more and more in various applications. But generating traffic with Gbps data-rates also brings some complications. Especially if UDP protocol is used and it is necessary to avoid packet fragmentation, for example for high-speed reliable transport protocols based on UDP. For such situation the Ethernet network packet size has to correspond to standard 1500 bytes MTU[1], which is widely used in the Internet. System may not has enough capacity to send messages with necessary rate in a single-threaded mode. A possible solution is to use more threads. It can be efficient on widespread multicore systems. Also the fact that in real network non-constant data flow can be expected brings another object of study –- an automatic adaptation to the traffic which is changing during runtime. Cases investigated in this paper include adjusting number of threads to a given speed and keeping speed on a given rate when CPU gets heavily loaded by other processes while sending data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying water wastage in forms of leaks in a water distribution network of any city becomes essential as droughts are presenting serious threats to few major cities. In this paper, we propose a deployment of sensor network for monitoring water flow in any water distribution network. We cover the issues related with designing such a dedicated sensor network by considering types of sensors required, sensors' functionality, data collection, and providing computation serving as leak detection mechanism. The main focus of this paper is on appropriate network segmentation that provides the base for hierarchical approach to pipes' failure detection. We show a method for sensors allocation to the network in order to facilitate effective pipes monitoring. In general, the identified computational problem belongs to hard problems. The paper shows a heuristic method to build effective hierarchy of the network segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various flexible mechanisms related to quality of service (QoS) provisioning have been specified for uplink traffic at the medium access control (MAC) layer in the IEEE 802.16 standards. Among the mechanisms, contention based bandwidth request scheme can be used to indicate bandwidth demands to the base station for the non-real-time polling and best-effort services. These two services are used for most applications with unknown traffic characteristics. Due to the diverse QoS requirements of those applications, service differentiation (SD) is anticipated over the contention based bandwidth request scheme. In this paper we investigate the SD with the bandwidth request scheme by means of assigning different channel access parameters and bandwidth allocation priorities at different packets arrival probability. The effectiveness of the differentiation schemes is evaluated by simulations. It is observed that the initial backoff window can be efficient in SD, and if combined with the bandwidth allocation priority, the SD performances will be better.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Class-based service differentiation is provided in DiffServ networks. However, this differentiation will be disordered under dynamic traffic loads due to the fixed weighted scheduling. An adaptive weighted scheduling scheme is proposed in this paper to achieve fair bandwidth allocation among different service classes. In this scheme, the number of active flows and the subscribed bandwidth are estimated based on the measurement of local queue metrics, then the scheduling weights of each service class are adjusted for the per-flow fairness of excess bandwidth allocation. This adaptive scheme can be combined with any weighted scheduling algorithm. Simulation results show that, comparing with fixed weighted scheduling, it effectively improve the fairness of excess bandwidth allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a Variable Neighborhood Search (VNS) algorithm for solving the Capacitated Single Allocation Hub Location Problem (CSAHLP) is presented. CSAHLP consists of two subproblems; the first is choosing a set of hubs from all nodes in a network, while the other comprises finding the optimal allocation of non-hubs to hubs when a set of hubs is already known. The VNS algorithm was used for the first subproblem, while the CPLEX solver was used for the second. Computational results demonstrate that the proposed algorithm has reached optimal solutions on all 20 test instances for which optimal solutions are known, and this in short computational time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the potentiality of reconfiguring distribution networks into islanded Microgrids to reduce the network infrastructure reinforcement requirement and incorporate various dispersed energy resources. The major challenge would be properly breaking down the network and its resultant protection and automation system changes. A reconfiguration method is proposed based on allocation of distributed generation resources to fulfil this purpose, with a heuristic algorithm. Cost/reliability data is required for the next stage tasks to realise a case study of a particular network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation discussed resource allocation mechanisms in several network topologies including infrastructure wireless network, non-infrastructure wireless network and wire-cum-wireless network. Different networks may have different resource constrains. Based on actual technologies and implementation models, utility function, game theory and a modern control algorithm have been introduced to balance power, bandwidth and customers' satisfaction in the system. ^ In infrastructure wireless networks, utility function was used in the Third Generation (3G) cellular network and the network was trying to maximize the total utility. In this dissertation, revenue maximization was set as an objective. Compared with the previous work on utility maximization, it is more practical to implement revenue maximization by the cellular network operators. The pricing strategies were studied and the algorithms were given to find the optimal price combination of power and rate to maximize the profit without degrading the Quality of Service (QoS) performance. ^ In non-infrastructure wireless networks, power capacity is limited by the small size of the nodes. In such a network, nodes need to transmit traffic not only for themselves but also for their neighbors, so power management become the most important issue for the network overall performance. Our innovative routing algorithm based on utility function, sets up a flexible framework for different users with different concerns in the same network. This algorithm allows users to make trade offs between multiple resource parameters. Its flexibility makes it a suitable solution for the large scale non-infrastructure network. This dissertation also covers non-cooperation problems. Through combining game theory and utility function, equilibrium points could be found among rational users which can enhance the cooperation in the network. ^ Finally, a wire-cum-wireless network architecture was introduced. This network architecture can support multiple services over multiple networks with smart resource allocation methods. Although a SONET-to-WiMAX case was used for the analysis, the mathematic procedure and resource allocation scheme could be universal solutions for all infrastructure, non-infrastructure and combined networks. ^