80 resultados para Network Cost Allocation
Resumo:
Background: The Brazilian consensus recommends a short-term treatment course with clarithromycin, amoxicillin and proton-pump inhibitor for the eradication of Helicobacter pylori ( H. pylori). This treatment course has good efficacy, but cannot be afforded by a large part of the population. Azithromycin, amoxicillin and omeprazole are subsidized, for several aims, by the Brazilian federal government. Therefore, a short-term treatment course that uses these drugs is a low-cost one, but its efficacy regarding the bacterium eradication is yet to be demonstrated. The study's purpose was to verify the efficacy of H. pylori eradication in infected patients who presented peptic ulcer disease, using the association of azithromycin, amoxicillin and omeprazole. Methods: Sixty patients with peptic ulcer diagnosed by upper digestive endoscopy and H. pylori infection documented by rapid urease test, histological analysis and urea breath test were treated for six days with a combination of azithromycin 500 mg and omeprazole 20 mg, in a single daily dose, associated with amoxicillin 500 mg 3 times a day. The eradication control was carried out 12 weeks after the treatment by means of the same diagnostic tests. The eradication rates were calculated with 95% confidence interval. Results: The eradication rate was 38% per intention to treat and 41% per protocol. Few adverse effects were observed and treatment compliance was high. Conclusion: Despite its low cost and high compliance, the low eradication rate does not allow the recommendation of the triple therapy with azithromycin as an adequate treatment for H. pylori infection.
Resumo:
Background: In a number of malaria endemic regions, tourists and travellers face a declining risk of travel associated malaria, in part due to successful malaria control. Many millions of visitors to these regions are recommended, via national and international policy, to use chemoprophylaxis which has a well recognized morbidity profile. To evaluate whether current malaria chemo-prophylactic policy for travellers is cost effective when adjusted for endemic transmission risk and duration of exposure. a framework, based on partial cost-benefit analysis was used Methods: Using a three component model combining a probability component, a cost component and a malaria risk component, the study estimated health costs avoided through use of chemoprophylaxis and costs of disease prevention (including adverse events and pre-travel advice for visits to five popular high and low malaria endemic regions) and malaria transmission risk using imported malaria cases and numbers of travellers to malarious countries. By calculating the minimal threshold malaria risk below which the economic costs of chemoprophylaxis are greater than the avoided health costs we were able to identify the point at which chemoprophylaxis would be economically rational. Results: The threshold incidence at which malaria chemoprophylaxis policy becomes cost effective for UK travellers is an accumulated risk of 1.13% assuming a given set of cost parameters. The period a travellers need to remain exposed to achieve this accumulated risk varied from 30 to more than 365 days, depending on the regions intensity of malaria transmission. Conclusions: The cost-benefit analysis identified that chemoprophylaxis use was not a cost-effective policy for travellers to Thailand or the Amazon region of Brazil, but was cost-effective for travel to West Africa and for those staying longer than 45 days in India and Indonesia.
Resumo:
Background: Tuberculosis is one of the most prominent health problems in the world, causing 1.75 million deaths each year. Rapid clinical diagnosis is important in patients who have comorbidities such as Human Immunodeficiency Virus (HIV) infection. Direct microscopy has low sensitivity and culture takes 3 to 6 weeks [1-3]. Therefore, new tools for TB diagnosis are necessary, especially in health settings with a high prevalence of HIV/TB co-infection. Methods: In a public reference TB/HIV hospital in Brazil, we compared the cost-effectiveness of diagnostic strategies for diagnosis of pulmonary TB: Acid fast bacilli smear microscopy by Ziehl-Neelsen staining (AFB smear) plus culture and AFB smear plus colorimetric test (PCR dot-blot). From May 2003 to May 2004, sputum was collected consecutively from PTB suspects attending the Parthenon Reference Hospital. Sputum samples were examined by AFB smear, culture, and PCR dot-blot. The gold standard was a positive culture combined with the definition of clinical PTB. Cost analysis included health services and patient costs. Results: The AFB smear plus PCR dot-blot require the lowest laboratory investment for equipment (US$ 20,000). The total screening costs are 3.8 times for AFB smear plus culture versus for AFB smear plus PCR dot blot costs (US$ 5,635,760 versus US$ 1,498, 660). Costs per correctly diagnosed case were US$ 50,773 and US$ 13,749 for AFB smear plus culture and AFB smear plus PCR dot-blot, respectively. AFB smear plus PCR dot-blot was more cost-effective than AFB smear plus culture, when the cost of treating all correctly diagnosed cases was considered. The cost of returning patients, which are not treated due to a negative result, to the health service, was higher in AFB smear plus culture than for AFB smear plus PCR dot-blot, US$ 374,778,045 and US$ 110,849,055, respectively. Conclusion: AFB smear associated with PCR dot-blot associated has the potential to be a cost-effective tool in the fight against PTB for patients attended in the TB/HIV reference hospital.
Resumo:
Background: In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method: The cost-effectiveness of the Optimal (R) and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U. S. dollars. Sensitivity analysis was performed considering key model parameters. Results: In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$ 549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion: Microscopy is more cost-effective than OptiMal (R) in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.
Resumo:
In many real situations, randomness is considered to be uncertainty or even confusion which impedes human beings from making a correct decision. Here we study the combined role of randomness and determinism in particle dynamics for complex network community detection. In the proposed model, particles walk in the network and compete with each other in such a way that each of them tries to possess as many nodes as possible. Moreover, we introduce a rule to adjust the level of randomness of particle walking in the network, and we have found that a portion of randomness can largely improve the community detection rate. Computer simulations show that the model has good community detection performance and at the same time presents low computational complexity. (C) 2008 American Institute of Physics.
Resumo:
This article focuses on the identification of the number of paths with different lengths between pairs of nodes in complex networks and how these paths can be used for characterization of topological properties of theoretical and real-world complex networks. This analysis revealed that the number of paths can provide a better discrimination of network models than traditional network measurements. In addition, the analysis of real-world networks suggests that the long-range connectivity tends to be limited in these networks and may be strongly related to network growth and organization.
Resumo:
This work clarifies the relation between network circuit (topology) and behaviour (information transmission and synchronization) in active networks, e.g. neural networks. As an application, we show how one can find network topologies that are able to transmit a large amount of information, possess a large number of communication channels, and are robust under large variations of the network coupling configuration. This theoretical approach is general and does not depend on the particular dynamic of the elements forming the network, since the network topology can be determined by finding a Laplacian matrix (the matrix that describes the connections and the coupling strengths among the elements) whose eigenvalues satisfy some special conditions. To illustrate our ideas and theoretical approaches, we use neural networks of electrically connected chaotic Hindmarsh-Rose neurons.
Resumo:
We numerically study the dynamics of a discrete spring-block model introduced by Olami, Feder, and Christensen (OFC) to mimic earthquakes and investigate to what extent this simple model is able to reproduce the observed spatiotemporal clustering of seismicity. Following a recently proposed method to characterize such clustering by networks of recurrent events [J. Davidsen, P. Grassberger, and M. Paczuski, Geophys. Res. Lett. 33, L11304 (2006)], we find that for synthetic catalogs generated by the OFC model these networks have many nontrivial statistical properties. This includes characteristic degree distributions, very similar to what has been observed for real seismicity. There are, however, also significant differences between the OFC model and earthquake catalogs, indicating that this simple model is insufficient to account for certain aspects of the spatiotemporal clustering of seismicity.
Resumo:
We present a scheme for quasiperfect transfer of polariton states from a sender to a spatially separated receiver, both composed of high-quality cavities filled by atomic samples. The sender and the receiver are connected by a nonideal transmission channel -the data bus- modelled by a network of lossy empty cavities. In particular, we analyze the influence of a large class of data-bus topologies on the fidelity and transfer time of the polariton state. Moreover, we also assume dispersive couplings between the polariton fields and the data-bus normal modes in order to achieve a tunneling-like state transfer. Such a tunneling-transfer mechanism, by which the excitation energy of the polariton effectively does not populate the data-bus cavities, is capable of attenuating appreciably the dissipative effects of the data-bus cavities. After deriving a Hamiltonian for the effective coupling between the sender and the receiver, we show that the decay rate of the fidelity is proportional to a cooperativity parameter that weighs the cost of the dissipation rate against the benefit of the effective coupling strength. The increase of the fidelity of the transfer process can be achieved at the expense of longer transfer times. We also show that the dependence of both the fidelity and the transfer time on the network topology is analyzed in detail for distinct regimes of parameters. It follows that the data-bus topology can be explored to control the time of the state-transfer process.
Resumo:
Complex networks have been characterised by their specific connectivity patterns (network motifs), but their building blocks can also be identified and described by node-motifs-a combination of local network features. One technique to identify single node-motifs has been presented by Costa et al. (L. D. F. Costa, F. A. Rodrigues, C. C. Hilgetag, and M. Kaiser, Europhys. Lett., 87, 1, 2009). Here, we first suggest improvements to the method including how its parameters can be determined automatically. Such automatic routines make high-throughput studies of many networks feasible. Second, the new routines are validated in different network-series. Third, we provide an example of how the method can be used to analyse network time-series. In conclusion, we provide a robust method for systematically discovering and classifying characteristic nodes of a network. In contrast to classical motif analysis, our approach can identify individual components (here: nodes) that are specific to a network. Such special nodes, as hubs before, might be found to play critical roles in real-world networks.
Resumo:
We investigate the performance of a variant of Axelrod's model for dissemination of culture-the Adaptive Culture Heuristic (ACH)-on solving an NP-Complete optimization problem, namely, the classification of binary input patterns of size F by a Boolean Binary Perceptron. In this heuristic, N agents, characterized by binary strings of length F which represent possible solutions to the optimization problem, are fixed at the sites of a square lattice and interact with their nearest neighbors only. The interactions are such that the agents' strings (or cultures) become more similar to the low-cost strings of their neighbors resulting in the dissemination of these strings across the lattice. Eventually the dynamics freezes into a homogeneous absorbing configuration in which all agents exhibit identical solutions to the optimization problem. We find through extensive simulations that the probability of finding the optimal solution is a function of the reduced variable F/N(1/4) so that the number of agents must increase with the fourth power of the problem size, N proportional to F(4), to guarantee a fixed probability of success. In this case, we find that the relaxation time to reach an absorbing configuration scales with F(6) which can be interpreted as the overall computational cost of the ACH to find an optimal set of weights for a Boolean binary perceptron, given a fixed probability of success.
Resumo:
We describe the effect of influenza-like illness (ILI) during the outbreak of pandemic (H1N1) 2009 on health care worker (HCW) absenteeism and compare the effectiveness and cost of 2 sick leave policies for HCWs with suspected influenza. We assessed initial 2-day sick leaves plus reassessment until the HOW was asymptomatic (2-day + reassessment policy), and initial 7-day sick leaves (7-day policy). Sick leaves peaked in August 2009: 3% of the workforce received leave for ILI. Costs during May October reached R$798,051.87 (approximate to US $443,362). The 7-day policy led to a higher monthly rate of sick leave days per 100 HCWs than did the 2-day + reassessment policy (8.72 vs. 3.47 days/100 HCWs; p<0.0001) and resulted in higher costs (US $609 vs. US $1,128 per HCW on leave). ILI affected HCW absenteeism. The 7-day policy was more costly and not more effective in preventing transmission to patients than the 2-day + reassessment policy.
Resumo:
Chagas disease is still a major public health problem in Latin America. Its causative agent, Trypanosoma cruzi, can be typed into three major groups, T. cruzi I, T. cruzi II and hybrids. These groups each have specific genetic characteristics and epidemiological distributions. Several highly virulent strains are found in the hybrid group; their origin is still a matter of debate. The null hypothesis is that the hybrids are of polyphyletic origin, evolving independently from various hybridization events. The alternative hypothesis is that all extant hybrid strains originated from a single hybridization event. We sequenced both alleles of genes encoding EF-1 alpha, actin and SSU rDNA of 26 T. cruzi strains and DHFR-TS and TR of 12 strains. This information was used for network genealogy analysis and Bayesian phylogenies. We found T. cruzi I and T. cruzi II to be monophyletic and that all hybrids had different combinations of T. cruzi I and T. cruzi II haplotypes plus hybrid-specific haplotypes. Bootstrap values (networks) and posterior probabilities (Bayesian phylogenies) of clades supporting the monophyly of hybrids were far below the 95% confidence interval, indicating that the hybrid group is polyphyletic. We hypothesize that T. cruzi I and T. cruzi II are two different species and that the hybrids are extant representatives of independent events of genome hybridization, which sporadically have sufficient fitness to impact on the epidemiology of Chagas disease.
Resumo:
We proposed a connection admission control (CAC) to monitor the traffic in a multi-rate WDM optical network. The CAC searches for the shortest path connecting source and destination nodes, assigns wavelengths with enough bandwidth to serve the requests, supervises the traffic in the most required nodes, and if needed activates a reserved wavelength to release bandwidth according to traffic demand. We used a scale-free network topology, which includes highly connected nodes ( hubs), to enhance the monitoring procedure. Numerical results obtained from computational simulations show improved network performance evaluated in terms of blocking probability.
Resumo:
This paper presents an approach for the active transmission losses allocation between the agents of the system. The approach uses the primal and dual variable information of the Optimal Power Flow in the losses allocation strategy. The allocation coefficients are determined via Lagrange multipliers. The paper emphasizes the necessity to consider the operational constraints and parameters of the systems in the problem solution. An example, for a 3-bus system is presented in details, as well as a comparative test with the main allocation methods. Case studies on the IEEE 14-bus systems are carried out to verify the influence of the constraints and parameters of the system in the losses allocation.