920 resultados para Distributed Control Problems
Resumo:
BACKGROUND Patients requiring anticoagulation suffer from comorbidities such as hypertension. On the occasion of INR monitoring, general practitioners (GPs) have the opportunity to control for blood pressure (BP). We aimed to evaluate the impact of Vitamin-K Antagonist (VKA) monitoring by GPs on BP control in patients with hypertension. METHODS We cross-sectionally analyzed the database of the Swiss Family Medicine ICPC Research using Electronic Medical Records (FIRE) of 60 general practices in a primary care setting in Switzerland. This database includes 113,335 patients who visited their GP between 2009 and 2013. We identified patients with hypertension based on antihypertensive medication prescribed for ≥6 months. We compared patients with VKA for ≥3 months and patients without such treatment regarding BP control. We adjusted for age, sex, observation period, number of consultations and comorbidity. RESULTS We identified 4,412 patients with hypertension and blood pressure recordings in the FIRE database. Among these, 569 (12.9 %) were on Phenprocoumon (VKA) and 3,843 (87.1 %) had no anticoagulation. Mean systolic and diastolic BP was significantly lower in the VKA group (130.6 ± 14.9 vs 139.8 ± 15.8 and 76.6 ± 7.9 vs 81.3 ± 9.3 mm Hg) (p < 0.001 for both). The difference remained after adjusting for possible confounders. Systolic and diastolic BP were significantly lower in the VKA group, reaching a mean difference of -8.4 mm Hg (95 % CI -9.8 to -7.0 mm Hg) and -1.5 mm Hg (95 % CI -2.3 to -0.7 mm Hg), respectively (p < 0.001 for both). CONCLUSIONS In a large sample of hypertensive patients in Switzerland, VKA treatment was independently associated with better systolic and diastolic BP control. The observed effect could be due to better compliance with antihypertensive medication in patients treated with VKA. Therefore, we conclude to be aware of this possible benefit especially in patients with lower expected compliance and with multimorbidity.
Resumo:
The present topical review deals with the motor control of facial expressions in humans. Facial expressions are a central part of human communication. Emotional face expressions have a crucial role in human non-verbal behavior, allowing a rapid transfer of information between individuals. Facial expressions can be both voluntarily or emotionally controlled. Recent studies in non-human primates and humans revealed that the motor control of facial expressions has a distributed neural representation. At least 5 cortical regions on the medial and lateral aspects of each hemisphere are involved: the primary motor cortex, the ventral lateral premotor cortex, the supplementary motor area on the medial wall, and, finally, the rostral and caudal cingulate cortex. The results of studies in humans and non-human primates suggest that the innervation of the face is bilaterally controlled for the upper part, and mainly contralaterally controlled for the lower part. Furthermore, the primary motor cortex, the ventral lateral premotor cortex, and the supplementary motor area are essential for the voluntary control of facial expressions. In contrast, the cingulate cortical areas are important for emotional expression, since they receive input from different structures of the limbic system. This article is protected by copyright. All rights reserved.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
HIV-infected women are at increased risk of cervical intra-epithelial neoplasia (CIN) and invasive cervical cancer (ICC), but it has been difficult to disentangle the influences of heavy exposure to HPV infection, inadequate screening, and immunodeficiency. A case-control study including 364 CIN2/3 and 20 ICC cases matched to 1,147 controls was nested in the Swiss HIV Cohort Study (1985-2013). CIN2/3 risk was significantly associated with low CD4+ cell counts, whether measured as nadir (odds ratio (OR) per 100-cell/μL decrease=1.15, 95% CI: 1.08, 1.22), or at CIN2/3 diagnosis (1.10, 95% CI: 1.04, 1.16). An association was evident even for nadir CD4+ 200-349 versus ≥350 cells/μL (OR=1.57, 95% CI: 1.09, 2.25). After adjustment for nadir CD4+, a protective effect of >2-year cART use was seen against CIN2/3 (OR versus never cART use=0.64, 95% CI: 0.42, 0.98). Despite low study power, similar associations were seen for ICC, notably with nadir CD4+ (OR for 50 versus >350 cells/μL= 11.10, 95% CI: 1.24, 100). HPV16-L1 antibodies were significantly associated with CIN2/3, but HPV16-E6 antibodies were nearly exclusively detected in ICC. In conclusion, worsening immunodeficiency, even at only moderately decreased CD4+ cell counts (200-349 CD4+ cells/μL), is a significant risk factor for CIN2/3 and cervical cancer. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND In 2012, the levels of chlamydia control activities including primary prevention, effective case management with partner management and surveillance were assessed in 2012 across countries in the European Union and European Economic Area (EU/EEA), on initiative of the European Centre for Disease Control (ECDC) survey, and the findings were compared with those from a similar survey in 2007. METHODS Experts in the 30 EU/EEA countries were invited to respond to an online questionnaire; 28 countries responded, of which 25 participated in both the 2007 and 2012 surveys. Analyses focused on 13 indicators of chlamydia prevention and control activities; countries were assigned to one of five categories of chlamydia control. RESULTS In 2012, more countries than in 2007 reported availability of national chlamydia case management guidelines (80% vs. 68%), opportunistic chlamydia testing (68% vs. 44%) and consistent use of nucleic acid amplification tests (64% vs. 36%). The number of countries reporting having a national sexually transmitted infection control strategy or a surveillance system for chlamydia did not change notably. In 2012, most countries (18/25, 72%) had implemented primary prevention activities and case management guidelines addressing partner management, compared with 44% (11/25) of countries in 2007. CONCLUSION Overall, chlamydia control activities in EU/EEA countries strengthened between 2007 and 2012. Several countries still need to develop essential chlamydia control activities, whereas others may strengthen implementation and monitoring of existing activities.
Resumo:
Little is known about the aetiology of childhood brain tumours. We investigated anthropometric factors (birth weight, length, maternal age), birth characteristics (e.g. vacuum extraction, preterm delivery, birth order) and exposures during pregnancy (e.g. maternal: smoking, working, dietary supplement intake) in relation to risk of brain tumour diagnosis among 7-19 year olds. The multinational case-control study in Denmark, Sweden, Norway and Switzerland (CEFALO) included interviews with 352 (participation rate=83.2%) eligible cases and 646 (71.1%) population-based controls. Interview data were complemented with data from birth registries and validated by assessing agreement (Cohen's Kappa). We used conditional logistic regression models matched on age, sex and geographical region (adjusted for maternal age and parental education) to explore associations between birth factors and childhood brain tumour risk. Agreement between interview and birth registry data ranged from moderate (Kappa=0.54; worked during pregnancy) to almost perfect (Kappa=0.98; birth weight). Neither anthropogenic factors nor birth characteristics were associated with childhood brain tumour risk. Maternal vitamin intake during pregnancy was indicative of a protective effect (OR 0.75, 95%-CI: 0.56-1.01). No association was seen for maternal smoking during pregnancy or working during pregnancy. We found little evidence that the considered birth factors were related to brain tumour risk among children and adolescents.
Resumo:
BACKGROUND CONTEXT Several randomized controlled trials (RCTs) have compared patient outcomes of anterior (cervical) interbody fusion (AIF) with those of total disc arthroplasty (TDA). Because RCTs have known limitations with regard to their external validity, the comparative effectiveness of the two therapies in daily practice remains unknown. PURPOSE This study aimed to compare patient-reported outcomes after TDA versus AIF based on data from an international spine registry. STUDY DESIGN AND SETTING A retrospective analysis of registry data was carried out. PATIENT SAMPLE Inclusion criteria were degenerative disc or disc herniation of the cervical spine treated by single-level TDA or AIF, no previous surgery, and a Core Outcome Measures Index (COMI) completed at baseline and at least 3 months' follow-up. Overall, 987 patients were identified. OUTCOME MEASURES Neck and arm pain relief and COMI score improvement were the outcome measures. METHODS Three separate analyses were performed to compare TDA and AIF surgical outcomes: (1) mimicking an RCT setting, with admission criteria typical of those in published RCTs, a 1:1 matched analysis was carried out in 739 patients; (2) an analysis was performed on 248 patients outside the classic RCT spectrum, that is, with one or more typical RCT exclusion criteria; (3) a subgroup analysis of all patients with additional follow-up longer than 2 years (n=149). RESULTS Matching resulted in 190 pairs with an average follow-up of 17 months that had no residual significant differences for any patient characteristics. Small but statistically significant differences in outcome were observed in favor of TDA, which are potentially clinically relevant. Subgroup analyses of atypical patients and of patients with longer-term follow-up showed no significant differences in outcome between the treatments. CONCLUSIONS The results of this observational study were in accordance with those of the published RCTs, suggesting substantial pain reduction both after AIF and TDA, with slightly greater benefit after arthroplasty. The analysis of atypical patients suggested that, in patients outside the spectrum of clinical trials, both surgical interventions appeared to work to a similar extent to that shown for the cohort in the matched study. Also, in the longer-term perspective, both therapies resulted in similar benefits to the patients.
Resumo:
Literature on hypertension treatment has demonstrated that a healthy life style is one of the best strategies for hypertension control. In exploring the mechanisms of behavioral change for hypertension control, a comprehensive study based on the Transtheoretical Model was carried out in Taiwan during the summer of 2000 with a sample of 350 hypertensive adults living in Taipei urban and rural areas. ^ The relationships among stages of change, processes of change and demographic factors were analyzed for six health behaviors—low fat food consumption, alcohol use, smoking, physical activity, weight control, and routine blood pressure checkups. In addition, differences were assessed between urban and rural populations in changing their behavior for hypertension control. ^ The results showed that rural populations had more difficulties than urban populations in avoiding smoking and engaging in physical activity, and the processes of change being used by urban populations were significantly greater than rural populations. The study findings support a strong association between processes and stages of change. ^ Individuals who use more processes of change will be more inclined to move from precontemplation stage to maintenance stage. Counterconditioning, which is the substitution of alternatives for the problem behaviors, in this study, significantly helped people to change diet, engage in physical activity, and check blood pressure regularly. For example, counterconditioning is eating more vegetables instead of meat, or engaging in physical activity as a time to relax rather than another task to accomplish. ^ In addition, self-reevaluation was the most important process for helping people to engage in physical activity; and social liberation was the most important process for changing diet behavior. The findings in this study may be applied to improve health behaviors among rural populations with low income and low education; however, at the same time, the obesity problems among urban populations should be prevented to control hypertension in Taiwan. ^
Resumo:
This small pilot study compared the effectiveness of two interventions to improve automaticity with basic addition facts: Taped Problems (TP) and Cover, Copy, Compare (CCC), in students aged 6-10. Automaticity was measured using Mathematics Curriculum-Based Measurement (M-CBM) at pretest, after 10 days, and after 20 days of intervention. Our hypothesis was that the TP group will gain higher levels of automaticity more quickly than the CCC and control groups. However, when gain scores were compared, no significant differences were found between groups. Limitations to the study include low treatment integrity and a short duration of intervention.
Resumo:
Malaria poses a significant public health problem worldwide. The World Health Organization indicates that approximately 40% of the world's population and almost 85% of the population from the South–East Asian region is at risk of contracting malaria. India being the most populous country in the region, contributes the highest number of malaria cases and deaths attributed to malaria. Orissa is the state that has the highest number of malaria cases and deaths attributable to malaria. A secondary data analysis was carried out to evaluate the effectiveness of the World bank-assisted Malaria Action Program in the state of Orissa under the health sector reforms of 1995-96. The secondary analysis utilized the government of India's National Anti Malaria Management Information System's (NAMMIS) surveillance data and the National Family Health Survey (NFHS–I and NFHS–II) datasets to compare the malaria mortality and morbidity in the state between 1992-93 and 1998-99. Results revealed no effect of the intervention and indicated an increase of 2.18 times in malaria mortality between 1992-1999 and an increase of 1.53 times in malaria morbidity between 1992-93 and 1998-99 in the state. The difference in the age-adjusted malaria morbidity in the state between the time periods of 1992-93 and 1998-99 proved to be highly significant (t = 4.29 df=16, p<. 0005) whereas the difference between the increase of age-adjusted malaria morbidity during 1992-93 and 1998-99 between Orissa (with intervention) and Bihar (no intervention) proved to be non significant (t=.0471 df=16, p<.50). Factors such as underutilization of World Bank funds for the malaria control program, inadequate health care infrastructure, structural adjustment problems, poor management, poor financial management, parasite resistance to anti-malarial drugs, inadequate supply of drugs and staff shortages may have contributed to the failure of the program in the state.^
Resumo:
Literature on agency problems arising between controlling and minority owners claim that separation of cash flow and control rights allows controllers to expropriate listed firms, and further that separation emerges when dual class shares or pyramiding corporate structures exist. Dual class share and pyramiding coexisted in listed companies of China until discriminated share reform was implemented in 2005. This paper presents a model of controller to expropriate behavior as well as empirical tests of expropriation via particular accounting items and pyramiding generated expropriation. Results show that expropriation is apparent for state controlled listed companies. While reforms have weakened the power to expropriate, separation remains and still generates expropriation. Size of expropriation is estimated to be 7 to 8 per cent of total asset at mean. If the "one share, one vote" principle were to be realized, asset inflation could be reduced by 13 percent.
Resumo:
The use of modular or ‘micro’ maximum power point tracking (MPPT) converters at module level in series association, commercially known as “power optimizers”, allows the individual adaptation of each panel to the load, solving part of the problems related to partial shadows and different tilt and/or orientation angles of the photovoltaic (PV) modules. This is particularly relevant in building integrated PV systems. This paper presents useful behavioural analytical studies of cascade MPPT converters and evaluation test results of a prototype developed under a Spanish national research project. On the one hand, this work focuses on the development of new useful expressions which can be used to identify the behaviour of individual MPPT converters applied to each module and connected in series, in a typical grid-connected PV system. On the other hand, a novel characterization method of MPPT converters is developed, and experimental results of the prototype are obtained: when individual partial shading is applied, and they are connected in a typical grid connected PV array
Resumo:
The problem of fairly distributing the capacity of a network among a set of sessions has been widely studied. In this problem, each session connects via a single path a source and a destination, and its goal is to maximize its assigned transmission rate (i.e., its throughput). Since the links of the network have limited bandwidths, some criterion has to be defined to fairly distribute their capacity among the sessions. A popular criterion is max-min fairness that, in short, guarantees that each session i gets a rate λi such that no session s can increase λs without causing another session s' to end up with a rate λs/ <; λs. Many max-min fair algorithms have been proposed, both centralized and distributed. However, to our knowledge, all proposed distributed algorithms require control data being continuously transmitted to recompute the max-min fair rates when needed (because none of them has mechanisms to detect convergence to the max-min fair rates). In this paper we propose B-Neck, a distributed max-min fair algorithm that is also quiescent. This means that, in absence of changes (i.e., session arrivals or departures), once the max min rates have been computed, B-Neck stops generating network traffic. Quiescence is a key design concept of B-Neck, because B-Neck routers are capable of detecting and notifying changes in the convergence conditions of max-min fair rates. As far as we know, B-Neck is the first distributed max-min fair algorithm that does not require a continuous injection of control traffic to compute the rates. The correctness of B-Neck is formally proved, and extensive simulations are conducted. In them, it is shown that B-Neck converges relatively fast and behaves nicely in presence of sessions arriving and departing.
Resumo:
Belief propagation (BP) is a technique for distributed inference in wireless networks and is often used even when the underlying graphical model contains cycles. In this paper, we propose a uniformly reweighted BP scheme that reduces the impact of cycles by weighting messages by a constant ?edge appearance probability? rho ? 1. We apply this algorithm to distributed binary hypothesis testing problems (e.g., distributed detection) in wireless networks with Markov random field models. We demonstrate that in the considered setting the proposed method outperforms standard BP, while maintaining similar complexity. We then show that the optimal ? can be approximated as a simple function of the average node degree, and can hence be computed in a distributed fashion through a consensus algorithm.
Resumo:
Many of the emerging telecom services make use of Outer Edge Networks, in particular Home Area Networks. The configuration and maintenance of such services may not be under full control of the telecom operator which still needs to guarantee the service quality experienced by the consumer. Diagnosing service faults in these scenarios becomes especially difficult since there may be not full visibility between different domains. This paper describes the fault diagnosis solution developed in the MAGNETO project, based on the application of Bayesian Inference to deal with the uncertainty. It also takes advantage of a distributed framework to deploy diagnosis components in the different domains and network elements involved, spanning both the telecom operator and the Outer Edge networks. In addition, MAGNETO features self-learning capabilities to automatically improve diagnosis knowledge over time and a partition mechanism that allows breaking down the overall diagnosis knowledge into smaller subsets. The MAGNETO solution has been prototyped and adapted to a particular outer edge scenario, and has been further validated on a real testbed. Evaluation of the results shows the potential of our approach to deal with fault management of outer edge networks.