832 resultados para role based access control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant impediment to deployment of multicast services is the daunting technical complexity of developing, testing and validating congestion control protocols fit for wide-area deployment. Protocols such as pgmcc and TFMCC have recently made considerable progress on the single rate case, i.e. where one dynamic reception rate is maintained for all receivers in the session. However, these protocols have limited applicability, since scaling to session sizes beyond tens of participants necessitates the use of multiple rate protocols. Unfortunately, while existing multiple rate protocols exhibit better scalability, they are both less mature than single rate protocols and suffer from high complexity. We propose a new approach to multiple rate congestion control that leverages proven single rate congestion control methods by orchestrating an ensemble of independently controlled single rate sessions. We describe SMCC, a new multiple rate equation-based congestion control algorithm for layered multicast sessions that employs TFMCC as the primary underlying control mechanism for each layer. SMCC combines the benefits of TFMCC (smooth rate control, equation-based TCP friendliness) with the scalability and flexibility of multiple rates to provide a sound multiple rate multicast congestion control policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since Wireless Sensor Networks (WSNs) are subject to failures, fault-tolerance becomes an important requirement for many WSN applications. Fault-tolerance can be enabled in different areas of WSN design and operation, including the Medium Access Control (MAC) layer and the initial topology design. To be robust to failures, a MAC protocol must be able to adapt to traffic fluctuations and topology dynamics. We design ER-MAC that can switch from energy-efficient operation in normal monitoring to reliable and fast delivery for emergency monitoring, and vice versa. It also can prioritise high priority packets and guarantee fair packet deliveries from all sensor nodes. Topology design supports fault-tolerance by ensuring that there are alternative acceptable routes to data sinks when failures occur. We provide solutions for four topology planning problems: Additional Relay Placement (ARP), Additional Backup Placement (ABP), Multiple Sink Placement (MSP), and Multiple Sink and Relay Placement (MSRP). Our solutions use a local search technique based on Greedy Randomized Adaptive Search Procedures (GRASP). GRASP-ARP deploys relays for (k,l)-sink-connectivity, where each sensor node must have k vertex-disjoint paths of length ≤ l. To count how many disjoint paths a node has, we propose Counting-Paths. GRASP-ABP deploys fewer relays than GRASP-ARP by focusing only on the most important nodes – those whose failure has the worst effect. To identify such nodes, we define Length-constrained Connectivity and Rerouting Centrality (l-CRC). Greedy-MSP and GRASP-MSP place minimal cost sinks to ensure that each sensor node in the network is double-covered, i.e. has two length-bounded paths to two sinks. Greedy-MSRP and GRASP-MSRP deploy sinks and relays with minimal cost to make the network double-covered and non-critical, i.e. all sensor nodes must have length-bounded alternative paths to sinks when an arbitrary sensor node fails. We then evaluate the fault-tolerance of each topology in data gathering simulations using ER-MAC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To evaluate the empirical evidence linking nursing resources to patient outcomes in intensive care settings as a framework for future research in this area. Background: Concerns about patient safely and the quality of care are driving research on the clinical and cost-effectiveness of health care interventions, including the deployment of human resources. This is particularly important in intensive care where a large proportion of the health care budget is consumed and where nursing staff is the main item of expenditure. Recommendations about staffing levels have been trade but may not be evidence based and may not always be achieved in practice. Methods: We searched systematically for studies of the impact of nursing resources (e.g. nurse-patient ratios, nurses' level of education, training and experience) on patient Outcomes, including mortality and adverse events, in adult intensive care. Abstracts of articles were reviewed and retrieved if they investigated the relationship between nursing resources and patient Outcomes. Characteristics of the studies were tabulated and the quality of the Studies assessed. Results: Of the 15 studies included in this review, two reported it statistical relationship between nursing resources and both mortality and adverse events, one reported ail association to mortality only, seven studies reported that they Could not reject the null hypothesis of no relationship to mortality and 10 studies (out of 10 that tested the hypothesis) reported a relationship to adverse events. The main explanatory mechanisms were the lack of time for nurses to perform preventative measures, or for patient surveillance. The nurses' role in pain control was noted by One author. Studies were mainly observational and retrospective and varied in scope from 1 to 52 units. Recommendations for future research include developing the mechanisms linking nursing resources to patient Outcomes, and designing large multi-centre prospective Studies that link patient's exposure to nursing care oil a shift-by-shift basis over time. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reflux of gastric contents can lead to development of reflux esophagitis and Barrett's esophagus. Barrett's esophagus is a risk factor for esophageal adenocarcinoma. Damage to DNA may lead to carcinogenesis but is repaired through activation of pathways involving polymorphic enzymes, including human 8-oxoguanine glycosylase 1 (hOGG1), X-ray repair cross-complementing 1 (XRCC1), and xeroderma pigmentosum group D (XPD). Of the single nucleotide polymorphisms identified in these genes, hOGG1 Ser 326Cys, XRCC1 Arg 399Gln, and XPD Lys 751Gln are particularly common in Caucasians and have been associated with lower DNA repair capacity. Small studies have reported associations with XPD Lys 751Gln and esophageal adenocarcinoma. XRCC1 Arg 399Gln has been linked to Barrett's esophagus and reflux esophagitis. In a population-based case-control study, we examined associations of the hOGG1 Ser 326Cys, XRCC1 Arg 399Gln, and XPD Lys 751Gln polymorphisms with risk of esophageal adenocarcinoma, Barrett's esophagus, and reflux esophagitis. Genomic DNA was extracted from blood samples collected from cases of esophageal adenocarcinoma (n = 210), Barrett's esophagus (n = 212), reflux esophagitis (n = 230), and normal population controls frequency matched for age and sex (n = 248). Polymorphisms were genotyped using Taq-Man allelic discrimination assays. Odds ratios and 95% confidence intervals were obtained from logistic regression models adjusted for potential confounding factors. There were no statistically significant associations between these polymorphisms and risk of esophageal adenocarcinoma, Barrett's esophagus, or reflux esophagitis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subspace monitoring has recently been proposed as a condition monitoring tool that requires considerably fewer variables to be analysed compared to dynamic principal component analysis (PCA). This paper analyses subspace monitoring in identifying and isolating fault conditions, which reveals that the existing work suffers from inherent limitations if complex fault senarios arise. Based on the assumption that the fault signature is deterministic while the monitored variables are stochastic, the paper introduces a regression-based reconstruction technique to overcome these limitations. The utility of the proposed fault identification and isolation method is shown using a simulation example and the analysis of experimental data from an industrial reactive distillation unit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In polymer extrusion, the delivery of a melt which is homogenous in composition and temperature is paramount for achieving high quality extruded products. However, advancements in process control are required to reduce temperature variations across the melt flow which can result in poor product quality. The majority of thermal monitoring methods provide only low accuracy point/bulk melt temperature measurements and cause poor controller performance. Furthermore, the most common conventional proportional-integral-derivative controllers seem to be incapable of performing well over the nonlinear operating region. This paper presents a model-based fuzzy control approach to reduce the die melt temperature variations across the melt flow while achieving desired average die melt temperature. Simulation results confirm the efficacy of the proposed controller.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxidative stress appears to be important in the pathogenesis of Barrett's esophagus (BE) and esophageal adenocarcinoma (EAC). Single-nucleotide polymorphisms (SNPs) of antioxidant enzyme genes may play a part in determining individual susceptibility to these diseases. The Factors Influencing the Barrett's Adenocarcinoma Relationship (FINBAR) study is a population-based, case-control study of BE and EAC in Ireland. DNA from EAC (n = 207), BE (> or =3 cm BE at endoscopy with specialized intestinal metaplasia on biopsy, n = 189) and normal population controls (n = 223) were analyzed. Several SNPs spanning the genes for glutathione S-transferase P1 (GSTP1), manganese superoxide dismutase (MnSOD) and glutathione peroxidase 2 (GPX2) were genotyped using multiplex polymerase chain reaction and SNaPshottrade mark. The chi(2) test was used to compare genotype and allele frequencies between case and control subjects. Linkage disequilibrium between SNPs was quantified using Lewontin's D' value and haplotype frequency estimates obtained using Haploview. Eleven SNPs were genotyped (six for GSTP1, three for MnSOD and two for GPX2); all were in Hardy-Weinberg equilibrium. None was significantly associated with EAC or BE even before Bonferroni correction. Odds ratios for EAC for individual SNPs ranged from 0.68 [95% confidence interval (CI) 0.43-1.08] to 1.25 (95% CI 0.73-2.16), and for BE from 0.84 (95% CI 0.52-1.30) to 1.30 (95% CI 0.85-1.97). SNPs in all three genes were in strong linkage disequilibrium (D' > 0.887) but haplotype analysis did not show any significant association with EAC or BE. SNPs involving the GSTP1, MnSOD and GPX2 genes were not associated with BE or EAC. Further studies aimed at identifying susceptibility genes should focus on different antioxidant genes or different pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new methodology for solving the multi-vehicle formation control problem. It employs a unique extension-decomposition-aggregation scheme to transform the overall complex formation control problem into a group of subproblems, which work via boundary interactions or disturbances. Thus, it is proved that the overall formation system is exponentially stable in the sense of Lyapunov, if all the individual augmented subsystems (IASs) are stable. Linear matrix inequality-based H8 control methodology is employed to design the decentralized formation controllers to reject the impact of the formation changes being treated as boundary disturbances and guarantee the stability of all the IASs, consequently maintaining the stability of the overall formation system. Simulation studies are performed to verify the stability, performance, and effectiveness of the proposed strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a framework for context-driven policy-based QoS control and end-to-end resource management in converged next generation networks. The Converged Networks QoS Framework (CNQF) is being developed within the IU-ATC project, and comprises distributed functional entities whose instances co-ordinate the converged network infrastructure to facilitate scalable and efficient end-to-end QoS management. The CNQF design leverages aspects of TISPAN, IETF and 3GPP policy-based management architectures whilst also introducing important innovative extensions to support context-aware QoS control in converged networks. The framework architecture is presented and its functionalities and operation in specific application scenarios are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nematode neuropeptide systems comprise an exceptionally complex array of similar to 250 peptidic signaling molecules that operate within a structurally simple nervous system of similar to 300 neurons. A relatively complete picture of the neuropeptide complement is available for Caenorhabditis elegans, with 30 flp, 38 ins and 43 nlp genes having been documented; accumulating evidence indicates similar complexity in parasitic nematodes from clades I, III, IV and V. In contrast, the picture for parasitic platyhelminths is less clear, with the limited peptide sequence data available providing concrete evidence for only FMRFamide-like peptide (FLP) and neuropeptide F (NPF) signaling systems, each of which only comprises one or two peptides. With the completion of the Schmidtea meditteranea and Schistosoma mansoni genome projects and expressed sequence tag datasets for other flatworm parasites becoming available, the time is ripe for a detailed reanalysis of neuropeptide signaling in flatworms. Although the actual neuropeptides provide limited obvious value as targets for chemotherapeutic-based control strategies, they do highlight the signaling systems present in these helminths and provide tools for the discovery of more amenable targets such as neuropeptide receptors or neuropeptide processing enzymes. Also, they offer opportunities to evaluate the potential of their associated signaling pathways as targets through RNA interference (RNAi)-based, target validation strategies. Currently, within both helminth phyla, the flp signaling systems appear to merit further investigation as they are intrinsically linked with motor function, a proven target for successful anti-parasitics; it is clear that some nematode NLPs also play a role in motor function and could have similar appeal. At this time, it is unclear if flatworm NPF and nematode INS peptides operate in pathways that have utility for parasite control. Clearly, RNAi-based validation could be a starting point for scoring potential target pathways within neuropeptide signaling for parasiticide discovery programs. Also, recent successes in the application of in planta-based RNAi control strategies for plant parasitic nematodes reveal a strategy whereby neuropeptide encoding genes could become targets for parasite control. The possibility of developing these approaches for the control of animal and human parasites is intriguing, but will require significant advances in the delivery of RNAi-triggers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power electronics plays an important role in the control and conversion of modern electric power systems. In particular, to integrate various renewable energies using DC transmissions and to provide more flexible power control in AC systems, significant efforts have been made in the modulation and control of power electronics devices. Pulse width modulation (PWM) is a well developed technology in the conversion between AC and DC power sources, especially for the purpose of harmonics reduction and energy optimization. As a fundamental decoupled control method, vector control with PI controllers has been widely used in power systems. However, significant power loss occurs during the operation of these devices, and the loss is often dissipated in the form of heat, leading to significant maintenance effort. Though much work has been done to improve the power electronics design, little has focused so far on the investigation of the controller design to reduce the controller energy consumption (leading to power loss in power electronics) while maintaining acceptable system performance. This paper aims to bridge the gap and investigates their correlations. It is shown a more thoughtful controller design can achieve better balance between energy consumption in power electronics control and system performance, which potentially leads to significant energy saving for integration of renewable power sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Congestion control in wireless networks is an important and open issue. Previous research has proven the poor performance of the Transport Control Protocol (TCP) in such networks. The factors that contribute to the poor performance of TCP in wireless environments concern its unsuitability to identify/detect and react properly to network events, its TCP window based ow control algorithm that is not suitable for the wireless channel, and the congestion collapse due to mobility. New rate based mechanisms have been proposed to mitigate TCP performance in wired and wireless networks. However, these mechanisms also present poor performance, as they lack of suitable bandwidth estimation techniques for multi-hop wireless networks. It is thus important to improve congestion control performance in wireless networks, incorporating components that are suitable for wireless environments. A congestion control scheme which provides an e - cient and fair sharing of the underlying network capacity and available bandwidth among multiple competing applications is crucial to the definition of new e cient and fair congestion control schemes on wireless multi-hop networks. The Thesis is divided in three parts. First, we present a performance evaluation study of several congestion control protocols against TCP, in wireless mesh and ad-hoc networks. The obtained results show that rate based congestion control protocols need an eficient and accurate underlying available bandwidth estimation technique. The second part of the Thesis presents a new link capacity and available bandwidth estimation mechanism denoted as rt-Winf (real time wireless inference). The estimation is performed in real-time and without the need to intrusively inject packets in the network. Simulation results show that rt-Winf obtains the available bandwidth and capacity estimation with accuracy and without introducing overhead trafic in the network. The third part of the Thesis proposes the development of new congestion control mechanisms to address the congestion control problems of wireless networks. These congestion control mechanisms use cross layer information, obtained by rt-Winf, to accurately and eficiently estimate the available bandwidth and the path capacity over a wireless network path. Evaluation of these new proposed mechanisms, through ns-2 simulations, shows that the cooperation between rt-Winf and the congestion control algorithms is able to significantly increase congestion control eficiency and network performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uma das áreas de investigação em Telecomunicações de interesse crescente prende-se com os futuros sistemas de comunicações móveis de 4a geração e além destes. Nos últimos anos tem sido desenvolvido o conceito de redes comunitárias, no qual os utilizadores se agregam de acordo com interesses comuns. Estes conceitos têm sido explorados de uma forma horizontal em diferentes camadas da comunicação, desde as redes comunitárias de comunicação (Seattle Wireless ou Personal Telco, p.ex.) até às redes de interesses peer-to-peer. No entanto, estas redes são usualmente vistas como redes de overlay, ou simplesmente redes de associação livre. Na prática, a noção de uma rede auto-organizada, completamente orientada ao serviço/comunidade, integralmente suportada em termos de arquitetura, não existe. Assim este trabalho apresenta uma realização original nesta área de criação de redes comunitárias, com uma arquitetura subjacente orientada a serviço, e que suporta integralmente múltiplas redes comunitárias no mesmo dispositivo, com todas as características de segurança, confiança e disponibilização de serviço necessárias neste tipo de cenários (um nó pode pertencer simultaneamente a mais do que uma rede comunitária). Devido à sua importância para os sistemas de redes comunitárias, foi dado particular atenção a aspetos de gestão de recursos e controlo de acessos. Ambos realizados de uma forma descentralizada e considerando mecanismos dotados de grande escalabilidade. Para isso, é apresentada uma linguagem de políticas que suporta a criação de comunidades virtuais. Esta linguagem não é apenas utilizada para o mapeamento da estrutura social dos membros da comunidade, como para, gerir dispositivos, recursos e serviços detidos pelos membros, de uma forma controlada e distribuída.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, a new paradigm for communication called cooperative communications has been proposed for which initial information theoretic studies have shown the potential for improvements in capacity over traditional multi-hop wireless networks. Extensive research has been done to mitigate the impact of fading in wireless networks, being mostly focused on Multiple-Input Multiple-Output (MIMO) systems. Recently, cooperative relaying techniques have been investigated to increase the performance of wireless systems by using diversity created by different single antenna devices, aiming to reach the same level of performance of MIMO systems with low cost devices. Cooperative communication is a promising method to achieve high spectrum efficiency and improve transmission capacity for wireless networks. Cooperative communications is the general idea of pooling the resources of distributed nodes to improve the overall performance of a wireless network. In cooperative networks the nodes cooperate to help each other. A cooperative node offering help is acting like a middle man or proxy and can convey messages from source to destination. Cooperative communication involves exploiting the broadcast nature of the wireless medium to form virtual antenna arrays out of independent singleantenna network nodes for transmission. This research aims at contributing to the field of cooperative wireless networks. The focus of this research is on the relay-based Medium Access Control (MAC) protocol. Specifically, I provide a framework for cooperative relaying called RelaySpot which comprises on opportunistic relay selection, cooperative relay scheduling and relay switching. RelaySpot-based solutions are expected to minimize signaling exchange, remove estimation of channel conditions, and improve the utilization of spatial diversity, minimizing outage and increasing reliability.