75 resultados para End-to-side neurorrhaphy


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: This is an update of a previous review (McGuinness 2006). Hypertension and cognitive impairment are prevalent in older people. Hypertension is a direct risk factor for vascular dementia (VaD) and recent studies have suggested hypertension impacts upon prevalence of Alzheimer's disease (AD). Therefore does treatment of hypertension prevent cognitive decline?
Objectives: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease.
Search strategy: The Specialized Register of the Cochrane Dementia and Cognitive Improvement Group, The Cochrane Library, MEDLINE, EMBASE, PsycINFO, CINAHL, LILACS as well as many trials databases and grey literature sources were searched on 13 February 2008 using the terms: hypertens$ OR anti-hypertens$. Selection criteria: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months.
Data collection and analysis: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life.
Main results: Four trials including 15,936 hypertensive subjects were identified. Average age was 75.4 years. Mean blood pressure at entry across the studies was 171/86 mmHg. The combined result of the four trials reporting incidence of dementia indicated no significant difference between treatment and placebo (236/7767 versus 259/7660, Odds Ratio (OR) = 0.89, 95% CI 0.74, 1.07) and there was considerable heterogeneity between the trials. The combined results from the three trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.42, 95%CI 0.30, 0.53). Both systolic and diastolic blood pressure levels were reduced significantly in the three trials assessing this outcome (WMD = -10.22, 95% CI -10.78, -9.66 for systolic blood pressure, WMD = -4.28, 95% CI -4.58, -3.98 for diastolic blood pressure). Three trials reported adverse effects requiring discontinuation of treatment and the combined results indicated no significant difference (OR = 1.01, 95% CI 0.92, 1.11). When analysed separately, however, more patients on placebo in Syst Eur 1997 were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the four studies. Analysis of the included studies in this review was problematic as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen.
Authors' conclusions: There is no convincing evidence fromthe trials identified that blood pressure lowering in late-life prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients who received active treatment. This introduced bias. More robust results may be obtained by conducting a meta-analysis using individual patient data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To optimize the performance of wireless networks, one needs to consider the impact of key factors such as interference from hidden nodes, the capture effect, the network density and network conditions (saturated versus non-saturated). In this research, our goal is to quantify the impact of these factors and to propose effective mechanisms and algorithms for throughput guarantees in multi-hop wireless networks. For this purpose, we have developed a model that takes into account all these key factors, based on which an admission control algorithm and an end-to-end available bandwidth estimation algorithm are proposed. Given the necessary network information and traffic demands as inputs, these algorithms are able to provide predictive control via an iterative approach. Evaluations using analytical comparison with simulations as well as existing research show that the proposed model and algorithms are accurate and effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individuals who have been subtly reminded of death display heightened in-group favouritism, or “worldview defense.” Terror management theory argues (i) that death cues engender worldview defense via psychological mechanisms specifically evolved to suppress death anxiety, and (ii) that the core function of religiosity is to suppress death anxiety. Thus, terror management theory predicts that extremely religious individuals will not evince worldview defense. Here, two studies are presented in support of an alternative perspective. According to the unconscious vigilance hypothesis, subtly processed threats (which need not pertain to death) heighten sensitivity to affectively valenced stimuli (which need not pertain to cultural attitudes). From this perspective, religiosity mitigates the influence of mortality-salience only insofar as afterlife doctrines reduce the perceived threat posed by death. Tibetan Buddhism portrays death as a perilous gateway to rebirth rather than an end to suffering; faith in this doctrine should therefore not be expected to nullify mortality-salience effects. In Study 1, devout Tibetan Buddhists who were subtly reminded of death produced exaggerated aesthetic ratings unrelated to cultural worldviews. In Study 2, devout Tibetan Buddhists produced worldview defense following subliminal exposure to non-death cues of threat. The results demonstrate both the domain-generality of the process underlying worldview defense and the importance of religious doctrinal content in moderating mortality-salience effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fixed and wireless networks are increasingly converging towards common connectivity with IP-based core networks. Providing effective end-to-end resource and QoS management in such complex heterogeneous converged network scenarios requires unified, adaptive and scalable solutions to integrate and co-ordinate diverse QoS mechanisms of different access technologies with IP-based QoS. Policy-Based Network Management (PBNM) is one approach that could be employed to address this challenge. Hence, a policy-based framework for end-to-end QoS management in converged networks, CNQF (Converged Networks QoS Management Framework) has been proposed within our project. In this paper, the CNQF architecture, a Java implementation of its prototype and experimental validation of key elements are discussed. We then present a fuzzy-based CNQF resource management approach and study the performance of our implementation with real traffic flows on an experimental testbed. The results demonstrate the efficacy of our resource-adaptive approach for practical PBNM systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy-based network management (PBNM) paradigms provide an effective tool for end-to-end resource
management in converged next generation networks by enabling unified, adaptive and scalable solutions
that integrate and co-ordinate diverse resource management mechanisms associated with heterogeneous
access technologies. In our project, a PBNM framework for end-to-end QoS management in converged
networks is being developed. The framework consists of distributed functional entities managed within a
policy-based infrastructure to provide QoS and resource management in converged networks. Within any
QoS control framework, an effective admission control scheme is essential for maintaining the QoS of
flows present in the network. Measurement based admission control (MBAC) and parameter basedadmission control (PBAC) are two commonly used approaches. This paper presents the implementationand analysis of various measurement-based admission control schemes developed within a Java-based
prototype of our policy-based framework. The evaluation is made with real traffic flows on a Linux-based experimental testbed where the current prototype is deployed. Our results show that unlike with classic MBAC or PBAC only schemes, a hybrid approach that combines both methods can simultaneously result in improved admission control and network utilization efficiency

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates a dynamic buffer man-agement scheme for QoS control of multimedia services in be-yond 3G wireless systems. The scheme is studied in the context of the state-of-the-art 3.5G system i.e. the High Speed Downlink Packet Access (HSDPA) which enhances 3G UMTS to support high-speed packet switched services. Unlike earlier systems, UMTS-evolved systems from HSDPA and beyond incorporate mechanisms such as packet scheduling and HARQ in the base station necessitating data buffering at the air interface. This introduces a potential bottleneck to end-to-end communication. Hence, buffer management at the air interface is crucial for end-to-end QoS support of multimedia services with multi-plexed parallel diverse flows such as video and data in the same end-user session. The dynamic buffer management scheme for HSDPA multimedia sessions with aggregated real-time and non real-time flows is investigated via extensive HSDPA simulations. The impact of the scheme on end-to-end traffic performance is evaluated with an example multimedia session comprising a real-time streaming flow concurrent with TCP-based non real-time flow. Results demonstrate that the scheme can guar-antee the end-to-end QoS of the real-time streaming flow, whilst simultaneously protecting the non real-time flow from starva-tion resulting in improved end-to-end throughput performance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy-based management is considered an effective approach to address the challenges of resource management in large complex networks. Within the IU-ATC QoS Frameworks project, a policy-based network management framework, CNQF (Converged Networks QoS Framework) is being developed aimed at providing context-aware, end-to-end QoS control and resource management in converged next generation networks. CNQF is designed to provide homogeneous, transparent QoS control over heterogeneous access technologies by means of distributed functional entities that co-ordinate the resources of the transport network through policy-driven decisions. In this paper, we present a measurement-based evaluation of policy-driven QoS management based on CNQF architecture, with real traffic flows on an experimental testbed. A Java based implementation of the CNQF Resource Management Subsystem is deployed on the testbed and results of the experiments validate the framework operation for policy-based QoS management of real traffic flows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents and investigates a dynamic
buffer management scheme for QoS control of multimedia
services in a 3.5G wireless system i.e. the High Speed Downlink
Packet Access (HSDPA). HSDPA was introduced to enhance
UMTS for high-speed packet switched services. With HSDPA,
packet scheduling and HARQ mechanisms in the base station
require data buffering at the air interface thus introducing a
potential bottleneck to end-to-end communication. Hence, for
multimedia services with multiplexed parallel diverse flows
such as video and data in the same end-user session, buffer
management schemes in the base station are essential to support
end-to-end QoS provision. We propose a dynamic buffer management
scheme for HSDPA multimedia sessions with aggregated real-time and non real-time flows in the paper. The end-to-end performance impact of the scheme is evaluated with an example multimedia session comprising a real-time streaming
flow concurrent with TCP-based non real-time flow via extensive HSDPA simulations. Results demonstrate that the scheme can guarantee the end-to-end QoS of the real-time streaming flow, whilst simultaneously protecting non real-time flow from starvation resulting in improved end-to-end throughput performance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the impact of transmit antenna selection with receive generalized selection combining (TAS/GSC) for cognitive decode-and-forward (DF) relaying in Nakagami-m fading channels. We select a single transmit antenna at the secondary transmitter which maximizes the receive signal-to-noise ratio (SNR) and combine a subset of receive antennas with the largest SNRs at the secondary receiver. In an effort to assess the performance, we first derive the probability density function and cumulative distribution function of the end-to-end SNR using the moment generating function. We then derive new exact closed-form expression for the ergodic capacity. More importantly, by deriving the asymptotic expression for the high SNR approximation of the ergodic capacity, we gather deep insights into the high SNR slope and the power offset. Our results show that the high SNR slope is 1/2 under the proportional interference power constraint. Under the fixed interference power constraint, the high SNR slope is zero.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical transceivers have hardware impairments that create distortions which degrade the performance of communication systems. The vast majority of technical contributions in the area of relaying neglect hardware impairments and, thus, assume ideal hardware. Such approximations make sense in low-rate systems, but can lead to very misleading results when analyzing future high-rate systems. This paper quantifies the impact of hardware impairments on dual-hop relaying, for both amplify-and-forward and decode-and-forward protocols. The outage probability (OP) in these practical scenarios is a function of the effective end-to-end signal-to-noise-and-distortion ratio (SNDR). This paper derives new closed-form expressions for the exact and asymptotic OPs, accounting for hardware impairments at the source, relay, and destination. A similar analysis for the ergodic capacity is also pursued, resulting in new upper bounds. We assume that both hops are subject to independent but non-identically distributed Nakagami-m fading. This paper validates that the performance loss is small at low rates, but otherwise can be very substantial. In particular, it is proved that for high signal-to-noise ratio (SNR), the end-to-end SNDR converges to a deterministic constant, coined the SNDR ceiling, which is inversely proportional to the level of impairments. This stands in contrast to the ideal hardware case in which the end-to-end SNDR grows without bound in the high-SNR regime. Finally, we provide fundamental design guidelines for selecting hardware that satisfies the requirements of a practical relaying system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work investigates the end-to-end performance of randomized distributed space-time codes with complex Gaussian distribution, when employed in a wireless relay network. The relaying nodes are assumed to adopt a decode-and-forward strategy and transmissions are affected by small and large scale fading phenomena. Extremely tight, analytical approximations of the end-to-end symbol error probability and of the end-to-end outage probability are derived and successfully validated through Monte-Carlo simulation. For the high signal-to-noise ratio regime, a simple, closed-form expression for the symbol error probability is further provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exponential growth in user and application data entails new means for providing fault tolerance and protection against data loss. High Performance Com- puting (HPC) storage systems, which are at the forefront of handling the data del- uge, typically employ hardware RAID at the backend. However, such solutions are costly, do not ensure end-to-end data integrity, and can become a bottleneck during data reconstruction. In this paper, we design an innovative solution to achieve a flex- ible, fault-tolerant, and high-performance RAID-6 solution for a parallel file system (PFS). Our system utilizes low-cost, strategically placed GPUs — both on the client and server sides — to accelerate parity computation. In contrast to hardware-based approaches, we provide full control over the size, length and location of a RAID array on a per file basis, end-to-end data integrity checking, and parallelization of RAID array reconstruction. We have deployed our system in conjunction with the widely-used Lustre PFS, and show that our approach is feasible and imposes ac- ceptable overhead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we investigate the end-to-end performance of dual-hop proactive decode-and-forward relaying networks with Nth best relay selection in the presence of two practical deleterious effects: i) hardware impairment and ii) cochannel interference. In particular, we derive new exact and asymptotic closed-form expressions for the outage probability and average channel capacity of Nth best partial and opportunistic relay selection schemes over Rayleigh fading channels. Insightful discussions are provided. It is shown that, when the system cannot select the best relay for cooperation, the partial relay selection scheme outperforms the opportunistic method under the impact of the same co-channel interference (CCI). In addition, without CCI but under the effect of hardware impairment, it is shown that both selection strategies have the same asymptotic channel capacity. Monte Carlo simulations are presented to corroborate our analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To cope with the rapid growth of multimedia applications that requires dynamic levels of quality of service (QoS), cross-layer (CL) design, where multiple protocol layers are jointly combined, has been considered to provide diverse QoS provisions for mobile multimedia networks. However, there is a lack of a general mathematical framework to model such CL scheme in wireless networks with different types of multimedia classes. In this paper, to overcome this shortcoming, we therefore propose a novel CL design for integrated real-time/non-real-time traffic with strict preemptive priority via a finite-state Markov chain. The main strategy of the CL scheme is to design a Markov model by explicitly including adaptive modulation and coding at the physical layer, queuing at the data link layer, and the bursty nature of multimedia traffic classes at the application layer. Utilizing this Markov model, several important performance metrics in terms of packet loss rate, delay, and throughput are examined. In addition, our proposed framework is exploited in various multimedia applications, for example, the end-to-end real-time video streaming and CL optimization, which require the priority-based QoS adaptation for different applications. More importantly, the CL framework reveals important guidelines as to optimize the network performance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background This study evaluated the effect of statins in Primary biliary cirrhosis (PBC) on endothelial function, anti-oxidant status and vascular compliance. Methods Primary biliary cirrhosis patients with hypercholesterolaemia were randomized to receive 20mg simvastatin or placebo in a single blind, randomized controlled trial. Body mass index, blood pressure, glucose, liver function, lipid profile, immunoglobulin levels, serological markers of endothelial function and anti-oxidant status were measured as well as vascular compliance, calculated from pulse wave analysis and velocity, at recruitment and again at 3, 6, 9 and 12months. Results Twenty-one PBC patients (F=20, mean age = 55) were randomized to simvastatin 20mg (n=11) or matched placebo (n=10). At completion of the trial, serum cholesterol levels in the simvastatin group were significantly lower compared with the placebo group (4.91mmol/L vs. 6.15mmol/L, P=0.01). Low-density lipoprotein (LDL) levels after 12months were also significantly lower in the simvastatin group (2.33mmol/L vs. 3.53mmol/L, P=0.01). After 12months of treatment, lipid hydroperoxides were lower (0.49mol/L vs. 0.59mol/L, P=0.10) while vitamin C levels were higher (80.54mol/L vs. 77.40mol/L, P=0.95) in the simvastatin group. Pulse wave velocity remained similar between treatment groups at 12months (8.45m/s vs. 8.80m/s, P=0.66). Only one patient discontinued medication owing to side effects. No deterioration in liver transaminases was noted in the simvastatin group. Conclusions Statin therapy in patients with PBC appears safe and effective towards overall reductions in total cholesterol and LDL levels. Our initial study suggests that simvastatin may also confer advantageous effects on endothelial function and antioxidant status.