869 resultados para NETWORK DESIGN PROBLEMS
Resumo:
In vielen Bereichen der industriellen Fertigung, wie zum Beispiel in der Automobilindustrie, wer- den digitale Versuchsmodelle (sog. digital mock-ups) eingesetzt, um die Entwicklung komplexer Maschinen m ̈oglichst gut durch Computersysteme unterstu ̈tzen zu k ̈onnen. Hierbei spielen Be- wegungsplanungsalgorithmen eine wichtige Rolle, um zu gew ̈ahrleisten, dass diese digitalen Pro- totypen auch kollisionsfrei zusammengesetzt werden k ̈onnen. In den letzten Jahrzehnten haben sich hier sampling-basierte Verfahren besonders bew ̈ahrt. Diese erzeugen eine große Anzahl von zuf ̈alligen Lagen fu ̈r das ein-/auszubauende Objekt und verwenden einen Kollisionserken- nungsmechanismus, um die einzelnen Lagen auf Gu ̈ltigkeit zu u ̈berpru ̈fen. Daher spielt die Kollisionserkennung eine wesentliche Rolle beim Design effizienter Bewegungsplanungsalgorith- men. Eine Schwierigkeit fu ̈r diese Klasse von Planern stellen sogenannte “narrow passages” dar, schmale Passagen also, die immer dort auftreten, wo die Bewegungsfreiheit der zu planenden Objekte stark eingeschr ̈ankt ist. An solchen Stellen kann es schwierig sein, eine ausreichende Anzahl von kollisionsfreien Samples zu finden. Es ist dann m ̈oglicherweise n ̈otig, ausgeklu ̈geltere Techniken einzusetzen, um eine gute Performance der Algorithmen zu erreichen.rnDie vorliegende Arbeit gliedert sich in zwei Teile: Im ersten Teil untersuchen wir parallele Kollisionserkennungsalgorithmen. Da wir auf eine Anwendung bei sampling-basierten Bewe- gungsplanern abzielen, w ̈ahlen wir hier eine Problemstellung, bei der wir stets die selben zwei Objekte, aber in einer großen Anzahl von unterschiedlichen Lagen auf Kollision testen. Wir im- plementieren und vergleichen verschiedene Verfahren, die auf Hu ̈llk ̈operhierarchien (BVHs) und hierarchische Grids als Beschleunigungsstrukturen zuru ̈ckgreifen. Alle beschriebenen Verfahren wurden auf mehreren CPU-Kernen parallelisiert. Daru ̈ber hinaus vergleichen wir verschiedene CUDA Kernels zur Durchfu ̈hrung BVH-basierter Kollisionstests auf der GPU. Neben einer un- terschiedlichen Verteilung der Arbeit auf die parallelen GPU Threads untersuchen wir hier die Auswirkung verschiedener Speicherzugriffsmuster auf die Performance der resultierenden Algo- rithmen. Weiter stellen wir eine Reihe von approximativen Kollisionstests vor, die auf den beschriebenen Verfahren basieren. Wenn eine geringere Genauigkeit der Tests tolerierbar ist, kann so eine weitere Verbesserung der Performance erzielt werden.rnIm zweiten Teil der Arbeit beschreiben wir einen von uns entworfenen parallelen, sampling- basierten Bewegungsplaner zur Behandlung hochkomplexer Probleme mit mehreren “narrow passages”. Das Verfahren arbeitet in zwei Phasen. Die grundlegende Idee ist hierbei, in der er- sten Planungsphase konzeptionell kleinere Fehler zuzulassen, um die Planungseffizienz zu erh ̈ohen und den resultierenden Pfad dann in einer zweiten Phase zu reparieren. Der hierzu in Phase I eingesetzte Planer basiert auf sogenannten Expansive Space Trees. Zus ̈atzlich haben wir den Planer mit einer Freidru ̈ckoperation ausgestattet, die es erlaubt, kleinere Kollisionen aufzul ̈osen und so die Effizienz in Bereichen mit eingeschr ̈ankter Bewegungsfreiheit zu erh ̈ohen. Optional erlaubt unsere Implementierung den Einsatz von approximativen Kollisionstests. Dies setzt die Genauigkeit der ersten Planungsphase weiter herab, fu ̈hrt aber auch zu einer weiteren Perfor- mancesteigerung. Die aus Phase I resultierenden Bewegungspfade sind dann unter Umst ̈anden nicht komplett kollisionsfrei. Um diese Pfade zu reparieren, haben wir einen neuartigen Pla- nungsalgorithmus entworfen, der lokal beschr ̈ankt auf eine kleine Umgebung um den bestehenden Pfad einen neuen, kollisionsfreien Bewegungspfad plant.rnWir haben den beschriebenen Algorithmus mit einer Klasse von neuen, schwierigen Metall- Puzzlen getestet, die zum Teil mehrere “narrow passages” aufweisen. Unseres Wissens nach ist eine Sammlung vergleichbar komplexer Benchmarks nicht ̈offentlich zug ̈anglich und wir fan- den auch keine Beschreibung von vergleichbar komplexen Benchmarks in der Motion-Planning Literatur.
Resumo:
Cloud services are becoming ever more important for everyone's life. Cloud storage? Web mails? Yes, we don't need to be working in big IT companies to be surrounded by cloud services. Another thing that's growing in importance, or at least that should be considered ever more important, is the concept of privacy. The more we rely on services of which we know close to nothing about, the more we should be worried about our privacy. In this work, I will analyze a prototype software based on a peer to peer architecture for the offering of cloud services, to see if it's possible to make it completely anonymous, meaning that not only the users using it will be anonymous, but also the Peers composing it will not know the real identity of each others. To make it possible, I will make use of anonymizing networks like Tor. I will start by studying the state of art of Cloud Computing, by looking at some real example, followed by analyzing the architecture of the prototype, trying to expose the differences between its distributed nature and the somehow centralized solutions offered by the famous vendors. After that, I will get as deep as possible into the working principle of the anonymizing networks, because they are not something that can just be 'applied' mindlessly. Some de-anonymizing techniques are very subtle so things must be studied carefully. I will then implement the required changes, and test the new anonymized prototype to see how its performances differ from those of the standard one. The prototype will be run on many machines, orchestrated by a tester script that will automatically start, stop and do all the required API calls. As to where to find all these machines, I will make use of Amazon EC2 cloud services and their on-demand instances.
Resumo:
Background: Medication-related problems are common in the growing population of older adults and inappropriate prescribing is a preventable risk factor. Explicit criteria such as the Beers criteria provide a valid instrument for describing the rate of inappropriate medication (IM) prescriptions among older adults. Objective: To reduce IM prescriptions based on explicit Beers criteria using a nurse-led intervention in a nursing-home (NH) setting. Study Design: The pre/post-design included IM assessment at study start (pre-intervention), a 4-month intervention period, IM assessment after the intervention period (post-intervention) and a further IM assessment at 1-year follow-up. Setting: 204-bed inpatient NH in Bern, Switzerland. Participants: NH residents aged ≥60 years. Intervention: The intervention included four key intervention elements: (i) adaptation of Beers criteria to the Swiss setting; (ii) IM identification; (iii) IM discontinuation; and (iv) staff training. Main Outcome Measure: IM prescription at study start, after the 4-month intervention period and at 1-year follow-up. Results: The mean±SD resident age was 80.3±8.8 years. Residents were prescribed a mean±SD 7.8±4.0 medications. The prescription rate of IMs decreased from 14.5% pre-intervention to 2.8% post-intervention (relative risk [RR] = 0.2; 95% CI 0.06, 0.5). The risk of IM prescription increased nonstatistically significantly in the 1-year follow-up period compared with post-intervention (RR = 1.6; 95% CI 0.5, 6.1). Conclusions: This intervention to reduce IM prescriptions based on explicit Beers criteria was feasible, easy to implement in an NH setting, and resulted in a substantial decrease in IMs. These results underscore the importance of involving nursing staff in the medication prescription process in a long-term care setting.
Resumo:
OBJECTIVE: To determine the effect of glucosamine, chondroitin, or the two in combination on joint pain and on radiological progression of disease in osteoarthritis of the hip or knee. Design Network meta-analysis. Direct comparisons within trials were combined with indirect evidence from other trials by using a Bayesian model that allowed the synthesis of multiple time points. MAIN OUTCOME MEASURE: Pain intensity. Secondary outcome was change in minimal width of joint space. The minimal clinically important difference between preparations and placebo was prespecified at -0.9 cm on a 10 cm visual analogue scale. DATA SOURCES: Electronic databases and conference proceedings from inception to June 2009, expert contact, relevant websites. Eligibility criteria for selecting studies Large scale randomised controlled trials in more than 200 patients with osteoarthritis of the knee or hip that compared glucosamine, chondroitin, or their combination with placebo or head to head. Results 10 trials in 3803 patients were included. On a 10 cm visual analogue scale the overall difference in pain intensity compared with placebo was -0.4 cm (95% credible interval -0.7 to -0.1 cm) for glucosamine, -0.3 cm (-0.7 to 0.0 cm) for chondroitin, and -0.5 cm (-0.9 to 0.0 cm) for the combination. For none of the estimates did the 95% credible intervals cross the boundary of the minimal clinically important difference. Industry independent trials showed smaller effects than commercially funded trials (P=0.02 for interaction). The differences in changes in minimal width of joint space were all minute, with 95% credible intervals overlapping zero. Conclusions Compared with placebo, glucosamine, chondroitin, and their combination do not reduce joint pain or have an impact on narrowing of joint space. Health authorities and health insurers should not cover the costs of these preparations, and new prescriptions to patients who have not received treatment should be discouraged.
Resumo:
A new hearing therapy based on direct acoustic cochlear stimulation was developed for the treatment of severe to profound mixed hearing loss. The device efficacy was validated in an initial clinical trial with four patients. This semi-implantable investigational device consists of an externally worn audio processor, a percutaneous connector, and an implantable microactuator. The actuator is placed in the mastoid bone, right behind the external auditory canal. It generates vibrations that are directly coupled to the inner ear fluids and that, therefore, bypass the external and the middle ear. The system is able to provide an equivalent sound pressure level of 125 dB over the frequency range between 125 and 8000 Hz. The hermetically sealed actuator is designed to provide maximal output power by keeping its dimensions small enough to enable implantation. A network model is used to simulate the dynamic characteristics of the actuator to adjust its transfer function to the characteristics of the middle ear. The geometry of the different actuator components is optimized using finite-element modeling.
Resumo:
The Default Mode Network (DMN) is a higher order functional neural network that displays activation during passive rest and deactivation during many types of cognitive tasks. Accordingly, the DMN is viewed to represent the neural correlate of internally-generated self-referential cognition. This hypothesis implies that the DMN requires the involvement of cognitive processes, like declarative memory. The present study thus examines the spatial and functional convergence of the DMN and the semantic memory system. Using an active block-design functional Magnetic Resonance Imaging (fMRI) paradigm and Independent Component Analysis (ICA), we trace the DMN and fMRI signal changes evoked by semantic, phonological and perceptual decision tasks upon visually-presented words. Our findings show less deactivation during semantic compared to the two non-semantic tasks for the entire DMN unit and within left-hemispheric DMN regions, i.e., the dorsal medial prefrontal cortex, the anterior cingulate cortex, the retrosplenial cortex, the angular gyrus, the middle temporal gyrus and the anterior temporal region, as well as the right cerebellum. These results demonstrate that well-known semantic regions are spatially and functionally involved in the DMN. The present study further supports the hypothesis of the DMN as an internal mentation system that involves declarative memory functions.
Resumo:
We conducted a qualitative, multicenter study using a focus group design to explore the lived experiences of persons with any kind of primary sleep disorder with regard to functioning and contextual factors using six open-ended questions related to the International Classification of Functioning, Disability and Health (ICF) components. We classified the results using the ICF as a frame of reference. We identified the meaningful concepts within the transcribed data and then linked them to ICF categories according to established linking rules. The six focus groups with 27 participants yielded a total of 6986 relevant concepts, which were linked to a total of 168 different second-level ICF categories. From the patient perspective, the ICF components: (1) Body Functions; (2) Activities & Participation; and (3) Environmental Factors were equally represented; while (4) Body Structures appeared poignantly less frequently. Out of the total number of concepts, 1843 concepts (26%) were assigned to the ICF component Personal Factors, which is not yet classified but could indicate important aspects of resource management and strategy development of those who have a sleep disorder. Therefore, treatment of patients with sleep disorders must not be limited to anatomical and (patho-)physiological changes, but should also consider a more comprehensive view that includes patient's demands, strategies and resources in daily life and the contextual circumstances surrounding the individual.
Resumo:
Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well as highlighting the differences between bone and soft tissue punctures. An overall average error of 4.23 mm and 3.07 mm for bone and soft tissue punctures, respectively, demonstrated the feasibility of using this system for such interventions. The proposed system workflow was shown to be effective in separating the preparation from the sterile phase, as well as in keeping the system manageable by a single operator. Among the distinct sources of error, the user error based on the system accuracy (defined as the distance from the planned target to the actual needle tip) appeared to be the most significant. Bone punctures showed higher user error, whereas soft tissue punctures showed higher tissue deformation error.
Resumo:
Objective To analyse the available evidence on cardiovascular safety of non-steroidal anti-inflammatory drugs. Design Network meta-analysis. Data sources Bibliographic databases, conference proceedings, study registers, the Food and Drug Administration website, reference lists of relevant articles, and reports citing relevant articles through the Science Citation Index (last update July 2009). Manufacturers of celecoxib and lumiracoxib provided additional data. Study selection All large scale randomised controlled trials comparing any non-steroidal anti-inflammatory drug with other non-steroidal anti-inflammatory drugs or placebo. Two investigators independently assessed eligibility. Data extraction The primary outcome was myocardial infarction. Secondary outcomes included stroke, death from cardiovascular disease, and death from any cause. Two investigators independently extracted data. Data synthesis 31 trials in 116 429 patients with more than 115 000 patient years of follow-up were included. Patients were allocated to naproxen, ibuprofen, diclofenac, celecoxib, etoricoxib, rofecoxib, lumiracoxib, or placebo. Compared with placebo, rofecoxib was associated with the highest risk of myocardial infarction (rate ratio 2.12, 95% credibility interval 1.26 to 3.56), followed by lumiracoxib (2.00, 0.71 to 6.21). Ibuprofen was associated with the highest risk of stroke (3.36, 1.00 to 11.6), followed by diclofenac (2.86, 1.09 to 8.36). Etoricoxib (4.07, 1.23 to 15.7) and diclofenac (3.98, 1.48 to 12.7) were associated with the highest risk of cardiovascular death. Conclusions Although uncertainty remains, little evidence exists to suggest that any of the investigated drugs are safe in cardiovascular terms. Naproxen seemed least harmful. Cardiovascular risk needs to be taken into account when prescribing any non-steroidal anti-inflammatory drug.
Resumo:
For centuries the science of pharmacognosy has dominated rational drug development until it was gradually substituted by target-based drug discovery in the last fifty years. Pharmacognosy stems from the different systems of traditional herbal medicine and its "reverse pharmacology" approach has led to the discovery of numerous pharmacologically active molecules and drug leads for humankind. But do botanical drugs also provide effective mixtures? Nature has evolved distinct strategies to modulate biological processes, either by selectively targeting biological macromolecules or by creating molecular promiscuity or polypharmacology (one molecule binds to different targets). Widely claimed to be superior over monosubstances, mixtures of bioactive compounds in botanical drugs allegedly exert synergistic therapeutic effects. Despite evolutionary clues to molecular synergism in nature, sound experimental data are still widely lacking to support this assumption. In this short review, the emerging concept of network pharmacology is highlighted, and the importance of studying ligand-target networks for botanical drugs is emphasized. Furthermore, problems associated with studying mixtures of molecules with distinctly different pharmacodynamic properties are addressed. It is concluded that a better understanding of the polypharmacology and potential network pharmacology of botanical drugs is fundamental in the ongoing rationalization of phytotherapy.
Resumo:
Background Randomized controlled trials (RCTs) may be discontinued because of apparent harm, benefit, or futility. Other RCTs are discontinued early because of insufficient recruitment. Trial discontinuation has ethical implications, because participants consent on the premise of contributing to new medical knowledge, Research Ethics Committees (RECs) spend considerable effort reviewing study protocols, and limited resources for conducting research are wasted. Currently, little is known regarding the frequency and characteristics of discontinued RCTs. Methods/Design Our aims are, first, to determine the prevalence of RCT discontinuation for specific reasons; second, to determine whether the risk of RCT discontinuation for specific reasons differs between investigator- and industry-initiated RCTs; third, to identify risk factors for RCT discontinuation due to insufficient recruitment; fourth, to determine at what stage RCTs are discontinued; and fifth, to examine the publication history of discontinued RCTs. We are currently assembling a multicenter cohort of RCTs based on protocols approved between 2000 and 2002/3 by 6 RECs in Switzerland, Germany, and Canada. We are extracting data on RCT characteristics and planned recruitment for all included protocols. Completion and publication status is determined using information from correspondence between investigators and RECs, publications identified through literature searches, or by contacting the investigators. We will use multivariable regression models to identify risk factors for trial discontinuation due to insufficient recruitment. We aim to include over 1000 RCTs of which an anticipated 150 will have been discontinued due to insufficient recruitment. Discussion Our study will provide insights into the prevalence and characteristics of RCTs that were discontinued. Effective recruitment strategies and the anticipation of problems are key issues in the planning and evaluation of trials by investigators, Clinical Trial Units, RECs and funding agencies. Identification and modification of barriers to successful study completion at an early stage could help to reduce the risk of trial discontinuation, save limited resources, and enable RCTs to better meet their ethical requirements.
Resumo:
Independent component analysis (ICA) or seed based approaches (SBA) in functional magnetic resonance imaging blood oxygenation level dependent (BOLD) data became widely applied tools to identify functionally connected, large scale brain networks. Differences between task conditions as well as specific alterations of the networks in patients as compared to healthy controls were reported. However, BOLD lacks the possibility of quantifying absolute network metabolic activity, which is of particular interest in the case of pathological alterations. In contrast, arterial spin labeling (ASL) techniques allow quantifying absolute cerebral blood flow (CBF) in rest and in task-related conditions. In this study, we explored the ability of identifying networks in ASL data using ICA and to quantify network activity in terms of absolute CBF values. Moreover, we compared the results to SBA and performed a test-retest analysis. Twelve healthy young subjects performed a fingertapping block-design experiment. During the task pseudo-continuous ASL was measured. After CBF quantification the individual datasets were concatenated and subjected to the ICA algorithm. ICA proved capable to identify the somato-motor and the default mode network. Moreover, absolute network CBF within the separate networks during either condition could be quantified. We could demonstrate that using ICA and SBA functional connectivity analysis is feasible and robust in ASL-CBF data. CBF functional connectivity is a novel approach that opens a new strategy to evaluate differences of network activity in terms of absolute network CBF and thus allows quantifying inter-individual differences in the resting state and task-related activations and deactivations.
Resumo:
Objectives: Recent anatomical-functional studies have transformed our understanding of cerebral motor control away from a hierarchical structure and toward parallel and interconnected specialized circuits. Subcortical electrical stimulation during awake surgery provides a unique opportunity to identify white matter tracts involved in motor control. For the first time, this study reports the findings on motor modulatory responses evoked by subcortical stimulation and investigates the cortico-subcortical connectivity of cerebral motor control. Experimental design: Twenty-one selected patients were operated while awake for frontal, insular, and parietal diffuse low-grade gliomas. Subcortical electrostimulation mapping was used to search for interference with voluntary movements. The corresponding stimulation sites were localized on brain schemas using the anterior and posterior commissures method. Principal observations: Subcortical negative motor responses were evoked in 20/21 patients, whereas acceleration of voluntary movements and positive motor responses were observed in three and five patients, respectively. The majority of the stimulation sites were detected rostral of the corticospinal tract near the vertical anterior-commissural line, and additional sites were seen in the frontal and parietal white matter. Conclusions: The diverse interferences with motor function resulting in inhibition and acceleration imply a modulatory influence of the detected fiber network. The subcortical stimulation sites were distributed veil-like, anterior to the primary motor fibers, suggesting descending pathways originating from premotor areas known for negative motor response characteristics. Further stimulation sites in the parietal white matter as well as in the anterior arm of the internal capsule indicate a large-scale fronto-parietal motor control network. Hum Brain Mapp, 2012. © 2012 Wiley Periodicals, Inc.
Resumo:
The deterioration of performance over time is characteristic for sustained attention tasks. This so-called "performance decrement" is measured by the increase of reaction time (RT) over time. Some behavioural and neurobiological mechanisms of this phenomenon are not yet fully understood. Behaviourally, we examined the increase of RT over time and the inter-individual differences of this performance decrement. On the neurophysiological level, we investigated the task-relevant brain areas where neural activity was modulated by RT and searched for brain areas involved in good performance (i.e. participants with no or moderate performance decrement) as compared to poor performance (i.e. participants with a steep performance decrement). For this purpose, 20 healthy, young subjects performed a carefully designed task for simple sustained attention, namely a low-demanding version of the Rapid Visual Information Processing task. We employed a rapid event-related functional magnetic resonance imaging (fMRI) design. The behavioural results showed a significant increase of RT over time in the whole group, and also revealed that some participants were not as prone to the performance decrement as others. The latter was statistically significant comparing good versus poor performers. Moreover, high BOLD-responses were linked to longer RTs in a task-relevant bilateral fronto-cingulate-insular-parietal network. Among these regions, good performance was associated with significantly higher RT-BOLD correlations in the pre-supplementary motor area (pre-SMA). We concluded that the task-relevant bilateral fronto-cingulate-insular-parietal network was a cognitive control network responsible for goal-directed attention. The pre-SMA in particular might be associated with the performance decrement insofar that good performers could sustain activity in this brain region in order to monitor performance declines and adjust behavioural output.
Resumo:
PURPOSE: There is a need for valid and reliable short scales that can be used to assess social networks and social supports and to screen for social isolation in older persons. DESIGN AND METHODS: The present study is a cross-national and cross-cultural evaluation of the performance of an abbreviated version of the Lubben Social Network Scale (LSNS-6), which was used to screen for social isolation among community-dwelling older adult populations in three European countries. Based on the concept of lack of redundancy of social ties we defined clinical cut-points of the LSNS-6 for identifying persons deemed at risk for social isolation. RESULTS: Among all three samples, the LSNS-6 and two subscales (Family and Friends) demonstrated high levels of internal consistency, stable factor structures, and high correlations with criterion variables. The proposed clinical cut-points showed good convergent validity, and classified 20% of the respondents in Hamburg, 11% of those in Solothurn (Switzerland), and 15% of those in London as at risk for social isolation. IMPLICATIONS: We conclude that abbreviated scales such as the LSNS-6 should be considered for inclusion in practice protocols of gerontological practitioners. Screening older persons based on the LSNS-6 provides quantitative information on their family and friendship ties, and identifies persons at increased risk for social isolation who might benefit from in-depth assessment and targeted interventions.