811 resultados para Path optimization
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
The paper develops a stability theory for the optimal value and the optimal set mapping of optimization problems posed in a Banach space. The problems considered in this paper have an arbitrary number of inequality constraints involving lower semicontinuous (not necessarily convex) functions and one closed abstract constraint set. The considered perturbations lead to problems of the same type as the nominal one (with the same space of variables and the same number of constraints), where the abstract constraint set can also be perturbed. The spaces of functions involved in the problems (objective and constraints) are equipped with the metric of the uniform convergence on the bounded sets, meanwhile in the space of closed sets we consider, coherently, the Attouch-Wets topology. The paper examines, in a unified way, the lower and upper semicontinuity of the optimal value function, and the closedness, lower and upper semicontinuity (in the sense of Berge) of the optimal set mapping. This paper can be seen as a second part of the stability theory presented in [17], where we studied the stability of the feasible set mapping (completed here with the analysis of the Lipschitz-like property).
Resumo:
This paper presents the Juste-Neige system for predicting the snow height on the ski runs of a resort using a multi-agent simulation software. Its aim is to facilitate snow cover management in order to i) reduce the production cost of artificial snow and to improve the profit margin for the companies managing the ski resorts; and ii) to reduce the water and energy consumption, and thus to reduce the environmental impact, by producing only the snow needed for a good skiing experience. The software provides maps with the predicted snow heights for up to 13 days. On these maps, the areas most exposed to snow erosion are highlighted. The software proceeds in three steps: i) interpolation of snow height measurements with a neural network; ii) local meteorological forecasts for every ski resort; iii) simulation of the impact caused by skiers using a multi-agent system. The software has been evaluated in the Swiss ski resort of Verbier and provides useful predictions.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
RATIONALE AND OBJECTIVES: To determine optimum spatial resolution when imaging peripheral arteries with magnetic resonance angiography (MRA). MATERIALS AND METHODS: Eight vessel diameters ranging from 1.0 to 8.0 mm were simulated in a vascular phantom. A total of 40 three-dimensional flash MRA sequences were acquired with incremental variations of fields of view, matrix size, and slice thickness. The accurately known eight diameters were combined pairwise to generate 22 "exact" degrees of stenosis ranging from 42% to 87%. Then, the diameters were measured in the MRA images by three independent observers and with quantitative angiography (QA) software and used to compute the degrees of stenosis corresponding to the 22 "exact" ones. The accuracy and reproducibility of vessel diameter measurements and stenosis calculations were assessed for vessel size ranging from 6 to 8 mm (iliac artery), 4 to 5 mm (femoro-popliteal arteries), and 1 to 3 mm (infrapopliteal arteries). Maximum pixel dimension and slice thickness to obtain a mean error in stenosis evaluation of less than 10% were determined by linear regression analysis. RESULTS: Mean errors on stenosis quantification were 8.8% +/- 6.3% for 6- to 8-mm vessels, 15.5% +/- 8.2% for 4- to 5-mm vessels, and 18.9% +/- 7.5% for 1- to 3-mm vessels. Mean errors on stenosis calculation were 12.3% +/- 8.2% for observers and 11.4% +/- 15.1% for QA software (P = .0342). To evaluate stenosis with a mean error of less than 10%, maximum pixel surface, the pixel size in the phase direction, and the slice thickness should be less than 1.56 mm2, 1.34 mm, 1.70 mm, respectively (voxel size 2.65 mm3) for 6- to 8-mm vessels; 1.31 mm2, 1.10 mm, 1.34 mm (voxel size 1.76 mm3), for 4- to 5-mm vessels; and 1.17 mm2, 0.90 mm, 0.9 mm (voxel size 1.05 mm3) for 1- to 3-mm vessels. CONCLUSION: Higher spatial resolution than currently used should be selected for imaging peripheral vessels.
Resumo:
Integrating and expressing stably a transgene into the cellular genome remain major challenges for gene-based therapies and for bioproduction purposes. While transposon vectors mediate efficient transgene integration, expression may be limited by epigenetic silencing, and persistent transposase expression may mediate multiple transposition cycles. Here, we evaluated the delivery of the piggyBac transposase messenger RNA combined with genetically insulated transposons to isolate the transgene from neighboring regulatory elements and stabilize expression. A comparison of piggyBac transposase expression from messenger RNA and DNA vectors was carried out in terms of expression levels, transposition efficiency, transgene expression and genotoxic effects, in order to calibrate and secure the transposition-based delivery system. Messenger RNA reduced the persistence of the transposase to a narrow window, thus decreasing side effects such as superfluous genomic DNA cleavage. Both the CTF/NF1 and the D4Z4 insulators were found to mediate more efficient expression from a few transposition events. We conclude that the use of engineered piggyBac transposase mRNA and insulated transposons offer promising ways of improving the quality of the integration process and sustaining the expression of transposon vectors.
Resumo:
 The Government is committed to ending the unfair, unequal and inefficient two-tier health system and to introducing a single-tier system, supported by universal health insurance The Government will achieve a single-tier system via a multi-payer model of universal health insurance (UHI), in line with the Programme for Government (PfG), involving competing private health insurers and a State-owned VHI. UHI will be gradually rolled out over several years, with full implementation by 2019 at the latest. Click here to download the White Paper (PDF, 1.5mb) Read the UHI Explained document (PDF, 200kb). See the stakeholder briefing (PDF, 400kb)
Resumo:
Tractography is a class of algorithms aiming at in vivo mapping the major neuronal pathways in the white matter from diffusion magnetic resonance imaging (MRI) data. These techniques offer a powerful tool to noninvasively investigate at the macroscopic scale the architecture of the neuronal connections of the brain. However, unfortunately, the reconstructions recovered with existing tractography algorithms are not really quantitative even though diffusion MRI is a quantitative modality by nature. As a matter of fact, several techniques have been proposed in recent years to estimate, at the voxel level, intrinsic microstructural features of the tissue, such as axonal density and diameter, by using multicompartment models. In this paper, we present a novel framework to reestablish the link between tractography and tissue microstructure. Starting from an input set of candidate fiber-tracts, which are estimated from the data using standard fiber-tracking techniques, we model the diffusion MRI signal in each voxel of the image as a linear combination of the restricted and hindered contributions generated in every location of the brain by these candidate tracts. Then, we seek for the global weight of each of them, i.e., the effective contribution or volume, such that they globally fit the measured signal at best. We demonstrate that these weights can be easily recovered by solving a global convex optimization problem and using efficient algorithms. The effectiveness of our approach has been evaluated both on a realistic phantom with known ground-truth and in vivo brain data. Results clearly demonstrate the benefits of the proposed formulation, opening new perspectives for a more quantitative and biologically plausible assessment of the structural connectivity of the brain.
Resumo:
Undernutrition is a widespread problem in intensive care unit and is associated with a worse clinical outcome. A state of negative energy balance increases stress catabolism and is associated with increased morbidity and mortality in ICU patients. Undernutrition-related increased morbidity is correlated with an increase in the length of hospital stay and health care costs. Enteral nutrition is the recommended feeding route in critically ill patients, but it is often insufficient to cover the nutritional needs. The initiation of supplemental parenteral nutrition, when enteral nutrition is insufficient, could optimize the nutritional therapy by preventing the onset of early energy deficiency, and thus, could allow to reduce morbidity, length of stay and costs, shorten recovery period and, finally, improve quality of life. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
The preclinical Alzheimer's disease (AD) - amnestic mild cognitive impairment (MCI) - is manifested by phenotypes classified into exclusively memory (single-domain) MCI (sMCI) and multiple-domain MCI (mMCI). We suggest that typical MCI-to-AD progression occurs through the sMCI-to-mMCI sequence as a result of the extension of initial pathological processes. To support this hypothesis, we assess myelin content with a Magnetization Transfer Ratio (MTR) in 21 sMCI and 21 mMCI patients and in 42 age-, sex-, and education-matched controls. A conjunction analysis revealed MTR reduction shared by sMCI and mMCI groups in the medial temporal lobe and posterior structures including white matter (WM: splenium, posterior corona radiata) and gray matter (GM: hippocampus; parahippocampal and lingual gyri). A disjunction analysis showed the spread of demyelination to prefrontal WM and insula GM in executive mMCI. Our findings suggest that demyelination starts in the structures affected by neurofibrillary pathology; its presence correlates with the clinical picture and indicates the method of MCI-to-AD progression. In vivo staging of preclinical AD can be developed in terms of WM/GM demyelination.
Resumo:
An impressive array of cellular and molecular adaptive responses achieves homeostasis. The inflammatory reaction is an adaptive response triggered by an insult to culminate into the overt cardinal signs of inflammation, eventually leading to resolution and returning the organism back to its original centered state. This article focuses on some aspects of the lipoxin A4 signaling pathway during the resolution phase, to better understand molecular mechanisms by which a neutrophil directs an inflammatory reaction to switch off and resume homeostasis. Defining the resolution state of a neutrophil at the molecular level will aid in treatments of diseases that are associated with an exaggerated and uncontrolled inflammation.