929 resultados para Solution of mathematical problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health safety during trips is based on previous counseling, vaccination and prevention of infections, previous diseases or specific problems related to the destination. Our aim was to assess two aspects, incidence of health problems related to travel and the traveler’s awareness of health safety. To this end we phone-interviewed faculty members of a large public University, randomly selected from humanities, engineering and health schools. Out of 520 attempts, we were able to contact 67 (12.9%) and 46 (68.6%) agreed to participate in the study. There was a large male proportion (37/44, 84.1%), mature adults mostly in their forties and fifties (32/44, 72.7%), all of them with higher education, as you would expect of faculty members. Most described themselves as being sedentary or as taking occasional exercise, with only 15.9% (7/44) taking regular exercise. Preexisting diseases were reported by 15 travelers. Most trips lasted usually one week or less. Duration of the travel was related to the destination, with (12h) or longer trips being taken by 68.2% (30/44) of travelers, and the others taking shorter (3h) domestic trips. Most travelling was made by air (41/44) and only 31.8% (14/44) of the trips were motivated by leisure. Field research trips were not reported. Specific health counseling previous to travel was reported only by two (4.5%). Twenty seven of them (61.4%) reported updated immunization, but 11/30 reported unchecked immunizations. 30% (9/30) reported travel without any health insurance coverage. As a whole group, 6 (13.6%) travelers reported at least one health problem attributed to the trip. All of them were males travelling abroad. Five presented respiratory infections, such as influenza and common cold, one neurological, one orthopedic, one social and one hypertension. There were no gender differences regarding age groups, destination, type of transport, previous health counseling, leisure travel motivation or pre-existing diseases. Interestingly, the two cases of previous health counseling were made by domestic travelers. Our data clearly shows that despite a significant number of travel related health problems, these highly educated faculty members, had a low awareness of those risks, and a significant number of travels are made without prior counseling or health insurance. A counseling program conducted by a tourism and health professional must be implemented for faculty members in order to increase the awareness of travel related health problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study evaluated Knoop hardness, surface roughness and color alteration of artificial teeth for dentures after immersion in water, 1% sodium hypochlorite and an experimental solution of 2% Ricinus communis (RC). Thirty specimens of Vipi, Biolux and Trilux were analyzed. Tests of Knoop hardness, surface roughness and color alteration were conducted immediately after specimen preparation (T0) and after two immersion protocols for 15 days (ΔT15) and 183 days (ΔT183). Data variation (ΔT) were subjected to ANOVA and Tukey's test (p<0.05). At ΔT15, Vipi presented hardness increase and Biolux presented the highest variation (p=0.01). RC caused the highest increase in hardness (p=0) and the lowest increase (p=0.005) in roughness. Biolux presented the lowest color alteration (p =0). At ΔT183, Trilux underwent the highest hardness variation (p=0). Biolux presented an increase in roughness (p=0). There was no significant differences in color alteration among the artificial teeth (p=0.06) and among solutions (p=0.08) after 183 days of immersion. All solutions (distilled water, 1% sodium hypochlorite and 2% RC) caused alterations on the analyzed properties. Both immersion protocols caused alterations on the analyzed properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to assess, using the DPPH assay, the antioxidant activity of several substances that could be proposed to immediately revert the problems caused by bleaching procedures. The percentage of antioxidant activity (AA%) of 10% ascorbic acid solution (AAcidS), 10% ascorbic acid gel (AAcidG), 10% sodium ascorbate solution (SodAsS), 10% sodium ascorbate gel (SodAsG), 10% sodium bicarbonate (Bicarb), Neutralize® (NE), Desensibilize® (DES), catalase C-40 at 10 mg/mL (CAT), 10% alcohol solution of alpha-tocopherol (VitE), Listerine® (LIS), 0.12% chlorhexidine (CHX), Croton Lechleri (CL), 10 % aqueous solution of Uncaria Tomentosa (UT), artificial saliva (ArtS) and 0.05% sodium fluoride (NaF) was assessed in triplicate by 2,2-diphenyl-1-picryl-hydrazyl-hydrate (DPPH) free radical assay. All substances exhibited antioxidant activity, except for CL. AAcidS, AAcidG and VitE exhibited the highest AA% (p<0.05). On the contrary, CHX, NE, LIS and NaF showed the lowest AA% (p<0.05). In conclusion, AAcidS, AAcidG, SodAsS, SodAsG and VitE presented the highest antioxidant activity among substances tested in this study. The DPPH assay provides an easy and rapid way to evaluate potential antioxidants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the hydrophobicity is usually an arduous parameter to be determined in the field, it has been pointed out as a good option to monitor aging of polymeric outdoor insulators. Concerning this purpose, digital image processing of photos taken from wet insulators has been the main technique nowadays. However, important challenges on this technique still remain to be overcome, such as; images from non-controlled illumination conditions can interfere on analyses and no existence of standard surfaces with different levels of hydrophobicity. In this paper, the photo image samples were digitally filtered to reduce the illumination influence, and hydrophobic surface samples were prepared from wetting silicon surfaces with solution of water-alcohol. Furthermore norevious studies triying to quantify and relate these properties in a mathematical function were found, that could be used in the field by the electrical companies. Based on such considerations, high quality images of countless hydrophobic surfaces were obtained and three different image processing methodologies, the fractal dimension and two Haralick textures descriptors, entropy and homogeneity, associated with several digital filters, were compared. The entropy parameter Haralick's descriptors filtered with the White Top-Hat filter presented the best result to classify the hydrophobicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work assessed the bioremediation of herbicide Velpar K®, in vitro in aqueous solution, used against weeds in sugar cane in São Paulo state. The herbicide contained Hexazinone and Diuron. It was used the microbial inoculant denominated Effective Microorganisms (EM-4), pool of microorganisms from soil that contained lactic and photosynthetic bacteria, fungi, yeasts and actinomycetes for bioremediation. Results for the depth of cultivation on agar-agar inoculated with EM-4 showed the microorganisms growth in the concentrations between 0.2% and 1.0% of the Velpar K®in the gel. The analysis of high performance liquid chromatography (HPLC) showed that the EM-4 was effective for the bioremediation of the herbicide, which reached the values of 80% for diuron and 70% for hexazinone after 21 days in solution of 2:1 of Velpar K®/EM-4 ratio. These results could be useful for planning the bioremediation of contaminated areas with Velpar K®.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We prove a uniqueness result related to the Germain–Lagrange dynamic plate differential equation. We consider the equation {∂2u∂t2+△2u=g⊗f,in ]0,+∞)×R2,u(0)=0,∂u∂t(0)=0, where uu stands for the transverse displacement, ff is a distribution compactly supported in space, and g∈Lloc1([0,+∞)) is a function of time such that g(0)≠0g(0)≠0 and there is a T0>0T0>0 such that g∈C1[0,T0[g∈C1[0,T0[. We prove that the knowledge of uu over an arbitrary open set of the plate for any interval of time ]0,T[]0,T[, 0

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a general class of mathematical models for stochastic gene expression where the transcription rate is allowed to depend on a promoter state variable that can take an arbitrary (finite) number of values. We provide the solution of the master equations in the stationary limit, based on a factorization of the stochastic transition matrix that separates timescales and relative interaction strengths, and we express its entries in terms of parameters that have a natural physical and/or biological interpretation. The solution illustrates the capacity of multiple states promoters to generate multimodal distributions of gene products, without the need for feedback. Furthermore, using the example of a three states promoter operating at low, high, and intermediate expression levels, we show that using multiple states operons will typically lead to a significant reduction of noise in the system. The underlying mechanism is that a three-states promoter can change its level of expression from low to high by passing through an intermediate state with a much smaller increase of fluctuations than by means of a direct transition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the complex relationships between quantities measured by volcanic monitoring network and shallow magma processes is a crucial headway for the comprehension of volcanic processes and a more realistic evaluation of the associated hazard. This question is very relevant at Campi Flegrei, a volcanic quiescent caldera immediately north-west of Napoli (Italy). The system activity shows a high fumarole release and periodic ground slow movement (bradyseism) with high seismicity. This activity, with the high people density and the presence of military and industrial buildings, makes Campi Flegrei one of the areas with higher volcanic hazard in the world. In such a context my thesis has been focused on magma dynamics due to the refilling of shallow magma chambers, and on the geophysical signals detectable by seismic, deformative and gravimetric monitoring networks that are associated with this phenomenologies. Indeed, the refilling of magma chambers is a process frequently occurring just before a volcanic eruption; therefore, the faculty of identifying this dynamics by means of recorded signal analysis is important to evaluate the short term volcanic hazard. The space-time evolution of dynamics due to injection of new magma in the magma chamber has been studied performing numerical simulations with, and implementing additional features in, the code GALES (Longo et al., 2006), recently developed and still on the upgrade at the Istituto Nazionale di Geofisica e Vulcanologia in Pisa (Italy). GALES is a finite element code based on a physico-mathematical two dimensional, transient model able to treat fluids as multiphase homogeneous mixtures, compressible to incompressible. The fundamental equations of mass, momentum and energy balance are discretised both in time and space using the Galerkin Least-Squares and discontinuity-capturing stabilisation technique. The physical properties of the mixture are computed as a function of local conditions of magma composition, pressure and temperature.The model features enable to study a broad range of phenomenologies characterizing pre and sin-eruptive magma dynamics in a wide domain from the volcanic crater to deep magma feeding zones. The study of displacement field associated with the simulated fluid dynamics has been carried out with a numerical code developed by the Geophysical group at the University College Dublin (O’Brien and Bean, 2004b), with whom we started a very profitable collaboration. In this code, the seismic wave propagation in heterogeneous media with free surface (e.g. the Earth’s surface) is simulated using a discrete elastic lattice where particle interactions are controlled by the Hooke’s law. This method allows to consider medium heterogeneities and complex topography. The initial and boundary conditions for the simulations have been defined within a coordinate project (INGV-DPC 2004-06 V3_2 “Research on active volcanoes, precursors, scenarios, hazard and risk - Campi Flegrei”), to which this thesis contributes, and many researchers experienced on Campi Flegrei in volcanological, seismic, petrological, geochemical fields, etc. collaborate. Numerical simulations of magma and rock dynamis have been coupled as described in the thesis. The first part of the thesis consists of a parametric study aimed at understanding the eect of the presence in magma of carbon dioxide in magma in the convection dynamics. Indeed, the presence of this volatile was relevant in many Campi Flegrei eruptions, including some eruptions commonly considered as reference for a future activity of this volcano. A set of simulations considering an elliptical magma chamber, compositionally uniform, refilled from below by a magma with volatile content equal or dierent from that of the resident magma has been performed. To do this, a multicomponent non-ideal magma saturation model (Papale et al., 2006) that considers the simultaneous presence of CO2 and H2O, has been implemented in GALES. Results show that the presence of CO2 in the incoming magma increases its buoyancy force promoting convection ad mixing. The simulated dynamics produce pressure transients with frequency and amplitude in the sensitivity range of modern geophysical monitoring networks such as the one installed at Campi Flegrei . In the second part, simulations more related with the Campi Flegrei volcanic system have been performed. The simulated system has been defined on the basis of conditions consistent with the bulk of knowledge of Campi Flegrei and in particular of the Agnano-Monte Spina eruption (4100 B.P.), commonly considered as reference for a future high intensity eruption in this area. The magmatic system has been modelled as a long dyke refilling a small shallow magma chamber; magmas with trachytic and phonolitic composition and variable volatile content of H2O and CO2 have been considered. The simulations have been carried out changing the condition of magma injection, the system configuration (magma chamber geometry, dyke size) and the resident and refilling magma composition and volatile content, in order to study the influence of these factors on the simulated dynamics. Simulation results allow to follow each step of the gas-rich magma ascent in the denser magma, highlighting the details of magma convection and mixing. In particular, the presence of more CO2 in the deep magma results in more ecient and faster dynamics. Through this simulations the variation of the gravimetric field has been determined. Afterward, the space-time distribution of stress resulting from numerical simulations have been used as boundary conditions for the simulations of the displacement field imposed by the magmatic dynamics on rocks. The properties of the simulated domain (rock density, P and S wave velocities) have been based on data from literature on active and passive tomographic experiments, obtained through a collaboration with A. Zollo at the Dept. of Physics of the Federici II Univeristy in Napoli. The elasto-dynamics simulations allow to determine the variations of the space-time distribution of deformation and the seismic signal associated with the studied magmatic dynamics. In particular, results show that these dynamics induce deformations similar to those measured at Campi Flegrei and seismic signals with energies concentrated on the typical frequency bands observed in volcanic areas. The present work shows that an approach based on the solution of equations describing the physics of processes within a magmatic fluid and the surrounding rock system is able to recognise and describe the relationships between geophysical signals detectable on the surface and deep magma dynamics. Therefore, the results suggest that the combined study of geophysical data and informations from numerical simulations can allow in a near future a more ecient evaluation of the short term volcanic hazard.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technology scaling increasingly emphasizes complexity and non-ideality of the electrical behavior of semiconductor devices and boosts interest on alternatives to the conventional planar MOSFET architecture. TCAD simulation tools are fundamental to the analysis and development of new technology generations. However, the increasing device complexity is reflected in an augmented dimensionality of the problems to be solved. The trade-off between accuracy and computational cost of the simulation is especially influenced by domain discretization: mesh generation is therefore one of the most critical steps and automatic approaches are sought. Moreover, the problem size is further increased by process variations, calling for a statistical representation of the single device through an ensemble of microscopically different instances. The aim of this thesis is to present multi-disciplinary approaches to handle this increasing problem dimensionality in a numerical simulation perspective. The topic of mesh generation is tackled by presenting a new Wavelet-based Adaptive Method (WAM) for the automatic refinement of 2D and 3D domain discretizations. Multiresolution techniques and efficient signal processing algorithms are exploited to increase grid resolution in the domain regions where relevant physical phenomena take place. Moreover, the grid is dynamically adapted to follow solution changes produced by bias variations and quality criteria are imposed on the produced meshes. The further dimensionality increase due to variability in extremely scaled devices is considered with reference to two increasingly critical phenomena, namely line-edge roughness (LER) and random dopant fluctuations (RD). The impact of such phenomena on FinFET devices, which represent a promising alternative to planar CMOS technology, is estimated through 2D and 3D TCAD simulations and statistical tools, taking into account matching performance of single devices as well as basic circuit blocks such as SRAMs. Several process options are compared, including resist- and spacer-defined fin patterning as well as different doping profile definitions. Combining statistical simulations with experimental data, potentialities and shortcomings of the FinFET architecture are analyzed and useful design guidelines are provided, which boost feasibility of this technology for mainstream applications in sub-45 nm generation integrated circuits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water distribution networks optimization is a challenging problem due to the dimension and the complexity of these systems. Since the last half of the twentieth century this field has been investigated by many authors. Recently, to overcome discrete nature of variables and non linearity of equations, the research has been focused on the development of heuristic algorithms. This algorithms do not require continuity and linearity of the problem functions because they are linked to an external hydraulic simulator that solve equations of mass continuity and of energy conservation of the network. In this work, a NSGA-II (Non-dominating Sorting Genetic Algorithm) has been used. This is a heuristic multi-objective genetic algorithm based on the analogy of evolution in nature. Starting from an initial random set of solutions, called population, it evolves them towards a front of solutions that minimize, separately and contemporaneously, all the objectives. This can be very useful in practical problems where multiple and discordant goals are common. Usually, one of the main drawback of these algorithms is related to time consuming: being a stochastic research, a lot of solutions must be analized before good ones are found. Results of this thesis about the classical optimal design problem shows that is possible to improve results modifying the mathematical definition of objective functions and the survival criterion, inserting good solutions created by a Cellular Automata and using rules created by classifier algorithm (C4.5). This part has been tested using the version of NSGA-II supplied by Centre for Water Systems (University of Exeter, UK) in MATLAB® environment. Even if orientating the research can constrain the algorithm with the risk of not finding the optimal set of solutions, it can greatly improve the results. Subsequently, thanks to CINECA help, a version of NSGA-II has been implemented in C language and parallelized: results about the global parallelization show the speed up, while results about the island parallelization show that communication among islands can improve the optimization. Finally, some tests about the optimization of pump scheduling have been carried out. In this case, good results are found for a small network, while the solutions of a big problem are affected by the lack of constraints on the number of pump switches. Possible future research is about the insertion of further constraints and the evolution guide. In the end, the optimization of water distribution systems is still far from a definitive solution, but the improvement in this field can be very useful in reducing the solutions cost of practical problems, where the high number of variables makes their management very difficult from human point of view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis gathers the work carried out by the author in the last three years of research and it concerns the study and implementation of algorithms to coordinate and control a swarm of mobile robots moving in unknown environments. In particular, the author's attention is focused on two different approaches in order to solve two different problems. The first algorithm considered in this work deals with the possibility of decomposing a main complex task in many simple subtasks by exploiting the decentralized implementation of the so called \emph{Null Space Behavioral} paradigm. This approach to the problem of merging different subtasks with assigned priority is slightly modified in order to handle critical situations that can be detected when robots are moving through an unknown environment. In fact, issues can occur when one or more robots got stuck in local minima: a smart strategy to avoid deadlock situations is provided by the author and the algorithm is validated by simulative analysis. The second problem deals with the use of concepts borrowed from \emph{graph theory} to control a group differential wheel robots by exploiting the Laplacian solution of the consensus problem. Constraints on the swarm communication topology have been introduced by the use of a range and bearing platform developed at the Distributed Intelligent Systems and Algorithms Laboratory (DISAL), EPFL (Lausanne, CH) where part of author's work has been carried out. The control algorithm is validated by demonstration and simulation analysis and, later, is performed by a team of four robots engaged in a formation mission. To conclude, the capabilities of the algorithm based on the local solution of the consensus problem for differential wheel robots are demonstrated with an application scenario, where nine robots are engaged in a hunting task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research activity carried out during the PhD course was focused on the development of mathematical models of some cognitive processes and their validation by means of data present in literature, with a double aim: i) to achieve a better interpretation and explanation of the great amount of data obtained on these processes from different methodologies (electrophysiological recordings on animals, neuropsychological, psychophysical and neuroimaging studies in humans), ii) to exploit model predictions and results to guide future research and experiments. In particular, the research activity has been focused on two different projects: 1) the first one concerns the development of neural oscillators networks, in order to investigate the mechanisms of synchronization of the neural oscillatory activity during cognitive processes, such as object recognition, memory, language, attention; 2) the second one concerns the mathematical modelling of multisensory integration processes (e.g. visual-acoustic), which occur in several cortical and subcortical regions (in particular in a subcortical structure named Superior Colliculus (SC)), and which are fundamental for orienting motor and attentive responses to external world stimuli. This activity has been realized in collaboration with the Center for Studies and Researches in Cognitive Neuroscience of the University of Bologna (in Cesena) and the Department of Neurobiology and Anatomy of the Wake Forest University School of Medicine (NC, USA). PART 1. Objects representation in a number of cognitive functions, like perception and recognition, foresees distribute processes in different cortical areas. One of the main neurophysiological question concerns how the correlation between these disparate areas is realized, in order to succeed in grouping together the characteristics of the same object (binding problem) and in maintaining segregated the properties belonging to different objects simultaneously present (segmentation problem). Different theories have been proposed to address these questions (Barlow, 1972). One of the most influential theory is the so called “assembly coding”, postulated by Singer (2003), according to which 1) an object is well described by a few fundamental properties, processing in different and distributed cortical areas; 2) the recognition of the object would be realized by means of the simultaneously activation of the cortical areas representing its different features; 3) groups of properties belonging to different objects would be kept separated in the time domain. In Chapter 1.1 and in Chapter 1.2 we present two neural network models for object recognition, based on the “assembly coding” hypothesis. These models are networks of Wilson-Cowan oscillators which exploit: i) two high-level “Gestalt Rules” (the similarity and previous knowledge rules), to realize the functional link between elements of different cortical areas representing properties of the same object (binding problem); 2) the synchronization of the neural oscillatory activity in the γ-band (30-100Hz), to segregate in time the representations of different objects simultaneously present (segmentation problem). These models are able to recognize and reconstruct multiple simultaneous external objects, even in difficult case (some wrong or lacking features, shared features, superimposed noise). In Chapter 1.3 the previous models are extended to realize a semantic memory, in which sensory-motor representations of objects are linked with words. To this aim, the network, previously developed, devoted to the representation of objects as a collection of sensory-motor features, is reciprocally linked with a second network devoted to the representation of words (lexical network) Synapses linking the two networks are trained via a time-dependent Hebbian rule, during a training period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from linguistic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with some shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits). PART 2. The ability of the brain to integrate information from different sensory channels is fundamental to perception of the external world (Stein et al, 1993). It is well documented that a number of extraprimary areas have neurons capable of such a task; one of the best known of these is the superior colliculus (SC). This midbrain structure receives auditory, visual and somatosensory inputs from different subcortical and cortical areas, and is involved in the control of orientation to external events (Wallace et al, 1993). SC neurons respond to each of these sensory inputs separately, but is also capable of integrating them (Stein et al, 1993) so that the response to the combined multisensory stimuli is greater than that to the individual component stimuli (enhancement). This enhancement is proportionately greater if the modality-specific paired stimuli are weaker (the principle of inverse effectiveness). Several studies have shown that the capability of SC neurons to engage in multisensory integration requires inputs from cortex; primarily the anterior ectosylvian sulcus (AES), but also the rostral lateral suprasylvian sulcus (rLS). If these cortical inputs are deactivated the response of SC neurons to cross-modal stimulation is no different from that evoked by the most effective of its individual component stimuli (Jiang et al 2001). This phenomenon can be better understood through mathematical models. The use of mathematical models and neural networks can place the mass of data that has been accumulated about this phenomenon and its underlying circuitry into a coherent theoretical structure. In Chapter 2.1 a simple neural network model of this structure is presented; this model is able to reproduce a large number of SC behaviours like multisensory enhancement, multisensory and unisensory depression, inverse effectiveness. In Chapter 2.2 this model was improved by incorporating more neurophysiological knowledge about the neural circuitry underlying SC multisensory integration, in order to suggest possible physiological mechanisms through which it is effected. This endeavour was realized in collaboration with Professor B.E. Stein and Doctor B. Rowland during the 6 months-period spent at the Department of Neurobiology and Anatomy of the Wake Forest University School of Medicine (NC, USA), within the Marco Polo Project. The model includes four distinct unisensory areas that are devoted to a topological representation of external stimuli. Two of them represent subregions of the AES (i.e., FAES, an auditory area, and AEV, a visual area) and send descending inputs to the ipsilateral SC; the other two represent subcortical areas (one auditory and one visual) projecting ascending inputs to the same SC. Different competitive mechanisms, realized by means of population of interneurons, are used in the model to reproduce the different behaviour of SC neurons in conditions of cortical activation and deactivation. The model, with a single set of parameters, is able to mimic the behaviour of SC multisensory neurons in response to very different stimulus conditions (multisensory enhancement, inverse effectiveness, within- and cross-modal suppression of spatially disparate stimuli), with cortex functional and cortex deactivated, and with a particular type of membrane receptors (NMDA receptors) active or inhibited. All these results agree with the data reported in Jiang et al. (2001) and in Binns and Salt (1996). The model suggests that non-linearities in neural responses and synaptic (excitatory and inhibitory) connections can explain the fundamental aspects of multisensory integration, and provides a biologically plausible hypothesis about the underlying circuitry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combinatorial Optimization is a branch of optimization that deals with the problems where the set of feasible solutions is discrete. Routing problem is a well studied branch of Combinatorial Optimization that concerns the process of deciding the best way of visiting the nodes (customers) in a network. Routing problems appear in many real world applications including: Transportation, Telephone or Electronic data Networks. During the years, many solution procedures have been introduced for the solution of different Routing problems. Some of them are based on exact approaches to solve the problems to optimality and some others are based on heuristic or metaheuristic search to find optimal or near optimal solutions. There is also a less studied method, which combines both heuristic and exact approaches to face different problems including those in the Combinatorial Optimization area. The aim of this dissertation is to develop some solution procedures based on the combination of heuristic and Integer Linear Programming (ILP) techniques for some important problems in Routing Optimization. In this approach, given an initial feasible solution to be possibly improved, the method follows a destruct-and-repair paradigm, where the given solution is randomly destroyed (i.e., customers are removed in a random way) and repaired by solving an ILP model, in an attempt to find a new improved solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crew scheduling and crew rostering are similar and related problems which can be solved by similar procedures. So far, the existing solution methods usually create a model for each one of these problems (scheduling and rostering), and when they are solved together in some cases an interaction between models is considered in order to obtain a better solution. A single set covering model to solve simultaneously both problems is presented here, where the total quantity of drivers needed is directly considered and optimized. This integration allows to optimize all of the depots at the same time, while traditional approaches needed to work depot by depot, and also it allows to see and manage the relationship between scheduling and rostering, which was known in some degree but usually not easy to quantify as this model permits. Recent research in the area of crew scheduling and rostering has stated that one of the current challenges to be achieved is to determine a schedule where crew fatigue, which depends mainly on the quality of the rosters created, is reduced. In this approach rosters are constructed in such way that stable working hours are used in every week of work, and a change to a different shift is done only using free days in between to make easier the adaptation to the new working hours. Computational results for real-world-based instances are presented. Instances are geographically diverse to test the performance of the procedures and the model in different scenarios.