16 resultados para Problem solving, control methods, and search – scheduling

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the stone monumental artefacts artistic fountains are extremely favorable to formation of biofilms, giving rise to biodegradation processes related with physical-chemical and visual aspect alterations, because of their particular exposure conditions. Microbial diversity of five fountains (two from Spain and three from Italy) was investigated. It was observed an ample similarity between the biodiversity of monumental stones reported in literature and that one found in studied fountains. Mechanical procedures and toxic chemical products are usually employed to remove such phototrophic patinas. Alternative methods based on natural antifouling substances are recently experimented in the marine sector, due to their very low environmental impact and for the bio settlement prevention on partially immersed structures of ships. In the present work groups of antibiofouling agents (ABAs) were selected from literature for their ability to interfere, at molecular level, with the microbial communication system “quorum sensing”, inhibiting the initial phase of biofilm formation. The efficacy of some natural antibiofoulants agents (ABAs) with terrestrial (Capsaicine - CS, Cinnamaldehyde - CI) and marine origin (Zosteric Acid - ZA, poly-Alkyl Pyridinium Salts pAPS and Ceramium botryocarpum extract - CBE), incorporated into two commercial coatings (Silres BS OH 100 - S and Wacker Silres BS 290 - W) commonly used in stone conservation procedures were evaluated. The formation of phototrophic biofilms in laboratory conditions (on Carrara marble specimens and Sierra Elvira stone) and on two monumental fountains (Tacca’s Fountain 2 - Florence, Italy and Fountain from Patio de la Lindaraja - Alhambra Palace, Granada, Spain) has been investigated in the presence or absence of these natural antifouling agents. The natural antibiofouling agents, at tested concentrations, demonstrated a certain inhibitory effect. The silane-siloxane based silicone coating (W) mixing with ABAs was more suitable with respect to ethyl silicate coating (S) and proved efficacy against biofilm formation only when incompletely cured. The laboratory results indicated a positive action in inhibiting the patina formation, especially for poly-alkyl pyridinium salts, zosteric acid and cinnamaldehyde, while on site tests revealed a good effect for zosteric acid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information is nowadays a key resource: machine learning and data mining techniques have been developed to extract high-level information from great amounts of data. As most data comes in form of unstructured text in natural languages, research on text mining is currently very active and dealing with practical problems. Among these, text categorization deals with the automatic organization of large quantities of documents in priorly defined taxonomies of topic categories, possibly arranged in large hierarchies. In commonly proposed machine learning approaches, classifiers are automatically trained from pre-labeled documents: they can perform very accurate classification, but often require a consistent training set and notable computational effort. Methods for cross-domain text categorization have been proposed, allowing to leverage a set of labeled documents of one domain to classify those of another one. Most methods use advanced statistical techniques, usually involving tuning of parameters. A first contribution presented here is a method based on nearest centroid classification, where profiles of categories are generated from the known domain and then iteratively adapted to the unknown one. Despite being conceptually simple and having easily tuned parameters, this method achieves state-of-the-art accuracy in most benchmark datasets with fast running times. A second, deeper contribution involves the design of a domain-independent model to distinguish the degree and type of relatedness between arbitrary documents and topics, inferred from the different types of semantic relationships between respective representative words, identified by specific search algorithms. The application of this model is tested on both flat and hierarchical text categorization, where it potentially allows the efficient addition of new categories during classification. Results show that classification accuracy still requires improvements, but models generated from one domain are shown to be effectively able to be reused in a different one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some fundamental biological processes such as embryonic development have been preserved during evolution and are common to species belonging to different phylogenetic positions, but are nowadays largely unknown. The understanding of cell morphodynamics leading to the formation of organized spatial distribution of cells such as tissues and organs can be achieved through the reconstruction of cells shape and position during the development of a live animal embryo. We design in this work a chain of image processing methods to automatically segment and track cells nuclei and membranes during the development of a zebrafish embryo, which has been largely validates as model organism to understand vertebrate development, gene function and healingrepair mechanisms in vertebrates. The embryo is previously labeled through the ubiquitous expression of fluorescent proteins addressed to cells nuclei and membranes, and temporal sequences of volumetric images are acquired with laser scanning microscopy. Cells position is detected by processing nuclei images either through the generalized form of the Hough transform or identifying nuclei position with local maxima after a smoothing preprocessing step. Membranes and nuclei shapes are reconstructed by using PDEs based variational techniques such as the Subjective Surfaces and the Chan Vese method. Cells tracking is performed by combining informations previously detected on cells shape and position with biological regularization constraints. Our results are manually validated and reconstruct the formation of zebrafish brain at 7-8 somite stage with all the cells tracked starting from late sphere stage with less than 2% error for at least 6 hours. Our reconstruction opens the way to a systematic investigation of cellular behaviors, of clonal origin and clonal complexity of brain organs, as well as the contribution of cell proliferation modes and cell movements to the formation of local patterns and morphogenetic fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed while waiting for the SRT to be completed on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The procedure for event location in OPERA ECC has been optimazed for penetrating particles while is less efficient for electrons. For this reason new procedure has been defined in order to recover event with an electromagnetic shower in its final state not located with the standard one. The new procedure include the standard procedure during which several electromagnetic shower hint has been defined by means of the available data. In case the event is not located, the presence of an electromagnetic shower hint trigger a dedicated procedure. The old and new location procedure has been then simulated in order to obtain the standard location efficiency and the possible gain due to the new one for the events with electromagnetic shower. Finally a Data-MC comparison has been performed for the 2008 and 2009 runs for what concern the NC in order to validate the Monte Carlo. Then the expected electron neutrino interactions for the 2008 and 2009 runs has been evaluated and compared with the available data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perfluoroalkylated substances are a group of chemicals that have been largely employed during the last 60 years in several applications, widely spreading and accumulating in the environment due to their extreme resistance to degradation. As a consequence, they have been found also in various types of food as well as in drinking water, proving that they can easily reach humans through the diet. The available information concerning their adverse effects on health has recently increased the interest towards these contaminants and highlighted the importance of investigating all the potential sources of human exposure, among which diet was proved to be the most relevant. This need has been underlined by the European Union through Recommendation 2010/161/EU: in this document, Member States were called to monitor their presence of in food, producing accurate estimations of human exposure. The purpose of the research presented in this thesis, which is the result of a partnership between an Italian and a French laboratory, was to develop reliable tools for the analysis of these pollutants in food, to be used for generating data on potentially contaminated matrices. An efficient method based on liquid chromatography-mass spectrometry for the detection of 16 different perfluorinated compounds in milk has been validated in accordance with current European regulation guidelines (2002/657/EC) and is currently under evaluation for ISO 17025 accreditation. The proposed technique was applied to cow, powder and human breast milk samples from Italy and France to produce a preliminary monitoring on the presence of these contaminants. In accordance with the above mentioned European Recommendation, this project led also to the development of a promising technique for the quantification of some precursors of these substances in fish. This method showed extremely satisfying performances in terms of linearity and limits of detection, and will be useful for future surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of my PhD thesis has been to face the issue of retrieving a three dimensional attenuation model in volcanic areas. To this purpose, I first elaborated a robust strategy for the analysis of seismic data. This was done by performing several synthetic tests to assess the applicability of spectral ratio method to our purposes. The results of the tests allowed us to conclude that: 1) spectral ratio method gives reliable differential attenuation (dt*) measurements in smooth velocity models; 2) short signal time window has to be chosen to perform spectral analysis; 3) the frequency range over which to compute spectral ratios greatly affects dt* measurements. Furthermore, a refined approach for the application of spectral ratio method has been developed and tested. Through this procedure, the effects caused by heterogeneities of propagation medium on the seismic signals may be removed. The tested data analysis technique was applied to the real active seismic SERAPIS database. It provided a dataset of dt* measurements which was used to obtain a three dimensional attenuation model of the shallowest part of Campi Flegrei caldera. Then, a linearized, iterative, damped attenuation tomography technique has been tested and applied to the selected dataset. The tomography, with a resolution of 0.5 km in the horizontal directions and 0.25 km in the vertical direction, allowed to image important features in the off-shore part of Campi Flegrei caldera. High QP bodies are immersed in a high attenuation body (Qp=30). The latter is well correlated with low Vp and high Vp/Vs values and it is interpreted as a saturated marine and volcanic sediments layer. High Qp anomalies, instead, are interpreted as the effects either of cooled lava bodies or of a CO2 reservoir. A pseudo-circular high Qp anomaly was detected and interpreted as the buried rim of NYT caldera.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Heavy Liquid Metal Cooled Reactors are among the concepts, fostered by the GIF, as potentially able to comply with stringent safety, economical, sustainability, proliferation resistance and physical protection requirements. The increasing interest around these innovative systems has highlighted the lack of tools specifically dedicated to their core design stage. The present PhD thesis summarizes the three years effort of, partially, closing the mentioned gap, by rationally defining the role of codes in core design and by creating a development methodology for core design-oriented codes (DOCs) and its subsequent application to the most needed design areas. The covered fields are, in particular, the fuel assembly thermal-hydraulics and the fuel pin thermo-mechanics. Regarding the former, following the established methodology, the sub-channel code ANTEO+ has been conceived. Initially restricted to the forced convection regime and subsequently extended to the mixed one, ANTEO+, via a thorough validation campaign, has been demonstrated a reliable tool for design applications. Concerning the fuel pin thermo-mechanics, the will to include safety-related considerations at the outset of the pin dimensioning process, has given birth to the safety-informed DOC TEMIDE. The proposed DOC development methodology has also been applied to TEMIDE; given the complex interdependence patterns among the numerous phenomena involved in an irradiated fuel pin, to optimize the code final structure, a sensitivity analysis has been performed, in the anticipated application domain. The development methodology has also been tested in the verification and validation phases; the latter, due to the low availability of experiments truly representative of TEMIDE's application domain, has only been a preliminary attempt to test TEMIDE's capabilities in fulfilling the DOC requirements upon which it has been built. In general, the capability of the proposed development methodology for DOCs in delivering tools helping the core designer in preliminary setting the system configuration has been proven.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changing or creating an organisation means creating a new process. Each process involves many risks that need to be identified and managed. The main risks considered here are procedural and legal risks. The former are related to the risks of errors that may occur during processes, while the latter are related to the compliance of processes with regulations. Managing the risks implies proposing changes to the processes that allow the desired result: an optimised process. In order to manage a company and optimise it in the best possible way, not only should the organisational aspect, risk management and legal compliance be taken into account, but it is important that they are all analysed simultaneously with the aim of finding the right balance that satisfies them all. This is the aim of this thesis, to provide methods and tools to balance these three characteristics, and to enable this type of optimisation, ICT support is used. This work isn’t a thesis in computer science or law, but rather an interdisciplinary thesis. Most of the work done so far is vertical and in a specific domain. The particularity and aim of this thesis is not to carry out an in-depth analysis of a particular aspect, but rather to combine several important aspects, normally analysed separately, which however have an impact and influence each other. In order to carry out this kind of interdisciplinary analysis, the knowledge base of both areas was involved and the combination and collaboration of different experts in the various fields was necessary. Although the methodology described is generic and can be applied to all sectors, the case study considered is a new type of healthcare service that allows patients in acute disease to be hospitalised to their home. This provide the possibility to perform experiments using real hospital database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenges of the current global food systems are often framed around feeding the world's growing population while meeting sustainable development for future generations. Globalization has brought to a fragmentation of food spaces, leading to a flexible and mutable supply chain. This poses a major challenge to food and nutrition security, affecting also rural-urban dynamics in territories. Furthermore, the recent crises have highlighted the vulnerability to shocks and disruptions of the food systems and the eco-system due to the intensive management of natural, human and economic capital. Hence, a sustainable and resilient transition of the food systems is required through a multi-faceted approach that tackles the causes of unsustainability and promotes sustainable practices at all levels of the food system. In this respect, a territorial approach becomes a relevant entry point of analysis for the food system’s multifunctionality and can support the evaluation of sustainability by quantifying impacts associated with quantitative methods and understanding the territorial responsibility of different actors with qualitative ones. Against this background the present research aims to i) investigate the environmental, costing and social indicators suitable for a scoring system able to measure the integrated sustainability performance of food initiatives within the City/Region territorial context; ii) develop a territorial assessment framework to measure sustainability impacts of agricultural systems; and iii) define an integrated methodology to match production and consumption at a territorial level to foster a long-term vision of short food supply chains. From a methodological perspective, the research proposes a mixed quantitative and qualitative research method. The outcomes provide an in-depth view into the environmental and socio-economic impacts of food systems at the territorial level, investigating possible indicators, frameworks, and business strategies to foster their future sustainable development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents hybrid Constraint Programming (CP) and metaheuristic methods for the solution of Large Scale Optimization Problems; it aims at integrating concepts and mechanisms from the metaheuristic methods to a CP-based tree search environment in order to exploit the advantages of both approaches. The modeling and solution of large scale combinatorial optimization problem is a topic which has arisen the interest of many researcherers in the Operations Research field; combinatorial optimization problems are widely spread in everyday life and the need of solving difficult problems is more and more urgent. Metaheuristic techniques have been developed in the last decades to effectively handle the approximate solution of combinatorial optimization problems; we will examine metaheuristics in detail, focusing on the common aspects of different techniques. Each metaheuristic approach possesses its own peculiarities in designing and guiding the solution process; our work aims at recognizing components which can be extracted from metaheuristic methods and re-used in different contexts. In particular we focus on the possibility of porting metaheuristic elements to constraint programming based environments, as constraint programming is able to deal with feasibility issues of optimization problems in a very effective manner. Moreover, CP offers a general paradigm which allows to easily model any type of problem and solve it with a problem-independent framework, differently from local search and metaheuristic methods which are highly problem specific. In this work we describe the implementation of the Local Branching framework, originally developed for Mixed Integer Programming, in a CP-based environment. Constraint programming specific features are used to ease the search process, still mantaining an absolute generality of the approach. We also propose a search strategy called Sliced Neighborhood Search, SNS, that iteratively explores slices of large neighborhoods of an incumbent solution by performing CP-based tree search and encloses concepts from metaheuristic techniques. SNS can be used as a stand alone search strategy, but it can alternatively be embedded in existing strategies as intensification and diversification mechanism. In particular we show its integration within the CP-based local branching. We provide an extensive experimental evaluation of the proposed approaches on instances of the Asymmetric Traveling Salesman Problem and of the Asymmetric Traveling Salesman Problem with Time Windows. The proposed approaches achieve good results on practical size problem, thus demonstrating the benefit of integrating metaheuristic concepts in CP-based frameworks.