22 resultados para Integration of Programming Techniques

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, IoT technology has radically transformed many crucial industrial and service sectors such as healthcare. The multi-facets heterogeneity of the devices and the collected information provides important opportunities to develop innovative systems and services. However, the ubiquitous presence of data silos and the poor semantic interoperability in the IoT landscape constitute a significant obstacle in the pursuit of this goal. Moreover, achieving actionable knowledge from the collected data requires IoT information sources to be analysed using appropriate artificial intelligence techniques such as automated reasoning. In this thesis work, Semantic Web technologies have been investigated as an approach to address both the data integration and reasoning aspect in modern IoT systems. In particular, the contributions presented in this thesis are the following: (1) the IoT Fitness Ontology, an OWL ontology that has been developed in order to overcome the issue of data silos and enable semantic interoperability in the IoT fitness domain; (2) a Linked Open Data web portal for collecting and sharing IoT health datasets with the research community; (3) a novel methodology for embedding knowledge in rule-defined IoT smart home scenarios; and (4) a knowledge-based IoT home automation system that supports a seamless integration of heterogeneous devices and data sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, in Ubiquitous computing scenarios users more and more require to exploit online contents and services by means of any device at hand, no matter their physical location, and by personalizing and tailoring content and service access to their own requirements. The coordinated provisioning of content tailored to user context and preferences, and the support for mobile multimodal and multichannel interactions are of paramount importance in providing users with a truly effective Ubiquitous support. However, so far the intrinsic heterogeneity and the lack of an integrated approach led to several either too vertical, or practically unusable proposals, thus resulting in poor and non-versatile support platforms for Ubiquitous computing. This work investigates and promotes design principles to help cope with these ever-changing and inherently dynamic scenarios. By following the outlined principles, we have designed and implemented a middleware support platform to support the provisioning of Ubiquitous mobile services and contents. To prove the viability of our approach, we have realized and stressed on top of our support platform a number of different, extremely complex and heterogeneous content and service provisioning scenarios. The encouraging results obtained are pushing our research work further, in order to provide a dynamic platform that is able to not only dynamically support novel Ubiquitous applicative scenarios by tailoring extremely diverse services and contents to heterogeneous user needs, but is also able to reconfigure and adapt itself in order to provide a truly optimized and tailored support for Ubiquitous service provisioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The PhD project was focused on the study of the poultry welfare conditions and improvements. The project work was divided into 3 main research activities. A) Field evaluation of chicken meat rearing conditions kept in intensive farms. Considering the lack of published reports concerning the overall Italian rearing conditions of broiler chickens, a survey was carried out to assess the welfare conditions of broiler reared in the most important poultry companies in Italy to verify if they are in accordance with the advices given in the European proposal COM (2005) 221 final. Chicken farm conditions, carcass lesions and meat quality were investigated. 1. The densities currently used in Italy are in accordance with the European proposal COM 221 final (2005) which suggests to keep broilers at a density lower than 30-32 kg live weight/m2 and to not exceed 38-40 kg live weight/m2. 2. The mortality rates in summer and winter agree with the mortality score calculated following the formula reported in the EU Proposal COM 221 final (2005). 3. The incidence of damaged carcasses was very low and did not seem related to the stocking density. 4. The FPD scores were generally above the maximum limit advised by the EU proposal COM 221 final (2005), although the stocking densities were lower than 30-32 kg live weight per m2. 5. It can be stated that the control of the environmental conditions, particularly litter quality, appears a key issue to control the onset of foot dermatitis. B) Manipulation of several farm parameters, such litter material and depth, stocking density and light regimen to improve the chicken welfare conditions, in winter season. 1. Even though 2 different stocking densities were established in this study, the performances achieved from the chickens were almost identical among groups. 2. The FCR was significantly better in Standard conditions contrarily to birds reared in Welfare conditions with lower stocking density, more litter material and with a light program of 16 hours light and 8 hours dark. 3. In our trial, in Standard groups we observed a higher content of moisture, nitrogen and ammonia released from the litter. Therefore it can be assumed that the environmental characteristics have been positively changed by the improvements of the rearing conditions adopted for Welfare groups. 4. In Welfare groups the exhausted litters of the pens were dryer and broilers showed a lower occurrence of FPD. 5. The prevalence of hock burn lesions, like FPD, is high with poor litter quality conditions. 6. The combined effect of a lower stocking density, a greater amount of litter material and a photoperiod similar to the natural one, have positively influenced the chickens welfare status, as a matter of fact the occurrence of FPD in Welfare groups was the lowest keeping the score under the European threshold of the proposal COM 221 final(2005). C) The purpose of the third research was to study the effect of high or low stocking density of broiler chickens, different types of litter and the adoption of short or long lighting regimen on broiler welfare through the evaluation of their productivity and incidence of foot pad dermatitis during the hot season. 1. The feed efficiency was better for the Low Density than for High Density broilers. 2. The appearance of FPD was not influenced by stocking density. 3. The foot examination revealed that the lesions occurred more in birds maintained on chopped wheat straw than on wood shaving. 4. In conclusion, the adoptions of a short light regimen similar to that occurring in nature during summer reduces the feed intake without modify the growth rate thus improving the feed efficiency. Foot pad lesion were not affected neither by stocking densities nor by light regimens whereas wood shavings exerted a favourable effect in preserving foot pad in good condition. D) A study was carried out to investigate more widely the possible role of 25-hydroxycholecalciferol supplemented in the diet of a laying hen commercial strain (Lohmann brown) in comparison of diets supplemented with D3 or with D3 + 25- hydroxycholecalciferol. Egg traits during a productive cycle as well as the bone characteristics of the layers have been as well evaluated to determine if there the vitamin D3 may enhance the welfare status of the birds. 1. The weight of the egg and of its components is often greater in hens fed a diet enriched with 25-hydroxycholecalciferol. 2. Since eggs of treated groups are heavier and a larger amount of shell is needed, a direct effect on shell strength is observed. 3. At 30 and at 50 wk of age hens fed 25 hydroxycholecalciferol exhibited greater values of bone breaking force. 4. Radiographic density values obtained in the trial are always higher in hens fed with 25-hydroxycholecalciferol of both treatments: supplemented for the whole laying cycle (25D3) or from 40 weeks of age onward (D3+25D3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactive theorem provers are tools designed for the certification of formal proofs developed by means of man-machine collaboration. Formal proofs obtained in this way cover a large variety of logical theories, ranging from the branches of mainstream mathematics, to the field of software verification. The border between these two worlds is marked by results in theoretical computer science and proofs related to the metatheory of programming languages. This last field, which is an obvious application of interactive theorem proving, poses nonetheless a serious challenge to the users of such tools, due both to the particularly structured way in which these proofs are constructed, and to difficulties related to the management of notions typical of programming languages like variable binding. This thesis is composed of two parts, discussing our experience in the development of the Matita interactive theorem prover and its use in the mechanization of the metatheory of programming languages. More specifically, part I covers: - the results of our effort in providing a better framework for the development of tactics for Matita, in order to make their implementation and debugging easier, also resulting in a much clearer code; - a discussion of the implementation of two tactics, providing infrastructure for the unification of constructor forms and the inversion of inductive predicates; we point out interactions between induction and inversion and provide an advancement over the state of the art. In the second part of the thesis, we focus on aspects related to the formalization of programming languages. We describe two works of ours: - a discussion of basic issues we encountered in our formalizations of part 1A of the Poplmark challenge, where we apply the extended inversion principles we implemented for Matita; - a formalization of an algebraic logical framework, posing more complex challenges, including multiple binding and a form of hereditary substitution; this work adopts, for the encoding of binding, an extension of Masahiko Sato's canonical locally named representation we designed during our visit to the Laboratory for Foundations of Computer Science at the University of Edinburgh, under the supervision of Randy Pollack.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research activity studied how the uncertainties are concerned and interrelated through the multi-model approach, since it seems to be the bigger challenge of ocean and weather forecasting. Moreover, we tried to reduce model error throughout the superensemble approach. In order to provide this aim, we created different dataset and by means of proper algorithms we obtained the superensamble estimate. We studied the sensitivity of this algorithm in function of its characteristics parameters. Clearly, it is not possible to evaluate a reasonable estimation of the error neglecting the importance of the grid size of ocean model, for the large amount of all the sub grid-phenomena embedded in space discretizations that can be only roughly parametrized instead of an explicit evaluation. For this reason we also developed a high resolution model, in order to calculate for the first time the impact of grid resolution on model error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between emotion and cognition is a topic that raises great interest in research. Recently, a view of these two processes as interactive and mutually influencing each other has become predominant. This dissertation investigates the reciprocal influences of emotion and cognition, both at behavioral and neural level, in two specific fields, such as attention and decision-making. Experimental evidence on how emotional responses may affect perceptual and attentional processes has been reported. In addition, the impact of three factors, such as personality traits, motivational needs and social context, in modulating the influence that emotion exerts on perception and attention has been investigated. Moreover, the influence of cognition on emotional responses in decision-making has been demonstrated. The current experimental evidence showed that cognitive brain regions such as the dorsolateral prefrontal cortex are causally implicated in regulation of emotional responses and that this has an effect at both pre and post decisional stages. There are two main conclusions of this dissertation: firstly, emotion exerts a strong influence on perceptual and attentional processes but, at the same time, this influence may also be modulated by other factors internal and external to the individuals. Secondly, cognitive processes may modulate emotional prepotent responses, by serving a regulative function critical to driving and shaping human behavior in line with current goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MFA and LCA methodologies were applied to analyse the anthropogenic aluminium cycle in Italy with focus on historical evolution of stocks and flows of the metal, embodied GHG emissions, and potentials from recycling to provide key features to Italy for prioritizing industrial policy toward low-carbon technologies and materials. Historical trend series were collected from 1947 to 2009 and balanced with data from production, manufacturing and waste management of aluminium-containing products, using a ‘top-down’ approach to quantify the contemporary in-use stock of the metal, and helping to identify ‘applications where aluminium is not yet being recycled to its full potential and to identify present and future recycling flows’. The MFA results were used as a basis for the LCA aimed at evaluating the carbon footprint evolution, from primary and electrical energy, the smelting process and the transportation, embodied in the Italian aluminium. A discussion about how the main factors, according to the Kaya Identity equation, they did influence the Italian GHG emissions pattern over time, and which are the levers to mitigate it, it has been also reported. The contemporary anthropogenic reservoirs of aluminium was estimated at about 320 kg per capita, mainly embedded within the transportation and building and construction sectors. Cumulative in-use stock represents approximately 11 years of supply at current usage rates (about 20 Mt versus 1.7 Mt/year), and it would imply a potential of about 160 Mt of CO2eq emissions savings. A discussion of criticality related to aluminium waste recovery from the transportation and the containers and packaging sectors was also included in the study, providing an example for how MFA and LCA may support decision-making at sectorial or regional level. The research constitutes the first attempt of an integrated approach between MFA and LCA applied to the aluminium cycle in Italy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Psychological characterisation of the somatosensory system often focusses on minimal units of perception, such as detection, localisation, and magnitude estimation of single events. Research on how multiple simultaneous stimuli are aggregated to create integrated, synthetic experiences is rarer. This thesis aims to shed a light on the mechanisms underlying the integration of multiple simultaneous stimuli, within and between different sub-modalities of the somatosensory system. First, we investigated the ability of healthy individuals to perceive the total intensity of composite somatosensory patterns. We found that the overall intensity of tactile, cold, or warm patterns was systematically overestimated when the multiple simultaneous stimuli had different intensities. Perception of somatosensory totals was biased towards the most salient element in the pattern. Furthermore, we demonstrated that peak-biased aggregation is a genuine perceptual phenomenon which does not rely on the discrimination of the parts, but is rather based on the salience of each stimulus. Next, we studied a classical thermal illusion to assess participants’ ability to localise thermal stimuli delivered on the fingers either in isolation, or in uniform and non-uniform patterns. We found that despite a surprisingly high accuracy in reporting the location of a single stimulus, when participants were presented with non-uniform patterns, their ability to identify the thermal state of a specific finger was completely abolished. Lastly, we investigated the perceptual and neural correlates of thermo-nociceptive interaction during the presentation of multiple thermal stimuli. We found that inhibition of pain by warmth was independent from both the position and the number of thermal stimuli administered. Our results suggest that nonlinear integration of multiple stimuli, within and between somatosensory sub-modalities, may be an efficient way by which the somatosensory system synthesises the complexity of reality, providing an extended and coherent perception of the world, in spite of its deep bandwidth limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molecular radiotherapy (MRT) is a fast developing and promising treatment for metastasised neuroendocrine tumours. Efficacy of MRT is based on the capability to selectively "deliver" radiation to tumour cells, minimizing administered dose to normal tissues. Outcome of MRT depends on the individual patient characteristics. For that reason, personalized treatment planning is important to improve outcomes of therapy. Dosimetry plays a key role in this setting, as it is the main physical quantity related to radiation effects on cells. Dosimetry in MRT consists in a complex series of procedures ranging from imaging quantification to dose calculation. This doctoral thesis focused on several aspects concerning the clinical implementation of absorbed dose calculations in MRT. Accuracy of SPECT/CT quantification was assessed in order to determine the optimal reconstruction parameters. A model of PVE correction was developed in order to improve the activity quantification in small volume, such us lesions in clinical patterns. Advanced dosimetric methods were compared with the aim of defining the most accurate modality, applicable in clinical routine. Also, for the first time on a large number of clinical cases, the overall uncertainty of tumour dose calculation was assessed. As part of the MRTDosimetry project, protocols for calibration of SPECT/CT systems and implementation of dosimetry were drawn up in order to provide standard guidelines to the clinics offering MRT. To estimate the risk of experiencing radio-toxicity side effects and the chance of inducing damage on neoplastic cells is crucial for patient selection and treatment planning. In this thesis, the NTCP and TCP models were derived based on clinical data as help to clinicians to decide the pharmaceutical dosage in relation to the therapy control and the limitation of damage to healthy tissues. Moreover, a model for tumour response prediction based on Machine Learning analysis was developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Against a backdrop of rapidly increasing worldwide population and growing energy demand, the development of renewable energy technologies has become of primary importance in the effort to reduce greenhouse gas emissions. However, it is often technically and economically infeasible to transport discontinuous renewable electricity for long distances to the shore. Another shortcoming of non-programmable renewable power is its integration into the onshore grid without affecting the dispatching process. On the other hand, the offshore oil & gas industry is striving to reduce overall carbon footprint from onsite power generators and limiting large expenses associated to carrying electricity from remote offshore facilities. Furthermore, the increased complexity and expansion towards challenging areas of offshore hydrocarbons operations call for higher attention to safety and environmental protection issues from major accident hazards. Innovative hybrid energy systems, as Power-to-Gas (P2G), Power-to-Liquid (P2L) and Gas-to-Power (G2P) options, implemented at offshore locations, would offer the opportunity to overcome challenges of both renewable and oil & gas sectors. This study aims at the development of systematic methodologies based on proper sustainability and safety performance indicators supporting the choice of P2G, P2L and G2P hybrid energy options for offshore green projects in early design phases. An in-depth analysis of the different offshore hybrid strategies was performed. The literature reviews on existing methods proposing metrics to assess sustainability of hybrid energy systems, inherent safety of process routes in conceptual design stage and environmental protection of installations from oil and chemical accidental spills were carried out. To fill the gaps, a suite of specific decision-making methodologies was developed, based on representative multi-criteria indicators addressing technical, economic, environmental and societal aspects of alternative options. A set of five case-studies was defined, covering different offshore scenarios of concern, to provide an assessment of the effectiveness and value of the developed tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The steadily growing immigration phenomenon in today’s Japan is showing a tangible and expanding presence of immigrant-origin youths residing in the country. International research in the migration studies area has underlined the importance of focusing on immigrant-origin youths to shed light on the character of the way immigrant incorporate in countries of destinations. In-deed, immigrants’ offspring, the adults of tomorrow, embody the interlocutor between first-generation immigrants and the receiving societal context. The extent of the presence of immigrants’ children in countries of destination is also a reliable yardstick to assess the maturation of the migration process, transforming it from a temporary phenomenon to a long-term settlement. Within this framework, the school is a privileged site to observe and analyze immigrant-origin youths’ integration. Alongside their family and peers, school constitutes one of the main agents of socialization. Here, children learn norms and rules and acquire the necessary tools to eventually compete in the pursuit of an occupation, determining their future socioeconomic standing. This doctoral research aims to identify which theoretical model articulated in the area of migration studies best describes the adaptation process of immigrant-origin youths in Japan. In particular, it examines whether (and to what extent) any of the pre-existing frameworks can help explain the Japanese occurring circumstances, or whether further elaboration and adjustment are needed. Alternatively, it studies whether it is necessary to produce a new model based on the peculiarities of the Japanese social context. This study provides a theoretical-oriented contribution to the (mainly descriptive but maturing) literature on immigrant-origin youths’ integration in Japan. Considering past growth trends of Japanese immigration and its expanding prospective projections (Korekawa 2018c), this study might be considered pioneering for future development of the phenomenon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations.