15 resultados para Hyper-heuristics
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This dissertation introduces and develops a new method of rational reconstruction called structural heuristics. Structural heuristics takes assignment of structure to any given object of investigation as the starting point for its rational reconstruction. This means to look at any given object as a system of relations and of transformation laws for those relations. The operational content of this heuristics can be summarized as follows: when facing any given system the best way to approach it is to explicitly look for a possible structure of it. The utilization of structural heuristics allows structural awareness, which is considered a fundamental epistemic disposition, as well as a fundamental condition for the rational reconstruction of systems of knowledge. In this dissertation, structural heuristics is applied to reconstructing the domain of economic knowledge. This is done by exploring four distinct areas of economic research: (i) economic axiomatics; (ii) realism in economics; (iii) production theory; (iv) economic psychology. The application of structural heuristics to these fields of economic inquiry shows the flexibility and potential of structural heuristics as epistemic tool for theoretical exploration and reconstruction.
Resumo:
The main object of this thesis is the analysis and the quantization of spinning particle models which employ extended ”one dimensional supergravity” on the worldline, and their relation to the theory of higher spin fields (HS). In the first part of this work we have described the classical theory of massless spinning particles with an SO(N) extended supergravity multiplet on the worldline, in flat and more generally in maximally symmetric backgrounds. These (non)linear sigma models describe, upon quantization, the dynamics of particles with spin N/2. Then we have analyzed carefully the quantization of spinning particles with SO(N) extended supergravity on the worldline, for every N and in every dimension D. The physical sector of the Hilbert space reveals an interesting geometrical structure: the generalized higher spin curvature (HSC). We have shown, in particular, that these models of spinning particles describe a subclass of HS fields whose equations of motions are conformally invariant at the free level; in D = 4 this subclass describes all massless representations of the Poincar´e group. In the third part of this work we have considered the one-loop quantization of SO(N) spinning particle models by studying the corresponding partition function on the circle. After the gauge fixing of the supergravity multiplet, the partition function reduces to an integral over the corresponding moduli space which have been computed by using orthogonal polynomial techniques. Finally we have extend our canonical analysis, described previously for flat space, to maximally symmetric target spaces (i.e. (A)dS background). The quantization of these models produce (A)dS HSC as the physical states of the Hilbert space; we have used an iterative procedure and Pochhammer functions to solve the differential Bianchi identity in maximally symmetric spaces. Motivated by the correspondence between SO(N) spinning particle models and HS gauge theory, and by the notorious difficulty one finds in constructing an interacting theory for fields with spin greater than two, we have used these one dimensional supergravity models to study and extract informations on HS. In the last part of this work we have constructed spinning particle models with sp(2) R symmetry, coupled to Hyper K¨ahler and Quaternionic-K¨ahler (QK) backgrounds.
Resumo:
This research argues for an analysis of textual and cultural forms in the American horror film (1968- 1998), by defining the so-called postmodern characters. The “postmodern” term will not mean a period of the history of cinema, but a series of forms and strategies recognizable in many American films. From a bipolar re-mediation and cognitive point of view, the postmodern phenomenon is been considered as a formal and epistemological re-configuration of the cultural “modern” system. The first section of the work examines theoretical problems about the “postmodern phenomenon” by defining its cultural and formal constants in different areas (epistemology, economy, mass-media): the character of convergence, fragmentation, manipulation and immersion represent the first ones, while the “excess” is the morphology of the change, by realizing the “fluctuation” of the previous consolidated system. The second section classifies the textual and cultural forms of American postmodern film, generally non-horror. The “classic narrative” structure – coherent and consequent chain of causal cues toward a conclusion – is scattered by the postmodern constant of “fragmentation”. New textual models arise, fragmenting the narrative ones into the aggregations of data without causal-temporal logics. Considering the process of “transcoding”1 and “remediation”2 between media, and the principle of “convergence” in the phenomenon, the essay aims to define these structures in postmodern film as “database forms” and “navigable space forms.” The third section applies this classification to American horror film (1968-1998). The formal constant of “excess” in the horror genre works on the paradigm of “vision”: if postmodern film shows a crisis of the “truth” in the vision, in horror movies the excess of vision becomes “hyper-vision” – that is “multiplication” of the death/blood/torture visions – and “intra-vision”, that shows the impossibility of recognizing the “real” vision from the virtual/imaginary. In this perspective, the textual and cultural forms and strategies of postmodern horror film are predominantly: the “database-accumulation” forms, where the events result from a very simple “remote cause” serving as a pretext (like in Night of the Living Dead); the “database-catalogue” forms, where the events follow one another displaying a “central” character or theme. In the first case, the catalogue syntagms are connected by “consecutive” elements, building stories linked by the actions of a single character (usually the killer), or connected by non-consecutive episodes about a general theme: examples of the first kind are built on the model of The Wizard of Gore; the second ones, on the films such as Mario Bava’s I tre volti della paura. The “navigable space” forms are defined: hyperlink a, where one universe is fluctuating between reality and dream, as in Rosemary’s Baby; hyperlink b (where two non-hierarchical universes are convergent, the first one real and the other one fictional, as in the Nightmare series); hyperlink c (where more worlds are separated but contiguous in the last sequence, as in Targets); the last form, navigable-loop, includes a textual line which suddenly stops and starts again, reflecting the pattern of a “loop” (as in Lost Highway). This essay analyses in detail the organization of “visual space” into the postmodern horror film by tracing representative patterns. It concludes by examining the “convergence”3 of technologies and cognitive structures of cinema and new media.
Resumo:
Allergies are a complex of symptoms derived from altered IgE-mediated reactions of the immune system towards substances known as allergens. Allergic sensibilization can be of food or respiratory origin and, in particular, apple and hazelnut allergens have been identified in pollens or fruits. Allergic cross-reactivity can occur in a patient reacting to similar allergens from different origins, justifying the research in both systems as in Europe a greater number of people suffers from apple fruit allergy, but little evidence exists about pollen. Apple fruit allergies are due to four different classes of allergens (Mal d 1, 2, 3, 4), whose allergenicity is related both to genotype and tissue specificity; therefore I have investigated their presence also in pollen at different time of germination to clarify the apple pollen allergenic potential. I have observed that the same four classes of allergens found in fruit are expressed at different levels also in pollen, and their presence might support that the apple pollen can be considered allergenic as the fruit, deducing that apple allergy could also be indirectly caused by sensitization to pollen. Climate changes resulting from increases in temperature and air pollution influence pollen allergenicity, responsible for the dramatic raise in respiratory allergies (hay fever, bronchial asthma, conjunctivitis). Although the link between climate change and pollen allergenicity is proven, the underlying mechanism is little understood. Transglutaminases (TGases), a class of enzymes able to post-translationally modify proteins, are activated under stress and involved in some inflammatory responses, enhancing the activity of pro-inflammatory phospholipase A2, suggesting a role in allergies. Recently, a calcium-dependent TGase activity has been identified in the pollen cell wall, raising the possibility that pollen TGase may have a role in the modification of pollen allergens reported above, thus stabilizing them against proteases. This enzyme can be involved also in the transamidation of proteins present in the human mucosa interacting with surface pollen or, finally, the enzyme itself can represent an allergen, as suggested by studies on celiac desease. I have hypothesized that this pollen enzyme can be affected by climate changes and be involved in exhacerbating allergy response. The data presented in this thesis represent a scientific basis for future development of studies devoted to verify the hypothesis set out here. First, I have demonstrated the presence of an extracellular TGase on the surface of the grain observed either at the apical or the proximal parts of the pollen-tube by laser confocal microscopy (Iorio et al., 2008), that plays an essential role in apple pollen-tube growth, as suggested by the arrest of tube elongation by TGase inhibitors, such as EGTA or R281. Its involvement in pollen tube growth is mainly confirmed by the data of activity and gene expression, because TGase showed a peak between 15 min and 30 min of germination, when this process is well established, and an optimal pH around 6.5, which is close to that recorded for the germination medium. Moreover, data show that pollen TGase can be a glycoprotein as the glycosylation profile is linked both with the activation of the enzyme and with its localization at the pollen cell wall during germination, because from the data presented seems that the active form of TGase involved in pollen tube growth and pollen-stylar interaction is more exposed and more weakly bound to the cell wall. Interestingly, TGase interacts with fibronectin (FN), a putative SAMs or psECM component, inducing possibly intracellular signal transduction during the interaction between pollen-stylar occuring in the germination process, since a protein immunorecognised by anti-FN antibody is also present in pollen, in particular at the level of pollen grain cell wall in a punctuate pattern, but also along the shank of the pollen tube wall, in a similar pattern that recalls the signal obtained with the antibody anti TGase. FN represents a good substrate for the enzyme activity, better than DMC usually used as standard substrate for animal TGase. Thus, this pollen enzyme, necessary for its germination, is exposed on the pollen surface and consequently can easily interact with mucosal proteins, as it has been found germinated pollen in studies conducted on human mucus (Forlani, personal communication). I have obtained data that TGase activity increases in a very remarkable way when pollen is exposed to stressful conditions, such as climate changes and environmental pollution. I have used two different species of pollen, an aero allergenic (hazelnut, Corylus avellana) pollen, whose allergenicity is well documented, and an enthomophylus (apple, Malus domestica) pollen, which is not yet well characterized, to compare data on their mechanism of action in response to stressors. The two pollens have been exposed to climate changes (different temperatures, relative humidity (rH), acid rain at pH 5.6 and copper pollution (3.10 µg/l)) and showed an increase in pollen surface TGase activity that is not accompanied to an induced expression of TGase immunoreactive protein with AtPNG1p. Probably, climate change induce an alteration or damage to pollen cell wall that carries the pollen grains to release their content in the medium including TGase enzyme, that can be free to carry out its function as confirmed by the immunolocalisation and by the in situ TGase activity assay data; morphological examination indicated pollen damage, viability significantly reduced and in acid rain conditions an early germination of apple pollen, thus possibly enhancing the TGase exposure on pollen surface. Several pollen proteins were post-translationally modified, as well as mammalian sPLA2 especially with Corylus pollen, which results in its activation, potentially altering pollen allergenicity and inflammation. Pollen TGase activity mimicked the behaviour of gpl TGase and AtPNG1p in the stimulation of sPLA2, even if the regulatory mechanism seems different to gpl TGase, because pollen TGase favours an intermolecular cross-linking between various molecules of sPLA2, giving rise to high-molecular protein networks normally more stable. In general, pollens exhibited a significant endogenous phospholipase activity and it has been observed differences according to the allergenic (Corylus) or not-well characterized allergenic (Malus) attitude of the pollen. However, even if with a different intensity level in activation, pollen enzyme share the ability to activate the sPLA2, thus suggesting an important regulatory role for the activation of a key enzyme of the inflammatory response, among which my interest was addressed to pollen allergy. In conclusion, from all the data presented, mainly presence of allergens, presence of an extracellular TGase, increasing in its activity following exposure to environmental pollution and PLA2 activation, I can conclude that also Malus pollen can behave as potentially allergenic. The mechanisms described here that could affect the allergenicity of pollen, maybe could be the same occurring in fruit, paving the way for future studies in the identification of hyper- and hypo- allergenic cultivars, in preventing environmental stressor effects and, possibly, in the production of transgenic plants.
Resumo:
In this thesis we study three combinatorial optimization problems belonging to the classes of Network Design and Vehicle Routing problems that are strongly linked in the context of the design and management of transportation networks: the Non-Bifurcated Capacitated Network Design Problem (NBP), the Period Vehicle Routing Problem (PVRP) and the Pickup and Delivery Problem with Time Windows (PDPTW). These problems are NP-hard and contain as special cases some well known difficult problems such as the Traveling Salesman Problem and the Steiner Tree Problem. Moreover, they model the core structure of many practical problems arising in logistics and telecommunications. The NBP is the problem of designing the optimum network to satisfy a given set of traffic demands. Given a set of nodes, a set of potential links and a set of point-to-point demands called commodities, the objective is to select the links to install and dimension their capacities so that all the demands can be routed between their respective endpoints, and the sum of link fixed costs and commodity routing costs is minimized. The problem is called non- bifurcated because the solution network must allow each demand to follow a single path, i.e., the flow of each demand cannot be splitted. Although this is the case in many real applications, the NBP has received significantly less attention in the literature than other capacitated network design problems that allow bifurcation. We describe an exact algorithm for the NBP that is based on solving by an integer programming solver a formulation of the problem strengthened by simple valid inequalities and four new heuristic algorithms. One of these heuristics is an adaptive memory metaheuristic, based on partial enumeration, that could be applied to a wider class of structured combinatorial optimization problems. In the PVRP a fleet of vehicles of identical capacity must be used to service a set of customers over a planning period of several days. Each customer specifies a service frequency, a set of allowable day-combinations and a quantity of product that the customer must receive every time he is visited. For example, a customer may require to be visited twice during a 5-day period imposing that these visits take place on Monday-Thursday or Monday-Friday or Tuesday-Friday. The problem consists in simultaneously assigning a day- combination to each customer and in designing the vehicle routes for each day so that each customer is visited the required number of times, the number of routes on each day does not exceed the number of vehicles available, and the total cost of the routes over the period is minimized. We also consider a tactical variant of this problem, called Tactical Planning Vehicle Routing Problem, where customers require to be visited on a specific day of the period but a penalty cost, called service cost, can be paid to postpone the visit to a later day than that required. At our knowledge all the algorithms proposed in the literature for the PVRP are heuristics. In this thesis we present for the first time an exact algorithm for the PVRP that is based on different relaxations of a set partitioning-like formulation. The effectiveness of the proposed algorithm is tested on a set of instances from the literature and on a new set of instances. Finally, the PDPTW is to service a set of transportation requests using a fleet of identical vehicles of limited capacity located at a central depot. Each request specifies a pickup location and a delivery location and requires that a given quantity of load is transported from the pickup location to the delivery location. Moreover, each location can be visited only within an associated time window. Each vehicle can perform at most one route and the problem is to satisfy all the requests using the available vehicles so that each request is serviced by a single vehicle, the load on each vehicle does not exceed the capacity, and all locations are visited according to their time window. We formulate the PDPTW as a set partitioning-like problem with additional cuts and we propose an exact algorithm based on different relaxations of the mathematical formulation and a branch-and-cut-and-price algorithm. The new algorithm is tested on two classes of problems from the literature and compared with a recent branch-and-cut-and-price algorithm from the literature.
Resumo:
Bifidobacteria constitute up to 3% of the total microbiota and represent one of the most important healthpromoting bacterial groups of the human intestinal microflora. The presence of Bifidobacterium in the human gastrointestinal tract has been directly related to several health-promoting activities; however, to date, no information about the specific mechanisms of interaction with the host is available. The first health-promoting activities studied in these job was the oxalate-degrading activity. Oxalic acid occurs extensively in nature and plays diverse roles, especially in pathological processes. Due to its highly oxidizing effects, hyper absorption or abnormal synthesis of oxalate can cause serious acute disorders in mammals and be lethal in extreme cases. Intestinal oxalate-degrading bacteria could therefore be pivotal in maintaining oxalate homeostasis, reducing the risk of kidney stone development. In this study, the oxalate-degrading activity of 14 bifidobacterial strains was measured by a capillary electrophoresis technique. The oxc gene, encoding oxalyl-CoA decarboxylase, a key enzyme in oxalate catabolism, was isolated by probing a genomic library of B. animalis subsp. lactis BI07, which was one of the most active strains in the preliminary screening. The genetic and transcriptional organization of oxc flanking regions was determined, unravelling the presence of other two independently transcribed open reading frames, potentially responsible for B. animalis subsp. lactis ability to degrade oxalate. Transcriptional analysis, using real-time quantitative reverse transcription PCR, revealed that these genes were highly induced in cells first adapted to subinhibitory concentrations of oxalate and then exposed to pH 4.5. Acidic conditions were also a prerequisite for a significant oxalate degradation rate, which dramatically increased in oxalate pre-adapted cells, as demonstrated in fermentation experiments with different pH-controlled batch cultures. These findings provide new insights in the characterization of oxalate-degrading probiotic bacteria and may support the use of B. animalis subsp. lactis as a promising adjunct for the prophylaxis and management of oxalate-related kidney disease. In order to provide some insight into the molecular mechanisms involved in the interaction with the host, in the second part of the job, we investigated whether Bifidobacterium was able to capture human plasminogen on the cell surface. The binding of human plasminogen to Bifidobacterium was dependent on lysine residues of surface protein receptors. By using a proteomic approach, we identified six putative plasminogen-binding proteins in the cell wall fraction of three strain of Bifidobacterium. The data suggest that plasminogen binding to Bifidobactrium is due to the concerted action of a number of proteins located on the bacterial cell surface, some of which are highly conserved cytoplasmic proteins which have other essential cellular functions. Our findings represent a step forward in understanding the mechanisms involved in the Bifidobacterium-host interaction. In these job w studied a new approach based on to MALDI-TOF MS to measure the interaction between entire bacterial cells and host molecular target. MALDI-TOF (Matrix Assisted Laser Desorption Ionization-Time of Flight)—mass spectrometry has been applied, for the first time, in the investigation of whole Bifidobacterium cells-host target proteins interaction. In particular, by means of this technique, a dose dependent human plasminogen-binding activity has been shown for Bifidobacterium. The involvement of lysine binding sites on the bacterial cell surface has been proved. The obtained result was found to be consistent with that from well-established standard methodologies, thus the proposed MALDI-TOF approach has the potential to enter as a fast alternative method in the field of biorecognition studies involving in bacterial cells and proteins of human origin.
Resumo:
Fog oases, locally named Lomas, are distributed in a fragmented way along the western coast of Chile and Peru (South America) between ~6°S and 30°S following an altitudinal gradient determined by a fog layer. This fragmentation has been attributed to the hyper aridity of the desert. However, periodically climatic events influence the ‘normal seasonality’ of this ecosystem through a higher than average water input that triggers plant responses (e.g. primary productivity and phenology). The impact of the climatic oscillation may vary according to the season (wet/dry). This thesis evaluates the potential effect of climate oscillations, such as El Niño Southern Oscillation (ENSO), through the analysis of vegetation of this ecosystem following different approaches: Chapters two and three show the analysis of fog oasis along the Peruvian and Chilean deserts. The objectives are: 1) to explain the floristic connection of fog oases analysing their taxa composition differences and the phylogenetic affinities among them, 2) to explore the climate variables related to ENSO which likely affect fog production, and the responses of Lomas vegetation (composition, productivity, distribution) to climate patterns during ENSO events. Chapters four and five describe a fog-oasis in southern Peru during the 2008-2010 period. The objectives are: 3) to describe and create a new vegetation map of the Lomas vegetation using remote sensing analysis supported by field survey data, and 4) to identify the vegetation change during the dry season. The first part of our results show that: 1) there are three significantly different groups of Lomas (Northern Peru, Southern Peru, and Chile) with a significant phylogenetic divergence among them. The species composition reveals a latitudinal gradient of plant assemblages. The species origin, growth-forms typologies, and geographic position also reinforce the differences among groups. 2) Contradictory results have emerged from studies of low-cloud anomalies and the fog-collection during El Niño (EN). EN increases water availability in fog oases when fog should be less frequent due to the reduction of low-clouds amount and stratocumulus. Because a minor role of fog during EN is expected, it is likely that measurements of fog-water collection during EN are considering drizzle and fog at the same time. Although recent studies on fog oases have shown some relationship with the ENSO, responses of vegetation have been largely based on descriptive data, the absence of large temporal records limit the establishment of a direct relationship with climatic oscillations. The second part of the results show that: 3) five different classes of different spectral values correspond to the main land cover of Lomas using a Vegetation Index (VI). The study case is characterised by shrubs and trees with variable cover (dense, semi-dense and open). A secondary area is covered by small shrubs where the dominant tree species is not present. The cacti area and the old terraces with open vegetation were not identified with the VI. Agriculture is present in the area. Finally, 4) contrary to the dry season of 2008 and 2009 years, a higher VI was obtained during the dry season of 2010. The VI increased up to three times their average value, showing a clear spectral signal change, which coincided with the ENSO event of that period.
Resumo:
One of the most interesting challenge of the next years will be the Air Space Systems automation. This process will involve different aspects as the Air Traffic Management, the Aircrafts and Airport Operations and the Guidance and Navigation Systems. The use of UAS (Uninhabited Aerial System) for civil mission will be one of the most important steps in this automation process. In civil air space, Air Traffic Controllers (ATC) manage the air traffic ensuring that a minimum separation between the controlled aircrafts is always provided. For this purpose ATCs use several operative avoidance techniques like holding patterns or rerouting. The use of UAS in these context will require the definition of strategies for a common management of piloted and piloted air traffic that allow the UAS to self separate. As a first employment in civil air space we consider a UAS surveillance mission that consists in departing from a ground base, taking pictures over a set of mission targets and coming back to the same ground base. During all mission a set of piloted aircrafts fly in the same airspace and thus the UAS has to self separate using the ATC avoidance as anticipated. We consider two objective, the first consists in the minimization of the air traffic impact over the mission, the second consists in the minimization of the impact of the mission over the air traffic. A particular version of the well known Travelling Salesman Problem (TSP) called Time-Dependant-TSP has been studied to deal with traffic problems in big urban areas. Its basic idea consists in a cost of the route between two clients depending on the period of the day in which it is crossed. Our thesis supports that such idea can be applied to the air traffic too using a convenient time horizon compatible with aircrafts operations. The cost of a UAS sub-route will depend on the air traffic that it will meet starting such route in a specific moment and consequently on the avoidance maneuver that it will use to avoid that conflict. The conflict avoidance is a topic that has been hardly developed in past years using different approaches. In this thesis we purpose a new approach based on the use of ATC operative techniques that makes it possible both to model the UAS problem using a TDTSP framework both to use an Air Traffic Management perspective. Starting from this kind of mission, the problem of the UAS insertion in civil air space is formalized as the UAS Routing Problem (URP). For this reason we introduce a new structure called Conflict Graph that makes it possible to model the avoidance maneuvers and to define the arc cost function of the departing time. Two Integer Linear Programming formulations of the problem are proposed. The first is based on a TDTSP formulation that, unfortunately, is weaker then the TSP formulation. Thus a new formulation based on a TSP variation that uses specific penalty to model the holdings is proposed. Different algorithms are presented: exact algorithms, simple heuristics used as Upper Bounds on the number of time steps used, and metaheuristic algorithms as Genetic Algorithm and Simulated Annealing. Finally an air traffic scenario has been simulated using real air traffic data in order to test our algorithms. Graphic Tools have been used to represent the Milano Linate air space and its air traffic during different days. Such data have been provided by ENAV S.p.A (Italian Agency for Air Navigation Services).
Resumo:
One of the ways by which the legal system has responded to different sets of problems is the blurring of the traditional boundaries of criminal law, both procedural and substantive. This study aims to explore under what conditions does this trend lead to the improvement of society's welfare by focusing on two distinguishing sanctions in criminal law, incarceration and social stigma. In analyzing how incarceration affects the incentive to an individual to violate a legal standard, we considered the crucial role of the time constraint. This aspect has not been fully explored in the literature on law and economics, especially with respect to the analysis of the beneficiality of imposing either a fine or a prison term. We observed that that when individuals are heterogeneous with respect to wealth and wage income, and when the level of activity can be considered a normal good, only the middle wage and middle income groups can be adequately deterred by a fixed fines alone regime. The existing literature only considers the case of the very poor, deemed as judgment proof. However, since imprisonment is a socially costly way to deprive individuals of their time, other alternatives may be sought such as the imposition of discriminatory monetary fine, partial incapacitation and other alternative sanctions. According to traditional legal theory, the reason why criminal law is obeyed is not mainly due to the monetary sanctions but to the stigma arising from the community’s moral condemnation that accompanies conviction or merely suspicion. However, it is not sufficiently clear whether social stigma always accompanies a criminal conviction. We addressed this issue by identifying the circumstances wherein a criminal conviction carries an additional social stigma. Our results show that social stigma is seen to accompany a conviction under the following conditions: first, when the law coincides with the society's social norms; and second, when the prohibited act provides information on an unobservable attribute or trait of an individual -- crucial in establishing or maintaining social relationships beyond mere economic relationships. Thus, even if the social planner does not impose the social sanction directly, the impact of social stigma can still be influenced by the probability of conviction and the level of the monetary fine imposed as well as the varying degree of correlation between the legal standard violated and the social traits or attributes of the individual. In this respect, criminal law serves as an institution that facilitates cognitive efficiency in the process of imposing the social sanction to the extent that the rest of society is boundedly rational and use judgment heuristics. Paradoxically, using criminal law in order to invoke stigma for the violation of a legal standard may also serve to undermine its strength. To sum, the results of our analysis reveal that the scope of criminal law is narrow both for the purposes of deterrence and cognitive efficiency. While there are certain conditions where the enforcement of criminal law may lead to an increase in social welfare, particularly with respect to incarceration and stigma, we have also identified the channels through which they could affect behavior. Since such mechanisms can be replicated in less costly ways, society should first try or seek to employ these legal institutions before turning to criminal law as a last resort.
Resumo:
In this study new tomographic models of Colombia were calculated. I used the seismicity recorded by the Colombian seismic network during the period 2006-2009. In this time period, the improvement of the seismic network yields more stable hypocentral results with respect to older data set and allows to compute new 3D Vp and Vp/Vs models. The final dataset consists of 10813 P- and 8614 S-arrival times associated to 1405 earthquakes. Tests with synthetic data and resolution analysis indicate that velocity models are well constrained in central, western and southwestern Colombia to a depth of 160 km; the resolution is poor in the northern Colombia and close to Venezuela due to a lack of seismic stations and seismicity. The tomographic models and the relocated seismicity indicate the existence of E-SE subducting Nazca lithosphere beneath central and southern Colombia. The North-South changes in Wadati-Benioff zone, Vp & Vp/Vs pattern and volcanism, show that the downgoing plate is segmented by slab tears E-W directed, suggesting the presence of three sectors. Earthquakes in the northernmost sector represent most of the Colombian seimicity and concentrated on 100-170 km depth interval, beneath the Eastern Cordillera. Here a massive dehydration is inferred, resulting from a delay in the eclogitization of a thickened oceanic crust in a flat-subduction geometry. In this sector a cluster of intermediate-depth seismicity (Bucaramanga Nest) is present beneath the elbow of the Eastern Cordillera, interpreted as the result of massive and highly localized dehydration phenomenon caused by a hyper-hydrous oceanic crust. The central and southern sectors, although different in Vp pattern show, conversely, a continuous, steep and more homogeneous Wadati-Benioff zone with overlying volcanic areas. Here a "normalthickened" oceanic crust is inferred, allowing for a gradual and continuous metamorphic reactions to take place with depth, enabling the fluid migration towards the mantle wedge.
Resumo:
In order to handle Natural disasters, emergency areas are often individuated over the territory, close to populated centres. In these areas, rescue services are located which respond with resources and materials for population relief. A method of automatic positioning of these centres in case of a flood or an earthquake is presented. The positioning procedure consists of two distinct parts developed by the research group of Prof Michael G. H. Bell of Imperial College, London, refined and applied to real cases at the University of Bologna under the coordination of Prof Ezio Todini. There are certain requirements that need to be observed such as the maximum number of rescue points as well as the number of people involved. Initially, the candidate points are decided according to the ones proposed by the local civil protection services. We then calculate all possible routes from each candidate rescue point to all other points, generally using the concept of the "hyperpath", namely a set of paths each one of which may be optimal. The attributes of the road network are of fundamental importance, both for the calculation of the ideal distance and eventual delays due to the event measured in travel time units. In a second phase, the distances are used to decide the optimum rescue point positions using heuristics. This second part functions by "elimination". In the beginning, all points are considered rescue centres. During every interaction we wish to delete one point and calculate the impact it creates. In each case, we delete the point that creates less impact until we reach the number of rescue centres we wish to keep.
Resumo:
La questione energetica ha assunto, negli ultimi anni, un ruolo centrale nel dibattito mondiale in relazione a quattro fattori principali: la non riproducibilità delle risorse naturali, l’aumento esponenziale dei consumi, gli interessi economici e la salvaguardia dell'equilibrio ambientale e climatico del nostro Pianeta. E’ necessario, dunque, cambiare il modello di produzione e consumo dell’energia soprattutto nelle città, dove si ha la massima concentrazione dei consumi energetici. Per queste ragioni, il ricorso alle Fonti Energetiche Rinnovabili (FER) si configura ormai come una misura necessaria, opportuna ed urgente anche nella pianificazione urbanistica. Per migliorare la prestazione energetica complessiva del sistema città bisogna implementare politiche di governo delle trasformazioni che escano da una logica operativa “edificio-centrica” e ricomprendano, oltre al singolo manufatto, le aggregazioni di manufatti e le loro relazioni/ interazioni in termini di input e output materico-energetiche. La sostituzione generalizzata del patrimonio edilizio esistente con nuovi edifici iper-tecnologici, è improponibile. In che modo quindi, è possibile ridefinire la normativa e la prassi urbanistica per generare tessuti edilizi energeticamente efficienti? La presente ricerca propone l’integrazione tra la nascente pianificazione energetica del territorio e le più consolidate norme urbanistiche, nella generazione di tessuti urbani “energy saving” che aggiungano alle prestazioni energetico-ambientali dei singoli manufatti quelle del contesto, in un bilancio energetico complessivo. Questo studio, dopo aver descritto e confrontato le principali FER oggi disponibili, suggerisce una metodologia per una valutazione preliminare del mix di tecnologie e di FER più adatto per ciascun sito configurato come “distretto energetico”. I risultati di tale processo forniscono gli elementi basilari per predisporre le azioni necessarie all’integrazione della materia energetica nei Piani Urbanistici attraverso l’applicazione dei principi della perequazione nella definizione di requisiti prestazionali alla scala insediativa, indispensabili per un corretto passaggio alla progettazione degli “oggetti” e dei “sistemi” urbani.
Resumo:
The aim of this thesis was to investigate the respective contribution of prior information and sensorimotor constraints to action understanding, and to estimate their consequences on the evolution of human social learning. Even though a huge amount of literature is dedicated to the study of action understanding and its role in social learning, these issues are still largely debated. Here, I critically describe two main perspectives. The first perspective interprets faithful social learning as an outcome of a fine-grained representation of others’ actions and intentions that requires sophisticated socio-cognitive skills. In contrast, the second perspective highlights the role of simpler decision heuristics, the recruitment of which is determined by individual and ecological constraints. The present thesis aims to show, through four experimental works, that these two contributions are not mutually exclusive. A first study investigates the role of the inferior frontal cortex (IFC), the anterior intraparietal area (AIP) and the primary somatosensory cortex (S1) in the recognition of other people’s actions, using a transcranial magnetic stimulation adaptation paradigm (TMSA). The second work studies whether, and how, higher-order and lower-order prior information (acquired from the probabilistic sampling of past events vs. derived from an estimation of biomechanical constraints of observed actions) interacts during the prediction of other people’s intentions. Using a single-pulse TMS procedure, the third study investigates whether the interaction between these two classes of priors modulates the motor system activity. The fourth study tests the extent to which behavioral and ecological constraints influence the emergence of faithful social learning strategies at a population level. The collected data contribute to elucidate how higher-order and lower-order prior expectations interact during action prediction, and clarify the neural mechanisms underlying such interaction. Finally, these works provide/open promising perspectives for a better understanding of social learning, with possible extensions to animal models.
Resumo:
The Capacitated Location-Routing Problem (CLRP) is a NP-hard problem since it generalizes two well known NP-hard problems: the Capacitated Facility Location Problem (CFLP) and the Capacitated Vehicle Routing Problem (CVRP). The Multi-Depot Vehicle Routing Problem (MDVRP) is known to be a NP-hard since it is a generalization of the well known Vehicle Routing Problem (VRP), arising with one depot. This thesis addresses heuristics algorithms based on the well-know granular search idea introduced by Toth and Vigo (2003) to solve the CLRP and the MDVRP. Extensive computational experiments on benchmark instances for both problems have been performed to determine the effectiveness of the proposed algorithms. This work is organized as follows: Chapter 1 describes a detailed overview and a methodological review of the literature for the the Capacitated Location-Routing Problem (CLRP) and the Multi-Depot Vehicle Routing Problem (MDVRP). Chapter 2 describes a two-phase hybrid heuristic algorithm to solve the CLRP. Chapter 3 shows a computational comparison of heuristic algorithms for the CLRP. Chapter 4 presents a hybrid granular tabu search approach for solving the MDVRP.