880 resultados para Vehicle Routing Problem Multi-Trip Ricerca Operativa TSP VRP
Resumo:
Which event study methods are best in non-U.S. multi-country samples? Nonparametric tests, especially the rank and generalized sign, are better specified and more powerful than common parametric tests, especially in multi-day windows. The generalized sign test is the best statistic but must be applied to buy-and-hold abnormal returns for correct specification. Market-adjusted and market-model methods with local market indexes, without conversion to a common currency, work well. The results are robust to limiting the samples to situations expected to be problematic for test specification or power. Applying the tests that perform best in simulation to merger announcements produces reasonable results.
Resumo:
As distributed collaborative applications and architectures are adopting policy based management for tasks such as access control, network security and data privacy, the management and consolidation of a large number of policies is becoming a crucial component of such policy based systems. In large-scale distributed collaborative applications like web services, there is the need of analyzing policy interactions and integrating policies. In this thesis, we propose and implement EXAM-S, a comprehensive environment for policy analysis and management, which can be used to perform a variety of functions such as policy property analyses, policy similarity analysis, policy integration etc. As part of this environment, we have proposed and implemented new techniques for the analysis of policies that rely on a deep study of state of the art techniques. Moreover, we propose an approach for solving heterogeneity problems that usually arise when considering the analysis of policies belonging to different domains. Our work focuses on analysis of access control policies written in the dialect of XACML (Extensible Access Control Markup Language). We consider XACML policies because XACML is a rich language which can represent many policies of interest to real world applications and is gaining widespread adoption in the industry.
Resumo:
Many energetic and environmental evaluations need appropriate meteorological data, as input to analysis and prevision softwares. In Italy there aren't adeguate meteorological data because, in many cases, they are incomplete, incorrect and also very expensive for a long-term analysis (that needs multi-year data sets). A possible solution to this problem is the use of a Typical Meteorological Year (TRY), generated for specific applications. Nowadays the TRYs have been created, using statistical criteria, just for the analysis of solar energy systems and for predicting the thermal performance of buildings, applying it also to the study of photovoltaic plants (PV), though not specifically created for this type of application. The present research has defined the methodology for the creation of TRYs for different applications. In particular TRYs for environmental and wind plant analysis have been created. This is the innovative aspect of this research, never explored before. In additions, the methodology of the generation for the PV TRYs has been improved. The results are very good and the TRYs generated for these applications are adeguate to characterize the climatic condition of the place over a long period and can be used for energetic and environmental studies.
Resumo:
Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still “far” from accessible from directly configure and offer network services and need to be enriched with more “user oriented” functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.
Resumo:
This thesis presents a creative and practical approach to dealing with the problem of selection bias. Selection bias may be the most important vexing problem in program evaluation or in any line of research that attempts to assert causality. Some of the greatest minds in economics and statistics have scrutinized the problem of selection bias, with the resulting approaches – Rubin’s Potential Outcome Approach(Rosenbaum and Rubin,1983; Rubin, 1991,2001,2004) or Heckman’s Selection model (Heckman, 1979) – being widely accepted and used as the best fixes. These solutions to the bias that arises in particular from self selection are imperfect, and many researchers, when feasible, reserve their strongest causal inference for data from experimental rather than observational studies. The innovative aspect of this thesis is to propose a data transformation that allows measuring and testing in an automatic and multivariate way the presence of selection bias. The approach involves the construction of a multi-dimensional conditional space of the X matrix in which the bias associated with the treatment assignment has been eliminated. Specifically, we propose the use of a partial dependence analysis of the X-space as a tool for investigating the dependence relationship between a set of observable pre-treatment categorical covariates X and a treatment indicator variable T, in order to obtain a measure of bias according to their dependence structure. The measure of selection bias is then expressed in terms of inertia due to the dependence between X and T that has been eliminated. Given the measure of selection bias, we propose a multivariate test of imbalance in order to check if the detected bias is significant, by using the asymptotical distribution of inertia due to T (Estadella et al. 2005) , and by preserving the multivariate nature of data. Further, we propose the use of a clustering procedure as a tool to find groups of comparable units on which estimate local causal effects, and the use of the multivariate test of imbalance as a stopping rule in choosing the best cluster solution set. The method is non parametric, it does not call for modeling the data, based on some underlying theory or assumption about the selection process, but instead it calls for using the existing variability within the data and letting the data to speak. The idea of proposing this multivariate approach to measure selection bias and test balance comes from the consideration that in applied research all aspects of multivariate balance, not represented in the univariate variable- by-variable summaries, are ignored. The first part contains an introduction to evaluation methods as part of public and private decision process and a review of the literature of evaluation methods. The attention is focused on Rubin Potential Outcome Approach, matching methods, and briefly on Heckman’s Selection Model. The second part focuses on some resulting limitations of conventional methods, with particular attention to the problem of how testing in the correct way balancing. The third part contains the original contribution proposed , a simulation study that allows to check the performance of the method for a given dependence setting and an application to a real data set. Finally, we discuss, conclude and explain our future perspectives.
Resumo:
Selective oxidation is one of the simplest functionalization methods and essentially all monomers used in manufacturing artificial fibers and plastics are obtained by catalytic oxidation processes. Formally, oxidation is considered as an increase in the oxidation number of the carbon atoms, then reactions such as dehydrogenation, ammoxidation, cyclization or chlorination are all oxidation reactions. In this field, most of processes for the synthesis of important chemicals used vanadium oxide-based catalysts. These catalytic systems are used either in the form of multicomponent mixed oxides and oxysalts, e.g., in the oxidation of n-butane (V/P/O) and of benzene (supported V/Mo/O) to maleic anhydride, or in the form of supported metal oxide, e.g., in the manufacture of phthalic anhydride by o-xylene oxidation, of sulphuric acid by oxidation of SO2, in the reduction of NOx with ammonia and in the ammoxidation of alkyl aromatics. In addition, supported vanadia catalysts have also been investigated for the oxidative dehydrogenation of alkanes to olefins , oxidation of pentane to maleic anhydride and the selective oxidation of methanol to formaldehyde or methyl formate [1]. During my PhD I focused my work on two gas phase selective oxidation reactions. The work was done at the Department of Industrial Chemistry and Materials (University of Bologna) in collaboration with Polynt SpA. Polynt is a leader company in the development, production and marketing of catalysts for gas-phase oxidation. In particular, I studied the catalytic system for n-butane oxidation to maleic anhydride (fluid bed technology) and for o-xylene oxidation to phthalic anhydride. Both reactions are catalyzed by systems based on vanadium, but catalysts are completely different. Part A is dedicated to the study of V/P/O catalyst for n-butane selective oxidation, while in the Part B the results of an investigation on TiO2-supported V2O5, catalyst for o-xylene oxidation are showed. In Part A, a general introduction about the importance of maleic anhydride, its uses, the industrial processes and the catalytic system are reported. The reaction is the only industrial direct oxidation of paraffins to a chemical intermediate. It is produced by n-butane oxidation either using fixed bed and fluid bed technology; in both cases the catalyst is the vanadyl pyrophosphate (VPP). Notwithstanding the good performances, the yield value didn’t exceed 60% and the system is continuously studied to improve activity and selectivity. The main open problem is the understanding of the real active phase working under reaction conditions. Several articles deal with the role of different crystalline and/or amorphous vanadium/phosphorous (VPO) compounds. In all cases, bulk VPP is assumed to constitute the core of the active phase, while two different hypotheses have been formulated concerning the catalytic surface. In one case the development of surface amorphous layers that play a direct role in the reaction is described, in the second case specific planes of crystalline VPP are assumed to contribute to the reaction pattern, and the redox process occurs reversibly between VPP and VOPO4. Both hypotheses are supported also by in-situ characterization techniques, but the experiments were performed with different catalysts and probably under slightly different working conditions. Due to complexity of the system, these differences could be the cause of the contradictions present in literature. Supposing that a key role could be played by P/V ratio, I prepared, characterized and tested two samples with different P/V ratio. Transformation occurring on catalytic surfaces under different conditions of temperature and gas-phase composition were studied by means of in-situ Raman spectroscopy, trying to investigate the changes that VPP undergoes during reaction. The goal is to understand which kind of compound constituting the catalyst surface is the most active and selective for butane oxidation reaction, and also which features the catalyst should possess to ensure the development of this surface (e.g. catalyst composition). On the basis of results from this study, it could be possible to project a new catalyst more active and selective with respect to the present ones. In fact, the second topic investigated is the possibility to reproduce the surface active layer of VPP onto a support. In general, supportation is a way to improve mechanical features of the catalysts and to overcome problems such as possible development of local hot spot temperatures, which could cause a decrease of selectivity at high conversion, and high costs of catalyst. In literature it is possible to find different works dealing with the development of supported catalysts, but in general intrinsic characteristics of VPP are worsened due to the chemical interaction between active phase and support. Moreover all these works deal with the supportation of VPP; on the contrary, my work is an attempt to build-up a V/P/O active layer on the surface of a zirconia support by thermal treatment of a precursor obtained by impregnation of a V5+ salt and of H3PO4. In-situ Raman analysis during the thermal treatment, as well as reactivity tests are used to investigate the parameters that may influence the generation of the active phase. Part B is devoted to the study of o-xylene oxidation of phthalic anhydride; industrially, the reaction is carried out in gas-phase using as catalysts a supported system formed by V2O5 on TiO2. The V/Ti/O system is quite complex; different vanadium species could be present on the titania surface, as a function of the vanadium content and of the titania surface area: (i) V species which is chemically bound to the support via oxo bridges (isolated V in octahedral or tetrahedral coordination, depending on the hydration degree), (ii) a polymeric species spread over titania, and (iii) bulk vanadium oxide, either amorphous or crystalline. The different species could have different catalytic properties therefore changing the relative amount of V species can be a way to optimize the catalytic performances of the system. For this reason, samples containing increasing amount of vanadium were prepared and tested in the oxidation of o-xylene, with the aim of find a correlations between V/Ti/O catalytic activity and the amount of the different vanadium species. The second part deals with the role of a gas-phase promoter. Catalytic surface can change under working conditions; the high temperatures and a different gas-phase composition could have an effect also on the formation of different V species. Furthermore, in the industrial practice, the vanadium oxide-based catalysts need the addition of gas-phase promoters in the feed stream, that although do not have a direct role in the reaction stoichiometry, when present leads to considerable improvement of catalytic performance. Starting point of my investigation is the possibility that steam, a component always present in oxidation reactions environment, could cause changes in the nature of catalytic surface under reaction conditions. For this reason, the dynamic phenomena occurring at the surface of a 7wt% V2O5 on TiO2 catalyst in the presence of steam is investigated by means of Raman spectroscopy. Moreover a correlation between the amount of the different vanadium species and catalytic performances have been searched. Finally, the role of dopants has been studied. The industrial V/Ti/O system contains several dopants; the nature and the relative amount of promoters may vary depending on catalyst supplier and on the technology employed for the process, either a single-bed or a multi-layer catalytic fixed-bed. Promoters have a quite remarkable effect on both activity and selectivity to phthalic anhydride. Their role is crucial, and the proper control of the relative amount of each component is fundamental for the process performance. Furthermore, it can not be excluded that the same promoter may play different role depending on reaction conditions (T, composition of gas phase..). The reaction network of phthalic anhydride formation is very complex and includes several parallel and consecutive reactions; for this reason a proper understanding of the role of each dopant cannot be separated from the analysis of the reaction scheme. One of the most important promoters at industrial level, which is always present in the catalytic formulations is Cs. It is known that Cs plays an important role on selectivity to phthalic anhydride, but the reasons of this phenomenon are not really clear. Therefore the effect of Cs on the reaction scheme has been investigated at two different temperature with the aim of evidencing in which step of the reaction network this promoter plays its role.
Resumo:
Alzheimer's disease (AD) and cancer represent two of the main causes of death worldwide. They are complex multifactorial diseases and several biochemical targets have been recognized to play a fundamental role in their development. Basing on their complex nature, a promising therapeutical approach could be represented by the so-called "Multi-Target-Directed Ligand" approach. This new strategy is based on the assumption that a single molecule could hit several targets responsible for the onset and/or progression of the pathology. In particular in AD, most currently prescribed drugs aim to increase the level of acetylcholine in the brain by inhibiting the enzyme acetylcholinesterase (AChE). However, clinical experience shows that AChE inhibition is a palliative treatment, and the simple modulation of a single target does not address AD aetiology. Research into newer and more potent anti-AD agents is thus focused on compounds whose properties go beyond AChE inhibition (such as inhibition of the enzyme β-secretase and inhibition of the aggregation of beta-amyloid). Therefore, the MTDL strategy seems a more appropriate approach for addressing the complexity of AD and may provide new drugs for tackling its multifactorial nature. In this thesis, it is described the design of new MTDLs able to tackle the multifactorial nature of AD. Such new MTDLs designed are less flexible analogues of Caproctamine, one of the first MTDL owing biological properties useful for the AD treatment. These new compounds are able to inhibit the enzymes AChE, beta-secretase and to inhibit both AChE-induced and self-induced beta-amyloid aggregation. In particular, the most potent compound of the series is able to inhibit AChE in subnanomolar range, to inhibit β-secretase in micromolar concentration and to inhibit both AChE-induced and self-induced beta-amyloid aggregation in micromolar concentration. Cancer, as AD, is a very complex pathology and many different therapeutical approaches are currently use for the treatment of such pathology. However, due to its multifactorial nature the MTDL approach could be, in principle, apply also to this pathology. Aim of this thesis has been the development of new molecules owing different structural motifs able to simultaneously interact with some of the multitude of targets responsible for the pathology. The designed compounds displayed cytotoxic activity in different cancer cell lines. In particular, the most potent compounds of the series have been further evaluated and they were able to bind DNA resulting 100-fold more potent than the reference compound Mitonafide. Furthermore, these compounds were able to trigger apoptosis through caspases activation and to inhibit PIN1 (preliminary result). This last protein is a very promising target because it is overexpressed in many human cancers, it functions as critical catalyst for multiple oncogenic pathways and in several cancer cell lines depletion of PIN1 determines arrest of mitosis followed by apoptosis induction. In conclusion, this study may represent a promising starting pint for the development of new MTDLs hopefully useful for cancer and AD treatment.
Resumo:
Marine soft bottom systems show a high variability across multiple spatial and temporal scales. Both natural and anthropogenic sources of disturbance act together in affecting benthic sedimentary characteristics and species distribution. The description of such spatial variability is required to understand the ecological processes behind them. However, in order to have a better estimate of spatial patterns, methods that take into account the complexity of the sedimentary system are required. This PhD thesis aims to give a significant contribution both in improving the methodological approaches to the study of biological variability in soft bottom habitats and in increasing the knowledge of the effect that different process (both natural and anthropogenic) could have on the benthic communities of a large area in the North Adriatic Sea. Beta diversity is a measure of the variability in species composition, and Whittaker’s index has become the most widely used measure of beta-diversity. However, application of the Whittaker index to soft bottom assemblages of the Adriatic Sea highlighted its sensitivity to rare species (species recorded in a single sample). This over-weighting of rare species induces biased estimates of the heterogeneity, thus it becomes difficult to compare assemblages containing a high proportion of rare species. In benthic communities, the unusual large number of rare species is frequently attributed to a combination of sampling errors and insufficient sampling effort. In order to reduce the influence of rare species on the measure of beta diversity, I have developed an alternative index based on simple probabilistic considerations. It turns out that this probability index is an ordinary Michaelis-Menten transformation of Whittaker's index but behaves more favourably when species heterogeneity increases. The suggested index therefore seems appropriate when comparing patterns of complexity in marine benthic assemblages. Although the new index makes an important contribution to the study of biodiversity in sedimentary environment, it remains to be seen which processes, and at what scales, influence benthic patterns. The ability to predict the effects of ecological phenomena on benthic fauna highly depends on both spatial and temporal scales of variation. Once defined, implicitly or explicitly, these scales influence the questions asked, the methodological approaches and the interpretation of results. Problem often arise when representative samples are not taken and results are over-generalized, as can happen when results from small-scale experiments are used for resource planning and management. Such issues, although globally recognized, are far from been resolved in the North Adriatic Sea. This area is potentially affected by both natural (e.g. river inflow, eutrophication) and anthropogenic (e.g. gas extraction, fish-trawling) sources of disturbance. Although few studies in this area aimed at understanding which of these processes mainly affect macrobenthos, these have been conducted at a small spatial scale, as they were designated to examine local changes in benthic communities or particular species. However, in order to better describe all the putative processes occurring in the entire area, a high sampling effort performed at a large spatial scale is required. The sedimentary environment of the western part of the Adriatic Sea was extensively studied in this thesis. I have described, in detail, spatial patterns both in terms of sedimentary characteristics and macrobenthic organisms and have suggested putative processes (natural or of human origin) that might affect the benthic environment of the entire area. In particular I have examined the effect of off shore gas platforms on benthic diversity and tested their effect over a background of natural spatial variability. The results obtained suggest that natural processes in the North Adriatic such as river outflow and euthrophication show an inter-annual variability that might have important consequences on benthic assemblages, affecting for example their spatial pattern moving away from the coast and along a North to South gradient. Depth-related factors, such as food supply, light, temperature and salinity play an important role in explaining large scale benthic spatial variability (i.e., affecting both the abundance patterns and beta diversity). Nonetheless, more locally, effects probably related to an organic enrichment or pollution from Po river input has been observed. All these processes, together with few human-induced sources of variability (e.g. fishing disturbance), have a higher effect on macrofauna distribution than any effect related to the presence of gas platforms. The main effect of gas platforms is restricted mainly to small spatial scales and related to a change in habitat complexity due to a natural dislodgement or structure cleaning of mussels that colonize their legs. The accumulation of mussels on the sediment reasonably affects benthic infauna composition. All the components of the study presented in this thesis highlight the need to carefully consider methodological aspects related to the study of sedimentary habitats. With particular regards to the North Adriatic Sea, a multi-scale analysis along natural and anthopogenic gradients was useful for detecting the influence of all the processes affecting the sedimentary environment. In the future, applying a similar approach may lead to an unambiguous assessment of the state of the benthic community in the North Adriatic Sea. Such assessment may be useful in understanding if any anthropogenic source of disturbance has a negative effect on the marine environment, and if so, planning sustainable strategies for a proper management of the affected area.
Resumo:
Several MCAO systems are under study to improve the angular resolution of the current and of the future generation large ground-based telescopes (diameters in the 8-40 m range). The subject of this PhD Thesis is embedded in this context. Two MCAO systems, in dierent realization phases, are addressed in this Thesis: NIRVANA, the 'double' MCAO system designed for one of the interferometric instruments of LBT, is in the integration and testing phase; MAORY, the future E-ELT MCAO module, is under preliminary study. These two systems takle the sky coverage problem in two dierent ways. The layer oriented approach of NIRVANA, coupled with multi-pyramids wavefront sensors, takes advantage of the optical co-addition of the signal coming from up to 12 NGS in a annular 2' to 6' technical FoV and up to 8 in the central 2' FoV. Summing the light coming from many natural sources permits to increase the limiting magnitude of the single NGS and to improve considerably the sky coverage. One of the two Wavefront Sensors for the mid- high altitude atmosphere analysis has been integrated and tested as a stand- alone unit in the laboratory at INAF-Osservatorio Astronomico di Bologna and afterwards delivered to the MPIA laboratories in Heidelberg, where was integrated and aligned to the post-focal optical relay of one LINC-NIRVANA arm. A number of tests were performed in order to characterize and optimize the system functionalities and performance. A report about this work is presented in Chapter 2. In the MAORY case, to ensure correction uniformity and sky coverage, the LGS-based approach is the current baseline. However, since the Sodium layer is approximately 10 km thick, the articial reference source looks elongated, especially when observed from the edge of a large aperture. On a 30-40 m class telescope, for instance, the maximum elongation varies between few arcsec and 10 arcsec, depending on the actual telescope diameter, on the Sodium layer properties and on the laser launcher position. The centroiding error in a Shack-Hartmann WFS increases proportionally to the elongation (in a photon noise dominated regime), strongly limiting the performance. To compensate for this effect a straightforward solution is to increase the laser power, i.e. to increase the number of detected photons per subaperture. The scope of Chapter 3 is twofold: an analysis of the performance of three dierent algorithms (Weighted Center of Gravity, Correlation and Quad-cell) for the instantaneous LGS image position measurement in presence of elongated spots and the determination of the required number of photons to achieve a certain average wavefront error over the telescope aperture. An alternative optical solution to the spot elongation problem is proposed in Section 3.4. Starting from the considerations presented in Chapter 3, a first order analysis of the LGS WFS for MAORY (number of subapertures, number of detected photons per subaperture, RON, focal plane sampling, subaperture FoV) is the subject of Chapter 4. An LGS WFS laboratory prototype was designed to reproduce the relevant aspects of an LGS SH WFS for the E-ELT and to evaluate the performance of different centroid algorithms in presence of elongated spots as investigated numerically and analytically in Chapter 3. This prototype permits to simulate realistic Sodium proles. A full testing plan for the prototype is set in Chapter 4.
Resumo:
The main task of this research is to investigate the situation of drugs in the city of Bologna. A first discussion pertains the method to adopt studying an ethical question as drug actually is. In fact it is widely known that drugs problem involves many political and religious considerations which are misleading in a scientific point of view. After a methodological chapter supposed to show the purpose of this research, it is discussed a logical definition of drugs. There it is examined an aristotelian definition of drugs with semantic instruments from philosophy of the language to fulfil meaning of terms. The following chapter discusses personal stories of different people involved in drug in the city, who actually represent the main characters of drug subculture. Afterwards the official statistics concerning drug enforcement is discussed and compared with a specific police action which allows to criticize that data, and to make some hypothesis about drug quantities circulating in town. Next step is investigating drugs addicted in town, with a validation technique of data base queries. The result is a statistics of users in which there is evidence of main presence of foreigners and not resident Italians who use to practice drugs in this city. Demographic analysis of identified people shows that drug addiction is widely diffused among all range of age and mainly pertains males, with an increasing trend. Then is examined the geographic distribution of users residence and use places, showing that drugs abuse is spread among all classes of population, while drugs squares are located in some points of town which realise a kind of drug area with a concentration of dealers not organised together. With some detailed queries in police reports statistics is studied some specific subject on nowadays drug abuse, the phenomenon of multi-use, the relation between drug and crime, the relation between drug and mental disease, recording some evidence in such topics. Finally a survey on city media along last two years shows the interest about this topic and gives an idea of public opinion’s information about drugs. The study refers to the city of Bologna only, and pertains data recorded along last ten years by the local metropolitan police corp.
Resumo:
Benessere delle popolazioni, gestione sostenibile delle risorse, povertà e degrado ambientale sono dei concetti fortemente connessi in un mondo in cui il 20% della popolazione mondiale consuma più del 75% delle risorse naturali. Sin dal 1992 al Summit della Terra a Rio de Janeiro si è affermato il forte legame tra tutela dell’ambiente e riduzione della povertà, ed è anche stata riconosciuta l’importanza di un ecosistema sano per condurre una vita dignitosa, specialmente nelle zone rurali povere dell’Africa, dell’Asia e dell’America Latina. La natura infatti, soprattutto per le popolazioni rurali, rappresenta un bene quotidiano e prezioso, una forma essenziale per la sussistenza ed una fonte primaria di reddito. Accanto a questa constatazione vi è anche la consapevolezza che negli ultimi decenni gli ecosistemi naturali si stanno degradando ad un ritmo impressionate, senza precedenti nella storia della specie umana: consumiamo le risorse più velocemente di quanto la Terra sia capace di rigenerarle e di “metabolizzare” i nostri scarti. Allo stesso modo aumenta la povertà: attualmente ci sono 1,2 miliardi di persone che vivono con meno di un dollaro al giorno, mentre circa metà della popolazione mondiale sopravvive con meno di due dollari al giorno (UN). La connessione tra povertà ed ambiente non dipende solamente dalla scarsità di risorse che rende più difficili le condizioni di vita, ma anche dalla gestione delle stesse risorse naturali. Infatti in molti paesi o luoghi dove le risorse non sono carenti la popolazione più povera non vi ha accesso per motivi politici, economici e sociali. Inoltre se si paragona l’impronta ecologica con una misura riconosciuta dello “sviluppo umano”, l’Indice dello Sviluppo Umano (HDI) delle Nazioni Unite (Cfr. Cap 2), il rapporto dimostra chiaramente che ciò che noi accettiamo generalmente come “alto sviluppo” è molto lontano dal concetto di sviluppo sostenibile accettato universalmente, in quanto i paesi cosiddetti “sviluppati” sono quelli con una maggior impronta ecologica. Se allora lo “sviluppo” mette sotto pressione gli ecosistemi, dal cui benessere dipende direttamente il benessere dell’uomo, allora vuol dire che il concetto di “sviluppo” deve essere rivisitato, perché ha come conseguenza non il benessere del pianeta e delle popolazioni, ma il degrado ambientale e l’accrescimento delle disuguaglianze sociali. Quindi da una parte vi è la “società occidentale”, che promuove l’avanzamento della tecnologia e dell’industrializzazione per la crescita economica, spremendo un ecosistema sempre più stanco ed esausto al fine di ottenere dei benefici solo per una ristretta fetta della popolazione mondiale che segue un modello di vita consumistico degradando l’ambiente e sommergendolo di rifiuti; dall’altra parte ci sono le famiglie di contadini rurali, i “moradores” delle favelas o delle periferie delle grandi metropoli del Sud del Mondo, i senza terra, gli immigrati delle baraccopoli, i “waste pickers” delle periferie di Bombay che sopravvivono raccattando rifiuti, i profughi di guerre fatte per il controllo delle risorse, gli sfollati ambientali, gli eco-rifugiati, che vivono sotto la soglia di povertà, senza accesso alle risorse primarie per la sopravvivenza. La gestione sostenibile dell’ambiente, il produrre reddito dalla valorizzazione diretta dell’ecosistema e l’accesso alle risorse naturali sono tra gli strumenti più efficaci per migliorare le condizioni di vita degli individui, strumenti che possono anche garantire la distribuzione della ricchezza costruendo una società più equa, in quanto le merci ed i servizi dell’ecosistema fungono da beni per le comunità. La corretta gestione dell’ambiente e delle risorse quindi è di estrema importanza per la lotta alla povertà ed in questo caso il ruolo e la responsabilità dei tecnici ambientali è cruciale. Il lavoro di ricerca qui presentato, partendo dall’analisi del problema della gestione delle risorse naturali e dal suo stretto legame con la povertà, rivisitando il concetto tradizionale di “sviluppo” secondo i nuovi filoni di pensiero, vuole suggerire soluzioni e tecnologie per la gestione sostenibile delle risorse naturali che abbiano come obiettivo il benessere delle popolazioni più povere e degli ecosistemi, proponendo inoltre un metodo valutativo per la scelta delle alternative, soluzioni o tecnologie più adeguate al contesto di intervento. Dopo l’analisi dello “stato del Pianeta” (Capitolo 1) e delle risorse, sia a livello globale che a livello regionale, il secondo Capitolo prende in esame il concetto di povertà, di Paese in Via di Sviluppo (PVS), il concetto di “sviluppo sostenibile” e i nuovi filoni di pensiero: dalla teoria della Decrescita, al concetto di Sviluppo Umano. Dalla presa di coscienza dei reali fabbisogni umani, dall’analisi dello stato dell’ambiente, della povertà e delle sue diverse facce nei vari paesi, e dalla presa di coscienza del fallimento dell’economia della crescita (oggi visibile più che mai) si può comprendere che la soluzione per sconfiggere la povertà, il degrado dell’ambiente, e raggiungere lo sviluppo umano, non è il consumismo, la produzione, e nemmeno il trasferimento della tecnologia e l’industrializzazione; ma il “piccolo e bello” (F. Schumacher, 1982), ovvero gli stili di vita semplici, la tutela degli ecosistemi, e a livello tecnologico le “tecnologie appropriate”. Ed è proprio alle Tecnologie Appropriate a cui sono dedicati i Capitoli successivi (Capitolo 4 e Capitolo 5). Queste sono tecnologie semplici, a basso impatto ambientale, a basso costo, facilmente gestibili dalle comunità, tecnologie che permettono alle popolazioni più povere di avere accesso alle risorse naturali. Sono le tecnologie che meglio permettono, grazie alle loro caratteristiche, la tutela dei beni comuni naturali, quindi delle risorse e dell’ambiente, favorendo ed incentivando la partecipazione delle comunità locali e valorizzando i saperi tradizionali, grazie al coinvolgimento di tutti gli attori, al basso costo, alla sostenibilità ambientale, contribuendo all’affermazione dei diritti umani e alla salvaguardia dell’ambiente. Le Tecnologie Appropriate prese in esame sono quelle relative all’approvvigionamento idrico e alla depurazione dell’acqua tra cui: - la raccolta della nebbia, - metodi semplici per la perforazione di pozzi, - pompe a pedali e pompe manuali per l’approvvigionamento idrico, - la raccolta dell’acqua piovana, - il recupero delle sorgenti, - semplici metodi per la depurazione dell’acqua al punto d’uso (filtro in ceramica, filtro a sabbia, filtro in tessuto, disinfezione e distillazione solare). Il quinto Capitolo espone invece le Tecnolocie Appropriate per la gestione dei rifiuti nei PVS, in cui sono descritte: - soluzioni per la raccolta dei rifiuti nei PVS, - soluzioni per lo smaltimento dei rifiuti nei PVS, - semplici tecnologie per il riciclaggio dei rifiuti solidi. Il sesto Capitolo tratta tematiche riguardanti la Cooperazione Internazionale, la Cooperazione Decentrata e i progetti di Sviluppo Umano. Per progetti di sviluppo si intende, nell’ambito della Cooperazione, quei progetti che hanno come obiettivi la lotta alla povertà e il miglioramento delle condizioni di vita delle comunità beneficiarie dei PVS coinvolte nel progetto. All’interno dei progetti di cooperazione e di sviluppo umano gli interventi di tipo ambientale giocano un ruolo importante, visto che, come già detto, la povertà e il benessere delle popolazioni dipende dal benessere degli ecosistemi in cui vivono: favorire la tutela dell’ambiente, garantire l’accesso all’acqua potabile, la corretta gestione dei rifiuti e dei reflui nonché l’approvvigionamento energetico pulito sono aspetti necessari per permettere ad ogni individuo, soprattutto se vive in condizioni di “sviluppo”, di condurre una vita sana e produttiva. È importante quindi, negli interventi di sviluppo umano di carattere tecnico ed ambientale, scegliere soluzioni decentrate che prevedano l’adozione di Tecnologie Appropriate per contribuire a valorizzare l’ambiente e a tutelare la salute della comunità. I Capitoli 7 ed 8 prendono in esame i metodi per la valutazione degli interventi di sviluppo umano. Un altro aspetto fondamentale che rientra nel ruolo dei tecnici infatti è l’utilizzo di un corretto metodo valutativo per la scelta dei progetti possibili che tenga presente tutti gli aspetti, ovvero gli impatti sociali, ambientali, economici e che si cali bene alle realtà svantaggiate come quelle prese in considerazione in questo lavoro; un metodo cioè che consenta una valutazione specifica per i progetti di sviluppo umano e che possa permettere l’individuazione del progetto/intervento tecnologico e ambientale più appropriato ad ogni contesto specifico. Dall’analisi dei vari strumenti valutativi si è scelto di sviluppare un modello per la valutazione degli interventi di carattere ambientale nei progetti di Cooperazione Decentrata basato sull’Analisi Multi Criteria e sulla Analisi Gerarchica. L’oggetto di questa ricerca è stato quindi lo sviluppo di una metodologia, che tramite il supporto matematico e metodologico dell’Analisi Multi Criteria, permetta di valutare l’appropriatezza, la sostenibilità degli interventi di Sviluppo Umano di carattere ambientale, sviluppati all’interno di progetti di Cooperazione Internazionale e di Cooperazione Decentrata attraverso l’utilizzo di Tecnologie Appropriate. Nel Capitolo 9 viene proposta la metodologia, il modello di calcolo e i criteri su cui si basa la valutazione. I successivi capitoli (Capitolo 10 e Capitolo 11) sono invece dedicati alla sperimentazione della metodologia ai diversi casi studio: - “Progetto ambientale sulla gestione dei rifiuti presso i campi Profughi Saharawi”, Algeria, - “Programa 1 milhão de Cisternas, P1MC” e - “Programa Uma Terra e Duas Águas, P1+2”, Semi Arido brasiliano.
Resumo:
This paper studies relational goods as immaterial assets creating real effects in society. The work starts answering to this question: what kind of effects do relational goods produce? After an accurate literature examination we suppose relational goods are social relations of second order. In the hypotesis they come from the emergence of two distinct social relations: interpersonal and reflexive relations. We describe empirical evidences of these emergent assets in social life and we test the effects they produce with a model. In the work we focus on four targets. First of all we describe the emergence of relational goods through a mathematical model. Then we individualize social realities where relational goods show evident effects and we outline our scientific hypotesis. The following step consists in the formulation of empirical tests. At last we explain final results. Our aim is to set apart the constitutive structure of relational goods into a checkable model coherently with the empirical evidences shown in the research. In the study we use multi-variate analysis techniques to see relational goods in a new way and we use qualitative and quantitative strategies. Relational goods are analysed both as dependent and independent variable in order to consider causative factors acting in a black-box model. Moreover we analyse effects of relational goods inside social spheres, especially in third sector and capitalistic economy. Finally we attain to effective indexes of relational goods in order to compare them with some performance indexes.
Resumo:
It is well known that the deposition of gaseous pollutants and aerosols plays a major role in causing the deterioration of monuments and built cultural heritage in European cities. Despite of many studies dedicated to the environmental damage of cultural heritage, in case of cement mortars, commonly used in the 20th century architecture, the deterioration due to air multipollutants impact, especially the formation of black crusts, is still not well explored making this issue a challenging area of research. This work centers on cement mortars – environment interactions, focusing on the diagnosis of the damage on the modern built heritage due to air multi-pollutants. For this purpose three sites, exposed to different urban areas in Europe, were selected for sampling and subsequent laboratory analyses: Centennial Hall, Wroclaw (Poland), Chiesa dell'Autostrada del Sole, Florence (Italy), Casa Galleria Vichi, Florence (Italy). The sampling sessions were performed taking into account the height from the ground level and protection from rain run off (sheltered, partly sheltered and exposed areas). The complete characterization of collected damage layer and underlying materials was performed using a range of analytical techniques: optical and scanning electron microscopy, X ray diffractometry, differential and gravimetric thermal analysis, ion chromatography, flash combustion/gas chromatographic analysis, inductively coupled plasma-optical emission spectrometer. The data were elaborated using statistical methods (i.e. principal components analyses) and enrichment factor for cement mortars was calculated for the first time. The results obtained from the experimental activity performed on the damage layers indicate that gypsum, due to the deposition of atmospheric sulphur compounds, is the main damage product at surfaces sheltered from rain run-off at Centennial Hall and Casa Galleria Vichi. By contrast, gypsum has not been identified in the samples collected at Chiesa dell'Autostrada del Sole. This is connected to the restoration works, particularly surface cleaning, regularly performed for the maintenance of the building. Moreover, the results obtained demonstrated the correlation between the location of the building and the composition of the damage layer: Centennial Hall is mainly undergoing to the impact of pollutants emitted from the close coal power stations, whilst Casa Galleria Vichi is principally affected by pollutants from vehicular exhaust in front of the building.
Resumo:
The wheel - rail contact analysis plays a fundamental role in the multibody modeling of railway vehicles. A good contact model must provide an accurate description of the global contact phenomena (contact forces and torques, number and position of the contact points) and of the local contact phenomena (position and shape of the contact patch, stresses and displacements). The model has also to assure high numerical efficiency (in order to be implemented directly online within multibody models) and a good compatibility with commercial multibody software (Simpack Rail, Adams Rail). The wheel - rail contact problem has been discussed by several authors and many models can be found in the literature. The contact models can be subdivided into two different categories: the global models and the local (or differential) models. Currently, as regards the global models, the main approaches to the problem are the so - called rigid contact formulation and the semi – elastic contact description. The rigid approach considers the wheel and the rail as rigid bodies. The contact is imposed by means of constraint equations and the contact points are detected during the dynamic simulation by solving the nonlinear algebraic differential equations associated to the constrained multibody system. Indentation between the bodies is not permitted and the normal contact forces are calculated through the Lagrange multipliers. Finally the Hertz’s and the Kalker’s theories allow to evaluate the shape of the contact patch and the tangential forces respectively. Also the semi - elastic approach considers the wheel and the rail as rigid bodies. However in this case no kinematic constraints are imposed and the indentation between the bodies is permitted. The contact points are detected by means of approximated procedures (based on look - up tables and simplifying hypotheses on the problem geometry). The normal contact forces are calculated as a function of the indentation while, as in the rigid approach, the Hertz’s and the Kalker’s theories allow to evaluate the shape of the contact patch and the tangential forces. Both the described multibody approaches are computationally very efficient but their generality and accuracy turn out to be often insufficient because the physical hypotheses behind these theories are too restrictive and, in many circumstances, unverified. In order to obtain a complete description of the contact phenomena, local (or differential) contact models are needed. In other words wheel and rail have to be considered elastic bodies governed by the Navier’s equations and the contact has to be described by suitable analytical contact conditions. The contact between elastic bodies has been widely studied in literature both in the general case and in the rolling case. Many procedures based on variational inequalities, FEM techniques and convex optimization have been developed. This kind of approach assures high generality and accuracy but still needs very large computational costs and memory consumption. Due to the high computational load and memory consumption, referring to the current state of the art, the integration between multibody and differential modeling is almost absent in literature especially in the railway field. However this integration is very important because only the differential modeling allows an accurate analysis of the contact problem (in terms of contact forces and torques, position and shape of the contact patch, stresses and displacements) while the multibody modeling is the standard in the study of the railway dynamics. In this thesis some innovative wheel – rail contact models developed during the Ph. D. activity will be described. Concerning the global models, two new models belonging to the semi – elastic approach will be presented; the models satisfy the following specifics: 1) the models have to be 3D and to consider all the six relative degrees of freedom between wheel and rail 2) the models have to consider generic railway tracks and generic wheel and rail profiles 3) the models have to assure a general and accurate handling of the multiple contact without simplifying hypotheses on the problem geometry; in particular the models have to evaluate the number and the position of the contact points and, for each point, the contact forces and torques 4) the models have to be implementable directly online within the multibody models without look - up tables 5) the models have to assure computation times comparable with those of commercial multibody software (Simpack Rail, Adams Rail) and compatible with RT and HIL applications 6) the models have to be compatible with commercial multibody software (Simpack Rail, Adams Rail). The most innovative aspect of the new global contact models regards the detection of the contact points. In particular both the models aim to reduce the algebraic problem dimension by means of suitable analytical techniques. This kind of reduction allows to obtain an high numerical efficiency that makes possible the online implementation of the new procedure and the achievement of performance comparable with those of commercial multibody software. At the same time the analytical approach assures high accuracy and generality. Concerning the local (or differential) contact models, one new model satisfying the following specifics will be presented: 1) the model has to be 3D and to consider all the six relative degrees of freedom between wheel and rail 2) the model has to consider generic railway tracks and generic wheel and rail profiles 3) the model has to assure a general and accurate handling of the multiple contact without simplifying hypotheses on the problem geometry; in particular the model has to able to calculate both the global contact variables (contact forces and torques) and the local contact variables (position and shape of the contact patch, stresses and displacements) 4) the model has to be implementable directly online within the multibody models 5) the model has to assure high numerical efficiency and a reduced memory consumption in order to achieve a good integration between multibody and differential modeling (the base for the local contact models) 6) the model has to be compatible with commercial multibody software (Simpack Rail, Adams Rail). In this case the most innovative aspects of the new local contact model regard the contact modeling (by means of suitable analytical conditions) and the implementation of the numerical algorithms needed to solve the discrete problem arising from the discretization of the original continuum problem. Moreover, during the development of the local model, the achievement of a good compromise between accuracy and efficiency turned out to be very important to obtain a good integration between multibody and differential modeling. At this point the contact models has been inserted within a 3D multibody model of a railway vehicle to obtain a complete model of the wagon. The railway vehicle chosen as benchmark is the Manchester Wagon the physical and geometrical characteristics of which are easily available in the literature. The model of the whole railway vehicle (multibody model and contact model) has been implemented in the Matlab/Simulink environment. The multibody model has been implemented in SimMechanics, a Matlab toolbox specifically designed for multibody dynamics, while, as regards the contact models, the CS – functions have been used; this particular Matlab architecture allows to efficiently connect the Matlab/Simulink and the C/C++ environment. The 3D multibody model of the same vehicle (this time equipped with a standard contact model based on the semi - elastic approach) has been then implemented also in Simpack Rail, a commercial multibody software for railway vehicles widely tested and validated. Finally numerical simulations of the vehicle dynamics have been carried out on many different railway tracks with the aim of evaluating the performances of the whole model. The comparison between the results obtained by the Matlab/ Simulink model and those obtained by the Simpack Rail model has allowed an accurate and reliable validation of the new contact models. In conclusion to this brief introduction to my Ph. D. thesis, we would like to thank Trenitalia and the Regione Toscana for the support provided during all the Ph. D. activity. Moreover we would also like to thank the INTEC GmbH, the society the develops the software Simpack Rail, with which we are currently working together to develop innovative toolboxes specifically designed for the wheel rail contact analysis.