42 resultados para Development tools
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The aim of my project is to find operational strategies for increasing the commercial exploitation of fish in Apulia region and to acquire a thorough knowledge of several important aspects of this system order to propose concrete, suitable and appropriate development tools. The plan is to analyze the impact that the socio-economic context has on blue fish systems of fishing and marketing in the various maritime regions. The sector of fishery is typified by a settled downward trend due to both communitarian policies driving towards a reduction of the fishery effort and to reduction of fishing resources. In the same time in Italy a increasing of costs (expecially fuel) and a reduction of market prices, because of the increasing of imports, are observed. Although a big part of Italian fishing fleet is to be referred to Apulia region, this dynamics are worsen, here, also because of market inefficiency and lack of integration and cooperation among fishermen. In this first part of my work I have investigated two areas that are relevant for regional fishery. On a first step I have evaluated fish amount for each kind of dealer working in each one of the two areas than, according to Porter's value chain analysis theory. Than i have applied the approach of value system to evaluate the value chains of the firm's supplier, the firm of fishery itself, and the firms distribution channels. Distribution of value has been resulted different but very unfavorable to fisherman in both investigated areas. The second step of my study has been the evaluation of the social capital value in both areas, defining the networks of fishery consistence and number of their mutual relationship. Results lay stress on a relation to an higher social capital value and a distribution of value system more profitable for fishermen.
Resumo:
The subject of this thesis is multicolour bioluminescence analysis and how it can provide new tools for drug discovery and development.The mechanism of color tuning in bioluminescent reactions is not fully understood yet but it is object of intense research and several hypothesis have been generated. In the past decade key residues of the active site of the enzyme or in the surface surrounding the active site have been identified as responsible of different color emission. Anyway since bioluminescence reaction is strictly dependent from the interaction between the enzyme and its substrate D-luciferin, modification of the substrate can lead to a different emission spectrum too. In the recent years firefly luciferase and other luciferases underwent mutagenesis in order to obtain mutants with different emission characteristics. Thanks to these new discoveries in the bioluminescence field multicolour luciferases can be nowadays employed in bioanalysis for assay developments and imaging purposes. The use of multicolor bioluminescent enzymes expanded the potential of a range of application in vitro and in vivo. Multiple analysis and more information can be obtained from the same analytical session saving cost and time. This thesis focuses on several application of multicolour bioluminescence for high-throughput screening and in vivo imaging. Multicolor luciferases can be employed as new tools for drug discovery and developments and some examples are provided in the different chapters. New red codon optimized luciferase have been demonstrated to be improved tools for bioluminescence imaging in small animal and the possibility to combine red and green luciferases for BLI has been achieved even if some aspects of the methodology remain challenging and need further improvement. In vivo Bioluminescence imaging has known a rapid progress since its first application no more than 15 years ago. It is becoming an indispensable tool in pharmacological research. At the same time the development of more sensitive and implemented microscopes and low-light imager for a better visualization and quantification of multicolor signals would boost the research and the discoveries in life sciences in general and in drug discovery and development in particular.
Resumo:
Proper hazard identification has become progressively more difficult to achieve, as witnessed by several major accidents that took place in Europe, such as the Ammonium Nitrate explosion at Toulouse (2001) and the vapour cloud explosion at Buncefield (2005), whose accident scenarios were not considered by their site safety case. Furthermore, the rapid renewal in the industrial technology has brought about the need to upgrade hazard identification methodologies. Accident scenarios of emerging technologies, which are not still properly identified, may remain unidentified until they take place for the first time. The consideration of atypical scenarios deviating from normal expectations of unwanted events or worst case reference scenarios is thus extremely challenging. A specific method named Dynamic Procedure for Atypical Scenarios Identification (DyPASI) was developed as a complementary tool to bow-tie identification techniques. The main aim of the methodology is to provide an easier but comprehensive hazard identification of the industrial process analysed, by systematizing information from early signals of risk related to past events, near misses and inherent studies. DyPASI was validated on the two examples of new and emerging technologies: Liquefied Natural Gas regasification and Carbon Capture and Storage. The study broadened the knowledge on the related emerging risks and, at the same time, demonstrated that DyPASI is a valuable tool to obtain a complete and updated overview of potential hazards. Moreover, in order to tackle underlying accident causes of atypical events, three methods for the development of early warning indicators were assessed: the Resilience-based Early Warning Indicator (REWI) method, the Dual Assurance method and the Emerging Risk Key Performance Indicator method. REWI was found to be the most complementary and effective of the three, demonstrating that its synergy with DyPASI would be an adequate strategy to improve hazard identification methodologies towards the capture of atypical accident scenarios.
Resumo:
In recent years, the use of Reverse Engineering systems has got a considerable interest for a wide number of applications. Therefore, many research activities are focused on accuracy and precision of the acquired data and post processing phase improvements. In this context, this PhD Thesis deals with the definition of two novel methods for data post processing and data fusion between physical and geometrical information. In particular a technique has been defined for error definition in 3D points’ coordinates acquired by an optical triangulation laser scanner, with the aim to identify adequate correction arrays to apply under different acquisition parameters and operative conditions. Systematic error in data acquired is thus compensated, in order to increase accuracy value. Moreover, the definition of a 3D thermogram is examined. Object geometrical information and its thermal properties, coming from a thermographic inspection, are combined in order to have a temperature value for each recognizable point. Data acquired by an optical triangulation laser scanner are also used to normalize temperature values and make thermal data independent from thermal-camera point of view.
Resumo:
The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.
Resumo:
This thesis is focused on Smart Grid applications in medium voltage distribution networks. For the development of new applications it appears useful the availability of simulation tools able to model dynamic behavior of both the power system and the communication network. Such a co-simulation environment would allow the assessment of the feasibility of using a given network technology to support communication-based Smart Grid control schemes on an existing segment of the electrical grid and to determine the range of control schemes that different communications technologies can support. For this reason, is presented a co-simulation platform that has been built by linking the Electromagnetic Transients Program Simulator (EMTP v3.0) with a Telecommunication Network Simulator (OPNET-Riverbed v18.0). The simulator is used to design and analyze a coordinate use of Distributed Energy Resources (DERs) for the voltage/var control (VVC) in distribution network. This thesis is focused control structure based on the use of phase measurement units (PMUs). In order to limit the required reinforcements of the communication infrastructures currently adopted by Distribution Network Operators (DNOs), the study is focused on leader-less MAS schemes that do not assign special coordinating rules to specific agents. Leader-less MAS are expected to produce more uniform communication traffic than centralized approaches that include a moderator agent. Moreover, leader-less MAS are expected to be less affected by limitations and constraint of some communication links. The developed co-simulator has allowed the definition of specific countermeasures against the limitations of the communication network, with particular reference to the latency and loss and information, for both the case of wired and wireless communication networks. Moreover, the co-simulation platform has bee also coupled with a mobility simulator in order to study specific countermeasures against the negative effects on the medium voltage/current distribution network caused by the concurrent connection of electric vehicles.
Resumo:
Heavy Liquid Metal Cooled Reactors are among the concepts, fostered by the GIF, as potentially able to comply with stringent safety, economical, sustainability, proliferation resistance and physical protection requirements. The increasing interest around these innovative systems has highlighted the lack of tools specifically dedicated to their core design stage. The present PhD thesis summarizes the three years effort of, partially, closing the mentioned gap, by rationally defining the role of codes in core design and by creating a development methodology for core design-oriented codes (DOCs) and its subsequent application to the most needed design areas. The covered fields are, in particular, the fuel assembly thermal-hydraulics and the fuel pin thermo-mechanics. Regarding the former, following the established methodology, the sub-channel code ANTEO+ has been conceived. Initially restricted to the forced convection regime and subsequently extended to the mixed one, ANTEO+, via a thorough validation campaign, has been demonstrated a reliable tool for design applications. Concerning the fuel pin thermo-mechanics, the will to include safety-related considerations at the outset of the pin dimensioning process, has given birth to the safety-informed DOC TEMIDE. The proposed DOC development methodology has also been applied to TEMIDE; given the complex interdependence patterns among the numerous phenomena involved in an irradiated fuel pin, to optimize the code final structure, a sensitivity analysis has been performed, in the anticipated application domain. The development methodology has also been tested in the verification and validation phases; the latter, due to the low availability of experiments truly representative of TEMIDE's application domain, has only been a preliminary attempt to test TEMIDE's capabilities in fulfilling the DOC requirements upon which it has been built. In general, the capability of the proposed development methodology for DOCs in delivering tools helping the core designer in preliminary setting the system configuration has been proven.
Design and Development of a Research Framework for Prototyping Control Tower Augmented Reality Tools
Resumo:
The purpose of the air traffic management system is to ensure the safe and efficient flow of air traffic. Therefore, while augmenting efficiency, throughput and capacity in airport operations, attention has rightly been placed on doing it in a safe manner. In the control tower, many advances in operational safety have come in the form of visualization tools for tower controllers. However, there is a paradox in developing such systems to increase controllers' situational awareness: by creating additional computer displays, the controller's vision is pulled away from the outside view and the time spent looking down at the monitors is increased. This reduces their situational awareness by forcing them to mentally and physically switch between the head-down equipment and the outside view. This research is based on the idea that augmented reality may be able to address this issue. The augmented reality concept has become increasingly popular over the past decade and is being proficiently used in many fields, such as entertainment, cultural heritage, aviation, military & defense. This know-how could be transferred to air traffic control with a relatively low effort and substantial benefits for controllers’ situation awareness. Research on this topic is consistent with SESAR objectives of increasing air traffic controllers’ situation awareness and enable up to 10 % of additional flights at congested airports while still increasing safety and efficiency. During the Ph.D., a research framework for prototyping augmented reality tools was set up. This framework consists of methodological tools for designing the augmented reality overlays, as well as of hardware and software equipment to test them. Several overlays have been designed and implemented in a simulated tower environment, which is a virtual reconstruction of Bologna airport control tower. The positive impact of such tools was preliminary assessed by means of the proposed methodology.
Resumo:
Antigen design is generally driven by the need to obtain enhanced stability,efficiency and safety in vaccines.Unfortunately,the antigen modification is rarely proceeded in parallel with analytical tools development characterization.The analytical tools set up is required during steps of vaccine manufacturing pipeline,for vaccine production modifications,improvements or regulatory requirements.Despite the relevance of bioconjugate vaccines,robust and consistent analytical tools to evaluate the extent of carrier glycosylation are missing.Bioconjugation is a glycoengineering technology aimed to produce N-glycoprotein in vivo in E.coli cells,based on the PglB-dependent system by C. jejuni,applied for production of several glycoconjugate vaccines.This applicability is due to glycocompetent E. coli ability to produce site-selective glycosylated protein used,after few purification steps, as vaccines able to elicit both humoral and cell-mediate immune-response.Here, S.aureus Hla bioconjugated with CP5 was used to perform rational analytical-driven design of the glycosylation sites for the glycosylation extent quantification by Mass Spectrometry.The aim of the study was to develop a MS-based approach to quantify the glycosylation extent for in-process monitoring of bioconjugate production and for final product characterization.The three designed consensus sequences differ for a single amino-acid residue and fulfill the prerequisites for engineered bioconjugate more appropriate from an analytical perspective.We aimed to achieve an optimal MS detectability of the peptide carrying the consensus sequences,complying with the well-characterized requirements for N-glycosylation by PglB.Hla carrier isoforms,bearing these consensus sequences allowed a recovery of about 20 ng/μg of periplasmic protein glycosylated at 40%.The SRM-MS here developed was successfully applied to evaluate the differential site occupancy when carrier protein present two glycosites.The glycosylation extent in each glycosite was determined and the difference in the isoforms were influenced either by the overall source of protein produced and by the position of glycosite insertion.The analytical driven design of the bioconjugated antigen and the development of accurate,precise and robust analytical method allowed to finely characterize the vaccine.
Resumo:
This work thesis focuses on the Helicon Plasma Thruster (HPT) as a candidate for generating thrust for small satellites and CubeSats. Two main topics are addressed: the development of a Global Model (GM) and a 3D self-consistent numerical tool. The GM is suitable for preliminary analysis of HPTs with noble gases such as argon, neon, krypton, and xenon, and alternative propellants such as air and iodine. A lumping methodology is developed to reduce the computational cost when modelling the excited species in the plasma chemistry. A 3D self-consistent numerical tool is also developed that can treat discharges with a generic 3D geometry and model the actual plasma-antenna coupling. The tool consists of two main modules, an EM module and a FLUID module, which run iteratively until a steady state solution is converged. A third module is available for solving the plume with a simplified semi-analytical approach, a PIC code, or directly by integration of the fluid equations. Results obtained from both the numerical tools are benchmarked against experimental measures of HPTs or Helicon reactors, obtaining very good qualitative agreement with the experimental trend for what concerns the GM, and an excellent agreement of the physical trends predicted against the measured data for the 3D numerical strategy.
Resumo:
Thanks to the development and combination of molecular markers for the genetic traceability of sunflower varieties and a gas chromatographic method for the determination of the FAs composition of sunflower oil, it was possible to implement an experimental method for the verification of both the traceability and the variety of organic sunflower marketed by Agricola Grains S.p.A. The experimental activity focused on two objectives: the implementation of molecular markers for the routine control of raw material deliveries for oil extraction and the improvement and validation of a gas chromatographic method for the determination of the FAs composition of sunflower oil. With regard to variety verification and traceability, the marker systems evaluated were the following: SSR markers (12) arranged in two multiplex sets and SCAR markers for the verification of cytoplasmic male sterility (Pet1) and fertility. In addition, two objectives were pursued in order to enable a routine application in the industrial field: the development of a suitable protocol for DNA extraction from single seeds and the implementation of a semi-automatic capillary electrophoresis system for the analysis of marker fragments. The development and validation of a new GC/FID analytical method for the determination of fatty acids (FAME) in sunflower achenes to improve the quality and efficiency of the analytical flow in the control of raw and refined materials entering the Agricola Grains S.p.A. production chain. The analytical performances being validated by the newly implemented method are: linearity of response, limit of quantification, specificity, precision, intra-laboratory precision, robustness, BIAS. These parameters are used to compare the newly developed method with the one considered as reference - Commission Regulation No. 2568/91 and Commission Implementing Regulation No. 2015/1833. Using the combination of the analytical methods mentioned above, the documentary traceability of the product can be confirmed experimentally, providing relevant information for subsequent marketing.
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.
Resumo:
ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.
Resumo:
With their accession to the European Union, twelve new countries - Romania among them - (re)entered the international community of international donors. In the history of development aid this can be seen as a unique event: it is for the first time in history that such a large number of countries become international donors, with such short notice and in such a particular context that sees some scholars announcing the ‘death’ of development. But in spite of what might be claimed regarding the ‘end’ of the development era, development discourse seems to be rather vigorous and in good health: it is able to extert an undeniable force of attraction over the twelve countries that, in a matter of years, have already convinced themselves of its validity and adhered to its main tenets. This thesis collects evidence for improving our understanding of this process that sees the co-optation of twelve new countries to the dominant theory and practice of development cooperation. The evidence collected seems to show that one of the tools employed by the promoters of this co-optation process is that of constructing the ‘new’ Member States as ‘new’, inexpert donors that need to learn from the ‘old’ ones. By taking a case-study approach, this thesis gathers data that suggests that conceiving of the ‘twelve’ as ‘new’ donors is both historically inaccurate and value-ladden. On one hand, Romania’s case-study illustrates how in the (socialist) past at least one in the group of the twelve was particularly conversant in the discourse of international development. On the other hand, the process of co-optation, while being presented as a knowledgeproducing process, can also be seen as an ignorance-producing procedure: Romania, along with its fellow new Member States, takes the opportunity of ‘building its capacity’ and ‘raising its awareness’ of development cooperation along the line drawn by the European Union, but at the same time it seems to un-learn and ‘lower’ its awareness of development experience in the (socialist) past. This is one possible reading of this thesis. At a different level, this thesis can also be seen as an attempt to account of almost five decades of international development discourse in one specific country – Romania – in three different socio-political contexts: the socialist years (up to the year 1989), the ‘transition years’ (from 1989 to the pre-accession years) and the membership to the European Union. In this second reading, the thesis seeks to illustrate how – contrary to widespread beliefs – before 1989 Romania’s international development discourse was particularly vivid: in the most varied national and international settings President Ceausescu unfolded an extensive discursive activity on issues pertaining to international development; generous media coverage of affairs concerning the developing countries and their fight for development was the rule rather than the exception; the political leadership wanted the Romanians not only to be familiarized with (or ‘aware of’ to use current terminology) matters of underdevelopment, but also to prove a sense of solidarity with these countries, as well as a sense of pride for the relations of ‘mutual help’ that were being built with them; finally, international development was object of academic attention and the Romanian scholars were able not only to reflect on major developments, but could also formulate critical positions towards the practices of development aid. Very little remains of all this during the transition years, while in the present those who are engaged in matters pertaining to international development do so with a view of building Romania as an EU-compliant donor.
Resumo:
The dolphin (Tursiops truncatus) is a mammal that is adapted to life in a totally aquatic environment. Despite the popularity and even iconic status of the dolphin, our knowledge of its physiology, its unique adaptations and the effects on it of environmental stressors are limited. One approach to improve this limited understanding is the implementation of established cellular and molecular methods to provide sensitive and insightful information for dolphin biology. We initiated our studies with the analysis of wild dolphin peripheral blood leukocytes, which have the potential to be informative of the animal’s global immune status. Transcriptomic profiles from almost 200 individual samples were analyzed using a newly developed species-specific microarray to assess its value as a prognostic and diagnostic tool. Functional genomics analyses were informative of stress-induced gene expression profiles and also of geographical location specific transcriptomic signatures, determined by the interaction of genetic, disease and environmental factors. We have developed quantitative metrics to unambiguously characterize the phenotypic properties of dolphin cells in culture. These quantitative metrics can provide identifiable characteristics and baseline data which will enable identification of changes in the cells due to time in culture. We have also developed a novel protocol to isolate primary cultures from cryopreserved tissue of stranded marine mammals, establishing a tissue (and cell) biorepository, a new approach that can provide a solution to the limited availability of samples. The work presented represents the development and application of tools for the study of the biology, health and physiology of the dolphin, and establishes their relevance for future studies of the impact on the dolphin of environmental infection and stress.