961 resultados para Industrial safety -- Evaluation
Resumo:
This exploratory-descriptive quantitative study aimed to evaluate the protocol for identifying newborns admitted to the Neonatal Intensive and Semi-intensive Therapy Unit of a private hospital. The case series was made up of 540 observation opportunities, selected by simple random probability sampling. The data was collected between May and August 2010 according to a form and analyzed by descriptive statistic. The protocol's general performance had a conformity index of 82.2%. There were three stages to the protocol: identification components, the identification wristbands' condition and the number of identification wristbands. The highest percentage of conformity (93%) was attributed to the second stage and the lowest (89.3%) to the third, presenting a statistically significant difference of p=0.046. In the group of 'special' neonates, 88.5% conformity was achieved. These results will make it possible to restructure the protocol for identifying newborns and to establish care and managerial goals so as to improve the quality of care and the patients' safety.
Resumo:
In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.
Resumo:
Food safety is a global concern. Meat represents the most important protein source for humans. Thus, contamination of meat products by nonessential elements is a ready source of human exposure. In addition, knowledge of the concentration of essential elements is also relevant with respect to human nutrition. The aim of the present study was to determine the concentration of 17 elements in pork, beef, and chicken produced in Brazil. Meat samples were analyzed by inductively coupled plasma mass spectrometry. The estimated daily intake for nonessential elements including arsenic (As), cadmium (Cd), lead (Pb), mercury (Hg), and antimony (Sb) through meat consumption is below the toxicological reference values. However, high levels were detected for the nonessential element cesium (Cs), mainly in beef samples, an observation that deserves future studies to identify the source of contamination and potential adverse consequences.
Resumo:
Five microbial lipase preparations from several sources were immobilized by hydrophobic adsorption on small or large poly-hydroxybutyrate (PHB) beads and the effect of the support particle size on the biocatalyst activity was assessed in the hydrolysis of olive oil, esterification of butyric acid with butanol and transesterification of babassu oil (Orbignya sp.) with ethanol. The catalytic activity of the immobilized lipases in both olive oil hydrolysis and biodiesel synthesis was influenced by the particle size of PHB and lipase source. In the esterification reaction such influence was not observed. Geobacillus thermocatenulatus lipase (BTL2) was considered to be inadequate to catalyze biodiesel synthesis, but displayed high esterification activity. Butyl butyrate synthesis catalyzed by BTL2 immobilized on small PHB beads gave the highest yield (approximate to 90 mmol L-1). In biodiesel synthesis, the catalytic activity of the immobilized lipases was significantly increased in comparison to the free lipases. Full conversion of babassu oil into ethyl esters was achieved at 72 h in the presence of Pseudozyma antarctica type B (CALB), Thermomyces lanuginosus lipase (Lipex (R) 100L) immobilized on either small or large PHB beads and Pseudomonas fluorescens (PFL) immobilized on large PHB beads. The latter preparation presented the highest productivity (40.9 mg of ethyl esters mg(-1) immobilized protein h(-1)). (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The assessment of the thermal process impact in terms of food safety and quality is of great importance for process evaluation and design. This can be accomplished from the analysis of the residence time and temperature distributions coupled with the kinetics of thermal change, or from the use of a proper time-temperature integrator (TTI) as indicator of safety and quality. The objective of this work was to develop and test enzymic TTIs with rapid detection for the evaluation of continuous HTST pasteurization processes (70-85 degrees C, 10-60 s) of low-viscosity liquid foods, such as milk and juices. Enzymes peroxidase, lactoperoxidase and alkaline phosphatase in phosphate buffer were tested and activity was determined with commercial reflectometric strips. Discontinuous thermal treatments at various time-temperature combinations were performed in order to adjust a first order kinetic model of a two-component system. The measured time-temperature history was considered instead of assuming isothermal conditions. Experiments with slow heating and cooling were used to validate the adjusted model. Only the alkaline phosphatase TTI showed potential to be used for the evaluation of pasteurization processes. The choice was based on the obtained z-values of the thermostable and thermolabile fractions, on the cost and on the validation tests. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Capability to produce antilisterial bacteriocins by lactic acid bacteria (LAB) can be explored by the food industry as a tool to increase the safety of foods. Furthermore, probiotic activity of bacteriogenic LAB brings extra advantages to these strains, as they can confer health benefits to the consumer. Beneficial effects depend on the ability of the probiotic strains to maintain viability in the food during shelf-life and to survive the natural defenses of the host and multiply in the gastrointestinal tract (GIT). This study evaluated the probiotic potential of a bacteriocinogenic Lactobacillus plantarum strain (Lb. plantarum ST16Pa) isolated from papaya fruit and studied the effect of encapsulation in alginate on survival in conditions simulating the human GIT. Good growth of Lb. plantarum ST16Pa was recorded in MRS broth with initial pH values between 5.0 and 9.0 and good capability to survive in pH 4.0, 11.0 and 13.0. Lb. plantarum ST16Pa grew well in the presence of oxbile at concentrations ranging from 0.2 to 3.0%. The level of auto-aggregation was 37%, and various degrees of co-aggregation were observed with different strains of Lb. plantarum, Enterococcus spp., Lb. sakei and Listeria, which are important features for probiotic activity. Growth was affected negatively by several medicaments used for human therapy, mainly anti-inflammatory drugs and antibiotics. Adhesion to Caco-2 cells was within the range reported for other probiotic strains, and PCR analysis indicated that the strain harbored the adhesion genes mapA, mub and EF-Tu. Encapsulation in 2, 3 and 4% alginate protected the cells from exposure to 1 or 2% oxbile added to MRS broth. Studies in a model simulating the transit through the GIT indicated that encapsulated cells were protected from the acidic conditions in the stomach but were less resistant when in conditions simulating the duodenum, jejunum, ileum and first section of the colon. To our knowledge, this is the first report on a bacteriocinogenic LAB isolated from papaya that presents application in food biopreservation and may be beneficial to the consumer health due to its potential probiotic characteristics.
Resumo:
Abstract Background Previous experiments have shown that a decoction of Bauhinia forficata leaves reduces the changes in carbohydrate and protein metabolism that occur in rats with streptozotocin-induced diabetes. In the present investigation, the serum activities of enzymes known to be reliable toxicity markers were monitored in normal and streptozotocin-diabetic rats to discover whether the use of B. forficata decoction has toxic effects on liver, muscle or pancreas tissue or on renal microcirculation. Methods An experimental group of normal and streptozotocin-diabetic rats received an aqueous decoction of fresh B. forficata leaves (150 g/L) by mouth for 33 days while a control group of normal and diabetic rats received water for the same length of time. The serum activity of the toxicity markers lactate dehydrogenase, creatine kinase, amylase, angiotensin-converting enzyme and bilirubin were assayed before receiving B. forficata decoction and on day 19 and 33 of treatment. Results The toxicity markers in normal and diabetic rats were not altered by the diabetes itself nor by treatment with decoction. Whether or not they received B. forficata decoction the normal rats showed a significant increase in serum amylase activity during the experimental period while there was a tendency for the diabetic rats, both treated and untreated with decoction, to have lower serum amylase activities than the normal rats. Conclusions Administration of an aqueous decoction of B. forficata is a potential treatment for diabetes and does not produce toxic effects measurable with the enzyme markers used in our study.
Resumo:
Photodynamic therapy (PDT) is based on the synergism of a photosensitive drug (a photosensitizer) and visible light to destroy target cells (e.g., malignant, premalignant, or bacterial cells). The aim of this study was to investigate the response of normal rat tongue mucosa to PDT following the topical application of hematoporphyrin derivative (Photogem®), Photodithazine®, methylene blue (MB), and poly(lactic-co-glycolic acid) (PLGA) nanoparticles loaded with MB. One hundred and thirty three rats were randomly divided in various groups: the PDT groups were treated with the photosensitizers for 10 min followed by exposure to red light. Those in control groups received neither photosensitizer nor light, and they were subjected to light exposure alone or to photosensitizer alone. Fluorescent signals were obtained from tongue tissue immediately after the topical application of photosensitizers and 24 h following PDT. Histological changes were evaluated at baseline and at 1, 3, 7, and 15 days post-PDT treatment. Fluorescence was detected immediately after the application of the photosensitizers, but not 24 h following PDT. Histology revealed intact mucosa in all experimental groups at all evaluation time points. The results suggest that there is a therapeutic window where PDT with Photogem®, Photodithazine®, MB, and MB-loaded PLGA nanoparticles could safely target oral pathogenic bacteria without damaging normal oral tissue.
Resumo:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.
Resumo:
The aim of the first part of this thesis was to evaluate the effect of trans fatty acid- (TFA), contaminant, polycyclic aromatic hydrocarbon (PAH)- and oxidation productenriched diets on the content of TFA and conjugated linoleic acid (CLA) isomers in meat and liver of both poultry and rabbit. The enriched feedings were prepared with preselected fatty co-and by-products that contained low and high levels of TFA (low, palm fatty acid distillate; high, hydrogenated palm fatty acid distillate), environmental contaminants (dioxins and PCBs) (two different fish oils), PAH (olive oil acid oils and pomace olive oil from chemical refining, for low and high levels) and oxidation products (sunflower-olive oil blend before and after frying), so as to obtain single feedings with three enrichment degrees (high, medium and low) of the compound of interest. This experimental set-up is a part of a large, collaborative European project (http://www.ub.edu/feedfat/), where other chemical and health parameters are assessed. Lipids were extracted, methylated with diazomethane, then transmethylated with 2N KOH/methanol and analyzed by GC and silver-ion TLC-GC. TFA and CLA were determined in the fats, the feedings, meat and liver of both poultry and rabbit. In general, the level of TFA and CLA in meat and liver mainly varied according to those originally found in the feeding fats. It must be pointed out, though, that TFA and CLA accumulation was different for the two animal species, as well as for the two types of tissues. The TFA composition of meat and liver changes according to the composition of the oils added to the feeds with some differences between species. Chicken meat with skin shows higher TFA content (2.6–5.4 fold) than rabbit meat, except for the “PAH” trial. Chicken liver shows higher TFA content (1.2–2.1 fold) than rabbit liver, except for the “TRANS” and “PAH” trials. In both chicken and rabbit meats, the TFA content was higher for the “TRANS” trial, followed by the “DIOXIN” trial. Slight differences were found on the “OXIDATION” and “PAH” trends in both types of meats. In both chicken and rabbit livers, the TFA content was higher for the “TRANS” trial, followed by those of the “PAH”, “DIOXIN” and “OXIDATION” trials. This trend, however, was not identical to that of feeds, where the TFA content varied as follows: “TRANS” > “DIOXIN” >“PAH” > “OXIDATION”. In chicken and rabbit meat samples, C18:1 TFA were the most abundant, followed by C18:2 TFA and C18:3 TFA, except for the “DIOXIN” trial where C18:3 TFA > C18:2 TFA. In chicken and rabbit liver samples of the “TRANS” and “OXIDATION” trials, C18:1 TFA were the most abundant, followed by C18:2 TFA and C18:3 TFA, whereas C18:3 TFA > C18:2 in the “DIOXIN” trial. Slight differences were found on the “PAH” trend in livers from both species. The second part of the thesis dealt with the study of lipid oxidation in washed turkey muscle added with different antioxidants. The evaluation on the oxidative stability of muscle foods found that oxidation could be measured by headspace solid phase microestraction (SPME) of hexanal and propanal. To make this method effective, an antioxidant system was added to stored muscle to stop the oxidative processes. An increase in ionic strength of the sample was also implemented to increase the concentration of aldehydes in the headspace. This method was found to be more sensitive than the commonly used thiobarbituric acid reactive substances (TBARs) method. However, after antioxidants were added and oxidation was stopped, the concentration of aldehydes decreased. It was found that the decrease in aldehyde concentration was due to the binding of the aldehydes to muscle proteins, thus decreasing the volatility and making them less detectable.
Resumo:
This PhD thesis discusses the rationale for design and use of synthetic oligosaccharides for the development of glycoconjugate vaccines and the role of physicochemical methods in the characterization of these vaccines. The study concerns two infectious diseases that represent a serious problem for the national healthcare programs: human immunodeficiency virus (HIV) and Group A Streptococcus (GAS) infections. Both pathogens possess distinctive carbohydrate structures that have been described as suitable targets for the vaccine design. The Group A Streptococcus cell membrane polysaccharide (GAS-PS) is an attractive vaccine antigen candidate based on its conserved, constant expression pattern and the ability to confer immunoprotection in a relevant mouse model. Analysis of the immunogenic response within at-risk populations suggests an inverse correlation between high anti-GAS-PS antibody titres and GAS infection cases. Recent studies show that a chemically synthesized core polysaccharide-based antigen may represent an antigenic structural determinant of the large polysaccharide. Based on GAS-PS structural analysis, the study evaluates the potential to exploit a synthetic design approach to GAS vaccine development and compares the efficiency of synthetic antigens with the long isolated GAS polysaccharide. Synthetic GAS-PS structural analogues were specifically designed and generated to explore the impact of antigen length and terminal residue composition. For the HIV-1 glycoantigens, the dense glycan shield on the surface of the envelope protein gp120 was chosen as a target. This shield masks conserved protein epitopes and facilitates virus spread via binding to glycan receptors on susceptible host cells. The broadly neutralizing monoclonal antibody 2G12 binds a cluster of high-mannose oligosaccharides on the gp120 subunit of HIV-1 Env protein. This oligomannose epitope has been a subject to the synthetic vaccine development. The cluster nature of the 2G12 epitope suggested that multivalent antigen presentation was important to develop a carbohydrate based vaccine candidate. I describe the development of neoglycoconjugates displaying clustered HIV-1 related oligomannose carbohydrates and their immunogenic properties.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
The present PhD thesis summarizes the three-years study about the neutronic investigation of a new concept nuclear reactor aiming at the optimization and the sustainable management of nuclear fuel in a possible European scenario. A new generation nuclear reactor for the nuclear reinassance is indeed desired by the actual industrialized world, both for the solution of the energetic question arising from the continuously growing energy demand together with the corresponding reduction of oil availability, and the environment question for a sustainable energy source free from Long Lived Radioisotopes and therefore geological repositories. Among the Generation IV candidate typologies, the Lead Fast Reactor concept has been pursued, being the one top rated in sustainability. The European Lead-cooled SYstem (ELSY) has been at first investigated. The neutronic analysis of the ELSY core has been performed via deterministic analysis by means of the ERANOS code, in order to retrieve a stable configuration for the overall design of the reactor. Further analyses have been carried out by means of the Monte Carlo general purpose transport code MCNP, in order to check the former one and to define an exact model of the system. An innovative system of absorbers has been conceptualized and designed for both the reactivity compensation and regulation of the core due to cycle swing, as well as for safety in order to guarantee the cold shutdown of the system in case of accident. Aiming at the sustainability of nuclear energy, the steady-state nuclear equilibrium has been investigated and generalized into the definition of the ``extended'' equilibrium state. According to this, the Adiabatic Reactor Theory has been developed, together with a New Paradigm for Nuclear Power: in order to design a reactor that does not exchange with the environment anything valuable (thus the term ``adiabatic''), in the sense of both Plutonium and Minor Actinides, it is required indeed to revert the logical design scheme of nuclear cores, starting from the definition of the equilibrium composition of the fuel and submitting to the latter the whole core design. The New Paradigm has been applied then to the core design of an Adiabatic Lead Fast Reactor complying with the ELSY overall system layout. A complete core characterization has been done in order to asses criticality and power flattening; a preliminary evaluation of the main safety parameters has been also done to verify the viability of the system. Burn up calculations have been then performed in order to investigate the operating cycle for the Adiabatic Lead Fast Reactor; the fuel performances have been therefore extracted and inserted in a more general analysis for an European scenario. The present nuclear reactors fleet has been modeled and its evolution simulated by means of the COSI code in order to investigate the materials fluxes to be managed in the European region. Different plausible scenarios have been identified to forecast the evolution of the European nuclear energy production, including the one involving the introduction of Adiabatic Lead Fast Reactors, and compared to better analyze the advantages introduced by the adoption of new concept reactors. At last, since both ELSY and the ALFR represent new concept systems based upon innovative solutions, the neutronic design of a demonstrator reactor has been carried out: such a system is intended to prove the viability of technology to be implemented in the First-of-a-Kind industrial power plant, with the aim at attesting the general strategy to use, to the largest extent. It was chosen then to base the DEMO design upon a compromise between demonstration of developed technology and testing of emerging technology in order to significantly subserve the purpose of reducing uncertainties about construction and licensing, both validating ELSY/ALFR main features and performances, and to qualify numerical codes and tools.
Resumo:
A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.