880 resultados para legal and constitutional analysis
Resumo:
Foods that provide medical and health benefits or have a role in disease risk prevention are termed functional foods. The functionality of functional foods is derived from bioactive compounds that are extranutritional constituents present in small quantities in food. Bioactive components include a range of chemical compounds with varying structures such as carotenoids, flavonoids, plant sterols, omega-3 fatty acids (n-3), allyl and diallyl sulfides, indoles (benzopyrroles), and phenolic acids. The increasing consumer interest in natural bioactive compounds has brought about a rise in demand for these kinds of compounds and, in parallel, an increasing number of scientific studies have this type of substance as main topic. The principal aim of this PhD research project was the study of different bioactive and toxic compounds in several natural matrices. To achieve this goal, chromatographic, spectroscopic and sensorial analysis were performed. This manuscript reports the main results obtained in the six activities briefly summarized as follows: • SECTION I: the influence of conventional packaging on lipid oxidation of pasta was evaluated in egg spaghetti. • SECTION II: the effect of the storage at different temperatures of virgin olive oil was monitored by peroxide value, fatty acid activity, OSI test and sensory analysis. • SECTION III: the glucosinolate and phenolic content of 37 rocket salad accessions were evaluated, comparing Eruca sativa and Diplotaxis tenuifolia species. Sensory analysis and the influence of the phenolic and glucosinolate composition on sensory attributes of rocket salads has been also studied. • SECTION IV: ten buckwheat honeys were characterised on the basis of their pollen, physicochemical, phenolic and volatile composition. • SECTION V: the polyphenolic fraction, anthocyanins and other polar compounds, the antioxidant capacity and the anty-hyperlipemic action of the aqueous extract of Hibiscus sabdariffa were achieved. • SECTION VI: the optimization of a normal phase high pressure liquid chromatography–fluorescence detection method for the quantitation of flavanols and procyanidins in cocoa powder and chocolate samples was performed.
Resumo:
The general aim of this work is to contribute to the energy performance assessment of ventilated façades by the simultaneous use of experimental data and numerical simulations. A significant amount of experimental work was done on different types of ventilated façades with natural ventilation. The measurements were taken on a test building. The external walls of this tower are rainscreen ventilated façades. Ventilation grills are located at the top and at the bottom of the tower. In this work the modelling of the test building using a dynamic thermal simulation program (ESP-r) is presented and the main results discussed. In order to investigate the best summer thermal performance of rainscreen ventilated skin façade a study for different setups of rainscreen walls was made. In particular, influences of ventilation grills, air cavity thickness, skin colour, skin material, orientation of façade were investigated. It is shown that some types of rainscreen ventilated façade typologies are capable of lowering the cooling energy demand of a few percent points.
Resumo:
Sequenz spezifische biomolekulare Analyseverfahren erweisen sich gerade im Hinblick auf das Humane Genom Projekt als äußerst nützlich in der Detektion von einzelnen Nukleotid Polymorphismen (SNPs) und zur Identifizierung von Genen. Auf Grund der hohen Anzahl von Basenpaaren, die zu analysieren sind, werden sensitive und effiziente Rastermethoden benötigt, welche dazu fähig sind, DNA-Proben in einer geeigneten Art und Weise zu bearbeiten. Die meisten Detektionsarten berücksichtigen die Interaktion einer verankerten Probe und des korrespondierenden Targets mit den Oberflächen. Die Analyse des kinetischen Verhaltens der Oligonukleotide auf der Sensoroberfläche ist infolgedessen von höchster Wichtigkeit für die Verbesserung bereits bekannter Detektions - Schemata. In letzter Zeit wurde die Oberflächen Plasmonen feld-verstärkte Fluoreszenz Spektroskopie (SPFS) entwickelt. Sie stellt eine kinetische Analyse - und Detektions - Methode dar, die mit doppelter Aufzeichnung, d.h. der Änderung der Reflektivität und des Fluoreszenzsignals, für das Interphasen Phänomen operiert. Durch die Verwendung von SPFS können Kinetikmessungen für die Hybridisierung zwischen Peptid Nukleinsäure (PNA), welche eine synthetisierte Nukleinsäure DNA imitiert und eine stabilere Doppelhelix formt, und DNA auf der Sensoroberfläche ausgeführt werden. Mittels einzel-, umfassend-, und titrations- Experimenten sowohl mit einer komplementär zusammenpassenden Sequenz als auch einer mismatch Sequenz können basierend auf dem Langmuir Modell die Geschwindigkeitskonstanten für die Bindungsreaktion des oligomer DNA Targets bzw. des PCR Targets zur PNA ermittelt werden. Darüber hinaus wurden die Einflüsse der Ionenstärke und der Temperatur für die PNA/DNA Hybridisierung in einer kinetischen Analyse aufgezeigt.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
Animal neocentromeres are defined as ectopic centromeres that have formed in non-centromeric locations and avoid some of the features, like the DNA satellite sequence, that normally characterize canonical centromeres. Despite this, they are stable functional centromeres inherited through generations. The only existence of neocentromeres provide convincing evidence that centromere specification is determined by epigenetic rather than sequence-specific mechanisms. For all this reasons, we used them as simplified models to investigate the molecular mechanisms that underlay the formation and the maintenance of functional centromeres. We collected human cell lines carrying neocentromeres in different positions. To investigate the region involved in the process at the DNA sequence level we applied a recent technology that integrates Chromatin Immuno-Precipitation and DNA microarrays (ChIP-on-chip) using rabbit polyclonal antibodies directed against CENP-A or CENP-C human centromeric proteins. These DNA binding-proteins are required for kinetochore function and are exclusively targeted to functional centromeres. Thus, the immunoprecipitation of DNA bound by these proteins allows the isolation of centromeric sequences, including those of the neocentromeres. Neocentromeres arise even in protein-coding genes region. We further analyzed if the increased scaffold attachment sites and the corresponding tighter chromatin of the region involved in the neocentromerization process still were permissive or not to transcription of within encoded genes. Centromere repositioning is a phenomenon in which a neocentromere arisen without altering the gene order, followed by the inactivation of the canonical centromere, becomes fixed in population. It is a process of chromosome rearrangement fundamental in evolution, at the bases of speciation. The repeat-free region where the neocentromere initially forms, progressively acquires extended arrays of satellite tandem repeats that may contribute to its functional stability. In this view our attention focalized to the repositioned horse ECA11 centromere. ChIP-on-chip analysis was used to define the region involved and SNPs studies, mapping within the region involved into neocentromerization, were carried on. We have been able to describe the structural polymorphism of the chromosome 11 centromeric domain of Caballus population. That polymorphism was seen even between homologues chromosome of the same cells. That discovery was the first described ever. Genomic plasticity had a fundamental role in evolution. Centromeres are not static packaged region of genomes. The key question that fascinates biologists is to understand how that centromere plasticity could be combined to the stability and maintenance of centromeric function. Starting from the epigenetic point of view that underlies centromere formation, we decided to analyze the RNA content of centromeric chromatin. RNA, as well as secondary chemically modifications that involve both histones and DNA, represents a good candidate to guide somehow the centromere formation and maintenance. Many observations suggest that transcription of centromeric DNA or of other non-coding RNAs could affect centromere formation. To date has been no thorough investigation addressing the identity of the chromatin-associated RNAs (CARs) on a global scale. This prompted us to develop techniques to identify CARs in a genome-wide approach using high-throughput genomic platforms. The future goal of this study will be to focalize the attention on what strictly happens specifically inside centromere chromatin.
Resumo:
Heavy pig breeding in Italy is mainly oriented for the production of high quality processed products. Of particular importance is the dry cured ham production, which is strictly regulated and requires specific carcass characteristics correlated with green leg characteristics. Furthermore, as pigs are slaughtered at about 160 kg live weight, the Italian pig breeding sector faces severe problems of production efficiency that are related to all biological aspects linked to growth, feed conversion, fat deposition and so on. It is well known that production and carcass traits are in part genetically determined. Therefore, as a first step to understand genetic basis of traits that could have a direct or indirect impact on dry cured ham production, a candidate gene approach can be used to identify DNA markers associated with parameters of economic importance. In this thesis, we investigated three candidate genes for carcass and production traits (TRIB3, PCSK1, MUC4) in pig breeds used for dry cured ham production, using different experimental approaches in order to find molecular markers associated with these parameters.
Resumo:
The present PhD thesis was focused on the development and application of chemical methodology (Py-GC-MS) and data-processing method by multivariate data analysis (chemometrics). The chromatographic and mass spectrometric data obtained with this technique are particularly suitable to be interpreted by chemometric methods such as PCA (Principal Component Analysis) as regards data exploration and SIMCA (Soft Independent Models of Class Analogy) for the classification. As a first approach, some issues related to the field of cultural heritage were discussed with a particular attention to the differentiation of binders used in pictorial field. A marker of egg tempera the phosphoric acid esterified, a pyrolysis product of lecithin, was determined using HMDS (hexamethyldisilazane) rather than the TMAH (tetramethylammonium hydroxide) as a derivatizing reagent. The validity of analytical pyrolysis as tool to characterize and classify different types of bacteria was verified. The FAMEs chromatographic profiles represent an important tool for the bacterial identification. Because of the complexity of the chromatograms, it was possible to characterize the bacteria only according to their genus, while the differentiation at the species level has been achieved by means of chemometric analysis. To perform this study, normalized areas peaks relevant to fatty acids were taken into account. Chemometric methods were applied to experimental datasets. The obtained results demonstrate the effectiveness of analytical pyrolysis and chemometric analysis for the rapid characterization of bacterial species. Application to a samples of bacterial (Pseudomonas Mendocina), fungal (Pleorotus ostreatus) and mixed- biofilms was also performed. A comparison with the chromatographic profiles established the possibility to: • Differentiate the bacterial and fungal biofilms according to the (FAMEs) profile. • Characterize the fungal biofilm by means the typical pattern of pyrolytic fragments derived from saccharides present in the cell wall. • Individuate the markers of bacterial and fungal biofilm in the same mixed-biofilm sample.
Resumo:
Abstract In this study structural and finite strain data are used to explore the tectonic evolution and the exhumation history of the Chilean accretionary wedge. The Chilean accretionary wedge is part of a Late Paleozoic subduction complex that developed during subduction of the Pacific plate underneath South America. The wedge is commonly subdivided into a structurally lower Western Series and an upper Eastern Series. This study shows the progressive development of structures and finite strain from the least deformed rocks in the eastern part of the Eastern Series of the accretionary wedge to higher grade schist of the Western Series at the Pacific coast. Furthermore, this study reports finite-strain data to quantify the contribution of vertical ductile shortening to exhumation. Vertical ductile shortening is, together with erosion and normal faulting, a process that can aid the exhumation of high-pressure rocks. In the east, structures are characterized by upright chevron folds of sedimentary layering which are associated with a penetrative axial-plane foliation, S1. As the F1 folds became slightly overturned to the west, S1 was folded about recumbent open F2 folds and an S2 axial-plane foliation developed. Near the contact between the Western and Eastern Series S2 represents a prominent subhorizontal transposition foliation. Towards the structural deepest units in the west the transposition foliation became progressively flat lying. Finite-strain data as obtained by Rf/Phi and PDS analysis in metagreywacke and X-ray texture goniometry in phyllosilicate-rich rocks show a smooth and gradual increase in strain magnitude from east to west. There are no evidences for normal faulting or significant structural breaks across the contact of Eastern and Western Series. The progressive structural and strain evolution between both series can be interpreted to reflect a continuous change in the mode of accretion in the subduction wedge. Before ~320-290 Ma the rocks of the Eastern Series were frontally accreted to the Andean margin. Frontal accretion caused horizontal shortening and upright folds and axial-plane foliations developed. At ~320-290 Ma the mode of accretion changed and the rocks of the Western Series were underplated below the Andean margin. This basal accretion caused a major change in the flow field within the wedge and gave rise to vertical shortening and the development of the penetrative subhorizontal transposition foliation. To estimate the amount that vertical ductile shortening contributed to the exhumation of both units finite strain is measured. The tensor average of absolute finite strain yield Sx=1.24, Sy=0.82 and Sz=0.57 implying an average vertical shortening of ca. 43%, which was compensated by volume loss. The finite strain data of the PDS measurements allow to calculate an average volume loss of 41%. A mass balance approximates that most of the solved material stays in the wedge and is precipitated in quartz veins. The average of relative finite strain is Sx=1.65, Sy=0.89 and Sz=0.59 indicating greater vertical shortening in the structurally deeper units. A simple model which integrates velocity gradients along a vertical flow path with a steady-state wedge is used to estimate the contribution of deformation to ductile thinning of the overburden during exhumation. The results show that vertical ductile shortening contributed 15-20% to exhumation. As no large-scale normal faults have been mapped the remaining 80-85% of exhumation must be due to erosion.
Resumo:
Atmospheric aerosol particles directly impact air quality and participate in controlling the climate system. Organic Aerosol (OA) in general accounts for a large fraction (10–90%) of the global submicron (PM1) particulate mass. Chemometric methods for source identification are used in many disciplines, but methods relying on the analysis of NMR datasets are rarely used in atmospheric sciences. This thesis provides an original application of NMR-based chemometric methods to atmospheric OA source apportionment. The method was tested on chemical composition databases obtained from samples collected at different environments in Europe, hence exploring the impact of a great diversity of natural and anthropogenic sources. We focused on sources of water-soluble OA (WSOA), for which NMR analysis provides substantial advantages compared to alternative methods. Different factor analysis techniques are applied independently to NMR datasets from nine field campaigns of the project EUCAARI and allowed the identification of recurrent source contributions to WSOA in European background troposphere: 1) Marine SOA; 2) Aliphatic amines from ground sources (agricultural activities, etc.); 3) Biomass burning POA; 4) Biogenic SOA from terpene oxidation; 5) “Aged” SOAs, including humic-like substances (HULIS); 6) Other factors possibly including contributions from Primary Biological Aerosol Particles, and products of cooking activities. Biomass burning POA accounted for more than 50% of WSOC in winter months. Aged SOA associated with HULIS was predominant (> 75%) during the spring-summer, suggesting that secondary sources and transboundary transport become more important in spring and summer. Complex aerosol measurements carried out, involving several foreign research groups, provided the opportunity to compare source apportionment results obtained by NMR analysis with those provided by more widespread Aerodyne aerosol mass spectrometers (AMS) techniques that now provided categorization schemes of OA which are becoming a standard for atmospheric chemists. Results emerging from this thesis partly confirm AMS classification and partly challenge it.
Resumo:
In dieser Arbeit werden Strukturen beschrieben, die mit Polymeren auf Oberflächen erzeugt wurden. Die Anwendungen reichen von PMMA und PNIPAM Polymerbürsten, über die Restrukturierung von Polystyrol durch Lösemittel bis zu 3D-Strukturen, die aus PAH/ PSS Polyelektrolytmultischichten bestehen. Im ersten Teil werden Polymethylmethacrylat (PMMA) Bürsten in der ionischen Flüssigkeit 1-Butyl-3-Methylimidazolium Hexafluorophospat ([Bmim][PF6]) durch kontrollierte radikalische Polymerisation (ATRP) hergestellt. Kinetische Untersuchungen zeigten ein lineares und dichtes Bürstenwachstum mit einer Wachstumsrate von 4600 g/mol pro nm. Die durchschnittliche Pfropfdichte betrug 0.36 µmol/m2. Als Anwendung wurden Mikrotropfen bestehend aus der ionischen Flüssigkeit, Dimethylformamid und dem ATRP-Katalysator benutzt, um in einer definierten Geometrie Polymerbürsten auf Silizium aufzubringen. Auf diese Weise lässt sich eine bis zu 13 nm dicke Beschichtung erzeugen. Dieses Konzept ist durch die Verdampfung des Monomers Methylmethacrylat (MMA) limitiert. Aus einem 1 µl großen Tropfen aus ionischer Flüssigkeit und MMA (1:1) verdampft MMA innerhalb von 100 s. Daher wurde das Monomer sequentiell zugegeben. Der zweite Teil konzentriert sich auf die Strukturierung von Oberflächen mit Hilfe einer neuen Methode: Tintendruck. Ein piezoelektrisch betriebenes „Drop-on-Demand“ Drucksystem wurde verwendet, um Polystyrol mit 0,4 nl Tropfen aus Toluol zu strukturieren. Die auf diese Art und Weise gebildeten Mikrokrater können Anwendung als Mikrolinsen finden. Die Brennweite der Mikrolinsen kann über die Anzahl an Tropfen, die für die Strukturierung verwendet werden, eingestellt werden. Theoretisch und experimentell wurde die Brennweite im Bereich von 4,5 mm bis 0,21 mm ermittelt. Der zweite Strukturierungsprozess nutzt die Polyelektrolyte Polyvinylamin-Hydrochlorid (PAH) und Polystyrolsulfonat (PSS), um 3D-Strukturen wie z.B. Linien, Schachbretter, Ringe, Stapel mit einer Schicht für Schicht Methode herzustellen. Die Schichtdicke für eine Doppelschicht (DS) liegt im Bereich von 0.6 bis 1.1 nm, wenn NaCl als Elektrolyt mit einer Konzentration von 0,5 mol/l eingesetzt wird. Die Breite der Strukturen beträgt im Mittel 230 µm. Der Prozess wurde erweitert, um Nanomechanische Cantilever Sensoren (NCS) zu beschichten. Auf einem Array bestehend aus acht Cantilevern wurden je zwei Cantilever mit fünf Doppelschichten PAH/ PSS und je zwei Cantilever mit zehn Doppelschichten PAH/ PSS schnell und reproduzierbar beschichtet. Die Massenänderung für die individuellen Cantilever war 0,55 ng für fünf Doppelschichten und 1,08 ng für zehn Doppelschichten. Der daraus resultierende Sensor wurde einer Umgebung mit definierter Luftfeuchtigkeit ausgesetzt. Die Cantilever verbiegen sich durch die Ausdehnung der Beschichtung, da Wasser in das Polymer diffundiert. Eine maximale Verbiegung von 442 nm bei 80% Luftfeuchtigkeit wurde für die mit zehn Doppelschichten beschichteten Cantilever gefunden. Dies entspricht einer Wasseraufnahme von 35%. Zusätzlich konnte aus den Verbiegungsdaten geschlossen werden, dass die Elastizität der Polyelektrolytmultischichten zunimmt, wenn das Polymer gequollen ist. Das thermische Verhalten in Wasser wurde im nächsten Teil an nanomechanischen Cantilever Sensoren, die mit Poly(N-isopropylacrylamid)bürsten (PNIPAM) und plasmapolymerisiertem N,N-Diethylacrylamid beschichtet waren, untersucht. Die Verbiegung des Cantilevers zeigte zwei Bereiche: Bei Temperaturen kleiner der niedrigsten kritischen Temperatur (LCST) ist die Verbiegung durch die Dehydration der Polymerschicht dominiert und bei Temperaturen größer der niedrigsten kritischen Temperatur (LCST) reagiert der Cantilever Sensor überwiegend auf Relaxationsprozesse innerhalb der kollabierten Polymerschicht. Es wurde gefunden, dass das Minimum in der differentiellen Verbiegung mit der niedrigsten kritischen Temperatur von 32°C und 44°C der ausgewählten Polymeren übereinstimmt. Im letzten Teil der Arbeit wurden µ-Reflektivitäts- und µ-GISAXS Experimente eingeführt als neue Methoden, um mikrostrukturierte Proben wie NCS oder PEM Linien mit Röntgenstreuung zu untersuchen. Die Dicke von jedem individuell mit PMMA Bürsten beschichtetem NCS ist im Bereich von 32,9 bis 35,2 nm, was mit Hilfe von µ-Reflektivitätsmessungen bestimmt wurde. Dieses Ergebnis kann mit abbildender Ellipsometrie als komplementäre Methode mit einer maximalen Abweichung von 7% bestätigt werden. Als zweites Beispiel wurde eine gedruckte Polyelektrolytmultischicht aus PAH/PSS untersucht. Die Herstellungsprozedur wurde so modifiziert, dass Goldnanopartikel in die Schichtstruktur eingebracht wurden. Durch Auswertung eines µ-GISAXS Experiments konnte der Einbau der Partikel identifiziert werden. Durch eine Anpassung mit einem Unified Fit Modell wurde herausgefunden, dass die Partikel nicht agglomeriert sind und von einer Polymermatrix umgeben sind.
Resumo:
The candidate tackled an important issue in contemporary management: the role of CSR and Sustainability. The research proposal focused on a longitudinal and inductive research, directed to specify the evolution of CSR and contribute to the new institutional theory, in particular institutional work framework, and to the relation between institutions and discourse analysis. The documental analysis covers all the evolution of CSR, focusing also on a number of important networks and associations. Some of the methodologies employed in the thesis have been employed as a consequence of data analysis, in a truly inductive research process. The thesis is composed by two section. The first section mainly describes the research process and the analyses results. The candidates employed several research methods: a longitudinal content analysis of documents, a vocabulary research with statistical metrics as cluster analysis and factor analysis, a rhetorical analysis of justifications. The second section puts in relation the analysis results with theoretical frameworks and contributions. The candidate confronted with several frameworks: Actor-Network-Theory, Institutional work and Boundary Work, Institutional Logic. Chapters are focused on different issues: a historical reconstruction of CSR; a reflection about symbolic adoption of recurrent labels; two case studies of Italian networks, in order to confront institutional and boundary works; a theoretical model of institutional change based on contradiction and institutional complexity; the application of the model to CSR and Sustainability, proposing Sustainability as a possible institutional logic.
Resumo:
The last decade has witnessed very fast development in microfabrication technologies. The increasing industrial applications of microfluidic systems call for more intensive and systematic knowledge on this newly emerging field. Especially for gaseous flow and heat transfer at microscale, the applicability of conventional theories developed at macro scale is not yet completely validated; this is mainly due to scarce experimental data available in literature for gas flows. The objective of this thesis is to investigate these unclear elements by analyzing forced convection for gaseous flows through microtubes and micro heat exchangers. Experimental tests have been performed with microtubes having various inner diameters, namely 750 m, 510 m and 170 m, over a wide range of Reynolds number covering the laminar region, the transitional zone and also the onset region of the turbulent regime. The results show that conventional theory is able to predict the flow friction factor when flow compressibility does not appear and the effect of fluid temperature-dependent properties is insignificant. A double-layered microchannel heat exchanger has been designed in order to study experimentally the efficiency of a gas-to-gas micro heat exchanger. This microdevice contains 133 parallel microchannels machined into polished PEEK plates for both the hot side and the cold side. The microchannels are 200 µm high, 200 µm wide and 39.8 mm long. The design of the micro device has been made in order to be able to test different materials as partition foil with flexible thickness. Experimental tests have been carried out for five different partition foils, with various mass flow rates and flow configurations. The experimental results indicate that the thermal performance of the countercurrent and cross flow micro heat exchanger can be strongly influenced by axial conduction in the partition foil separating the hot gas flow and cold gas flow.
Resumo:
In the framework of the micro-CHP (Combined Heat and Power) energy systems and the Distributed Generation (GD) concept, an Integrated Energy System (IES) able to meet the energy and thermal requirements of specific users, using different types of fuel to feed several micro-CHP energy sources, with the integration of electric generators of renewable energy sources (RES), electrical and thermal storage systems and the control system was conceived and built. A 5 kWel Polymer Electrolyte Membrane Fuel Cell (PEMFC) has been studied. Using experimental data obtained from various measurement campaign, the electrical and CHP PEMFC system performance have been determinate. The analysis of the effect of the water management of the anodic exhaust at variable FC loads has been carried out, and the purge process programming logic was optimized, leading also to the determination of the optimal flooding times by varying the AC FC power delivered by the cell. Furthermore, the degradation mechanisms of the PEMFC system, in particular due to the flooding of the anodic side, have been assessed using an algorithm that considers the FC like a black box, and it is able to determine the amount of not-reacted H2 and, therefore, the causes which produce that. Using experimental data that cover a two-year time span, the ageing suffered by the FC system has been tested and analyzed.
Resumo:
This Doctoral Thesis unfolds into a collection of three distinct papers that share an interest in institutional theory and technology transfer. Taking into account that organizations are increasingly exposed to a multiplicity of demands and pressures, we aim to analyze what renders this situation of institutional complexity more or less difficult to manage for organizations, and what makes organizations more or less successful in responding to it. The three studies offer a novel contribution both theoretically and empirically. In particular, the first paper “The dimensions of organizational fields for understanding institutional complexity: A theoretical framework” is a theoretical contribution that tries to better understand the relationship between institutional complexity and fields by providing a framework. The second article “Beyond institutional complexity: The case of different organizational successes in confronting multiple institutional logics” is an empirical study which aims to explore the strategies that allow organizations facing multiple logics to respond more successfully to them. The third work “ How external support may mitigate the barriers to university-industry collaboration” is oriented towards practitioners and presents a case study about technology transfer in Italy.
Resumo:
Supernovae are among the most energetic events occurring in the universe and are so far the only verified extrasolar source of neutrinos. As the explosion mechanism is still not well understood, recording a burst of neutrinos from such a stellar explosion would be an important benchmark for particle physics as well as for the core collapse models. The neutrino telescope IceCube is located at the Geographic South Pole and monitors the antarctic glacier for Cherenkov photons. Even though it was conceived for the detection of high energy neutrinos, it is capable of identifying a burst of low energy neutrinos ejected from a supernova in the Milky Way by exploiting the low photomultiplier noise in the antarctic ice and extracting a collective rate increase. A signal Monte Carlo specifically developed for water Cherenkov telescopes is presented. With its help, we will investigate how well IceCube can distinguish between core collapse models and oscillation scenarios. In the second part, nine years of data taken with the IceCube precursor AMANDA will be analyzed. Intensive data cleaning methods will be presented along with a background simulation. From the result, an upper limit on the expected occurrence of supernovae within the Milky Way will be determined.