13 resultados para the big 5

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work provides an ex-post assessment of the UK 5-a-day information campaign where the positive effects of information on consumption levels are disentangled from the potentially conflicting price dynamics. A model-based estimate of the counterfactual (no-intervention) scenario is computed using data from the Expenditure and Food Survey between 2002 and 2006. For this purpose fruit and vegetable demand is modelled employing Quadratic Almost Ideal Demand System (QUAIDS) specification with demographic effects and controlling for potential endogeneity of prices and total food expenditure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis reports on the two main areas of our research: introductory programming as the traditional way of accessing informatics and cultural teaching informatics through unconventional pathways. The research on introductory programming aims to overcome challenges in traditional programming education, thus increasing participation in informatics. Improving access to informatics enables individuals to pursue more and better professional opportunities and contribute to informatics advancements. We aimed to balance active, student-centered activities and provide optimal support to novices at their level. Inspired by Productive Failure and exploring the concept of notional machine, our work focused on developing Necessity Learning Design, a design to help novices tackle new programming concepts. Using this design, we implemented a learning sequence to introduce arrays and evaluated it in a real high-school context. The subsequent chapters discuss our experiences teaching CS1 in a remote-only scenario during the COVID-19 pandemic and our collaborative effort with primary school teachers to develop a learning module for teaching iteration using a visual programming environment. The research on teaching informatics principles through unconventional pathways, such as cryptography, aims to introduce informatics to a broader audience, particularly younger individuals that are less technical and professional-oriented. It emphasizes the importance of understanding informatics's cultural and scientific aspects to focus on the informatics societal value and its principles for active citizenship. After reflecting on computational thinking and inspired by the big ideas of science and informatics, we describe our hands-on approach to teaching cryptography in high school, which leverages its key scientific elements to emphasize its social aspects. Additionally, we present an activity for teaching public-key cryptography using graphs to explore fundamental concepts and methods in informatics and mathematics and their interdisciplinarity. In broadening the understanding of informatics, these research initiatives also aim to foster motivation and prime for more professional learning of informatics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The subject of this doctoral dissertation concerns the definition of a new methodology for the morphological and morphometric study of fossilized human teeth, and therefore strives to provide a contribution to the reconstruction of human evolutionary history that proposes to extend to the different species of hominid fossils. Standardized investigative methodologies are lacking both regarding the orientation of teeth subject to study and in the analysis that can be carried out on these teeth once they are oriented. The opportunity to standardize a primary analysis methodology is furnished by the study of certain early Neanderthal and preneanderthal molars recovered in two caves in southern Italy [Grotta Taddeo (Taddeo Cave) and Grotta del Poggio (Poggio Cave), near Marina di Camerata, Campania]. To these we can add other molars of Neanderthal and modern man of the upper Paleolithic era, specifically scanned in the paleoanthropology laboratory of the University of Arkansas (Fayetteville, Arkansas, USA), in order to increase the paleoanthropological sample data and thereby make the final results of the analyses more significant. The new analysis methodology is rendered as follows: 1. Standardization of an orientation system for primary molars (superior and inferior), starting from a scan of a sample of 30 molars belonging to modern man (15 M1 inferior and 15 M1 superior), the definition of landmarks, the comparison of various systems and the choice of a system of orientation for each of the two dental typologies. 2. The definition of an analysis procedure that considers only the first 4 millimeters of the dental crown starting from the collar: 5 sections parallel to the plane according to which the tooth has been oriented are carried out, spaced 1 millimeter between them. The intention is to determine a method that allows for the differentiation of fossilized species even in the presence of worn teeth. 3. Results and Conclusions. The new approach to the study of teeth provides a considerable quantity of information that can better be evaluated by increasing the fossil sample data. It has been demonstrated to be a valid tool in evolutionary classification that has allowed (us) to differentiate the Neanderthal sample from that of modern man. In a particular sense the molars of Grotta Taddeo, which up until this point it has not been possible to determine with exactness their species of origin, through the present research they are classified as Neanderthal.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The hydrogen production in the green microalga Chlamydomonas reinhardtii was evaluated by means of a detailed physiological and biotechnological study. First, a wide screening of the hydrogen productivity was done on 22 strains of C. reinhardtii, most of which mutated at the level of the D1 protein. The screening revealed for the first time that mutations upon the D1 protein may result on an increased hydrogen production. Indeed, productions ranged between 0 and more than 500 mL hydrogen per liter of culture (Torzillo, Scoma et al., 2007a), the highest producer (L159I-N230Y) being up to 5 times more performant than the strain cc124 widely adopted in literature (Torzillo, Scoma, et al., 2007b). Improved productivities by D1 protein mutants were generally a result of high photosynthetic capabilities counteracted by high respiration rates. Optimization of culture conditions were addressed according to the results of the physiological study of selected strains. In a first step, the photobioreactor (PBR) was provided with a multiple-impeller stirring system designed, developed and tested by us, using the strain cc124. It was found that the impeller system was effectively able to induce regular and turbulent mixing, which led to improved photosynthetic yields by means of light/dark cycles. Moreover, improved mixing regime sustained higher respiration rates, compared to what obtained with the commonly used stir bar mixing system. As far as the results of the initial screening phase are considered, both these factors are relevant to the hydrogen production. Indeed, very high energy conversion efficiencies (light to hydrogen) were obtained with the impeller device, prooving that our PBR was a good tool to both improve and study photosynthetic processes (Giannelli, Scoma et al., 2009). In the second part of the optimization, an accurate analysis of all the positive features of the high performance strain L159I-N230Y pointed out, respect to the WT, it has: (1) a larger chlorophyll optical cross-section; (2) a higher electron transfer rate by PSII; (3) a higher respiration rate; (4) a higher efficiency of utilization of the hydrogenase; (5) a higher starch synthesis capability; (6) a higher per cell D1 protein amount; (7) a higher zeaxanthin synthesis capability (Torzillo, Scoma et al., 2009). These information were gathered with those obtained with the impeller mixing device to find out the best culture conditions to optimize productivity with strain L159I-N230Y. The main aim was to sustain as long as possible the direct PSII contribution, which leads to hydrogen production without net CO2 release. Finally, an outstanding maximum rate of 11.1 ± 1.0 mL/L/h was reached and maintained for 21.8 ± 7.7 hours, when the effective photochemical efficiency of PSII (ΔF/F'm) underwent a last drop to zero. If expressed in terms of chl (24.0 ± 2.2 µmoles/mg chl/h), these rates of production are 4 times higher than what reported in literature to date (Scoma et al., 2010a submitted). DCMU addition experiments confirmed the key role played by PSII in sustaining such rates. On the other hand, experiments carried out in similar conditions with the control strain cc124 showed an improved final productivity, but no constant PSII direct contribution. These results showed that, aside from fermentation processes, if proper conditions are supplied to selected strains, hydrogen production can be substantially enhanced by means of biophotolysis. A last study on the physiology of the process was carried out with the mutant IL. Although able to express and very efficiently utilize the hydrogenase enzyme, this strain was unable to produce hydrogen when sulfur deprived. However, in a specific set of experiments this goal was finally reached, pointing out that other than (1) a state 1-2 transition of the photosynthetic apparatus, (2) starch storage and (3) anaerobiosis establishment, a timely transition to the hydrogen production is also needed in sulfur deprivation to induce the process before energy reserves are driven towards other processes necessary for the survival of the cell. This information turned out to be crucial when moving outdoor for the hydrogen production in a tubular horizontal 50-liter PBR under sunlight radiation. First attempts with laboratory grown cultures showed that no hydrogen production under sulfur starvation can be induced if a previous adaptation of the culture is not pursued outdoor. Indeed, in these conditions the hydrogen production under direct sunlight radiation with C. reinhardtii was finally achieved for the first time in literature (Scoma et al., 2010b submitted). Experiments were also made to optimize productivity in outdoor conditions, with respect to the light dilution within the culture layers. Finally, a brief study of the anaerobic metabolism of C. reinhardtii during hydrogen oxidation has been carried out. This study represents a good integration to the understanding of the complex interplay of pathways that operate concomitantly in this microalga.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The worldwide demand for a clean and low-fuel-consuming transport promotes the development of safe, high energy and power electrochemical storage and conversion systems. Lithium-ion batteries (LIBs) are considered today the best technology for this application as demonstrated by the recent interest of automotive industry in hybrid (HEV) and electric vehicles (EV) based on LIBs. This thesis work, starting from the synthesis and characterization of electrode materials and the use of non-conventional electrolytes, demonstrates that LIBs with novel and safe electrolytes and electrode materials meet the targets of specific energy and power established by U.S.A. Department of Energy (DOE) for automotive application in HEV and EV. In chapter 2 is reported the origin of all chemicals used, the description of the instruments used for synthesis and chemical-physical characterizations, the electrodes preparation, the batteries configuration and the electrochemical characterization procedure of electrodes and batteries. Since the electrolyte is the main critical point of a battery, in particular in large- format modules, in chapter 3 we focused on the characterization of innovative and safe electrolytes based on ionic liquids (characterized by high boiling/decomposition points, thermal and electrochemical stability and appreciable conductivity) and mixtures of ionic liquid with conventional electrolyte. In chapter 4 is discussed the microwave accelerated sol–gel synthesis of the carbon- coated lithium iron phosphate (LiFePO 4 -C), an excellent cathode material for LIBs thanks to its intrinsic safety and tolerance to abusive conditions, which showed excellent electrochemical performance in terms of specific capacity and stability. In chapter 5 are presented the chemical-physical and electrochemical characterizations of graphite and titanium-based anode materials in different electrolytes. We also characterized a new anodic material, amorphous SnCo alloy, synthetized with a nanowire morphology that showed to strongly enhance the electrochemical stability of the material during galvanostatic full charge/discharge cycling. Finally, in chapter 6, are reported different types of batteries, assembled using the LiFePO 4 -C cathode material, different anode materials and electrolytes, characterized by deep galvanostatic charge/discharge cycles at different C-rates and by test procedures of the DOE protocol for evaluating pulse power capability and available energy. First, we tested a battery with the innovative cathode material LiFePO 4 -C and conventional graphite anode and carbonate-based electrolyte (EC DMC LiPF 6 1M) that demonstrated to surpass easily the target for power-assist HEV application. Given that the big concern of conventional lithium-ion batteries is the flammability of highly volatile organic carbonate- based electrolytes, we made safe batteries with electrolytes based on ionic liquid (IL). In order to use graphite anode in IL electrolyte we added to the IL 10% w/w of vinylene carbonate (VC) that produces a stable SEI (solid electrolyte interphase) and prevents the graphite exfoliation phenomenon. Then we assembled batteries with LiFePO 4 -C cathode, graphite anode and PYR 14 TFSI 0.4m LiTFSI with 10% w/w of VC that overcame the DOE targets for HEV application and were stable for over 275 cycles. We also assembled and characterized ―high safety‖ batteries with electrolytes based on pure IL, PYR 14 TFSI with 0.4m LiTFSI as lithium salt, and on mixture of this IL and standard electrolyte (PYR 14 TFSI 50% w/w and EC DMC LiPF 6 50% w/w), using titanium-based anodes (TiO 2 and Li 4 Ti 5 O 12 ) that are commonly considered safer than graphite in abusive conditions. The batteries bearing the pure ionic liquid did not satisfy the targets for HEV application, but the batteries with Li 4 Ti 5 O 12 anode and 50-50 mixture electrolyte were able to surpass the targets. We also assembled and characterized a lithium battery (with lithium metal anode) with a polymeric electrolyte based on poly-ethilenoxide (PEO 20 – LiCF 3 SO 3 +10%ZrO 2 ), which satisfied the targets for EV application and showed a very impressive cycling stability. In conclusion, we developed three lithium-ion batteries of different chemistries that demonstrated to be suitable for application in power-assist hybrid vehicles: graphite/EC DMC LiPF 6 /LiFePO 4 -C, graphite/PYR 14 TFSI 0.4m LiTFSI with 10% VC/LiFePO 4 -C and Li 4 T i5 O 12 /PYR 14 TFSI 50%-EC DMC LiPF 6 50%/LiFePO 4 -C. We also demonstrated that an all solid-state polymer lithium battery as Li/PEO 20 –LiCF 3 SO 3 +10%ZrO 2 /LiFePO 4 -C is suitable for application on electric vehicles. Furthermore we developed a promising anodic material alternative to the graphite, based on SnCo amorphous alloy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis the analysis to reconstruct the transverse momentum p_{t} spectra for pions, kaons and protons identified with the TOF detector of the ALICE experiment in pp Minimum Bias collisions at $\sqrt{s}=7$ TeV was reported. After a detailed description of all the parameters which influence the TOF PID performance (time resolution, calibration, alignment, matching efficiency, time-zero of the event) the method used to identify the particles, the unfolding procedure, was discussed. With this method, thanks also to the excellent TOF performance, the pion and kaon spectra can be reconstructed in the 0.55 GeV/c range, while the protons can be measured in the interval 0.8the robustness of these results, a comparison with the spectra obtained with a $3\sigma$ cut PID procedure, was reported, showing an agreement within 5%. The estimation of the systematic uncertainties was described. The reported spectra provide very useful information to tune the Monte Carlo generators that, as was shown, are not able to describe $\pi$, $K$ and $p$ production over the full momentum range. The same limitation for the theoretical models in describing the data was observed when comparing with the Monte Carlo predictions the $K/\pi$ and $p/\pi$ ratios, as obtained with the TOF analysis. Finally, the comparison between the TOF results and the spectra obtained with analyses that use other ALICE PID detectors and techniques to extend the identified spectra to a wider $p_{t}$ range was reported, showing an agreement within 6\%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Eco-labels and certification are one of the many environmental policy tools that have been under scrutiny in recent years. This is because the damages of environmental degradation are becoming more apparent over time. Hence there is a pressure to come up with tools that help solve even small parts of the problem. Eco-labels have been around for over 30 years. However the market, the environment and eco-labels have changed drastically during this period. Moreover, in the last 5 years there has been a sudden increase in eco-labels making them more visible in the market and to the average consumer. All this has made evident that little is known about the effectiveness of eco-labels as environmental policy tools. Hence, there is a call to find answers regarding the actual effects of eco-labels on the market and on the environment. While this work cannot address whether eco-labels have an environmental impact it addresses the effects of eco-labels on the markets. Moreover, this work aimed to find the role of law in eco-labelling. In addition, it aims to find a legal solution that would improve the performance of eco-labelling and certification.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The diameters of traditional dish concentrators can reach several tens of meters, the construction of monolithic mirrors being difficult at these scales: cheap flat reflecting facets mounted on a common frame generally reproduce a paraboloidal surface. When a standard imaging mirror is coupled with a PV dense array, problems arise since the solar image focused is intrinsically circular. Moreover, the corresponding irradiance distribution is bell-shaped in contrast with the requirement of having all the cells under the same illumination. Mismatch losses occur when interconnected cells experience different conditions, in particular in series connections. In this PhD Thesis, we aim at solving these issues by a multidisciplinary approach, exploiting optical concepts and applications developed specifically for astronomical use, where the improvement of the image quality is a very important issue. The strategy we propose is to boost the spot uniformity acting uniquely on the primary reflector and avoiding the big mirrors segmentation into numerous smaller elements that need to be accurately mounted and aligned. In the proposed method, the shape of the mirrors is analytically described by the Zernike polynomials and its optimization is numerically obtained to give a non-imaging optics able to produce a quasi-square spot, spatially uniform and with prescribed concentration level. The freeform primary optics leads to a substantial gain in efficiency without secondary optics. Simple electrical schemes for the receiver are also required. The concept has been investigated theoretically modeling an example of CPV dense array application, including the development of non-optical aspects as the design of the detector and of the supporting mechanics. For the method proposed and the specific CPV system described, a patent application has been filed in Italy with the number TO2014A000016. The patent has been developed thanks to the collaboration between the University of Bologna and INAF (National Institute for Astrophysics).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present dissertation aims at analyzing the construction of American adolescent culture through teen-targeted television series and the shift in perception that occurs as a consequence of the translation process. In light of the recent changes in television production and consumption modes, largely caused by new technologies, this project explores the evolution of Italian audiences, focusing on fansubbing (freely distributed amateur subtitles made by fans for fan consumption) and social viewing (the re-aggregation of television consumption based on social networks and dedicated platforms, rather than on physical presence). These phenomena are symptoms of a sort of ‘viewership 2.0’ and of a new type of active viewing, which calls for a revision of traditional AVT strategies. Using a framework that combines television studies, new media studies, and fandom studies with an approach to AVT based on Descriptive Translation Studies (Toury 1995), this dissertation analyzes the non-Anglophone audience’s growing need to participation in the global dialogue and appropriation process based on US scheduling and informed by the new paradigm of convergence culture, transmedia storytelling, and affective economics (Jenkins 2006 and 2007), as well as the constraints intrinsic to multimodal translation and the different types of linguistic and cultural adaptation performed through dubbing (which tends to be more domesticating; Venuti 1995) and fansubbing (typically more foreignizing). The study analyzes a selection of episodes from six of the most popular teen television series between 1990 and 2013, which has been divided into three ages based on the different modes of television consumption: top-down, pre-Internet consumption (Beverly Hills, 90210, 1990 – 2000), emergence of audience participation (Buffy the Vampire Slayer, 1997 – 2003; Dawson’s Creek, 1998 – 2003), age of convergence and Viewership 2.0 (Gossip Girl, 2007 – 2012; Glee, 2009 – present; The Big Bang Theory, 2007 - present).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The study of supermassive black hole (SMBH) accretion during their phase of activity (hence becoming active galactic nuclei, AGN), and its relation to the host-galaxy growth, requires large datasets of AGN, including a significant fraction of obscured sources. X-ray data are strategic in AGN selection, because at X-ray energies the contamination from non-active galaxies is far less significant than in optical/infrared surveys, and the selection of obscured AGN, including also a fraction of heavily obscured AGN, is much more effective. In this thesis, I present the results of the Chandra COSMOS Legacy survey, a 4.6 Ms X-ray survey covering the equatorial COSMOS area. The COSMOS Legacy depth (flux limit f=2x10^(-16) erg/s/cm^(-2) in the 0.5-2 keV band) is significantly better than that of other X-ray surveys on similar area, and represents the path for surveys with future facilities, like Athena and X-ray Surveyor. The final Chandra COSMOS Legacy catalog contains 4016 point-like sources, 97% of which with redshift. 65% of the sources are optically obscured and potentially caught in the phase of main BH growth. We used the sample of 174 Chandra COSMOS Legacy at z>3 to place constraints on the BH formation scenario. We found a significant disagreement between our space density and the predictions of a physical model of AGN activation through major-merger. This suggests that in our luminosity range the BH triggering through secular accretion is likely preferred to a major-merger triggering scenario. Thanks to its large statistics, the Chandra COSMOS Legacy dataset, combined with the other multiwavelength COSMOS catalogs, will be used to answer questions related to a large number of astrophysical topics, with particular focus on the SMBH accretion in different luminosity and redshift regimes.