14 resultados para PERSUASIVE TECHNOLOGIES
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This PhD thesis has been proposed to validate and then apply innovative analytical methodologies for the determination of compounds with harmful impact on human health, such as biogenic amines and ochratoxin A in wines. Therefore, the influence of production technology (pH, amino acids precursor and use of different malolactic starters) on biogenic amines content in wines was evaluated. An HPLC method for simultaneous determination of amino acids and amines with precolumnderivatization with 9-Fluorenyl-methoxycarbonyl chloride (FMOC-Cl) and UV detection was developed. Initially, the influence of pH, time of derivatization, gradient profile were studied. In order to improve the separation of amino acids and amines and reduce the time of analysis, it was decided to study the influence of different flows and the use of different columns in the chromatographic method. Firstly, a C18 Luna column was used and later two monolithic columns Chromolith in series. It appeared to be suitable for an easy, precise and accurate determination of a relatively large number of amino acids and amines in wines. This method was then applied on different wines produced in the Emilia Romagna region. The investigation permitted to discriminate between red and white wines. Amino acids content is related to the winemaking process. Biogenic amines content in these wines does not represent a possible toxicological problem for human health. The results of the study of influence of technologies and wine composition demonstrated that pH of wines and amino acids content are the most important factors. Particularly wines with pH > 3,5 show higher concentration of biogenic amines than wines with lower pH. The enrichment of wines by nutrients also influences the content of some biogenic amines that are higher in wines added with amino acids precursors. In this study, amino acids and biogenic amines are not statistically affected by strain of lactic acid bacteria inoculated as a starter for malolactic fermentation. An evaluation of different clean-up (SPE-MycoSep; IACs and LLE) and determination methods (HPLC and ELISA) of ochratoxin A was carried out. The results obtained proved that the SPE clean-up are reliable at the same level while the LLE procedures shows lowest recovery. The ELISA method gave a lower determination and a low reproducibility than HPLC method.
Resumo:
Chemistry can contribute, in many different ways to solve the challenges we are facing to modify our inefficient and fossil-fuel based energy system. The present work was motivated by the search for efficient photoactive materials to be employed in the context of the energy problem: materials to be utilized in energy efficient devices and in the production of renewable electricity and fuels. We presented a new class of copper complexes, that could find application in lighting techhnologies, by serving as luminescent materials in LEC, OLED, WOLED devices. These technologies may provide substantial energy savings in the lighting sector. Moreover, recently, copper complexes have been used as light harvesting compounds in dye sensitized photoelectrochemical solar cells, which offer a viable alternative to silicon-based photovoltaic technologies. We presented also a few supramolecular systems containing fullerene, e.g. dendrimers, dyads and triads.The most complex among these arrays, which contain porphyrin moieties, are presented in the final chapter. They undergo photoinduced energy- and electron transfer processes also with long-lived charge separated states, i.e. the fundamental processes to power artificial photosynthetic systems.
Resumo:
Food technologies today mean reducing agricultural food waste, improvement of food security, enhancement of food sensory properties, enlargement of food market and food economies. Food technologists must be high-skilled technicians with good scientific knowledge of food hygiene, food chemistry, industrial technologies and food engineering, sensory evaluation experience and analytical chemistry. Their role is to apply the modern vision of science in the field of human nutrition, rising up knowledge in food science. The present PhD project starts with the aim of studying and improving frozen fruits quality. Freezing process in very powerful in preserve initial raw material characteristics, but pre-treatment before the freezing process are necessary to improve quality, in particular to improve texture and enzymatic activity of frozen foods. Osmotic Dehydration (OD) and Vacuum Impregnation (VI), are useful techniques to modify fruits and vegetables composition and prepare them to freezing process. These techniques permit to introduce cryo-protective agent into the food matrices, without significant changes of the original structure, but cause a slight leaching of important intrinsic compounds. Phenolic and polyphenolic compounds for example in apples and nectarines treated with hypertonic solutions are slightly decreased, but the effect of concentration due to water removal driven out from the osmotic gradient, cause a final content of phenolic compounds similar to that of the raw material. In many experiment, a very important change in fruit composition regard the aroma profile. This occur in strawberries osmo-dehydrated under vacuum condition or under atmospheric pressure condition. The increment of some volatiles, probably due to fermentative metabolism induced by the osmotic stress of hypertonic treatment, induce a sensory profile modification of frozen fruits, that in some way result in a better acceptability of consumer, that prefer treated frozen fruits to untreated frozen fruits. Among different processes used, a very interesting result was obtained with the application of a osmotic pre-treatment driven out at refrigerated temperature for long time. The final quality of frozen strawberries was very high and a peculiar increment of phenolic profile was detected. This interesting phenomenon was probably due to induction of phenolic biological synthesis (for example as reaction to osmotic stress), or to hydrolysis of polymeric phenolic compounds. Aside this investigation in the cryo-stabilization and dehydrofreezing of fruits, deeper investigation in VI techniques were carried out, as studies of changes in vacuum impregnated prickly pear texture, and in use of VI and ultrasound (US) in aroma enrichment of fruit pieces. Moreover, to develop sensory evaluation tools and analytical chemistry determination (of volatiles and phenolic compounds), some researches were bring off and published in these fields. Specifically dealing with off-flavour development during storage of boiled potato, and capillary zonal electrophoresis (CZE) and high performance liquid chromatography (HPLC) determination of phenolic compounds.
Resumo:
The question “artificial nutrition and hydration (ANH) is therapy or not?” is one of the key point of end-of-life issues in Italy, since it was (and it is also nowadays) a strategic and crucial point of the Italian Bioethics discussion about the last phases of human life: determining if ANH is therapy implies the possibility of being included in the list of treatments that could be mentioned for refusal within the living will document. But who is entitled to decide and judge if ANH is a therapy or not? Scientists? The Legislator? Judges? Patients? This issue at first sight seems just a matter of science, but at stake there is more than a scientific definition. According to several scholars, we are in the era of post-academic Science, in which Science broaden discussion, production, negotation and decision to other social groups that are not just the scientific communities. In this process, called co-production, on one hand scientific knowledge derives from the interaction between scientists and society at large. On the other hand, science is functional to co-production of social order. The continuous negotation on which science has to be used in social decisions is just the evidence of the mirroring negotation for different way to structure and interpret society. Thus, in the interaction between Science and Law, deciding what kind of Science could be suitable for a specific kind of Law, envisages a well defined idea of society behind this choice. I have analysed both the legislative path (still in progress) in the living will act production in Italy and Eluana Englaro’s judicial case (that somehow collapsed in the living will act negotiation), using official documents (hearings, texts of the official conference, committees comments and ruling texts) and interviewing key actors in the two processes from the science communication point of view (who talks in the name of science? Who defines what is a therapy? And how do they do?), finding support on the theoretical framework of the Science&Technologies Studies (S&TS).
Resumo:
The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.
Resumo:
This doctoral dissertation aims to establish fiber-optic technologies overcoming the limiting issues of data communications in indoor environments. Specific applications are broadband mobile distribution in different in-building scenarios and high-speed digital transmission over short-range wired optical systems. Two key enabling technologies are considered: Radio over Fiber (RoF) techniques over standard silica fibers for distributed antenna systems (DAS) and plastic optical fibers (POFs) for short-range communications. Hence, the objectives and achievements of this thesis are related to the application of RoF and POF technologies in different in-building scenarios. On one hand, a theoretical and experimental analysis combined with demonstration activities has been performed on cost-effective RoF systems. An extensive modeling on modal noise impact both on linear and non-linear characteristics of RoF link over silica multimode fiber has been performed to achieve link design rules for an optimum choice of the transmitter, receiver and launching technique. A successful transmission of Long Term Evolution (LTE) mobile signals on the resulting optimized RoF system over silica multimode fiber employing a Fabry-Perot LD, central launch technique and a photodiode with a built-in ball lens was demonstrated up to 525m with performances well compliant with standard requirements. On the other hand, digital signal processing techniques to overcome the bandwidth limitation of POF have been investigated. An uncoded net bit-rate of 5.15Gbit/s was obtained on a 50m long POF link employing an eye-safe transmitter, a silicon photodiode, and DMT modulation with bit and power loading algorithm. With the insertion of 3x2N quadrature amplitude modulation constellation formats, an uncoded net-bit-rate of 5.4Gbit/s was obtained on a 50 m long POF link employing an eye-safe transmitter and a silicon avalanche photodiode. Moreover, simultaneous transmission of baseband 2Gbit/s with DMT and 200Mbit/s with an ultra-wideband radio signal has been validated over a 50m long POF link.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
The meaning of a place has been commonly assigned to the quality of having root (rootedness) or sense of belonging to that setting. While on the contrary, people are nowadays more concerned with the possibilities of free moving and networks of communication. So, the meaning, as well as the materiality of architecture has been dramatically altered with these forces. It is therefore of significance to explore and redefine the sense and the trend of architecture at the age of flow. In this dissertation, initially, we review the gradually changing concept of "place-non-place" and its underlying technological basis. Then we portray the transformation of meaning of architecture as influenced by media and information technology and advanced methods of mobility, in the dawn of 21st century. Against such backdrop, there is a need to sort and analyze architectural practices in response to the triplet of place-non-place and space of flow, which we plan to achieve conclusively. We also trace the concept of flow in the process of formation and transformation of old cities. As a brilliant case study, we look at Persian Bazaar from a socio-architectural point of view. In other word, based on Robert Putnam's theory of social capital, we link social context of the Bazaar with architectural configuration of cities. That is how we believe "cities as flow" are not necessarily a new paradigm.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
n the last few years, the vision of our connected and intelligent information society has evolved to embrace novel technological and research trends. The diffusion of ubiquitous mobile connectivity and advanced handheld portable devices, amplified the importance of the Internet as the communication backbone for the fruition of services and data. The diffusion of mobile and pervasive computing devices, featuring advanced sensing technologies and processing capabilities, triggered the adoption of innovative interaction paradigms: touch responsive surfaces, tangible interfaces and gesture or voice recognition are finally entering our homes and workplaces. We are experiencing the proliferation of smart objects and sensor networks, embedded in our daily living and interconnected through the Internet. This ubiquitous network of always available interconnected devices is enabling new applications and services, ranging from enhancements to home and office environments, to remote healthcare assistance and the birth of a smart environment. This work will present some evolutions in the hardware and software development of embedded systems and sensor networks. Different hardware solutions will be introduced, ranging from smart objects for interaction to advanced inertial sensor nodes for motion tracking, focusing on system-level design. They will be accompanied by the study of innovative data processing algorithms developed and optimized to run on-board of the embedded devices. Gesture recognition, orientation estimation and data reconstruction techniques for sensor networks will be introduced and implemented, with the goal to maximize the tradeoff between performance and energy efficiency. Experimental results will provide an evaluation of the accuracy of the presented methods and validate the efficiency of the proposed embedded systems.
Resumo:
Waste management represents an important issue in our society and Waste-to-Energy incineration plants have been playing a significant role in the last decades, showing an increased importance in Europe. One of the main issues posed by waste combustion is the generation of air contaminants. Particular concern is present about acid gases, mainly hydrogen chloride and sulfur oxides, due to their potential impact on the environment and on human health. Therefore, in the present study the main available technological options for flue gas treatment were analyzed, focusing on dry treatment systems, which are increasingly applied in Municipal Solid Wastes (MSW) incinerators. An operational model was proposed to describe and optimize acid gas removal process. It was applied to an existing MSW incineration plant, where acid gases are neutralized in a two-stage dry treatment system. This process is based on the injection of powdered calcium hydroxide and sodium bicarbonate in reactors followed by fabric filters. HCl and SO2 conversions were expressed as a function of reactants flow rates, calculating model parameters from literature and plant data. The implementation in a software for process simulation allowed the identification of optimal operating conditions, taking into account the reactant feed rates, the amount of solid products and the recycle of the sorbent. Alternative configurations of the reference plant were also assessed. The applicability of the operational model was extended developing also a fundamental approach to the issue. A predictive model was developed, describing mass transfer and kinetic phenomena governing the acid gas neutralization with solid sorbents. The rate controlling steps were identified through the reproduction of literature data, allowing the description of acid gas removal in the case study analyzed. A laboratory device was also designed and started up to assess the required model parameters.
Resumo:
With the aim to provide people with sustainable options, engineers are ethically required to hold the safety, health and welfare of the public paramount and to satisfy society's need for sustainable development. The global crisis and related sustainability challenges are calling for a fundamental change in culture, structures and practices. Sustainability Transitions (ST) have been recognized as promising frameworks for radical system innovation towards sustainability. In order to enhance the effectiveness of transformative processes, both the adoption of a transdisciplinary approach and the experimentation of practices are crucial. The evolution of approaches towards ST provides a series of inspiring cases which allow to identify advances in making sustainability transitions happen. In this framework, the thesis has emphasized the role of Transition Engineering (TE). TE adopts a transdisciplinary approach for engineering to face the sustainability challenges and address the risks of un-sustainability. With this purpose, a definition of Transition Technologies is provided as a valid instruments to contribute to ST. In the empirical section, several transition initiatives have been analysed especially at the urban level. As a consequence, the model of living-lab of sustainability has crucially emerged. Living-labs are environments in which innovative technologies and services are co-created with users active participation. In this framework, university can play a key role as learning organization. The core of the thesis has concerned the experimental application of transition approach within the School of Engineering and Architecture of University of Bologna at Terracini Campus. The final vision is to realize a living-lab of sustainability. Particularly, a Transition Team has been established and several transition experiments have been conducted. The final result is not only the improvement of sustainability and resilience of the Terracini Campus, but the demonstration that university can generate solutions and strategies that tackle the complex, dynamic factors fuelling the global crisis.
Resumo:
This study concerns teachers’ use of digital technologies in student assessment, and how the learning that is developed through the use of technology in mathematics can be evaluated. Nowadays math teachers use digital technologies in their teaching, but not in student assessment. The activities carried out with technology are seen as ‘extra-curricular’ (by both teachers and students), thus students do not learn what they can do in mathematics with digital technologies. I was interested in knowing the reasons teachers do not use digital technology to assess students’ competencies, and what they would need to be able to design innovative and appropriate tasks to assess students’ learning through digital technology. This dissertation is built on two main components: teachers and task design. I analyze teachers’ practices involving digital technologies with Ruthven’s Structuring Features of Classroom Practice, and what relation these practices have to the types of assessment they use. I study the kinds of assessment tasks teachers design with a DGE (Dynamic Geometry Environment), using Laborde’s categorization of DGE tasks. I consider the competencies teachers aim to assess with these tasks, and how their goals relate to the learning outcomes of the curriculum. This study also develops new directions in finding how to design suitable tasks for student mathematical assessment in a DGE, and it is driven by the desire to know what kinds of questions teachers might be more interested in using. I investigate the kinds of technology-based assessment tasks teachers value, and the type of feedback they give to students. Finally, I point out that the curriculum should include a range of mathematical and technological competencies that involve the use of digital technologies in mathematics, and I evaluate the possibility to take advantage of technology feedback to allow students to continue learning while they are taking a test.