9 resultados para 200404 Laboratory Phonetics and Speech Science
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.
Resumo:
Herbicides are becoming emergent contaminants in Italian surface, coastal and ground waters, due to their intensive use in agriculture. In marine environments herbicides have adverse effects on non-target organisms, as primary producers, resulting in oxygen depletion and decreased primary productivity. Alterations of species composition in algal communities can also occur due to the different sensitivity among the species. In the present thesis the effects of herbicides, widely used in the Northern Adriatic Sea, on different algal species were studied. The main goal of this work was to study the influence of temperature on algal growth in the presence of the triazinic herbicide terbuthylazine (TBA), and the cellular responses adopted to counteract the toxic effects of the pollutant (Chapter 1 and 2). The development of simulation models to be applied in environmental management are needed to organize and track information in a way that would not be possible otherwise and simulate an ecological prospective. The data collected from laboratory experiments were used to simulate algal responses to the TBA exposure at increasing temperature conditions (Chapter 3). Part of the thesis was conducted in foreign countries. The work presented in Chapter 4 was focused on the effect of high light on growth, toxicity and mixotrophy of the ichtyotoxic species Prymnesium parvum. In addition, a mesocosm experiment was conducted in order to study the synergic effect of the pollutant emamectin benzoate with other anthropogenic stressors, such as oil pollution and induced phytoplankton blooms (Chapter 5).
Resumo:
The term Artificial intelligence acquired a lot of baggage since its introduction and in its current incarnation is synonymous with Deep Learning. The sudden availability of data and computing resources has opened the gates to myriads of applications. Not all are created equal though, and problems might arise especially for fields not closely related to the tasks that pertain tech companies that spearheaded DL. The perspective of practitioners seems to be changing, however. Human-Centric AI emerged in the last few years as a new way of thinking DL and AI applications from the ground up, with a special attention at their relationship with humans. The goal is designing a system that can gracefully integrate in already established workflows, as in many real-world scenarios AI may not be good enough to completely replace its humans. Often this replacement may even be unneeded or undesirable. Another important perspective comes from, Andrew Ng, a DL pioneer, who recently started shifting the focus of development from “better models” towards better, and smaller, data. He defined his approach Data-Centric AI. Without downplaying the importance of pushing the state of the art in DL, we must recognize that if the goal is creating a tool for humans to use, more raw performance may not align with more utility for the final user. A Human-Centric approach is compatible with a Data-Centric one, and we find that the two overlap nicely when human expertise is used as the driving force behind data quality. This thesis documents a series of case-studies where these approaches were employed, to different extents, to guide the design and implementation of intelligent systems. We found human expertise proved crucial in improving datasets and models. The last chapter includes a slight deviation, with studies on the pandemic, still preserving the human and data centric perspective.
Resumo:
Today’s pet food industry is growing rapidly, with pet owners demanding high-quality diets for their pets. The primary role of diet is to provide enough nutrients to meet metabolic requirements, while giving the consumer a feeling of well-being. Diet nutrient composition and digestibility are of crucial importance for health and well being of animals. A recent strategy to improve the quality of food is the use of “nutraceuticals” or “Functional foods”. At the moment, probiotics and prebiotics are among the most studied and frequently used functional food compounds in pet foods. The present thesis reported results from three different studies. The first study aimed to develop a simple laboratory method to predict pet foods digestibility. The developed method was based on the two-step multi-enzymatic incubation assay described by Vervaeke et al. (1989), with some modification in order to better represent the digestive physiology of dogs. A trial was then conducted to compare in vivo digestibility of pet-foods and in vitro digestibility using the newly developed method. Correlation coefficients showed a close correlation between digestibility data of total dry matter and crude protein obtained with in vivo and in vitro methods (0.9976 and 0.9957, respectively). Ether extract presented a lower correlation coefficient, although close to 1 (0.9098). Based on the present results, the new method could be considered as an alternative system of evaluation of dog foods digestibility, reducing the need for using experimental animals in digestibility trials. The second parte of the study aimed to isolate from dog faeces a Lactobacillus strain capable of exert a probiotic effect on dog intestinal microflora. A L. animalis strain was isolated from the faeces of 17 adult healthy dogs..The isolated strain was first studied in vitro when it was added to a canine faecal inoculum (at a final concentration of 6 Log CFU/mL) that was incubated in anaerobic serum bottles and syringes which simulated the large intestine of dogs. Samples of fermentation fluid were collected at 0, 4, 8, and 24 hours for analysis (ammonia, SCFA, pH, lactobacilli, enterococci, coliforms, clostridia). Consequently, the L. animalis strain was fed to nine dogs having lactobacilli counts lower than 4.5 Log CFU per g of faeces. The study indicated that the L animalis strain was able to survive gastrointestinal passage and transitorily colonize the dog intestine. Both in vitro and in vivo results showed that the L. animalis strain positively influenced composition and metabolism of the intestinal microflora of dogs. The third trail investigated in vitro the effects of several non-digestible oligosaccharides (NDO) on dog intestinal microflora composition and metabolism. Substrates were fermented using a canine faecal inoculum that was incubated in anaerobic serum bottles and syringes. Substrates were added at the final concentration of 1g/L (inulin, FOS, pectin, lactitol, gluconic acid) or 4g/L (chicory). Samples of fermentation fluid were collected at 0, 6, and 24 hours for analysis (ammonia, SCFA, pH, lactobacilli, enterococci, coliforms). Gas production was measured throughout the 24 h of the study. Among the tested NDO lactitol showed the best prebiotic properties. In fact, it reduced coliforms and increased lactobacilli counts, enhanced microbial fermentation and promoted the production of SCFA while decreasing BCFA. All the substrates that were investigated showed one or more positive effects on dog faecal microflora metabolism or composition. Further studies (in particular in vivo studies with dogs) will be needed to confirm the prebiotic properties of lactitol and evaluate its optimal level of inclusion in the diet.
Resumo:
FIR spectroscopy is an alternative way of collecting spectra of many inorganic pigments and corrosion products found on art objects, which is not normally observed in the MIR region. Most FIR spectra are traditionally collected in transmission mode but as a real novelty it is now also possible to record FIR spectra in ATR (Attenuated Total Reflectance) mode. In FIR transmission we employ polyethylene (PE) for preparation of pellets by embedding the sample in PE. Unfortunately, the preparation requires heating of the PE in order to produces at transparent pellet. This will affect compounds with low melting points, especially those with structurally incorporated water. Another option in FIR transmission is the use of thin films. We test the use of polyethylene thin film (PETF), both commercial and laboratory-made PETF. ATR collection of samples is possible in both the MIR and FIR region on solid, powdery or liquid samples. Changing from the MIR to the FIR region is easy as it simply requires the change of detector and beamsplitter (which can be performed within a few minutes). No preparation of the sample is necessary, which is a huge advantage over the PE transmission method. The most obvious difference, when comparing transmission with ATR, is the distortion of band shape (which appears asymmetrical in the lower wavenumber region) and intensity differences. However, the biggest difference can be the shift of strong absorbing bands moving to lower wavenumbers in ATR mode. The sometimes huge band shift necessitates the collection of standard library spectra in both FIR transmission and ATR modes, provided these two methods of collecting are to be employed for analyses of unknown samples. Standard samples of 150 pigment and corrosion compounds are thus collected in both FIR transmission and ATR mode in order to build up a digital library of spectra for comparison with unknown samples. XRD, XRF and Raman spectroscopy assists us in confirming the purity or impurity of our standard samples. 24 didactic test tables, with known pigment and binder painted on the surface of a limestone tablet, are used for testing the established library and different ways of collecting in ATR and transmission mode. In ATR, micro samples are scratched from the surface and examined in both the MIR and FIR region. Additionally, direct surface contact of the didactic tablets with the ATR crystal are tested together with water enhanced surface contact. In FIR transmission we compare the powder from our test tablet on the laboratory PETF and embedded in PE. We also compare the PE pellets collected using a 4x beam condenser, focusing the IR beam area from 8 mm to 2 mm. A few samples collected from a mural painting in a Nepalese temple, corrosion products collected from archaeological Chinese bronze objects and samples from a mural paintings in an Italian abbey, are examined by ATR or transmission spectroscopy.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
The aim of the present work is to contribute to a better understanding of the relation between organization theory and management practice. It is organized as a collection of two papers, a theoretical and conceptual contribution and an ethnographic study. The first paper is concerned with systematizing different literatures inside and outside the field of organization studies that deal with the theory-practice relation. After identifying a series of positions to the theory-practice debate and unfolding some of their implicit assumptions and limitations, a new position called entwinement is developed in order to overcome status quo through reconciliation and integration. Accordingly, the paper proposes to reconceptualize theory and practice as a circular iterative process of action and cognition, science and common-sense enacted in the real world both by organization scholars and practitioners according to purposes at hand. The second paper is the ethnographic study of an encounter between two groups of expert academics and practitioners occasioned by a one-year executive business master in an international business school. The research articulates a process view of the knowledge exchange between management academics and practitioners in particular and between individuals belonging to different communities of practice, in general, and emphasizes its dynamic, relational and transformative mechanisms. Findings show that when they are given the chance to interact, academics and practitioners set up local provisional relations that enable them to act as change intermediaries vis-a-vis each other’s worlds, without tying themselves irremediably to each other and to the scenarios they conjointly projected during the master’s experience. Finally, the study shows that provisional relations were accompanied by a recursive shift in knowledge modes. While interacting, academics passed from theory to practical theorizing, practitioners passed from an involved practical mode to a reflexive and quasi-theoretical one, and then, as exchanges proceeded, the other way around.
Resumo:
In this thesis we discuss in what ways computational logic (CL) and data science (DS) can jointly contribute to the management of knowledge within the scope of modern and future artificial intelligence (AI), and how technically-sound software technologies can be realised along the path. An agent-oriented mindset permeates the whole discussion, by stressing pivotal role of autonomous agents in exploiting both means to reach higher degrees of intelligence. Accordingly, the goals of this thesis are manifold. First, we elicit the analogies and differences among CL and DS, hence looking for possible synergies and complementarities along 4 major knowledge-related dimensions, namely representation, acquisition (a.k.a. learning), inference (a.k.a. reasoning), and explanation. In this regard, we propose a conceptual framework through which bridges these disciplines can be described and designed. We then survey the current state of the art of AI technologies, w.r.t. their capability to support bridging CL and DS in practice. After detecting lacks and opportunities, we propose the notion of logic ecosystem as the new conceptual, architectural, and technological solution supporting the incremental integration of symbolic and sub-symbolic AI. Finally, we discuss how our notion of logic ecosys- tem can be reified into actual software technology and extended towards many DS-related directions.
Resumo:
In the agri-food sector, measurement and monitoring activities contribute to high quality end products. In particular, considering food of plant origin, several product quality attributes can be monitored. Among the non-destructive measurement techniques, a large variety of optical techniques are available, including hyperspectral imaging (HSI) in the visible/near-infrared (Vis/NIR) range, which, due to the capacity to integrate image analysis and spectroscopy, proved particularly useful in agronomy and food science. Many published studies regarding HSI systems were carried out under controlled laboratory conditions. In contrast, few studies describe the application of HSI technology directly in the field, in particular for high-resolution proximal measurements carried out on the ground. Based on this background, the activities of the present PhD project were aimed at exploring and deepening knowledge in the application of optical techniques for the estimation of quality attributes of agri-food plant products. First, research activities on laboratory trials carried out on apricots and kiwis for the estimation of soluble solids content (SSC) and flesh firmness (FF) through HSI were reported; subsequently, FF was estimated on kiwis using a NIR-sensitive device; finally, the procyanidin content of red wine was estimated through a device based on the pulsed spectral sensitive photometry technique. In the second part, trials were carried out directly in the field to assess the degree of ripeness of red wine grapes by estimating SSC through HSI, and finally a method for the automatic selection of regions of interest in hyperspectral images of the vineyard was developed. The activities described above have revealed the potential of the optical techniques for sorting-line application; moreover, the application of the HSI technique directly in the field has proved particularly interesting, suggesting further investigations to solve a variety of problems arising from the many environmental variables that may affect the results of the analyses.