14 resultados para Load test on SPT sampler
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
La caratterizzazione di sedimenti contaminati è un problema complesso, in questo lavoro ci si è occupati di individuare una metodologia di caratterizzazione che tenesse conto sia delle caratteristiche della contaminazione, con analisi volte a determinare il contenuto totale di contaminanti, sia della mobilità degli inquinanti stessi. Una adeguata strategia di caratterizzazione può essere applicata per la valutazione di trattamenti di bonifica, a questo scopo si è valutato il trattamento di soil washing, andando ad indagare le caratteristiche dei sedimenti dragati e del materiale in uscita dal processo, sabbie e frazione fine, andando inoltre a confrontare le caratteristiche della sabbia in uscita con quelle delle sabbie comunemente usate per diverse applicazioni. Si è ritenuto necessario indagare la compatibilità dal punto di vista chimico, granulometrico e morfologico. Per indagare la mobilità si è scelto di applicare i test di cessione definiti sia a livello internazionale che italiano (UNI) e quindi si sono sviluppate le tecnologie necessarie alla effettuazione di test di cessione in modo efficace, automatizzando la gestione del test a pHstat UNI CEN 14997. Questo si è reso necessario a causa della difficoltà di gestire il test manualmente, per via delle tempistiche difficilmente attuabili da parte di un operatore. Le condizioni redox influenzano la mobilità degli inquinanti, in particolare l’invecchiamento all’aria di sedimenti anossici provoca variazioni sensibili nello stato d’ossidazione di alcune componenti, incrementandone la mobilità, si tratta quindi di un aspetto da considerare quando si individuano le adeguate condizioni di stoccaggio-smaltimento, si è eseguita a questo scopo una campagna sperimentale.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
The aim of the thesi is to formulate a suitable Item Response Theory (IRT) based model to measure HRQoL (as latent variable) using a mixed responses questionnaire and relaxing the hypothesis of normal distributed latent variable. The new model is a combination of two models already presented in literature, that is, a latent trait model for mixed responses and an IRT model for Skew Normal latent variable. It is developed in a Bayesian framework, a Markov chain Monte Carlo procedure is used to generate samples of the posterior distribution of the parameters of interest. The proposed model is test on a questionnaire composed by 5 discrete items and one continuous to measure HRQoL in children, the EQ-5D-Y questionnaire. A large sample of children collected in the schools was used. In comparison with a model for only discrete responses and a model for mixed responses and normal latent variable, the new model has better performances, in term of deviance information criterion (DIC), chain convergences times and precision of the estimates.
Resumo:
Polymeric adhesives have been used for many applications like suture and embolization, instead of classic surgical methods or as for dental uses. In this work both subjects have been investigated and the results separated in two parts. In the first, new dentinal adhesives with different polymerizable groups (methacrylic or vinyl-ethereal) were synthesized. A low sensitivity to hydrolysis and equal or enhanced properties, compared to existing commercial products, were considered essentials. Moreover, these monomers need to polymerize by radical photopolymerization and functional groups of different characteristics were tested. All these products were characterized by microtensile bond strength test to determine the bonding strength between the adhesive and tooth. Concerning embolization, cyanoacrylates are nowadays the most-used adhesives in surgery. Thus, they must respond to several requirements. For instance, polymerization time and adhesive strength need to be low, to avoid diffusion of the products in the body and adhesion to the catheter. In order to overcome these problems we developed new cyanoacrylates, which practically instantly polymerize upon contact with blood but do not demonstrate strong adhesion to the catheter, thank to the presence of fluorine atoms, linked to the ester chain. The synthesis of these products was carried out in several steps, such as the depolymerization of the corresponding oligomers at high temperature in acid conditions. Two types of adhesion strengths were determined. Bonding strength between human veins and a microcatheter was determined in vitro by using organic materials as the most realistic model. Another test, on two layers of skin, was conducted to verify the possible use of these new cyanoacrylates as a glue for sutures. As a conclusion, we were able to demonstrate that some of the prepared monomers posses adhesive strength and polymerization time lower than the commercial product Glubran2.
Resumo:
In this work we conduct an experimental analysis on different behavioral models of economic choice. In particular, we analyze the role of overconfidence in shaping the beliefs of economics agents about the future path of their consumption or investment. We discuss the relevance of this bias in expectation formation both from a static and from a dynamic point of view and we analyze the effect of possible interventions aimed to achieve some policy goals. The methodology we follow is both theoretical and empirical. In particular, we make large use of controlled economic field experiments in order to test the predictions of the theoretical models we propose. In the second part of the thesis we discuss the role of cognition and personality in affecting economic preferences and choices. In this way we make a bridge between established psychological research and novel findings in economics. Finally, we conduct a field study on the role of incentives on education. We design different incentive schemes and we test, on randomized groups of students, their effectiveness in improving academic performance.
Resumo:
The evaluation of structural performance of existing concrete buildings, built according to standards and materials quite different to those available today, requires procedures and methods able to cover lack of data about mechanical material properties and reinforcement detailing. To this end detailed inspections and test on materials are required. As a consequence tests on drilled cores are required; on the other end, it is stated that non-destructive testing (NDT) cannot be used as the only mean to get structural information, but can be used in conjunction with destructive testing (DT) by a representative correlation between DT and NDT. The aim of this study is to verify the accuracy of some formulas of correlation available in literature between measured parameters, i.e. rebound index, ultrasonic pulse velocity and compressive strength (SonReb Method). To this end a relevant number of DT and NDT tests has been performed on many school buildings located in Cesena (Italy). The above relationships have been assessed on site correlating NDT results to strength of core drilled in adjacent locations. Nevertheless, concrete compressive strength assessed by means of NDT methods and evaluated with correlation formulas has the advantage of being able to be implemented and used for future applications in a much more simple way than other methods, even if its accuracy is strictly limited to the analysis of concretes having the same characteristics as those used for their calibration. This limitation warranted a search for a different evaluation method for the non-destructive parameters obtained on site. To this aim, the methodology of neural identification of compressive strength is presented. Artificial Neural Network (ANN) suitable for the specific analysis were chosen taking into account the development presented in the literature in this field. The networks were trained and tested in order to detect a more reliable strength identification methodology.
Resumo:
Pathogenic fungi are responsible for vine diseases affecting the grapevine yield and the organoleptic quality of the final wine products. Using of biocontrol agents can represent a sustainable alternative to the use of synthetic fungicides whose intense use can have negative effects on the ecosystem and cause increase resistant pathogen population to synthetic agents. The principal aim of my PhD thesis was the isolation and characterization of new yeast strains and Bacillus subtilis SV108 as biocontrol agent and the comprehension of the mechanism of their antimicrobial action. Accordingly, twenty wild yeast and one selected bacterium isolated among 62 samples, isolated from different Italian and Malaysian regions and molecularly identified, were evaluated in a preliminary screening test on agar. Results showed the highest effects on inhibiting mycelial growth by Starmerella bacillaris FE08.05, Metschnikowia pulcherrima GP8 and Hanseniaspora uvarum GM19. On the other side, Bacillus subtilis SV108 showed the ability of inhibit the mycelial growth of selected fungi by producing antimicrobial compounds on Malt Extract Broth medium recovered by sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) and identified by electrospray ionization (ESI) tandem mass spectrometer Triple TOF 5600. Moreover, in order to analyze the volatile fraction of compounds, the quantitative analysis of the VOCs profiles was performed by GC/MS/SPME. The analysis highlighted the presence of isoamyl and phenylethyl alcohols and an overall higher presence of low-chain fatty acids and volatile ethyl esters. All the data collected suggest that the tested yeasts, found among the epiphytic microbiota associated with grape berries, can be potentially effective for the biological control of pathogenic moulds. On the other hand, the proteomic study conducted on B. subtilis SV108 revealed that there are two cyclic antifungal peptides which can explain the antimicrobial effect of Bacillus subtilis SV108 acting as biocontrol agent against fungal pathogens in grapevine.
Resumo:
The presented study aimed to evaluate the productive and physiological behavior of a 2D multileader apple training systems in the Italian environment both investigating the possibility to increase yield and precision crop load management resolution. Another objective was to find valuable thinning thresholds guaranteeing high yields and matching fruit market requirements. The thesis consists in three studies carried out in a Pink Lady®- Rosy Glow apple orchard trained as a planar multileader training system (double guyot). Fruiting leaders (uprights) dimension, crop load, fruit quality, flower and physiological (leaf gas exchanges and fruit growth rate) data were collected and analysed. The obtained results found that uprights present dependence among each other and as well as a mutual support during fruit development. However, individual upright fruit load and upright’s fruit load distribution on the tree (~ plant crop load) seems to define both upright independence from the other, and single upright crop load effects on the final fruit quality production. Correlations between fruit load and harvest fruit size were found and thanks to that valuable thinning thresholds, based on different vegetative parameters, were obtained. Moreover, it comes out that an upright’s fruit load random distribution presents a widening of those thinning thresholds, keeping un-altered fruit quality. For this reason, uprights resulted a partially physiologically-dependent plant unit. Therefore, if considered and managed as independent, then no major problems on final fruit quality and production occurred. This partly confirmed the possibility to shift crop load management to single upright. The finding of the presented studies together with the benefits coming from multileader planar training systems suggest a high potentiality of the 2D multileader training systems to increase apple production sustainability and profitability for Italian apple orchard, while easing the advent of automation in fruit production.
Resumo:
Cable-driven parallel robots offer significant advantages in terms of workspace dimensions and payload capability. They are attractive for many industrial tasks to be performed on a large scale, such as handling and manufacturing, without a substantial increase in costs and mechanical complexity with respect to a small-scale application. However, since cables can only sustain tensile stresses, cable tensions must be kept within positive limits during the end-effector motion. This problem can be managed by overconstraining the end-effector and controlling cable tensions. Tension control is typically achieved by mounting a load sensor on all cables, and using specific control algorithms to avoid cable slackness or breakage while the end-effector is controlled in a desired position. These algorithms require multiple cascade control loops and they can be complex and computationally demanding. To simplify the control of overconstrained cable-driven parallel robots, this Thesis proposes suitable mechanical design and hybrid control strategies. It is shown how a convenient design of the cable guidance system allows kinematic modeling to be simplified, without introducing geometric approximations. This guidance system employs swiveling pulleys equipped with position and tension sensors and provides a parallelogram arrangement of cables. Furthermore, a hybrid force/position control in the robot joint space is adopted. According to this strategy, a particular set of cables is chosen to be tension-controlled, whereas the other cables are length-controlled. The force-controlled cables are selected based on the computation of a novel index called force-distribution sensitivity to cable-tension errors. This index aims to evaluate the maximum expected cable-tension error in the length-controlled cables if a unit tension error is committed in the force-controlled cables. In practice, the computation of the force-distribution sensitivity allows determining which cables are best to be force-controlled, to ensure the lowest error in the overall force distribution when a hybrid force/position joint-space strategy is used.
Resumo:
Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.
Resumo:
Introduction: The term Clinimetric was introduced by Feinstein in 1982, who first noticed that despite all the improvements in the assessment methods, a number of clinical phenomena were still unconsidered during the evaluation process. Yet today clinical phenomena, such as stress, relevant in diseases progression and course, are not completely evaluated. Only recently, according to the clinimetric approach, Fava and colleagues have introduced specific criteria for evaluating the allostatic overload in clinical setting. Methods: Participants were 240 blood donors recruited from May 2007 to December 2009 in 4 different blood Centers (AVIS) in Italy. Blood samples from each participant were collected for laboratory test the same day the self-rating instruments were administered (Psychosocial Index, Symptom Questionnaire, Psychological well-being scales, Temperament and Character inventory, Self-Report Altruism scale). The study explore different aspects describing sample characteristics and correlates of stress in the total sample (part I), new selection criteria applied to existing instruments to identify individuals reporting allostatic load (part II), and differences on biological correlates between subjects with vs without AL. Results: Significant differences according to gender and past illnesses have been found in different dimensions of well-being and distress. Further, distress was explained for more than 60% by 4 main factors such as anxiety, somatic symptoms, environmental mastery and persistence. According to the new criteria, 98 donors reported AL. Allostatic load individuals reported to engage in less altruistic behaviours. Also they differ in personality traits and characters from controls. In the last part, results showed significant differences among donors according to allostatic load on diverse biological parameters (RBC, MCV, immune essay). Conclusion: This study presents obvious limitations due to its preliminary nature. Further research are need to confirm that these new criteria may lead to identify high risk individuals reporting not only stressful situations but also vulnerabilities.
Resumo:
Il primo studio ha verificato l'affidabilità del software Polimedicus e gli effetti indotti d'allenamento arobico all’intensità del FatMax. 16 soggetti sovrappeso, di circa 40-55anni, sono stati arruolati e sottoposti a un test incrementale fino a raggiungere un RER di 0,95, e da quel momento il carico è stato aumentato di 1 km/ h ogni minuto fino a esaurimento. Successivamente, è stato verificato se i valori estrapolati dal programma erano quelli che si possono verificare durante a un test a carico costante di 1ora. I soggetti dopo 8 settimane di allenamento hanno fatto un altro test incrementale. Il dati hanno mostrato che Polimedicus non è molto affidabile, soprattutto l'HR. Nel secondo studio è stato sviluppato un nuovo programma, Inca, ed i risultati sono stati confrontati con i dati ottenuti dal primo studio con Polimedicus. I risultati finali hanno mostrato che Inca è più affidabile. Nel terzo studio, abbiamo voluto verificare l'esattezza del calcolo del FatMax con Inca e il test FATmaxwork. 25 soggetti in sovrappeso, tra 40-55 anni, sono stati arruolati e sottoposti al FATmaxwork test. Successivamente, è stato verificato se i valori estrapolati da INCA erano quelli che possono verificarsi durante un carico di prova costante di un'ora. L'analisi ha mostrato una precisione del calcolo della FatMax durante il carico di lavoro. Conclusione: E’ emersa una certa difficoltà nel determinare questo parametro, sia per la variabilità inter-individuale che intra-individuale. In futuro bisognerà migliorare INCA per ottenere protocolli di allenamento ancora più validi.
Resumo:
The main objective of this thesis is to obtain a better understanding of the methods to assess the stability of a slope. We have illustrated the principal variants of the Limit Equilibrium (LE) method found in literature, focalizing our attention on the Minimum Lithostatic Deviation (MLD) method, developed by Prof. Tinti and his collaborators (e.g. Tinti and Manucci, 2006, 2008). We had two main goals: the first was to test the MLD method on some real cases. We have selected the case of the Vajont landslide with the objective to reconstruct the conditions that caused the destabilization of Mount Toc, and two sites in the Norwegian margin, where failures has not occurred recently, with the aim to evaluate the present stability state and to assess under which conditions they might be mobilized. The second goal was to study the stability charts by Taylor and by Michalowski, and to use the MLD method to investigate the correctness and adequacy of this engineering tool.