7 resultados para PIAAC <Programme for the International Assessment of Adult Competencies>

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuclear Magnetic Resonance (NMR) is a branch of spectroscopy that is based on the fact that many atomic nuclei may be oriented by a strong magnetic field and will absorb radiofrequency radiation at characteristic frequencies. The parameters that can be measured on the resulting spectral lines (line positions, intensities, line widths, multiplicities and transients in time-dependent experi-ments) can be interpreted in terms of molecular structure, conformation, molecular motion and other rate processes. In this way, high resolution (HR) NMR allows performing qualitative and quantitative analysis of samples in solution, in order to determine the structure of molecules in solution and not only. In the past, high-field NMR spectroscopy has mainly concerned with the elucidation of chemical structure in solution, but today is emerging as a powerful exploratory tool for probing biochemical and physical processes. It represents a versatile tool for the analysis of foods. In literature many NMR studies have been reported on different type of food such as wine, olive oil, coffee, fruit juices, milk, meat, egg, starch granules, flour, etc using different NMR techniques. Traditionally, univariate analytical methods have been used to ex-plore spectroscopic data. This method is useful to measure or to se-lect a single descriptive variable from the whole spectrum and , at the end, only this variable is analyzed. This univariate methods ap-proach, applied to HR-NMR data, lead to different problems due especially to the complexity of an NMR spectrum. In fact, the lat-ter is composed of different signals belonging to different mole-cules, but it is also true that the same molecules can be represented by different signals, generally strongly correlated. The univariate methods, in this case, takes in account only one or a few variables, causing a loss of information. Thus, when dealing with complex samples like foodstuff, univariate analysis of spectra data results not enough powerful. Spectra need to be considered in their wholeness and, for analysing them, it must be taken in consideration the whole data matrix: chemometric methods are designed to treat such multivariate data. Multivariate data analysis is used for a number of distinct, differ-ent purposes and the aims can be divided into three main groups: • data description (explorative data structure modelling of any ge-neric n-dimensional data matrix, PCA for example); • regression and prediction (PLS); • classification and prediction of class belongings for new samples (LDA and PLS-DA and ECVA). The aim of this PhD thesis was to verify the possibility of identify-ing and classifying plants or foodstuffs, in different classes, based on the concerted variation in metabolite levels, detected by NMR spectra and using the multivariate data analysis as a tool to inter-pret NMR information. It is important to underline that the results obtained are useful to point out the metabolic consequences of a specific modification on foodstuffs, avoiding the use of a targeted analysis for the different metabolites. The data analysis is performed by applying chemomet-ric multivariate techniques to the NMR dataset of spectra acquired. The research work presented in this thesis is the result of a three years PhD study. This thesis reports the main results obtained from these two main activities: A1) Evaluation of a data pre-processing system in order to mini-mize unwanted sources of variations, due to different instrumental set up, manual spectra processing and to sample preparations arte-facts; A2) Application of multivariate chemiometric models in data analy-sis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Primary angioplasty has been shown to be more effective than fibrinolysis in terms of mortality and adverse outcomes. More recent data, however, suggests that timely reperfusion with fibrinolysis is comparable to primary angioplasty. The current study gathered data from the International Survey of Acute Coronary Syndromes in Transitional Countries registry. Among 7406 ST-elevation myocardial infarction patients presenting within 12 hours from symptom onset, 6315 underwent primary percutaneous coronary intervention and 1091 were treated with fibrinolysis. The primary outcome was 30-day mortality, while the secondary outcome was a composite of 30-day incidence of death, severe left ventricular dysfunction, stroke or reinfarction. Patients who underwent primary angioplasty tended to have a greater cardiovascular risk profile and were slightly older. On the other hand, patients treated with fibrinolysis received less anti-platelet medications yet were more often prescribed beta blockers in the acute phase. Among those who received fibrinolysis, 43% underwent coronary angiography while 32.3% were treated with a subsequent angioplasty. Total ischemic time was lower in patients undergoing fibrinolysis (185 minutes) than in those treated with primary angioplasty (258 minutes). Rates of primary and secondary combined endpoints were higher in patients receiving fibrinolysis compared to those receiving primary angioplasty (7.8% vs. 4.1%; p<0.0001; OR 1.97, 95% CI, 1.38-2.81; and 14.8% vs. 10.1%, p<0.0001; OR 1.43, 95% CI, 1.12-1.81). When considering only patients receiving reperfusion within 3 hours, regardless of reperfusion strategy, differences in mortality (6.3% vs. 4%, p=0.094, for fibrinolysis or primary angioplasty, respectively; OR 0.87, 95% CI, 0.35-2.16) and in the combined secondary endpoint were no longer observed (12.9% vs 10.8%, p=0.33; OR 0.98, 95% CI, 0.58-1.64), and female sex was no longer a significant predictor of adverse outcomes. When performed 3 hours from symptom onset, fibrinolysis is safe and feasible, in terms of mortality and adverse outcomes, compared to primary angioplasty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the beginning, this Ph.D. project led to an overview of the most common and emerging types of fraud and possible countermeasures in the olive oil sector. Furthermore, possible weaknesses in the current conformity check system for olive oil were highlighted. Among those, despite the organoleptic assessment is a fundamental tool for establishing the virgin olive oils (VOOs) quality grade, the scientific community has evidenced some drawbacks in it. In particular, the application of instrumental screening methods to support the panel test could reduce the work of sensory panels and the cost of this analysis (e.g. for industries, distributors, public and private control laboratories), permitting the increase in the number and the efficiency of the controls. On this basis, a research line called “Quantitative Panel Test” is one of the main expected outcomes of the OLEUM project that is also partially discussed in this doctoral dissertation. In this framework, analytical activities were carried out, within this PhD project, aimed to develop and validate analytical protocols for the study of the profiles in volatile compounds (VOCs) of the VOOs headspace. Specifically, two chromatographic approaches, one targeted and one semi-targeted, to determine VOCs were investigated in this doctoral thesis. The obtained results, will allow the possible establishment of concentration limits and ranges of selected volatile markers, as related to fruitiness and defects, with the aim to support the panel test in the commercial categorization of VOOs. In parallel, a rapid instrumental screening method based on the analysis of VOCs has been investigated to assist the panel test through a fast pre-classification of VOOs samples based on a known level of probability, thus increasing the efficiency of quality control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In food industry, quality assurance requires low cost methods for the rapid assessment of the parameters that affect product stability. Foodstuffs are complex in their structure, mainly composed by gaseous, liquid and solid phases which often coexist in the same product. Special attention is given to water, concerned as natural component of the major food product or as added ingredient of a production process. Particularly water is structurally present in the matrix and not completely available. In this way, water can be present in foodstuff in many different states: as water of crystallization, bound to protein or starch molecules, entrapped in biopolymer networks or adsorbed on solid surfaces of porous food particles. The traditional technique for the assessment of food quality give reliable information but are destructive, time consuming and unsuitable for on line application. The techniques proposed answer to the limited disposition of time and could be able to characterize the main compositional parameters. Dielectric interaction response is mainly related to water and could be useful not only to provide information on the total content but also on the degree of mobility of this ubiquitous molecule in different complex food matrix. In this way the proposal of this thesis is to answer at this need. Dielectric and electric tool can be used for the scope and led us to describe the complex food matrix and predict food characteristic. The thesis is structured in three main part, in the first one some theoretical tools are recalled to well assess the food parameter involved in the quality definition and the techniques able to reply at the problem emerged. The second part explains the research conducted and the experimental plans are illustrated in detail. Finally the last section is left for rapid method easily implementable in an industrial process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a large and long-lived species with high economic value, restricted spawning areas and short spawning periods, the Atlantic bluefin tuna (BFT; Thunnus thynnus) is particularly susceptible to over-exploitation. Although BFT have been targeted by fisheries in the Mediterranean Sea for thousands of years, it has only been in these last decades that the exploitation rate has reached far beyond sustainable levels. An understanding of the population structure, spatial dynamics, exploitation rates and the environmental variables that affect BFT is crucial for the conservation of the species. The aims of this PhD project were 1) to assess the accuracy of larval identification methods, 2) determine the genetic structure of modern BFT populations, 3) assess the self-recruitment rate in the Gulf of Mexico and Mediterranean spawning areas, 4) estimate the immigration rate of BFT to feeding aggregations from the various spawning areas, and 5) develop tools capable of investigating the temporal stability of population structuring in the Mediterranean Sea. Several weaknesses in modern morphology-based taxonomy including demographic decline of expert taxonomists, flawed identification keys, reluctance of the taxonomic community to embrace advances in digital communications and a general scarcity of modern user-friendly materials are reviewed. Barcoding of scombrid larvae revealed important differences in the accuracy of the taxonomic identifications carried out by different ichthyoplanktologists following morphology-based methods. Using a Genotyping-by-Sequencing a panel of 95 SNPs was developed and used to characterize the population structuring of BFT and composition of adult feeding aggregations. Using novel molecular techniques, DNA was extracted from bluefin tuna vertebrae excavated from late iron age, ancient roman settlements Byzantine-era Constantinople and a 20th century collection. A second panel of 96 SNPs was developed to genotype historical and modern samples in order to elucidate changes in population structuring and allele frequencies of loci associated with selective traits.