882 resultados para VaR Estimation methods, Statistical Methods, Risk managment, Investments
Resumo:
The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.
Resumo:
This Ph.D. thesis focuses on the investigation of some chemical and sensorial analytical parameters linked to the quality and purity of different categories of oils obtained by olives: extra virgin olive oils, both those that are sold in the large retail trade (supermarkets and discounts) and those directly collected at some Italian mills, and lower-quality oils (refined, lampante and “repaso”). Concurrently with the adoption of traditional and well-known analytical procedures such as gas chromatography and high-performance liquid chromatography, I carried out a set-up of innovative, fast and environmentally-friend methods. For example, I developed some analytical approaches based on Fourier transform medium infrared spectroscopy (FT-MIR) and time domain reflectometry (TDR), coupled with a robust chemometric elaboration of the results. I investigated some other freshness and quality markers that are not included in official parameters (in Italian and European regulations): the adoption of such a full chemical and sensorial analytical plan allowed me to obtain interesting information about the degree of quality of the EVOOs, mostly within the Italian market. Here the range of quality of EVOOs resulted very wide, in terms of sensory attributes, price classes and chemical parameters. Thanks to the collaboration with other Italian and foreign research groups, I carried out several applicative studies, especially focusing on the shelf-life of oils obtained by olives and on the effects of thermal stresses on the quality of the products. I also studied some innovative technological treatments, such as the clarification by using inert gases, as an alternative to the traditional filtration. Moreover, during a three-and-a-half months research stay at the University of Applied Sciences in Zurich, I also carried out a study related to the application of statistical methods for the elaboration of sensory results, obtained thanks to the official Swiss Panel and to some consumer tests.
Resumo:
In this work, new tools in atmospheric pollutant sampling and analysis were applied in order to go deeper in source apportionment study. The project was developed mainly by the study of atmospheric emission sources in a suburban area influenced by a municipal solid waste incinerator (MSWI), a medium-sized coastal tourist town and a motorway. Two main research lines were followed. For what concerns the first line, the potentiality of the use of PM samplers coupled with a wind select sensor was assessed. Results showed that they may be a valid support in source apportionment studies. However, meteorological and territorial conditions could strongly affect the results. Moreover, new markers were investigated, particularly focusing on the processes of biomass burning. OC revealed a good biomass combustion process indicator, as well as all determined organic compounds. Among metals, lead and aluminium are well related to the biomass combustion. Surprisingly PM was not enriched of potassium during bonfire event. The second research line consists on the application of Positive Matrix factorization (PMF), a new statistical tool in data analysis. This new technique was applied to datasets which refer to different time resolution data. PMF application to atmospheric deposition fluxes identified six main sources affecting the area. The incinerator’s relative contribution seemed to be negligible. PMF analysis was then applied to PM2.5 collected with samplers coupled with a wind select sensor. The higher number of determined environmental indicators allowed to obtain more detailed results on the sources affecting the area. Vehicular traffic revealed the source of greatest concern for the study area. Also in this case, incinerator’s relative contribution seemed to be negligible. Finally, the application of PMF analysis to hourly aerosol data demonstrated that the higher the temporal resolution of the data was, the more the source profiles were close to the real one.
Resumo:
Foodborne diseases impact human health and economies worldwide in terms of health care and productivity loss. Prevention is necessary and methods to detect, isolate and quantify foodborne pathogens play a fundamental role, changing continuously to face microorganisms and food production evolution. Official methods are mainly based on microorganisms growth in different media and their isolation on selective agars followed by confirmation of presumptive colonies through biochemical and serological test. A complete identification requires form 7 to 10 days. Over the last decades, new molecular techniques based on antibodies and nucleic acids allow a more accurate typing and a faster detection and quantification. The present thesis aims to apply molecular techniques to improve official methods performances regarding two pathogens: Shiga-like Toxin-producing Escherichia coli (STEC) and Listeria monocytogenes. In 2011, a new strain of STEC belonging to the serogroup O104 provoked a large outbreak. Therefore, the development of a method to detect and isolate STEC O104 is demanded. The first objective of this work is the detection, isolation and identification of STEC O104 in sprouts artificially contaminated. Multiplex PCR assays and antibodies anti-O104 incorporated in reagents for immunomagnetic separation and latex agglutination were employed. Contamination levels of less than 1 CFU/g were detected. Multiplex PCR assays permitted a rapid screening of enriched food samples and identification of isolated colonies. Immunomagnetic separation and latex agglutination allowed a high sensitivity and rapid identification of O104 antigen, respectively. The development of a rapid method to detect and quantify Listeria monocytogenes, a high-risk pathogen, is the second objective. Detection of 1 CFU/ml and quantification of 10–1,000 CFU/ml in raw milk were achieved by a sample pretreatment step and quantitative PCR in about 3h. L. monocytogenes growth in raw milk was also evaluated.
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.
Resumo:
The thesis is concerned with local trigonometric regression methods. The aim was to develop a method for extraction of cyclical components in time series. The main results of the thesis are the following. First, a generalization of the filter proposed by Christiano and Fitzgerald is furnished for the smoothing of ARIMA(p,d,q) process. Second, a local trigonometric filter is built, with its statistical properties. Third, they are discussed the convergence properties of trigonometric estimators, and the problem of choosing the order of the model. A large scale simulation experiment has been designed in order to assess the performance of the proposed models and methods. The results show that local trigonometric regression may be a useful tool for periodic time series analysis.
Resumo:
Despite the scientific achievement of the last decades in the astrophysical and cosmological fields, the majority of the Universe energy content is still unknown. A potential solution to the “missing mass problem” is the existence of dark matter in the form of WIMPs. Due to the very small cross section for WIMP-nuleon interactions, the number of expected events is very limited (about 1 ev/tonne/year), thus requiring detectors with large target mass and low background level. The aim of the XENON1T experiment, the first tonne-scale LXe based detector, is to be sensitive to WIMP-nucleon cross section as low as 10^-47 cm^2. To investigate the possibility of such a detector to reach its goal, Monte Carlo simulations are mandatory to estimate the background. To this aim, the GEANT4 toolkit has been used to implement the detector geometry and to simulate the decays from the various background sources: electromagnetic and nuclear. From the analysis of the simulations, the level of background has been found totally acceptable for the experiment purposes: about 1 background event in a 2 tonne-years exposure. Indeed, using the Maximum Gap method, the XENON1T sensitivity has been evaluated and the minimum for the WIMP-nucleon cross sections has been found at 1.87 x 10^-47 cm^2, at 90% CL, for a WIMP mass of 45 GeV/c^2. The results have been independently cross checked by using the Likelihood Ratio method that confirmed such results with an agreement within less than a factor two. Such a result is completely acceptable considering the intrinsic differences between the two statistical methods. Thus, in the PhD thesis it has been proven that the XENON1T detector will be able to reach the designed sensitivity, thus lowering the limits on the WIMP-nucleon cross section by about 2 orders of magnitude with respect to the current experiments.
Resumo:
Information is nowadays a key resource: machine learning and data mining techniques have been developed to extract high-level information from great amounts of data. As most data comes in form of unstructured text in natural languages, research on text mining is currently very active and dealing with practical problems. Among these, text categorization deals with the automatic organization of large quantities of documents in priorly defined taxonomies of topic categories, possibly arranged in large hierarchies. In commonly proposed machine learning approaches, classifiers are automatically trained from pre-labeled documents: they can perform very accurate classification, but often require a consistent training set and notable computational effort. Methods for cross-domain text categorization have been proposed, allowing to leverage a set of labeled documents of one domain to classify those of another one. Most methods use advanced statistical techniques, usually involving tuning of parameters. A first contribution presented here is a method based on nearest centroid classification, where profiles of categories are generated from the known domain and then iteratively adapted to the unknown one. Despite being conceptually simple and having easily tuned parameters, this method achieves state-of-the-art accuracy in most benchmark datasets with fast running times. A second, deeper contribution involves the design of a domain-independent model to distinguish the degree and type of relatedness between arbitrary documents and topics, inferred from the different types of semantic relationships between respective representative words, identified by specific search algorithms. The application of this model is tested on both flat and hierarchical text categorization, where it potentially allows the efficient addition of new categories during classification. Results show that classification accuracy still requires improvements, but models generated from one domain are shown to be effectively able to be reused in a different one.
Resumo:
The revision hip arthroplasty is a surgical procedure, consisting in the reconstruction of the hip joint through the replacement of the damaged hip prosthesis. Several factors may give raise to the failure of the artificial device: aseptic loosening, infection and dislocation represent the principal causes of failure worldwide. The main effect is the raise of bone defects in the region closest to the prosthesis that weaken the bone structure for the biological fixation of the new artificial hip. For this reason bone reconstruction is necessary before the surgical revision operation. This work is born by the necessity to test the effects of bone reconstruction due to particular bone defects in the acetabulum, after the hip prosthesis revision. In order to perform biomechanical in vitro tests on hip prosthesis implanted in human pelvis or hemipelvis a practical definition of a reference frame for these kind of bone specimens is required. The aim of the current study is to create a repeatable protocol to align hemipelvic samples in the testing machine, that relies on a reference system based on anatomical landmarks on the human pelvis. In chapter 1 a general overview of the human pelvic bone is presented: anatomy, bone structure, loads and the principal devices for hip joint replacement. The purpose of chapters 2 is to identify the most common causes of the revision hip arthroplasty, analysing data from the most reliable orthopaedic registries in the world. Chapter 3 presents an overview of the most used classifications for acetabular bone defects and fractures and the most common techniques for acetabular and bone reconstruction. After a critical review of the scientific literature about reference frames for human pelvis, in chapter 4, the definition of a new reference frame is proposed. Based on this reference frame, the alignment protocol for the human hemipelvis is presented as well as the statistical analysis that confirm the good repeatability of the method.
Resumo:
Agricultural workers are exposed to various risks, including chemical agents, noise, and many other factors. One of the most characteristic and least known risk factors is constituted by the microclimatic conditions in the different phases of work (in field, in greenhouse, etc). A typical condition is thermal stress due to high temperatures during harvesting operations in open fields or in greenhouses. In Italy, harvesting is carried out for many hours during the day, mainly in the summer, with temperatures often higher than 30 degrees C. According to ISO 7243, these conditions can be considered dangerous for workers' health. The aim of this study is to assess the risks of exposure to microclimatic conditions (heat) for fruit and vegetable harvesters in central Italy by applying methods established by international standards. In order to estimate the risk for workers, the air temperature, radiative temperature, and air speed were measured using instruments in conformity with ISO 7726. Thermodynamic parameters and two more subjective parameters, clothing and the metabolic heat production rate related to the worker's physical activity, were used to calculate the predicted heat strain (PHS) for the exposed workers in conformity with ISO 7933. Environmental and subjective parameters were also measured for greenhouse workers, according to ISO 7243, in order to calculate the wet-bulb globe temperature (WBGT). The results show a slight risk for workers during manual harvesting in the field. On the other hand, the data collected in the greenhouses show that the risk for workers must not be underestimated. The results of the study show that, for manual harvesting work in climates similar to central Italy, it is essential to provide plenty of drinking water and acclimatization for the workers in order to reduce health risks. Moreover, the study emphasizes that the possible health risks for greenhouse workers increase from the month of April through July.
Resumo:
Background Men who have sex with men (MSM) remain the group most at risk of acquiring HIV infection in Britain. HIV prevalence appears to vary widely between MSM from different ethnic minority groups in this country for reasons that are not fully understood. The aim of the MESH project was to examine in detail the sexual health of ethnic minority MSM living in Britain. Methods/Design The main objectives of the MESH project were to explore among ethnic minority MSM living in Britain: (i) sexual risk behaviour and HIV prevalence; (ii) their experience of stigma and discrimination; (iii) disclosure of sexuality; (iv) use of, and satisfaction with sexual health services; (v) the extent to which sexual health services (for treatment and prevention) are aware of the needs of ethnic minority MSM. The research was conducted between 2006 and 2008 in four national samples: (i) ethnic minority MSM living in Britain; (ii) a comparison group of white British MSM living in Britain; (iii) NHS sexual health clinic staff in 15 British towns and cities with significant ethnic minority communities and; (iv) sexual health promotion/HIV prevention service providers. We also recruited men from two "key migrant" groups living in Britain: MSM born in Central or Eastern Europe and MSM born in Central or South America. Internet-based quantitative and qualitative research methods were used. Ethnic minority MSM were recruited through advertisements on websites, in community venues, via informal networks and in sexual health clinics. White and "key migrant" MSM were recruited mostly through Gaydar, one of the most popular dating sites used by gay men in Britain. MSM who agreed to take part completed a questionnaire online. Ethnic minority MSM who completed the online questionnaire were asked if they would be willing to take part in an online qualitative interview using email. Service providers were identified through the British Association of Sexual Health and HIV (BASHH) and the Terrence Higgins Trust (THT) CHAPS partnerships. Staff who agreed to take part were asked to complete a questionnaire online. The online survey was completed by 1241 ethnic minority MSM, 416 men born in South and Central America or Central and Eastern Europe, and 13,717 white British MSM; 67 ethnic minority MSM took part in the online qualitative interview. In addition 364 people working in sexual health clinics and 124 health promotion workers from around Britain completed an online questionnaire. Discussion The findings from this study will improve our understanding of the sexual health and needs of ethnic minority MSM in Britain.
Resumo:
Background— The age, creatinine, and ejection fraction (ACEF) score (age/left ventricular ejection fraction+1 if creatinine >2.0 mg/dL) has been established as an effective predictor of clinical outcomes in patients undergoing elective coronary artery bypass surgery; however, its utility in “all-comer” patients undergoing percutaneous coronary intervention is yet unexplored. Methods and Results— The ACEF score was calculated for 1208 of the 1707 patients enrolled in the LEADERS trial. Post hoc analysis was performed by stratifying clinical outcomes at the 1-year follow-up according to ACEF score tertiles: ACEFlow ≤1.0225, 1.0225< ACEFmid ≤1.277, and ACEFhigh >1.277. At 1-year follow-up, there was a significantly lower number of patients with major adverse cardiac event–free survival in the highest tertile of the ACEF score (ACEFlow=92.1%, ACEFmid=89.5%, and ACEFhigh=86.1%; P=0.0218). Cardiac death was less frequent in ACEFlow than in ACEFmid and ACEFhigh (0.7% vs 2.2% vs 4.5%; hazard ratio=2.22, P=0.002) patients. Rates of myocardial infarction were significantly higher in patients with a high ACEF score (6.7% for ACEFhigh vs 5.2% for ACEFmid and 2.5% for ACEFlow; hazard ratio=1.6, P=0.006). Clinically driven target-vessel revascularization also tended to be higher in the ACEFhigh group, but the difference among the 3 groups did not reach statistical significance. The rate of composite definite, possible, and probable stent thrombosis was also higher in the ACEFhigh group (ACEFlow=1.2%, ACEFmid=3.5%, and ACEFhigh=6.2%; hazard ratio=2.04, P<0.001). Conclusions— ACEF score may be a simple way to stratify risk of events in patients treated with percutaneous coronary intervention with respect to mortality and risk of myocardial infarction.
Resumo:
BACKGROUND: Detecting a benefit from closure of patent foramen ovale in patients with cryptogenic stroke is hampered by low rates of stroke recurrence and uncertainty about the causal role of patent foramen ovale in the index event. A method to predict patent foramen ovale-attributable recurrence risk is needed. However, individual databases generally have too few stroke recurrences to support risk modeling. Prior studies of this population have been limited by low statistical power for examining factors related to recurrence. AIMS: The aim of this study was to develop a database to support modeling of patent foramen ovale-attributable recurrence risk by combining extant data sets. METHODS: We identified investigators with extant databases including subjects with cryptogenic stroke investigated for patent foramen ovale, determined the availability and characteristics of data in each database, collaboratively specified the variables to be included in the Risk of Paradoxical Embolism database, harmonized the variables across databases, and collected new primary data when necessary and feasible. RESULTS: The Risk of Paradoxical Embolism database has individual clinical, radiologic, and echocardiographic data from 12 component databases, including subjects with cryptogenic stroke both with (n = 1925) and without (n = 1749) patent foramen ovale. In the patent foramen ovale subjects, a total of 381 outcomes (stroke, transient ischemic attack, death) occurred (median follow-up 2·2 years). While there were substantial variations in data collection between studies, there was sufficient overlap to define a common set of variables suitable for risk modeling. CONCLUSION: While individual studies are inadequate for modeling patent foramen ovale-attributable recurrence risk, collaboration between investigators has yielded a database with sufficient power to identify those patients at highest risk for a patent foramen ovale-related stroke recurrence who may have the greatest potential benefit from patent foramen ovale closure.
Resumo:
The aim of this in vitro study was to assess the agreement among four techniques used as gold standard for the validation of methods for occlusal caries detection. Sixty-five human permanent molars were selected and one site in each occlusal surface was chosen as the test site. The teeth were cut and prepared according to each technique: stereomicroscopy without coloring (1), dye enhancement with rhodamine B (2) and fuchsine/acetic light green (3), and semi-quantitative microradiography (4). Digital photographs from each prepared tooth were assessed by three examiners for caries extension. Weighted kappa, as well as Friedman's test with multiple comparisons, was performed to compare all techniques and verify statistical significant differences. Results: kappa values varied from 0.62 to 0.78, the latter being found by both dye enhancement methods. Friedman's test showed statistical significant difference (P < 0.001) and multiple comparison identified these differences among all techniques, except between both dye enhancement methods (rhodamine B and fuchsine/acetic light green). Cross-tabulation showed that the stereomicroscopy overscored the lesions. Both dye enhancement methods showed a good agreement, while stereomicroscopy overscored the lesions. Furthermore, the outcome of caries diagnostic tests may be influenced by the validation method applied. Dye enhancement methods seem to be reliable as gold standard methods.