925 resultados para Estimation Of Distribution Algorithm
Resumo:
Terrestrial radioactivity for most individual is the major contributor to the total dose and is mostly provided by 238U, 232Th and 40K radionuclides. In particular indoor radioactivity is principally due to 222Rn, a radioactive noble gas descendent of 238U, second cause of lung cancer after cigarettes smoking. Vulsini Volcanic District is a well known quaternary volcanic area located between the northern Latium and southern Tuscany (Central Italy). It is characterized by an high natural radiation background resulting from the high concentrations of 238U, 232Th and 40K in the volcanic products. In this context, subduction-related metasomatic enrichment of incompatible elements in the mantle source coupled with magma differentiation within the upper crust has given rise to U, Th and K enriched melts. Almost every ancient village and town located in this part of Italy has been built with volcanic rocks pertaining to the Vulsini Volcanic District. The radiological risk of living in this area has been estimated considering separately: a. the risk associated with buildings made of volcanic products and built on volcanic rock substrates b. the risk associated to soil characteristics. The former has been evaluated both using direct 222Rn indoor measurements and simulations of “standard rooms” built with the tuffs and lavas from the Vulsini Volcanic District investigated in this work. The latter has been carried out by using in situ measurements of 222Rn activity in the soil gases. A radon risk map for the Bolsena village has been developed using soil radon measurements integrating geological information. Data of airborne radioactivity in ambient aerosol at two elevated stations in Emilia Romagna (North Italy) under the influence of Fukushima plume have been collected, effective doses have been calculated and an extensive comparison between doses associated with artificial and natural sources in different area have been described and discussed.
Resumo:
The research activity characterizing the present thesis was mainly centered on the design, development and validation of methodologies for the estimation of stationary and time-varying connectivity between different regions of the human brain during specific complex cognitive tasks. Such activity involved two main aspects: i) the development of a stable, consistent and reproducible procedure for functional connectivity estimation with a high impact on neuroscience field and ii) its application to real data from healthy volunteers eliciting specific cognitive processes (attention and memory). In particular the methodological issues addressed in the present thesis consisted in finding out an approach to be applied in neuroscience field able to: i) include all the cerebral sources in connectivity estimation process; ii) to accurately describe the temporal evolution of connectivity networks; iii) to assess the significance of connectivity patterns; iv) to consistently describe relevant properties of brain networks. The advancement provided in this thesis allowed finding out quantifiable descriptors of cognitive processes during a high resolution EEG experiment involving subjects performing complex cognitive tasks.
Resumo:
The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).
Resumo:
The present study has been carried out with the following objectives: i) To investigate the attributes of source parameters of local and regional earthquakes; ii) To estimate, as accurately as possible, M0, fc, Δσ and their standard errors to infer their relationship with source size; iii) To quantify high-frequency earthquake ground motion and to study the source scaling. This work is based on observational data of micro, small and moderate -earthquakes for three selected seismic sequences, namely Parkfield (CA, USA), Maule (Chile) and Ferrara (Italy). For the Parkfield seismic sequence (CA), a data set of 757 (42 clusters) repeating micro-earthquakes (0 ≤ MW ≤ 2), collected using borehole High Resolution Seismic Network (HRSN), have been analyzed and interpreted. We used the coda methodology to compute spectral ratios to obtain accurate values of fc , Δσ, and M0 for three target clusters (San Francisco, Los Angeles, and Hawaii) of our data. We also performed a general regression on peak ground velocities to obtain reliable seismic spectra of all earthquakes. For the Maule seismic sequence, a data set of 172 aftershocks of the 2010 MW 8.8 earthquake (3.7 ≤ MW ≤ 6.2), recorded by more than 100 temporary broadband stations, have been analyzed and interpreted to quantify high-frequency earthquake ground motion in this subduction zone. We completely calibrated the excitation and attenuation of the ground motion in Central Chile. For the Ferrara sequence, we calculated moment tensor solutions for 20 events from MW 5.63 (the largest main event occurred on May 20 2012), down to MW 3.2 by a 1-D velocity model for the crust beneath the Pianura Padana, using all the geophysical and geological information available for the area. The PADANIA model allowed a numerical study on the characteristics of the ground motion in the thick sediments of the flood plain.
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.
Resumo:
Small-scale dynamic stochastic general equilibrium have been treated as the benchmark of much of the monetary policy literature, given their ability to explain the impact of monetary policy on output, inflation and financial markets. One cause of the empirical failure of New Keynesian models is partially due to the Rational Expectations (RE) paradigm, which entails a tight structure on the dynamics of the system. Under this hypothesis, the agents are assumed to know the data genereting process. In this paper, we propose the econometric analysis of New Keynesian DSGE models under an alternative expectations generating paradigm, which can be regarded as an intermediate position between rational expectations and learning, nameley an adapted version of the "Quasi-Rational" Expectatations (QRE) hypothesis. Given the agents' statistical model, we build a pseudo-structural form from the baseline system of Euler equations, imposing that the length of the reduced form is the same as in the `best' statistical model.
Resumo:
Agricultural workers are exposed to various risks, including chemical agents, noise, and many other factors. One of the most characteristic and least known risk factors is constituted by the microclimatic conditions in the different phases of work (in field, in greenhouse, etc). A typical condition is thermal stress due to high temperatures during harvesting operations in open fields or in greenhouses. In Italy, harvesting is carried out for many hours during the day, mainly in the summer, with temperatures often higher than 30 degrees C. According to ISO 7243, these conditions can be considered dangerous for workers' health. The aim of this study is to assess the risks of exposure to microclimatic conditions (heat) for fruit and vegetable harvesters in central Italy by applying methods established by international standards. In order to estimate the risk for workers, the air temperature, radiative temperature, and air speed were measured using instruments in conformity with ISO 7726. Thermodynamic parameters and two more subjective parameters, clothing and the metabolic heat production rate related to the worker's physical activity, were used to calculate the predicted heat strain (PHS) for the exposed workers in conformity with ISO 7933. Environmental and subjective parameters were also measured for greenhouse workers, according to ISO 7243, in order to calculate the wet-bulb globe temperature (WBGT). The results show a slight risk for workers during manual harvesting in the field. On the other hand, the data collected in the greenhouses show that the risk for workers must not be underestimated. The results of the study show that, for manual harvesting work in climates similar to central Italy, it is essential to provide plenty of drinking water and acclimatization for the workers in order to reduce health risks. Moreover, the study emphasizes that the possible health risks for greenhouse workers increase from the month of April through July.
Resumo:
PURPOSE: The purpose of this retrospective study was to examine the reliability of virtually estimated abdominal blood volume using segmentation from postmortem computed tomography (PMCT) data. MATERIALS AND METHODS: Twenty-one cases with free abdominal blood were investigated by PMCT and autopsy. The volume of the blood was estimated using a manual segmentation technique (Amira, Visage Imaging, Germany) and the results were compared to autopsy data. Six of 21 cases had undergone additional post-mortem computed tomographic angiography (PMCTA). RESULTS: The virtually estimated abdominal blood volumes did not differ significantly from those measured at autopsy. Additional PMCTA did not bias data significantly. CONCLUSION: Virtual estimation of abdominal blood volume is a reliable technique. The virtual blood volume estimation is a useful tool to deliver additional information in cases where autopsy is not performed or in cases where a postmortem angiography is performed.
Resumo:
Standard methods for the estimation of the postmortem interval (PMI, time since death), based on the cooling of the corpse, are limited to about 48 h after death. As an alternative, noninvasive postmortem observation of alterations of brain metabolites by means of (1)H MRS has been suggested for an estimation of the PMI at room temperature, so far without including the effect of other ambient temperatures. In order to study the temperature effect, localized (1)H MRS was used to follow brain decomposition in a sheep brain model at four different temperatures between 4 and 26°C with repeated measurements up to 2100 h postmortem. The simultaneous determination of 25 different biochemical compounds at each measurement allowed the time courses of concentration changes to be followed. A sudden and almost simultaneous change of the concentrations of seven compounds was observed after a time span that decreased exponentially from 700 h at 4°C to 30 h at 26°C ambient temperature. As this represents, most probably, the onset of highly variable bacterial decomposition, and thus defines the upper limit for a reliable PMI estimation, data were analyzed only up to this start of bacterial decomposition. As 13 compounds showed unequivocal, reproducible concentration changes during this period while eight showed a linear increase with a slope that was unambiguously related to ambient temperature. Therefore, a single analytical function with PMI and temperature as variables can describe the time courses of metabolite concentrations. Using the inverse of this function, metabolite concentrations determined from a single MR spectrum can be used, together with known ambient temperatures, to calculate the PMI of a corpse. It is concluded that the effect of ambient temperature can be reliably included in the PMI determination by (1)H MRS.
Resumo:
In 1996, a cadaver in adipocere condition was discovered in a bay of the Brienzer See in Switzerland. The torso was named "Brienzi" following the "Iceman" Ötzi. Several outer parts of the body were incrusted; the incrustation was in blue color. Further investigations showed that the bluish covering of parts of the adipocere torso were a mineral known as Vivianite. Vivianite (Fe(3)(PO(4))(2-)(H(2)O)(8)) is an iron phosphate mineral with needle lengths between 100 and 150μm. It is normally associated in a context with organic archaeological and geological materials (some hundreds to millions of years old). Hitherto, it is only described in three cases of human remains. We were able to reconstruct the following facts about 'Brienzi': The man drowned in Lake Brienz or in one of its tributaries during the 1700s. The body was subsequently covered with sedimentation and thus buried under water. An earthquake produced an underwater landslide which eventually exposed the corpse.
Resumo:
The Estimation of Physiologic Ability and Surgical Stress score was designed to predict postoperative morbidity and mortality in general surgery. Our study aims to evaluate its use and accuracy in estimating postoperative outcome after elective pancreatic surgery.