920 resultados para Estimation Of Distribution Algorithms
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.
Resumo:
Small-scale dynamic stochastic general equilibrium have been treated as the benchmark of much of the monetary policy literature, given their ability to explain the impact of monetary policy on output, inflation and financial markets. One cause of the empirical failure of New Keynesian models is partially due to the Rational Expectations (RE) paradigm, which entails a tight structure on the dynamics of the system. Under this hypothesis, the agents are assumed to know the data genereting process. In this paper, we propose the econometric analysis of New Keynesian DSGE models under an alternative expectations generating paradigm, which can be regarded as an intermediate position between rational expectations and learning, nameley an adapted version of the "Quasi-Rational" Expectatations (QRE) hypothesis. Given the agents' statistical model, we build a pseudo-structural form from the baseline system of Euler equations, imposing that the length of the reduced form is the same as in the `best' statistical model.
Resumo:
Wir betrachten Systeme von endlich vielen Partikeln, wobei die Partikel sich unabhängig voneinander gemäß eindimensionaler Diffusionen [dX_t = b(X_t),dt + sigma(X_t),dW_t] bewegen. Die Partikel sterben mit positionsabhängigen Raten und hinterlassen eine zufällige Anzahl an Nachkommen, die sich gemäß eines Übergangskerns im Raum verteilen. Zudem immigrieren neue Partikel mit einer konstanten Rate. Ein Prozess mit diesen Eigenschaften wird Verzweigungsprozess mit Immigration genannt. Beobachten wir einen solchen Prozess zu diskreten Zeitpunkten, so ist zunächst nicht offensichtlich, welche diskret beobachteten Punkte zu welchem Pfad gehören. Daher entwickeln wir einen Algorithmus, um den zugrundeliegenden Pfad zu rekonstruieren. Mit Hilfe dieses Algorithmus konstruieren wir einen nichtparametrischen Schätzer für den quadrierten Diffusionskoeffizienten $sigma^2(cdot),$ wobei die Konstruktion im Wesentlichen auf dem Auffüllen eines klassischen Regressionsschemas beruht. Wir beweisen Konsistenz und einen zentralen Grenzwertsatz.
Resumo:
The problem of localizing a scatterer, which represents a tumor, in a homogeneous circular domain, which represents a breast, is addressed. A breast imaging method based on microwaves is considered. The microwave imaging involves to several techniques for detecting, localizing and characterizing tumors in breast tissues. In all such methods an electromagnetic inverse scattering problem exists. For the scattering detection method, an algorithm based on a linear procedure solution, inspired by MUltiple SIgnal Classification algorithm (MUSIC) and Time Reversal method (TR), is implemented. The algorithm returns a reconstructed image of the investigation domain in which it is detected the scatterer position. This image is called pseudospectrum. A preliminary performance analysis of the algorithm vying the working frequency is performed: the resolution and the signal-to-noise ratio of the pseudospectra are improved if a multi-frequency approach is considered. The Geometrical Mean-MUSIC algorithm (GM- MUSIC) is proposed as multi-frequency method. The performance of the GMMUSIC is tested in different real life computer simulations. The performed analysis shows that the algorithm detects the scatterer until the electrical parameters of the breast are known. This is an evident limit, since, in a real life situation, the anatomy of the breast is unknown. An improvement in GM-MUSIC is proposed: the Eye-GMMUSIC algorithm. Eye-GMMUSIC algorithm needs no a priori information on the electrical parameters of the breast. It is an optimizing algorithm based on the pattern search algorithm: it searches the breast parameters which minimize the Signal-to-Clutter Mean Ratio (SCMR) in the signal. Finally, the GM-MUSIC and the Eye-GMMUSIC algorithms are tested on a microwave breast cancer detection system consisting of an dipole antenna, a Vector Network Analyzer and a novel breast phantom built at University of Bologna. The reconstruction of the experimental data confirm the GM-MUSIC ability to localize a scatterer in a homogeneous medium.
Resumo:
Agricultural workers are exposed to various risks, including chemical agents, noise, and many other factors. One of the most characteristic and least known risk factors is constituted by the microclimatic conditions in the different phases of work (in field, in greenhouse, etc). A typical condition is thermal stress due to high temperatures during harvesting operations in open fields or in greenhouses. In Italy, harvesting is carried out for many hours during the day, mainly in the summer, with temperatures often higher than 30 degrees C. According to ISO 7243, these conditions can be considered dangerous for workers' health. The aim of this study is to assess the risks of exposure to microclimatic conditions (heat) for fruit and vegetable harvesters in central Italy by applying methods established by international standards. In order to estimate the risk for workers, the air temperature, radiative temperature, and air speed were measured using instruments in conformity with ISO 7726. Thermodynamic parameters and two more subjective parameters, clothing and the metabolic heat production rate related to the worker's physical activity, were used to calculate the predicted heat strain (PHS) for the exposed workers in conformity with ISO 7933. Environmental and subjective parameters were also measured for greenhouse workers, according to ISO 7243, in order to calculate the wet-bulb globe temperature (WBGT). The results show a slight risk for workers during manual harvesting in the field. On the other hand, the data collected in the greenhouses show that the risk for workers must not be underestimated. The results of the study show that, for manual harvesting work in climates similar to central Italy, it is essential to provide plenty of drinking water and acclimatization for the workers in order to reduce health risks. Moreover, the study emphasizes that the possible health risks for greenhouse workers increase from the month of April through July.
Resumo:
PURPOSE: The purpose of this retrospective study was to examine the reliability of virtually estimated abdominal blood volume using segmentation from postmortem computed tomography (PMCT) data. MATERIALS AND METHODS: Twenty-one cases with free abdominal blood were investigated by PMCT and autopsy. The volume of the blood was estimated using a manual segmentation technique (Amira, Visage Imaging, Germany) and the results were compared to autopsy data. Six of 21 cases had undergone additional post-mortem computed tomographic angiography (PMCTA). RESULTS: The virtually estimated abdominal blood volumes did not differ significantly from those measured at autopsy. Additional PMCTA did not bias data significantly. CONCLUSION: Virtual estimation of abdominal blood volume is a reliable technique. The virtual blood volume estimation is a useful tool to deliver additional information in cases where autopsy is not performed or in cases where a postmortem angiography is performed.
Resumo:
Standard methods for the estimation of the postmortem interval (PMI, time since death), based on the cooling of the corpse, are limited to about 48 h after death. As an alternative, noninvasive postmortem observation of alterations of brain metabolites by means of (1)H MRS has been suggested for an estimation of the PMI at room temperature, so far without including the effect of other ambient temperatures. In order to study the temperature effect, localized (1)H MRS was used to follow brain decomposition in a sheep brain model at four different temperatures between 4 and 26°C with repeated measurements up to 2100 h postmortem. The simultaneous determination of 25 different biochemical compounds at each measurement allowed the time courses of concentration changes to be followed. A sudden and almost simultaneous change of the concentrations of seven compounds was observed after a time span that decreased exponentially from 700 h at 4°C to 30 h at 26°C ambient temperature. As this represents, most probably, the onset of highly variable bacterial decomposition, and thus defines the upper limit for a reliable PMI estimation, data were analyzed only up to this start of bacterial decomposition. As 13 compounds showed unequivocal, reproducible concentration changes during this period while eight showed a linear increase with a slope that was unambiguously related to ambient temperature. Therefore, a single analytical function with PMI and temperature as variables can describe the time courses of metabolite concentrations. Using the inverse of this function, metabolite concentrations determined from a single MR spectrum can be used, together with known ambient temperatures, to calculate the PMI of a corpse. It is concluded that the effect of ambient temperature can be reliably included in the PMI determination by (1)H MRS.
Resumo:
In 1996, a cadaver in adipocere condition was discovered in a bay of the Brienzer See in Switzerland. The torso was named "Brienzi" following the "Iceman" Ötzi. Several outer parts of the body were incrusted; the incrustation was in blue color. Further investigations showed that the bluish covering of parts of the adipocere torso were a mineral known as Vivianite. Vivianite (Fe(3)(PO(4))(2-)(H(2)O)(8)) is an iron phosphate mineral with needle lengths between 100 and 150μm. It is normally associated in a context with organic archaeological and geological materials (some hundreds to millions of years old). Hitherto, it is only described in three cases of human remains. We were able to reconstruct the following facts about 'Brienzi': The man drowned in Lake Brienz or in one of its tributaries during the 1700s. The body was subsequently covered with sedimentation and thus buried under water. An earthquake produced an underwater landslide which eventually exposed the corpse.
Resumo:
The Estimation of Physiologic Ability and Surgical Stress score was designed to predict postoperative morbidity and mortality in general surgery. Our study aims to evaluate its use and accuracy in estimating postoperative outcome after elective pancreatic surgery.
Resumo:
Full axon counting of optic nerve cross-sections represents the most accurate method to quantify axonal damage, but such analysis is very labour intensive. Recently, a new method has been developed, termed targeted sampling, which combines the salient features of a grading scheme with axon counting. Preliminary findings revealed the method compared favourably with random sampling. The aim of the current study was to advance our understanding of the effect of sampling patterns on axon counts by comparing estimated axon counts from targeted sampling with those obtained from fixed-pattern sampling in a large collection of optic nerves with different severities of axonal injury.