897 resultados para estimation and filtering


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Understanding the transmission dynamics of infectious diseases is important to allow for improvements of control measures. To investigate the spatiotemporal pattern of an epidemic dengue occurred at a medium-sized city in the Northeast Region of Brazil in 2009, we conducted an ecological study of the notified dengue cases georeferenced according to epidemiological week (EW) and home address. Kernel density estimation and space-time interaction were analysed using the Knox method. The evolution of the epidemic was analysed using an animated projection technique. The dengue incidence was 6.918.7/100,000 inhabitants; the peak of the epidemic occurred from 8 February-1 March, EWs 6-9 (828.7/100,000 inhabitants). There were cases throughout the city and was identified space-time interaction. Three epicenters were responsible for spreading the disease in an expansion and relocation diffusion pattern. If the health services could detect in real time the epicenters and apply nimbly control measures, may possibly reduce the magnitude of dengue epidemics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A problem in the archaeometric classification of Catalan Renaissance pottery is the fact, thatthe clay supply of the pottery workshops was centrally organized by guilds, and thereforeusually all potters of a single production centre produced chemically similar ceramics.However, analysing the glazes of the ware usually a large number of inclusions in the glaze isfound, which reveal technological differences between single workshops. These inclusionshave been used by the potters in order to opacify the transparent glaze and to achieve a whitebackground for further decoration.In order to distinguish different technological preparation procedures of the single workshops,at a Scanning Electron Microscope the chemical composition of those inclusions as well astheir size in the two-dimensional cut is recorded. Based on the latter, a frequency distributionof the apparent diameters is estimated for each sample and type of inclusion.Following an approach by S.D. Wicksell (1925), it is principally possible to transform thedistributions of the apparent 2D-diameters back to those of the true three-dimensional bodies.The applicability of this approach and its practical problems are examined using differentways of kernel density estimation and Monte-Carlo tests of the methodology. Finally, it istested in how far the obtained frequency distributions can be used to classify the pottery

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective. To measure support for seasonal influenza vaccination requirements among US healthcare personnel (HCP) and its associations with attitudes regarding influenza and influenza vaccination and self-reported coverage by existing vaccination requirements. Design. Between June 1 and June 30, 2010, we surveyed a sample of US HCP ([Formula: see text]) recruited using an existing probability-based online research panel of participants representing the US general population as a sampling frame. Setting. General community. Participants. Eligible HCP who (1) reported having worked as medical doctors, health technologists, healthcare support staff, or other health practitioners or who (2) reported having worked in hospitals, ambulatory care facilities, long-term care facilities, or other health-related settings. Methods. We analyzed support for seasonal influenza vaccination requirements for HCP using proportion estimation and multivariable probit models. Results. A total of 57.4% (95% confidence interval, 53.3%-61.5%) of US HCP agreed that HCP should be required to be vaccinated for seasonal influenza. Support for mandatory vaccination was statistically significantly higher among HCP who were subject to employer-based influenza vaccination requirements, who considered influenza to be a serious disease, and who agreed that influenza vaccine was safe and effective. Conclusions. A majority of HCP support influenza vaccination requirements. Moreover, providing HCP with information about the safety of influenza vaccination and communicating that immunization of HCP is a patient safety issue may be important for generating staff support for influenza vaccination requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We analyze the effects of a domestic standard that reduces an externality associated with the consumption of the good targeted by the standard, using a model in which foreign and domestic producers compete in the domestic good market. Producers can reduce expected damage associated with the externality by incurring a cost that varies by source of origin. Despite potential protectionism, the standard is useful in correcting the consumption externality in the domestic country. Protectionism occurs when the welfare-maximizing domestic standard is higher than the international standard maximizing welfare inclusive of foreign profits. The standard is actually anti-protectionist when foreign producers are much more efficient at addressing the externality than are domestic producers. Possible exclusion of domestic or foreign producers arises with large standards, which may alter the classification of a standard as protectionist or non-protectionist. The paper provides important implications for the estimation and use of tariff equivalents of nontariff barriers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate identifiability issues in DSGE models and their consequences for parameter estimation and model evaluation when the objective function measures the distance between estimated and model impulse responses. We show that observational equivalence, partial and weak identification problems are widespread, that they lead to biased estimates, unreliable t-statistics and may induce investigators to select false models. We examine whether different objective functions affect identification and study how small samples interact with parameters and shock identification. We provide diagnostics and tests to detect identification failures and apply them to a state-of-the-art model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Se evaluó la utilización de la malla de 50 mm (2”) en redes de cerco artesanal de la Región Tumbes en una pesquería multiespecífica. Se trabajó con una red control de tamaño de malla de 38 mm (1,5”) y la red experimental de 50 mm (2,0”), con un porcentaje de embande de 0,65 y 0,77, respectivamente. Se determinó diferencia entre las curvas de profundidad de calado del cuerpo central de las redes (tc= 46,670, t*= 1,98, p= 0) la red experimental tuvo mayor profundidad de velado; entre las curvas de velocidad de caída del cuerpo central de las redes, hubo diferencia significativa (tc= 7,790, t*= 1,98, p = 0,000), debido al mayor lastre y filtrado de las mallas de la red experimental. El coeficiente abertura horizontal (μ1) de las mallas en la franja superior durante el máximo velado de la red y el gareteo fue en las mallas del cabecero o copo, parte central y ultimo cuerpo de la red, en promedio 0,71; 0,74 y 0,73 respectivamente; (valores cercanos al coeficiente de armado ideal para el escape de ciertos peces fusiformes). El promedio de μ1 obtenidos en la región de las mallas centrales en el cabecero, centro y ultimo cuerpo de la red fue 0,85; 0,85 y 0,84 respectivamente; lo que indicó una mayor abertura horizontal de las mallas por encima del valor del coeficiente de armado que no permitiría el escape de los peces. Se concluyó que por la condición de las mallas de la red de cerco experimental (tamaño de malla 50 mm) no es óptima para gran parte de la estructura de la red, esto no permitiría la selectividad por tamaños.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diesel oil is a compound derived from petroleum, consisting primarily of hydrocarbons. Poor conditions in transportation and storage of this product can contribute significantly to accidental spills causing serious ecological problems in soil and water and affecting the diversity of the microbial environment. The cloning and sequencing of the 16S rRNA gene is one of the molecular techniques that allows estimation and comparison of the microbial diversity in different environmental samples. The aim of this work was to estimate the diversity of microorganisms from the Bacteria domain in a consortium specialized in diesel oil degradation through partial sequencing of the 16S rRNA gene. After the extraction of DNA metagenomics, the material was amplified by PCR reaction using specific oligonucleotide primers for the 16S rRNA gene. The PCR products were cloned into a pGEM-T-Easy vector (Promega), and Escherichia coli was used as the host cell for recombinant DNAs. The partial clone sequencing was obtained using universal oligonucleotide primers from the vector. The genetic library obtained generated 431 clones. All the sequenced clones presented similarity to phylum Proteobacteria, with Gammaproteobacteria the most present group (49.8 % of the clones), followed by Alphaproteobacteira (44.8 %) and Betaproteobacteria (5.4 %). The Pseudomonas genus was the most abundant in the metagenomic library, followed by the Parvibaculum and the Sphingobium genus, respectively. After partial sequencing of the 16S rRNA, the diversity of the bacterial consortium was estimated using DOTUR software. When comparing these sequences to the database from the National Center for Biotechnology Information (NCBI), a strong correlation was found between the data generated by the software used and the data deposited in NCBI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A major issue in the application of waveform inversion methods to crosshole georadar data is the accurate estimation of the source wavelet. Here, we explore the viability and robustness of incorporating this step into a time-domain waveform inversion procedure through an iterative deconvolution approach. Our results indicate that, at least in non-dispersive electrical environments, such an approach provides remarkably accurate and robust estimates of the source wavelet even in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity. Our results also indicate that the proposed source wavelet estimation approach is relatively insensitive to ambient noise and to the phase characteristics of the starting wavelet. Finally, there appears to be little-to-no trade-off between the wavelet estimation and the tomographic imaging procedures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationship between random variables is nonlinear or when few data are available, the HSIC criterion outperforms other standard methods, such as the linear correlation or mutual information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los acontecimientos pueden contribuir al desarrollo turístico de una ciudad y beneficiar a sus habitantes y empresas. Sin embargo, para poder diseñar los acontecimientos correctamente, es necesario comprender qué características determinan su impacto económico. Esta investigación pretende contribuir a tal entendimiento mediante la estimación y comparación del impacto económico de tres acontecimientos. Para estimar el impacto económico adoptamos un modelo básico de tres factores: (1) número de visitantes por (2) gasto medio por turista por (3) multiplicador. Primero estimamos el número de visitantes particulares y profesionales mediante diversos sistemas de conteo, encuestas personales e información suministrada por el organizador de los eventos. En segundo lugar, obtuvimos los importes y componentes del gasto de los visitantes por medio de una encuesta; también contamos los gastos de organización de los eventos a partir de sus respectivos presupuestos. Y, tercero, utilizamos multiplicadores de tablas input-output para analizar el impacto de los gastos directos sobre la producción, el valor añadido y el empleo y su distribución entre sectores económicos. Además, calculamos y comparamos ratios de rentabilidad de los tres eventos y dimos recomendaciones para aumentar su impacto económico.