897 resultados para estimation and filtering


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis mainly deals with the preparation and studies on magnetic composites based on spinel ferrites prepared both chemically and mechanically. Rubber ferrite composites (RFC) are chosen because of their mouldability and flexibility and the ease with which the dielectric and magnetic properties can be manipulated to make them as useful devices. Natural rubber is chosen as the Matrix because of its local availability and possible value addition. Moreover, NR represents a typical unsaturated nonpolar matrix. The work can be thought of as two parts. Part l concentrates on the preparation and characterization of nanocomposites based on y-Fe203. Part 2 deals with the preparation and characterization of RFCs containing Nickel zinc ferrit In the present study magnetic nanocomposites have been prepared by ionexchange method and the preparation conditions have been optimized. The insitu incorporation of the magnetic component is carried out chemically. This method is selected as it is the easiest and simplest method for preparation of nanocomposite. Nanocomposite samples thus prepared were studied using VSM, Mossbauer spectroscopy, Iron content estimation, and ESR spectroscopy. For the preparation of RFCs, the filler material namely nickel zinc ferrite having the general formula Ni)_xZnxFez04, where x varies from 0 to 1 in steps of 0.2 have been prepared by the conventional ceramic techniques. The system of Nil_xZn"Fe204 is chosen because of their excellent high frequency characteristics. After characterization they are incorporated into the polymer matrix of natural rubber by mechanical method. The incorporation is done according to a specific recipe and for various Loadings of magnetic fillers and also for all compositions. The cure characteristics, magnetic properties and dielectric properties of these composites are evaluated. The ac electrical conductivity of both ceramic nickel zinc ferrites and rubber ferrite composites are also calculated using a simple relation. The results are correlated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main objective of the present work is to acquire information regarding the growth responses of P. monodon larvae (from PZ1 upto PL1) to various mono specific and mixed diets. Evaluate the nutritional quality of selected species of micro algae viz. Chaetoceros calcitrans, Dunaliella salina, Isochrysis galbana and Nannochloropsis salina, larvae at three cell concentrations 10x104 cells/ml, 25x104 cells/ml and 50x104 cells/ml. The P. monodon larvae were transported, at the Nauplius stage, to the laboratory. The larvae were stocked at density of 150 larvae per litre in 5 litre FRP tanks with 3 litres of sea water. The algal cell density given to the larvae varied. The larval stages were fed with increasing densities of algae to evaluate the relationship between the food densities, ingestion rates, development and growth of the larvae. The water quality parameters, the percentage of survival rate, the growth estimation and the algal cell count were done. Each experiment was carried out in triplicate with a control group of larvae fed with Chaetoceros calcitrans. For the estimation standard procedures were used.to P. monodon

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A revolution\0\0\0 in earthmoving, a $100 billion industry, can be achieved with three components: the GPS location system, sensors and computers in bulldozers, and SITE CONTROLLER, a central computer system that maintains design data and directs operations. The first two components are widely available; I built SITE CONTROLLER to complete the triangle and describe it here. SITE CONTROLLER assists civil engineers in the design, estimation, and construction of earthworks, including hazardous waste site remediation. The core of SITE CONTROLLER is a site modelling system that represents existing and prospective terrain shapes, roads, hydrology, etc. Around this core are analysis, simulation, and vehicle control tools. Integrating these modules into one program enables civil engineers and contractors to use a single interface and database throughout the life of a project.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes. Against this background the relevance of perturbations and subcompositions can be clearly seen. Moreover we can identify a number of hypotheses of interest involving the specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we use the most representative models that exist in the literature on term structure of interest rates. In particular, we explore affine one factor models and polynomial-type approximations such as Nelson and Siegel. Our empirical application considers monthly data of USA and Colombia for estimation and forecasting. We find that affine models do not provide adequate performance either in-sample or out-of-sample. On the contrary, parsimonious models such as Nelson and Siegel have adequate results in-sample, however out-of-sample they are not able to systematically improve upon random walk base forecast.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We look at at the empirical validity of Schelling’s models for racial residential segregation applied to the case of Chicago. Most of the empirical literature has focused exclusively the single neighborhood model, also known as the tipping point model and neglected a multineighborhood approach or a unified approach. The multi-neighborhood approach introduced spatial interaction across the neighborhoods, in particular we look at spatial interaction across neighborhoods sharing a border. An initial exploration of the data indicates that spatial contiguity might be relevant to properly analyse the so call tipping phenomena of predominately non-Hispanic white neighborhoods to predominantly minority neighborhoods within a decade. We introduce an econometric model that combines an approach to estimate tipping point using threshold effects and a spatial autoregressive model. The estimation results from the model disputes the existence of a tipping point, that is a discontinuous change in the rate of growth of the non-Hispanic white population due to a small increase in the minority share of the neighborhood. In addition we find that racial distance between the neighborhood of interest and it surrounding neighborhoods has an important effect on the dynamics of racial segregation in Chicago.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the role of natural resource windfalls in explaining the efficiency of public expenditures. Using a rich dataset of expenditures and public good provision for 1,836 municipalities in Peru for period 2001-2010, we estimate a non-monotonic relationship between the efficiency of public good provision and the level of natural resource transfers. Local governments that were extremely favored by the boom of mineral prices were more efficient in using fiscal windfalls whereas those benefited with modest transfers were more inefficient. These results can be explained by the increase in political competition associated with the boom. However, the fact that increases in efficiency were related to reductions in public good provision casts doubts about the beneficial effects of political competition in promoting efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The structure of turbulent flow over large roughness consisting of regular arrays of cubical obstacles is investigated numerically under constant pressure gradient conditions. Results are analysed in terms of first- and second-order statistics, by visualization of instantaneous flow fields and by conditional averaging. The accuracy of the simulations is established by detailed comparisons of first- and second-order statistics with wind-tunnel measurements. Coherent structures in the log region are investigated. Structure angles are computed from two-point correlations, and quadrant analysis is performed to determine the relative importance of Q2 and Q4 events (ejections and sweeps) as a function of height above the roughness. Flow visualization shows the existence of low-momentum regions (LMRs) as well as vortical structures throughout the log layer. Filtering techniques are used to reveal instantaneous examples of the association of the vortices with the LMRs, and linear stochastic estimation and conditional averaging are employed to deduce their statistical properties. The conditional averaging results reveal the presence of LMRs and regions of Q2 and Q4 events that appear to be associated with hairpin-like vortices, but a quantitative correspondence between the sizes of the vortices and those of the LMRs is difficult to establish; a simple estimate of the ratio of the vortex width to the LMR width gives a value that is several times larger than the corresponding ratio over smooth walls. The shape and inclination of the vortices and their spatial organization are compared to recent findings over smooth walls. Characteristic length scales are shown to scale linearly with height in the log region. Whilst there are striking qualitative similarities with smooth walls, there are also important differences in detail regarding: (i) structure angles and sizes and their dependence on distance from the rough surface; (ii) the flow structure close to the roughness; (iii) the roles of inflows into and outflows from cavities within the roughness; (iv) larger vortices on the rough wall compared to the smooth wall; (v) the effect of the different generation mechanism at the wall in setting the scales of structures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The technical comments by Getz and Lloyd-Smith, Ross, and Doncaster focus on specific aspects of our analysis and estimation and do not demonstrate any results opposing our key conclusion-that, contrary to what was previously believed, the relation between a population's growth rate (pgr) and its density is generally concave.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Event-related functional magnetic resonance imaging (efMRI) has emerged as a powerful technique for detecting brains' responses to presented stimuli. A primary goal in efMRI data analysis is to estimate the Hemodynamic Response Function (HRF) and to locate activated regions in human brains when specific tasks are performed. This paper develops new methodologies that are important improvements not only to parametric but also to nonparametric estimation and hypothesis testing of the HRF. First, an effective and computationally fast scheme for estimating the error covariance matrix for efMRI is proposed. Second, methodologies for estimation and hypothesis testing of the HRF are developed. Simulations support the effectiveness of our proposed methods. When applied to an efMRI dataset from an emotional control study, our method reveals more meaningful findings than the popular methods offered by AFNI and FSL. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sparse kernel density estimator is derived based on the zero-norm constraint, in which the zero-norm of the kernel weights is incorporated to enhance model sparsity. The classical Parzen window estimate is adopted as the desired response for density estimation, and an approximate function of the zero-norm is used for achieving mathemtical tractability and algorithmic efficiency. Under the mild condition of the positive definite design matrix, the kernel weights of the proposed density estimator based on the zero-norm approximation can be obtained using the multiplicative nonnegative quadratic programming algorithm. Using the -optimality based selection algorithm as the preprocessing to select a small significant subset design matrix, the proposed zero-norm based approach offers an effective means for constructing very sparse kernel density estimates with excellent generalisation performance.