954 resultados para Robust estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a robust adaptive time synchronization and frequency offset estimation method for coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems by applying electrical dispersion pre-compensation (pre-EDC) to the pilot symbol. This technique effectively eliminates the timing error due to the fiber chromatic dispersion, thus increasing significantly the accuracy of the frequency offset estimation process and improving the overall system performance. In addition, a simple design of the pilot symbol is proposed for full-range frequency offset estimation. This pilot symbol can also be used to carry useful data to effectively reduce the overhead due to time synchronization by a factor of 2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62J05, 62J10, 62F35, 62H12, 62P30.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 60J80.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 62F10, 62F12.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.

Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile regression (QR) was first introduced by Roger Koenker and Gilbert Bassett in 1978. It is robust to outliers which affect least squares estimator on a large scale in linear regression. Instead of modeling mean of the response, QR provides an alternative way to model the relationship between quantiles of the response and covariates. Therefore, QR can be widely used to solve problems in econometrics, environmental sciences and health sciences. Sample size is an important factor in the planning stage of experimental design and observational studies. In ordinary linear regression, sample size may be determined based on either precision analysis or power analysis with closed form formulas. There are also methods that calculate sample size based on precision analysis for QR like C.Jennen-Steinmetz and S.Wellek (2005). A method to estimate sample size for QR based on power analysis was proposed by Shao and Wang (2009). In this paper, a new method is proposed to calculate sample size based on power analysis under hypothesis test of covariate effects. Even though error distribution assumption is not necessary for QR analysis itself, researchers have to make assumptions of error distribution and covariate structure in the planning stage of a study to obtain a reasonable estimate of sample size. In this project, both parametric and nonparametric methods are provided to estimate error distribution. Since the method proposed can be implemented in R, user is able to choose either parametric distribution or nonparametric kernel density estimation for error distribution. User also needs to specify the covariate structure and effect size to carry out sample size and power calculation. The performance of the method proposed is further evaluated using numerical simulation. The results suggest that the sample sizes obtained from our method provide empirical powers that are closed to the nominal power level, for example, 80%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a convolutional neuralnetwork (CNN)-based model for human head pose estimation inlow-resolution multi-modal RGB-D data. We pose the problemas one of classification of human gazing direction. We furtherfine-tune a regressor based on the learned deep classifier. Next wecombine the two models (classification and regression) to estimateapproximate regression confidence. We present state-of-the-artresults in datasets that span the range of high-resolution humanrobot interaction (close up faces plus depth information) data tochallenging low resolution outdoor surveillance data. We buildupon our robust head-pose estimation and further introduce anew visual attention model to recover interaction with theenvironment. Using this probabilistic model, we show thatmany higher level scene understanding like human-human/sceneinteraction detection can be achieved. Our solution runs inreal-time on commercial hardware

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Static state estimators currently in use in power systems are prone to masking by multiple bad data. This is mainly because the power system regression model contains many leverage points; typically they have a cluster pattern. As reported recently in the statistical literature, only high breakdown point estimators are robust enough to cope with gross errors corrupting such a model. This paper deals with one such estimator, the least median of squares estimator, developed by Rousseeuw in 1984. The robustness of this method is assessed while applying it to power systems. Resampling methods are developed, and simulation results for IEEE test systems discussed. © 1991 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over 2 million Anterior Cruciate Ligament (ACL) injuries occur annually worldwide resulting in considerable economic and health burdens (e.g., suffering, surgery, loss of function, risk for re-injury, and osteoarthritis). Current screening methods are effective but they generally rely on expensive and time-consuming biomechanical movement analysis, and thus are impractical solutions. In this dissertation, I report on a series of studies that begins to investigate one potentially efficient alternative to biomechanical screening, namely skilled observational risk assessment (e.g., having experts estimate risk based on observations of athletes movements). Specifically, in Study 1 I discovered that ACL injury risk can be accurately and reliably estimated with nearly instantaneous visual inspection when observed by skilled and knowledgeable professionals. Modern psychometric optimization techniques were then used to develop a robust and efficient 5-item test of ACL injury risk prediction skill—i.e., the ACL Injury-Risk-Estimation Quiz or ACL-IQ. Study 2 cross-validated the results from Study 1 in a larger representative sample of both skilled (Exercise Science/Sports Medicine) and un-skilled (General Population) groups. In accord with research on human expertise, quantitative structural and process modeling of risk estimation indicated that superior performance was largely mediated by specific strategies and skills (e.g., ignoring irrelevant information), independent of domain general cognitive abilities (e.g., metal rotation, general decision skill). These cognitive models suggest that ACL-IQ is a trainable skill, providing a foundation for future research and applications in training, decision support, and ultimately clinical screening investigations. Overall, I present the first evidence that observational ACL injury risk prediction is possible including a robust technology for fast, accurate and reliable measurement—i.e., the ACL-IQ. Discussion focuses on applications and outreach including a web platform that was developed to house the test, provide a repository for further data collection, and increase public and professional awareness and outreach (www.ACL-IQ.org). Future directions and general applications of the skilled movement analysis approach are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Super elastic nitinol (NiTi) wires were exploited as highly robust supports for three distinct crosslinked polymeric ionic liquid (PIL)-based coatings in solid-phase microextraction (SPME). The oxidation of NiTi wires in a boiling (30%w/w) H2O2 solution and subsequent derivatization in vinyltrimethoxysilane (VTMS) allowed for vinyl moieties to be appended to the surface of the support. UV-initiated on-fiber copolymerization of the vinyl-substituted NiTi support with monocationic ionic liquid (IL) monomers and dicationic IL crosslinkers produced a crosslinked PIL-based network that was covalently attached to the NiTi wire. This alteration alleviated receding of the coating from the support, which was observed for an analogous crosslinked PIL applied on unmodified NiTi wires. A series of demanding extraction conditions, including extreme pH, pre-exposure to pure organic solvents, and high temperatures, were applied to investigate the versatility and robustness of the fibers. Acceptable precision of the model analytes was obtained for all fibers under these conditions. Method validation by examining the relative recovery of a homologous group of phthalate esters (PAEs) was performed in drip-brewed coffee (maintained at 60 °C) by direct immersion SPME. Acceptable recoveries were obtained for most PAEs in the part-per-billion level, even in this exceedingly harsh and complex matrix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to correlate the pre-operative imaging, vascularity of the proximal pole, and histology of the proximal pole bone of established scaphoid fracture non-union. This was a prospective non-controlled experimental study. Patients were evaluated pre-operatively for necrosis of the proximal scaphoid fragment by radiography, computed tomography (CT) and magnetic resonance imaging (MRI). Vascular status of the proximal scaphoid was determined intra-operatively, demonstrating the presence or absence of puncate bone bleeding. Samples were harvested from the proximal scaphoid fragment and sent for pathological examination. We determined the association between the imaging and intra-operative examination and histological findings. We evaluated 19 male patients diagnosed with scaphoid nonunion. CT evaluation showed no correlation to scaphoid proximal fragment necrosis. MRI showed marked low signal intensity on T1-weighted images that confirmed the histological diagnosis of necrosis in the proximal scaphoid fragment in all patients. Intra-operative assessment showed that 90% of bones had absence of intra-operative puncate bone bleeding, which was confirmed necrosis by microscopic examination. In scaphoid nonunion MRI images with marked low signal intensity on T1-weighted images and the absence of intra-operative puncate bone bleeding are strong indicatives of osteonecrosis of the proximal fragment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objetivou-se identificar fatores associados ao edentulismo e o seu risco espacial em idosos. Foi realizado um estudo transversal em uma amostra de 372 indivíduos de 60 anos e mais, no Município de Botucatu, São Paulo, Brasil, em 2005. Razões de prevalência brutas e ajustadas foram estimadas por meio de regressão de Poisson, com estimativa robusta da variância e procedimentos de modelagem hierárquica. A análise espacial foi realizada por estimativas de densidade de Kernel. A prevalência de edentulismo foi de 63,17%. Os fatores sociodemográficos associados ao edentulismo foram a baixa escolaridade, o aumento do número de pessoas por cômodo, não possuir automóvel e idade mais avançada, presença de comorbidades, ausência de um cirurgião-dentista regular e ter realizado a última consulta há três anos ou mais. A análise espacial mostrou maior risco nas áreas periféricas. Obteve-se uma melhor compreensão da perda dentária entre os idosos, subsidiando o planejamento de ações em saúde coletiva.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a computer program developed for estimating penetrance rates in autosomal dominant diseases by means of family kinship and phenotype information contained within the pedigrees. The program also determines the exact 95% credibility interval for the penetrance estimate. Both executable (PenCalc for Windows) and web versions (PenCalcWeb) of the software are available. The web version enables further calculations, such as heterozygosity probabilities and assessment of offspring risks for all individuals in the pedigrees. Both programs can be accessed and down-loaded freely at the home-page address http://www.ib.usp.br/~otto/software.htm.