915 resultados para Automated estimator


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Res no pot garantir que un procés de selecció d'un sistema automatitzat arribi a tenir èxit, però l'observació d'un conjunt de principis de sentit comú pot ajudar a assegurar l'èxit del resultat. El procés ha de centrar-se en el llarg termini i ha de tenir present el context institucional en el qual el sistema s'establirà. Els sistemes estan cada vegada més orientats als usuaris, i per tant el compromís dels usuaris en el procés de selecció és més i més important. Els components del procés de selecció poden ser previstos i combinats de moltes maneres diferents. El procediment usat per les Purdue University Libraries serveix per a il·lustrar una via en la realització pràctica d'un procés.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los requisitos del Negocio que requieren un gran crecimiento generan mayor complejidad en los Centros de Cómputo.Son los administradores quienes necesitan gestionar el creciente volumen de datos, aplicaciones, y usuarios, así como la rápida proliferación de los servidores y los diferentes sistemas operativos. En este proyecto se pretende reducir la complejidad en la gestión de los Centros de Cómputo, combinando la automatización de la gestión del ciclo de vida y todas las medidas de contingencia necesarias para mantener la integridad de los mismos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To determine whether the relative afferent pupillary defect (RAPD) remains constant over time in normal subjects. METHODS: Seventeen normal subjects were tested with infrared pupillography and automated perimetry in four sessions over 3 years. The changes in RAPD and visual field asymmetry between testing sessions were compared. RESULTS: The range of RAPD was 0.0 to 0.3 log unit, and the difference in the mean deviation between the eyes on automated static perimetry was 0 to 3 dB. Eight subjects repeatedly had an RAPD in the same eye. There was no correlation between the RAPD and the visual field asymmetry at the same visit. Changes in the magnitude of the RAPD between any two sessions were typically small (median, 0.08 log unit; 25th percentile, 0.04 log unit; 75th percentile, 0.15 log unit). CONCLUSIONS: Some normal subjects may show a persistent but small RAPD in the absence of detectable pathologic disease. Therefore, an isolated RAPD in the range of 0.3 log unit that is not associated with any other significant historical or clinical finding should probably be considered benign.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyses intergenerational earnings mobility in Spain correcting for different selection biases. We address the co-residence selection problem by combining information from two samples and using the two-sample two-stage least square estimator. We find a small decrease in elasticity when we move to younger cohorts. Furthermore, we find a higher correlation in the case of daughters than in the case of sons; however, when we consider the employment selection in the case of daughters, by adopting a Heckman-type correction method, the diference between sons and daughters disappears. By decomposing the sources of earnings elasticity across generations, we find that the correlation between child's and father's occupation is the most important component. Finally, quantile regressions estimates show that the influence of the father's earnings is greater when we move to the lower tail of the offspring's earnings distribution, especially in the case of daughters' earnings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a new test of true versus spurious long memory, based on log-periodogram estimation of the long memory parameter using skip-sampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of significance of the long memory parameter, and composite test procedure described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependent-sample bootstrap technique to approximate short-run dependence following fractional differencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Satellite remote sensing imagery is used for forestry, conservation and environmental applications, but insufficient spatial resolution, and, in particular, unavailability of images at the precise timing required for a given application, often prevent achieving a fully operational stage. Airborne remote sensing has the advantage of custom-tuned sensors, resolution and timing, but its price prevents using it as a routine technique for the mentioned fields. Some Unmanned Aerial Vehicles might provide a “third way” solution as low-cost techniques for acquiring remotely sensed information, under close control of the end-user, albeit at the expense of lower quality instrumentation and instability. This report evaluates a light remote sensing system based on a remotely-controlled mini-UAV (ATMOS-3) equipped with a color infra-red camera (VEGCAM-1) designed and operated by CATUAV. We conducted a testing mission over a Mediterranean landscape dominated by an evergreen woodland of Aleppo pine (Pinus halepensis) and (Holm) oak (Quercus ilex) in the Montseny National Park (Catalonia, NE Spain). We took advantage of state-of-the-art ortho-rectified digital aerial imagery (acquired by the Institut Cartogràfic de Catalunya over the area during the previous year) and used it as quality reference. In particular, we paid attention to: 1) Operationality of flight and image acquisition according to a previously defined plan; 2) Radiometric and geometric quality of the images; and 3) Operational use of the images in the context of applications. We conclude that the system has achieved an operational stage regarding flight activities, although with meteorological limits set by wind speed and turbulence. Appropriate landing areas can be sometimes limiting also, but the system is able to land on small and relatively rough terrains such as patches of grassland or short matorral, and we have operated the UAV as far as 7 km from the control unit. Radiometric quality is sufficient for interactive analysis, but probably insufficient for automated processing. A forthcoming camera is supposed to greatly improve radiometric quality and consistency. Conventional GPS positioning through time synchronization provides coarse orientation of the images, with no roll information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Properties of GMM estimators for panel data, which have become very popular in the empirical economic growth literature, are not well known when the number of individuals is small. This paper analyses through Monte Carlo simulations the properties of various GMM and other estimators when the number of individuals is the one typically available in country growth studies. It is found that, provided that some persistency is present in the series, the system GMM estimator has a lower bias and higher efficiency than all the other estimators analysed, including the standard first-differences GMM estimator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I construct "homogeneous" series of GVA at current and constant prices, employment and population for the Spain and its regions covering the period 1955-2007. The series are obtained by linking the Regional Accounts of the National Statistical Institute with the series constructed by Julio Alcaide and his team for the BBVA Foundation. The "switching point" at which this last source stops being used as a reference to construct the linked series is determined using a procedure that allows me to estimate which of the two competing series would produce an estimator with the lowest MSE when it is used as dependent variable in a regression on an arbitrary independent variable. To the extent that it is possible, the difference between the two series found at the point of linkage is distributed between the initial levels of the older series and its subsequent growth using external estimates of the relevant variables at the beginning of the sample period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider a model with parameter phi, and an auxiliary model with parameter theta. Let phi be a randomly sampled from a given density over the known parameter space. Monte Carlo methods can be used to draw simulated data and compute the corresponding estimate of theta, say theta_tilde. A large set of tuples (phi, theta_tilde) can be generated in this manner. Nonparametric methods may be use to fit the function E(phi|theta_tilde=a), using these tuples. It is proposed to estimate phi using the fitted E(phi|theta_tilde=theta_hat), where theta_hat is the auxiliary estimate, using the real sample data. This is a consistent and asymptotically normally distributed estimator, under certain assumptions. Monte Carlo results for dynamic panel data and vector autoregressions show that this estimator can have very attractive small sample properties. Confidence intervals can be constructed using the quantiles of the phi for which theta_tilde is close to theta_hat. Such confidence intervals are found to have very accurate coverage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-world objects are often endowed with features that violate Gestalt principles. In our experiment, we examined the neural correlates of binding under conflict conditions in terms of the binding-by-synchronization hypothesis. We presented an ambiguous stimulus ("diamond illusion") to 12 observers. The display consisted of four oblique gratings drifting within circular apertures. Its interpretation fluctuates between bound ("diamond") and unbound (component gratings) percepts. To model a situation in which Gestalt-driven analysis contradicts the perceptually explicit bound interpretation, we modified the original diamond (OD) stimulus by speeding up one grating. Using OD and modified diamond (MD) stimuli, we managed to dissociate the neural correlates of Gestalt-related (OD vs. MD) and perception-related (bound vs. unbound) factors. Their interaction was expected to reveal the neural networks synchronized specifically in the conflict situation. The synchronization topography of EEG was analyzed with the multivariate S-estimator technique. We found that good Gestalt (OD vs. MD) was associated with a higher posterior synchronization in the beta-gamma band. The effect of perception manifested itself as reciprocal modulations over the posterior and anterior regions (theta/beta-gamma bands). Specifically, higher posterior and lower anterior synchronization supported the bound percept, and the opposite was true for the unbound percept. The interaction showed that binding under challenging perceptual conditions is sustained by enhanced parietal synchronization. We argue that this distributed pattern of synchronization relates to the processes of multistage integration ranging from early grouping operations in the visual areas to maintaining representations in the frontal networks of sensory memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Previous studies reported an increase of mean platelet volume (MPV) in patients with acute ischemic stroke. However, its correlation with stroke severity has not been investigated. Moreover, studies on the association of MPV with functional outcome yielded inconsistent results. Methods: We included all consecutive ischemic stroke patients admitted to CHUV (Centre Hospitalier Universitaire Vaudois) Neurology Service within 24 h after stroke onset who had MPV measured on admission. The association of MPV with stroke severity (NIHSS score at admission and at 24 h) and outcome (Rankin Scale score at 3 and 12 months) was analyzed in univariate analysis. The chi(2) test was performed to compare the frequency of minor strokes (NIHSS score </=4) and good functional outcome (Rankin Scale score </=2) across MPV quartiles. The ANOVA test was used to compare MPV between stroke subtypes according to the TOAST classification. Student's two-tailed unpaired t test was performed to compare MPV between lacunar and nonlacunar strokes. MPV was generated at admission by the Sysmex XE-2100 automated cell counter (Sysmex Corporation, Kobe, Japan) from EDTA blood samples. Results: There was no significant difference in the frequency of minor strokes (p = 0.46) and good functional outcome (p = 0.06) across MPV quartiles. MPV was not associated with stroke severity or outcome in univariate analysis. There was no significant difference in MPV between stroke subtypes according to the TOAST classification (p = 0.173) or between lacunar and nonlacunar strokes (10.50 +/- 0.91 vs. 10.40 +/- 0.81 fl, p = 0.322). Conclusions: MPV, assessed within 24 h after ischemic stroke onset, is not associated with stroke severity or functional outcome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is a guide that explains how to use software that implements the simulated nonparametric moments (SNM) estimator proposed by Creel and Kristensen (2009). The guide shows how results of that paper may easily be replicated, and explains how to install and use the software for estimation of simulable econometric models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.