988 resultados para Sequential error ratio


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the signalling effect of the consumption−wealth ratio (cay) on German stock returns via vector error correction models (VECMs). The effect of cay on U.S. stock returns has been recently confirmed by Lettau and Ludvigson with a two−stage method. In this paper, performance of the VECMs and the two−stage method are compared in both German and U.S. data. It is found that the VECMs are more suitable to study the effect of cay on stock returns than the two−stage method. Using the Conditional−Subset VECM, cay signals real stock returns and excess returns in both data sets significantly. The estimated coefficient on cay for stock returns turns out to be two times greater in U.S. data than in German data. When the two−stage method is used, cay has no significant effect on German stock returns. Besides, it is also found that cay signals German wealth growth and U.S. income growth significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-based Synthetic Aperture Radar (SAR) has proved useful for obtaining information on flood extent, which, when intersected with a Digital Elevation Model (DEM) of the floodplain, provides water level observations that can be assimilated into a hydrodynamic model to decrease forecast uncertainty. With an increasing number of operational satellites with SAR capability, information on the relationship between satellite first visit and revisit times and forecast performance is required to optimise the operational scheduling of satellite imagery. By using an Ensemble Transform Kalman Filter (ETKF) and a synthetic analysis with the 2D hydrodynamic model LISFLOOD-FP based on a real flooding case affecting an urban area (summer 2007,Tewkesbury, Southwest UK), we evaluate the sensitivity of the forecast performance to visit parameters. We emulate a generic hydrologic-hydrodynamic modelling cascade by imposing a bias and spatiotemporal correlations to the inflow error ensemble into the hydrodynamic domain. First, in agreement with previous research, estimation and correction for this bias leads to a clear improvement in keeping the forecast on track. Second, imagery obtained early in the flood is shown to have a large influence on forecast statistics. Revisit interval is most influential for early observations. The results are promising for the future of remote sensing-based water level observations for real-time flood forecasting in complex scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure is described in which patients are randomized between two experimental treatments and a control. At a series of interim analyses, each experimental treatment is compared with control. One of the experimental treatments might then be found sufficiently superior to the control for it to be declared the best treatment, and the trial stopped. Alternatively, experimental treatments might be eliminated from further consideration at any stage. It is shown how the procedure can be conducted while controlling overall error probabilities. Data concerning evaluation of different doses of riluzole in the treatment of motor neurone disease are used for illustration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the conditions for detectability of two planets in a 2/1 mean-motion resonance from radial velocity data, as a function of their masses, number of observations and the signal-to-noise ratio. Even for a data set of the order of 100 observations and standard deviations of the order of a few meters per second, we find that Jovian-size resonant planets are difficult to detect if the masses of the planets differ by a factor larger than similar to 4. This is consistent with the present population of real exosystems in the 2/1 commensurability, most of which have resonant pairs with similar minimum masses, and could indicate that many other resonant systems exist, but are currently beyond the detectability limit. Furthermore, we analyze the error distribution in masses and orbital elements of orbital fits from synthetic data sets for resonant planets in the 2/1 commensurability. For various mass ratios and number of data points we find that the eccentricity of the outer planet is systematically overestimated, although the inner planet`s eccentricity suffers a much smaller effect. If the initial conditions correspond to small-amplitude oscillations around stable apsidal corotation resonances, the amplitudes estimated from the orbital fits are biased toward larger amplitudes, in accordance to results found in real resonant extrasolar systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we deal with the issue of performing accurate testing inference on a scalar parameter of interest in structural errors-in-variables models. The error terms are allowed to follow a multivariate distribution in the class of the elliptical distributions, which has the multivariate normal distribution as special case. We derive a modified signed likelihood ratio statistic that follows a standard normal distribution with a high degree of accuracy. Our Monte Carlo results show that the modified test is much less size distorted than its unmodified counterpart. An application is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main object of this paper is to discuss the Bayes estimation of the regression coefficients in the elliptically distributed simple regression model with measurement errors. The posterior distribution for the line parameters is obtained in a closed form, considering the following: the ratio of the error variances is known, informative prior distribution for the error variance, and non-informative prior distributions for the regression coefficients and for the incidental parameters. We proved that the posterior distribution of the regression coefficients has at most two real modes. Situations with a single mode are more likely than those with two modes, especially in large samples. The precision of the modal estimators is studied by deriving the Hessian matrix, which although complicated can be computed numerically. The posterior mean is estimated by using the Gibbs sampling algorithm and approximations by normal distributions. The results are applied to a real data set and connections with results in the literature are reported. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the development of a sequential injection chromatography (SIC) procedure for separation and quantification of the herbicides simazine, atrazine, and propazine exploring the low backpressure of a 2.5 cm long monolithic C(18) column. The separation of the three compounds was achieved in less than 90 s with resolution > 1.5 using a mobile phase composed by ACN/1.25 mmol/L acetate buffer (pH 4.5) at the volumetric ratio of 35:65 and flow rate of 40 mu L/s. Detection was made at 223 nm using a flow cell with 40 mm of optical path length. The LOD was 10 mu g/L for the three triazines and the quantification limits were of 30 mu g/L for simazine and propazine and 40 mu g/L for atrazine. The sampling frequency is 27 samples per hour, consuming 1.1 mL of ACN per analysis. The proposed methodology was applied to spiked water samples and no statistically significant differences were observed in comparison to a conventional HPLC-UV method. The major metabolites of atrazine and other herbicides did not interfere in the analysis, being eluted from the column either together with the unretained peak, or at retention times well-resolved from the studied compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of sequential injection chromatography (SIC) was exploited to automate the fluorimetric determination of amino acids after pre-column derivatization with ophthaldialdehyde (OPA) in presence of 2-mercaptoethanol (2MCE) using a reverse phase monolithic C(18) stationary phase. The method is low-priced and based on five steps of isocratic elutions. The first step employs the mixture methanol: tetrahydrofuran: 10 mmol L(-1) phosphate buffer (pH 7.2) at the volumetric ratio of 8:1:91; the other steps use methanol: 10 mmol L-1 phosphate buffer (pH 7.2) at volumetric ratios of 20:80, 35:65, SO:SO and 65:35. At a flow rate of 10 mu L s(-1) a 25 mm long-column was able to separate aspartic acid (Asp), glutamic acid (Glu), asparagine (Asn), serine (Ser), glutamine (Gln), glycine (Gly), threonine (Thr), citruline (Ctr), arginine (Arg), alanine (Ala), tyrosine (Tyr), phenylalanine (Phe), ornithine (Orn) and lysine (Lys) with resolution >1.2 as well as methionine (Met) and valine (Val) with resolution of 0.6. Under these conditions isoleucine (Ile) and leucine (Leu) co-eluted. The entire cycle of amino acids derivatization, chromatographic separation and column conditioning at the end of separation lasted 25 min. At a flow rate of 40 mu L s(-1) such time was reduced to 10 min at the cost of resolution worsening for the pairs Ctr/Arg and Orn/Lys. The detection limits varied from 0.092 mu mol L(-1) for Tyr to 0.51 mu mol L(-1) for Orn. The method was successfully applied to the determination of intracellular free amino acids in the green alga Tetraselmis gracilis during a period of seven days of cultivation. Samples spiked with known amounts of amino acids resulted in recoveries between 94 and 112%. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper draws on empirical evidence to demonstrate that a heuristic framework signals collapse with significantly higher accuracy than the traditional static approach. Using a sample of 494 US publicly listed companies comprising 247 collapsed matched with 247 financially healthy ones, a heuristic framework is decisively superior the closer one gets to the event of collapse, culminating in 12.5% more overall accuracy than a static approach during
the year of collapse. An even more dramatic improvement occurs in relation to reduction of Type I error, with a heuristic framework delivering an improvement of 66.7% over its static counterpart.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis, using a computer simulation, studies the effect of the normal distribution assumption on the power of several many-sample location and scale test procedures. It also suggests an almost robust parametric test, namely numerical likelihood ratio test (NLRT) for non-normal situations. The NLRT is found better than all of the tests considered. Some real life data sets were used as examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is currently no universally recommended and accepted method of data processing within the science of indirect calorimetry for either mixing chamber or breath-by-breath systems of expired gas analysis. Exercise physiologists were first surveyed to determine methods used to process oxygen consumption ([OV0312]O 2) data, and current attitudes to data processing within the science of indirect calorimetry. Breath-by-breath datasets obtained from indirect calorimetry during incremental exercise were then used to demonstrate the consequences of commonly used time, breath and digital filter post-acquisition data processing strategies. Assessment of the variability in breath-by-breath data was determined using multiple regression based on the independent variables ventilation (VE), and the expired gas fractions for oxygen and carbon dioxide, FEO 2 and FECO2, respectively. Based on the results of explanation of variance of the breath-by-breath [OV0312]O2 data, methods of processing to remove variability were proposed for time-averaged, breath-averaged and digital filter applications. Among exercise physiologists, the strategy used to remove the variability in sequential [OV0312]O2 measurements varied widely, and consisted of time averages (30 sec [38%], 60 sec [18%], 20 sec [11%], 15 sec [8%]), a moving average of five to 11 breaths (10%), and the middle five of seven breaths (7%). Most respondents indicated that they used multiple criteria to establish maximum [OV0312]O 2 ([OV0312]O2max) including: the attainment of age-predicted maximum heart rate (HRmax) [53%], respiratory exchange ratio (RER) >1.10 (49%) or RER >1.15 (27%) and a rating of perceived exertion (RPE) of >17, 18 or 19 (20%). The reasons stated for these strategies included their own beliefs (32%), what they were taught (26%), what they read in research articles (22%), tradition (13%) and the influence of their colleagues (7%). The combination of VE, FEO 2 and FECO2 removed 96-98% of [OV0312]O2 breath-by-breath variability in incremental and steady-state exercise [OV0312]O2 data sets, respectively. Correction of residual error in [OV0312]O2 datasets to 10% of the raw variability results from application of a 30-second time average, 15-breath running average, or a 0.04 Hz low cut-off digital filter. Thus, we recommend that once these data processing strategies are used, the peak or maximal value becomes the highest processed datapoint. Exercise physiologists need to agree on, and continually refine through empirical research, a consistent process for analysing data from indirect calorimetry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prevalence of visual impairment due to uncorrected refractive error has not been previously studied in Canada. A population-based study was conducted in Brantford, Ontario. The target population included all people 40 years of age and older. Study participants were selected using a randomized sampling strategy based on postal codes. Presenting distance and near visual acuities were measured with habitual spectacle correction, if any, in place. Best corrected visual acuities were determined for all participants who had a presenting distance visual acuity of less than 20/25. Population weighted prevalence of distance visual impairment (visual acuity <20/40 in the better eye) was 2.7% (n = 768, 95% confidence interval (CI) 1.8–4.0%) with 71.8% correctable by refraction. Population weighted prevalence of near visual impairment (visual acuity <20/40 with both eyes) was 2.2% (95% CI 1.4–3.6) with 69.1% correctable by refraction. Multivariable adjusted analysis showed that the odds of having distance visual impairment was independently associated with increased age (odds ratio, OR, 3.56, 95% CI 1.22–10.35; ≥65 years compared to those 39–64 years), and time since last eye examination (OR 4.93, 95% CI 1.19–20.32; ≥5 years compared to ≤2 years). The same factors appear to be associated with increased prevalence of near visual impairment but were not statistically significant. The majority of visual impairment found in Brantford was due to uncorrected refractive error. Factors that increased the prevalence of visual impairment were the same for distance and near visual acuity measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capability indices in both univariate and multivariate processes are extensively employed in quality control to assess the quality status of production batches before their release for operational use. It is traditionally a measure of the ratio of the allowable process spread and the actual spread. In this paper, we will adopt a bootstrap and sequential sampling procedures to determine the optimal sample size for estimating a multivariate capability index introduced by Pearns et. al. [12]. Bootstrap techniques have the distinct advantage of placing very minimum requirement on the distributions of the underlying quality characteristics, thereby rendering them more relevant under a wide variety of situations. Finally, we provide several numerical examples where the sequential sampling procedures are evaluated and compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes the specification and performance analysis of a real-time communication mechanism for IEEE 802.11/11e standard. This approach is called Group Sequential Communication (GSC). The GSC has a better performance for dealing with small data packets when compared to the HCCA mechanism by adopting a decentralized medium access control using a publish/subscribe communication scheme. The main objective of the thesis is the HCCA overhead reduction of the Polling, ACK and QoS Null frames exchanged between the Hybrid Coordinator and the polled stations. The GSC eliminates the polling scheme used by HCCA scheduling algorithm by using a Virtual Token Passing procedure among members of the real-time group to whom a high-priority and sequential access to communication medium is granted. In order to improve the reliability of the mechanism proposed into a noisy channel, it is presented an error recovery scheme called second chance algorithm. This scheme is based on block acknowledgment strategy where there is a possibility of retransmitting when missing real-time messages. Thus, the GSC mechanism maintains the real-time traffic across many IEEE 802.11/11e devices, optimized bandwidth usage and minimal delay variation for data packets in the wireless network. For validation purpose of the communication scheme, the GSC and HCCA mechanisms have been implemented in network simulation software developed in C/C++ and their performance results were compared. The experiments show the efficiency of the GSC mechanism, especially in industrial communication scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A flow injection analysis (FIA) procedure for the speciation of Cr(III) and Cr(VI) using the 1,5-diphenylcarbazide (DPC) method is presented. As Cr(III) does not interfere in the Cr(VI) - DPC reaction, both Cr(VI) and total chromium [after the on-line oxidation of Cr(III) by Ce(IV)] are sequentially determined. Cr(III) is obtained by difference. Under the experimental conditions described, the calibration graphs are linear up to 2 μg mh1 of Cr(VI) and 4 μg ml-1 of Cr(III). The detection limits found were 18 ng ml -1 for Cr(VI) and 55 ng ml-1 for Cr(III), at a signal to noise ratio of 3. The common interfering elements in the Cr(VI) - DPC reaction were investigated under dynamic FIA conditions. The FIA method was also compared with the conventional spectrophotometric procedure.