50 resultados para 1-RM test


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to assist in comparing the computational techniques used in different models, the authors propose a standardized set of one-dimensional numerical experiments that could be completed for each model. The results of these experiments, with a simplified form of the computational representation for advection, diffusion, pressure gradient term, Coriolis term, and filter used in the models, should be reported in the peer-reviewed literature. Specific recommendations are described in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developing models to predict the effects of social and economic change on agricultural landscapes is an important challenge. Model development often involves making decisions about which aspects of the system require detailed description and which are reasonably insensitive to the assumptions. However, important components of the system are often left out because parameter estimates are unavailable. In particular, measurements of the relative influence of different objectives, such as risk, environmental management, on farmer decision making, have proven difficult to quantify. We describe a model that can make predictions of land use on the basis of profit alone or with the inclusion of explicit additional objectives. Importantly, our model is specifically designed to use parameter estimates for additional objectives obtained via farmer interviews. By statistically comparing the outputs of this model with a large farm-level land-use data set, we show that cropping patterns in the United Kingdom contain a significant contribution from farmer’s preference for objectives other than profit. In particular, we found that risk aversion had an effect on the accuracy of model predictions, whereas preference for a particular number of crops grown was less important. While nonprofit objectives have frequently been identified as factors in farmers’ decision making, our results take this analysis further by demonstrating the relationship between these preferences and actual cropping patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In financial research, the sign of a trade (or identity of trade aggressor) is not always available in the transaction dataset and it can be estimated using a simple set of rules called the tick test. In this paper we investigate the accuracy of the tick test from an analytical perspective by providing a closed formula for the performance of the prediction algorithm. By analyzing the derived equation, we provide formal arguments for the use of the tick test by proving that it is bounded to perform better than chance (50/50) and that the set of rules from the tick test provides an unbiased estimator of the trade signs. On the empirical side of the research, we compare the values from the analytical formula against the empirical performance of the tick test for fifteen heavily traded stocks in the Brazilian equity market. The results show that the formula is quite realistic in assessing the accuracy of the prediction algorithm in a real data situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We apply experimental methods to study the role of risk aversion on players’ behavior in repeated prisoners’ dilemma games. Faced with quantitatively equal discount factors, the most risk-averse players will choose Nash strategies more often in the presence of uncertainty than when future profits are discounted in a deterministic way. Overall, we find that risk aversion relates negatively with the frequency of collusive outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been reported that the ability to solve syllogisms is highly g-loaded. In the present study, using a self-administered shortened version of a syllogism-solving test, the BAROCO Short, we examined whether robust findings generated by previous research regarding IQ scores were also applicable to BAROCO Short scores. Five syllogism-solving problems were included in a questionnaire as part of a postal survey conducted by the Keio Twin Research Center. Data were collected from 487 pairs of twins (1021 individuals) who were Japanese junior high or high school students (ages 13–18) and from 536 mothers and 431 fathers. Four findings related to IQ were replicated: 1) The mean level increased gradually during adolescence, stayed unchanged from the 30s to the early 50s, and subsequently declined after the late 50s. 2) The scores for both children and parents were predicted by the socioeconomic status of the family. 3) The genetic effect increased, although the shared environmental effect decreased during progression from adolescence to adulthood. 4) Children's scores were genetically correlated with school achievement. These findings further substantiate the close association between syllogistic reasoning ability and g.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the effect of using a GARCH filter on the properties of the BDS test statistic as well as a number of other issues relating to the application of the test. It is found that, for certain values of the user-adjustable parameters, the finite sample distribution of the test is far-removed from asymptotic normality. In particular, when data generated from some completely different model class are filtered through a GARCH model, the frequency of rejection of iid falls, often substantially. The implication of this result is that it might be inappropriate to use non-rejection of iid of the standardised residuals of a GARCH model as evidence that the GARCH model ‘fits’ the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse the widely-used international/ Zürich sunspot number record, R, with a view to quantifying a suspected calibration discontinuity around 1945 (which has been termed the “Waldmeier discontinuity” [Svalgaard, 2011]). We compare R against the composite sunspot group data from the Royal Greenwich Observatory (RGO) network and the Solar Optical Observing Network (SOON), using both the number of sunspot groups, N{sub}G{\sub}, and the total area of the sunspots, A{sub}G{\sub}. In addition, we compare R with the recently developed interdiurnal variability geomagnetic indices IDV and IDV(1d). In all four cases, linearity of the relationship with R is not assumed and care is taken to ensure that the relationship of each with R is the same before and after the putative calibration change. It is shown the probability that a correction is not needed is of order 10{sup}−8{\sup} and that R is indeed too low before 1945. The optimum correction to R for values before 1945 is found to be 11.6%, 11.7%, 10.3% and 7.9% using A{sub}G{\sub}, N{sub)G{\sub}, IDV, and IDV(1d), respectively. The optimum value obtained by combining the sunspot group data is 11.6% with an uncertainty range 8.1-14.8% at the 2σ level. The geomagnetic indices provide an independent yet less stringent test but do give values that fall within the 2σ uncertainty band with optimum values are slightly lower than from the sunspot group data. The probability of the correction needed being as large as 20%, as advocated by Svalgaard [2011], is shown to be 1.6 × 10{sup}−5{\sup}.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, computational fluid dynamics (CFD) has been widely used as a method of simulating airflow and addressing indoor environment problems. The complexity of airflows within the indoor environment would make experimental investigation difficult to undertake and also imposes significant challenges on turbulence modelling for flow prediction. This research examines through CFD visualization how air is distributed within a room. Measurements of air temperature and air velocity have been performed at a number of points in an environmental test chamber with a human occupant. To complement the experimental results, CFD simulations were carried out and the results enabled detailed analysis and visualization of spatial distribution of airflow patterns and the effect of different parameters to be predicted. The results demonstrate the complexity of modelling human exhalation within a ventilated enclosure and shed some light into how to achieve more realistic predictions of the airflow within an occupied enclosure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The permeability parameter (C) for the movement of cephalosporin C across the outer membrane of Pseudomonas aeruginosa was measured using the widely accepted method of Zimmermann & Rosselet. In one experiment, the value of C varied continuously from 4·2 to 10·8 cm3 min-1 (mg dry wt cells)-1 over a range of concentrations of the test substrate, cephalosporin C, from 50 to 5 μm. Dependence of C on the concentration of test substrate was still observed when the effect of a possible electric potential difference across the outer membrane was corrected for. In quantitative studies of β-lactam permeation the dependence of C on the concentration of β-lactam should be taken into account.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing urban meteorological networks have an important role to play as test beds for inexpensive and more sustainable measurement techniques that are now becoming possible in our increasingly smart cities. The Birmingham Urban Climate Laboratory (BUCL) is a near-real-time, high-resolution urban meteorological network (UMN) of automatic weather stations and inexpensive, nonstandard air temperature sensors. The network has recently been implemented with an initial focus on monitoring urban heat, infrastructure, and health applications. A number of UMNs exist worldwide; however, BUCL is novel in its density, the low-cost nature of the sensors, and the use of proprietary Wi-Fi networks. This paper provides an overview of the logistical aspects of implementing a UMN test bed at such a density, including selecting appropriate urban sites; testing and calibrating low-cost, nonstandard equipment; implementing strict quality-assurance/quality-control mechanisms (including metadata); and utilizing preexisting Wi-Fi networks to transmit data. Also included are visualizations of data collected by the network, including data from the July 2013 U.K. heatwave as well as highlighting potential applications. The paper is an open invitation to use the facility as a test bed for evaluating models and/or other nonstandard observation techniques such as those generated via crowdsourcing techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The differential susceptibly hypothesis suggests that certain genetic variants moderate the effects of both negative and positive environments on mental health and may therefore be important predictors of response to psychological treatments. Nevertheless, the identification of such variants has so far been limited to preselected candidate genes. In this study we extended the differential susceptibility hypothesis from a candidate gene to a genome-wide approach to test whether a polygenic score of environmental sensitivity predicted response to Cognitive Behavioural Therapy (CBT) in children with anxiety disorders. Methods: We identified variants associated with environmental sensitivity using a novel method in which within-pair variability in emotional problems in 1026 monozygotic (MZ) twin pairs was examined as a function of the pairs’ genotype. We created a polygenic score of environmental sensitivity based on the whole-genome findings and tested the score as a moderator of parenting on emotional problems in 1,406 children and response to individual, group and brief parent-led CBT in 973 children with anxiety disorders. Results: The polygenic score significantly moderated the effects of parenting on emotional problems and the effects of treatment. Individuals with a high score responded significantly better to individual CBT than group CBT or brief parent-led CBT (remission rates: 70.9%, 55.5% and 41.6% respectively). Conclusions: Pending successful replication, our results should be considered exploratory. Nevertheless, if replicated, they suggest that individuals with the greatest environmental sensitivity may be more likely to develop emotional problems in adverse environments, but also benefit more from the most intensive types of treatment.