937 resultados para Statistical methodologies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, the cusp has been described in terms of a time-stationary feature of the magnetosphere which allows access of magnetosheath-like plasma to low altitudes. Statistical surveys of data from low-altitude spacecraft have shown the average characteristics and position of the cusp. Recently, however, it has been suggested that the ionospheric footprint of flux transfer events (FTEs) may be identified as variations of the “cusp” on timescales of a few minutes. In this model, the cusp can vary in form between a steady-state feature in one limit and a series of discrete ionospheric FTE signatures in the other limit. If this time-dependent cusp scenario is correct, then the signatures of the transient reconnection events must be able, on average, to reproduce the statistical cusp occurrence previously determined from the satellite observations. In this paper, we predict the precipitation signatures which are associated with transient magnetopause reconnection, following recent observations of the dependence of dayside ionospheric convection on the orientation of the IMF. We then employ a simple model of the longitudinal motion of FTE signatures to show how such events can easily reproduce the local time distribution of cusp occurrence probabilities, as observed by low-altitude satellites. This is true even in the limit where the cusp is a series of discrete events. Furthermore, we investigate the existence of double cusp patches predicted by the simple model and show how these events may be identified in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of case studies of large, transient, field-aligned ion flows in the topside ionosphere at high-latitudes have been reported, showing that these events occur during periods of frictional heating and/or intense particle precipitation. This study examines the frequency of occurrence of such events for the altitude range 200–500 km, based on 3 years of incoherent scatter data. Correlations of the upgoing ion flux at 400 km with ion and electron temperatures at lower altitudes are presented, together with a discussion of possible mechanisms for the production of such large flows. The influence of low-altitude electron precipitation on the production of these events is also considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning to talk about motion in a second language is very difficult because it involves restructuring deeply entrenched patterns from the first language (Slobin 1996). In this paper we argue that statistical learning (Saffran et al. 1997) can explain why L2 learners are only partially successful in restructuring their second language grammars. We explore to what extent L2 learners make use of two mechanisms of statistical learning, entrenchment and pre-emption (Boyd and Goldberg 2011) to acquire target-like expressions of motion and retreat from overgeneralisation in this domain. Paying attention to the frequency of existing patterns in the input can help learners to adjust the frequency with which they use path and manner verbs in French but is insufficient to acquire the boundary crossing constraint (Slobin and Hoiting 1994) and learn what not to say. We also look at the role of language proficiency and exposure to French in explaining the findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The chapter examines how far medieval economic crises can be identified by analysing the residuals from a simultaneous equation model of the medieval English economy. High inflation, falls in gross domestic product and large intermittent changes in wage rates are all considered as potential indicators of crisis. Potential causal factors include bad harvests, wars and political instability. The chapter suggests that crises arose when a combination of different problems overwhelmed the capacity of government to address them. It may therefore be a mistake to look for a single cause of any crisis. The coincidence of separate problems is a more plausible explanation of many crises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Incorporating an emerging therapy as a new randomisation arm in a clinical trial that is open to recruitment would be desirable to researchers, regulators and patients to ensure that the trial remains current, new treatments are evaluated as quickly as possible, and the time and cost for determining optimal therapies is minimised. It may take many years to run a clinical trial from concept to reporting within a rapidly changing drug development environment; hence, in order for trials to be most useful to inform policy and practice, it is advantageous for them to be able to adapt to emerging therapeutic developments. This paper reports a comprehensive literature review on methodologies for, and practical examples of, amending an ongoing clinical trial by adding a new treatment arm. Relevant methodological literature describing statistical considerations required when making this specific type of amendment is identified, and the key statistical concepts when planning the addition of a new treatment arm are extracted, assessed and summarised. For completeness, this includes an assessment of statistical recommendations within general adaptive design guidance documents. Examples of confirmatory ongoing trials designed within the frequentist framework that have added an arm in practice are reported; and the details of the amendment are reviewed. An assessment is made as to how well the relevant statistical considerations were addressed in practice, and the related implications. The literature review confirmed that there is currently no clear methodological guidance on this topic, but that guidance would be advantageous to help this efficient design amendment to be used more frequently and appropriately in practice. Eight confirmatory trials were identified to have added a treatment arm, suggesting that trials can benefit from this amendment and that it can be practically feasible; however, the trials were not always able to address the key statistical considerations, often leading to uninterpretable or invalid outcomes. If the statistical concepts identified within this review are considered and addressed during the design of a trial amendment, it is possible to effectively assess a new treatment arm within an ongoing trial without compromising the original trial outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first multi-event study of the spatial and temporal structuring of the aurora to provide statistical evidence of the near-Earth plasma instability which causes the substorm onset arc. Using data from ground-based auroral imagers, we study repeatable signatures of along-arc auroral beads, which are thought to represent the ionospheric projection of magnetospheric instability in the near-Earth plasma sheet. We show that the growth and spatial scales of these wave-like fluctuations are similar across multiple events, indicating that each sudden auroral brightening has a common explanation. We find statistically that growth rates for auroral beads peak at low wavenumber with the most unstable spatial scales mapping to an azimuthal wavelength λ≈1700 − 2500 km in the equatorial magnetosphere at around 9-12 RE. We compare growth rates and spatial scales with a range of theoretical predictions of magnetotail instabilities, including the cross-field current instability and the shear-flow ballooning instability. We conclude that, although the cross-field current instability can generate similar magnitude of growth rates, the range of unstable wavenumbers indicates that the shear-flow ballooning instability is the most likely explanation for our observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pipe sizing of water networks via evolutionary algorithms is of great interest because it allows the selection of alternative economical solutions that meet a set of design requirements. However, available evolutionary methods are numerous, and methodologies to compare the performance of these methods beyond obtaining a minimal solution for a given problem are currently lacking. A methodology to compare algorithms based on an efficiency rate (E) is presented here and applied to the pipe-sizing problem of four medium-sized benchmark networks (Hanoi, New York Tunnel, GoYang and R-9 Joao Pessoa). E numerically determines the performance of a given algorithm while also considering the quality of the obtained solution and the required computational effort. From the wide range of available evolutionary algorithms, four algorithms were selected to implement the methodology: a PseudoGenetic Algorithm (PGA), Particle Swarm Optimization (PSO), a Harmony Search and a modified Shuffled Frog Leaping Algorithm (SFLA). After more than 500,000 simulations, a statistical analysis was performed based on the specific parameters each algorithm requires to operate, and finally, E was analyzed for each network and algorithm. The efficiency measure indicated that PGA is the most efficient algorithm for problems of greater complexity and that HS is the most efficient algorithm for less complex problems. However, the main contribution of this work is that the proposed efficiency ratio provides a neutral strategy to compare optimization algorithms and may be useful in the future to select the most appropriate algorithm for different types of optimization problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This special issue is a testament to the recent burgeoning interest by theoretical linguists, language acquisitionists and teaching practitioners in the neuroscience of language. It offers a highly valuable, state-of-the-art overview of the neurophysiological methods that are currently being applied to questions in the field of second language (L2) acquisition, teaching and processing. Research in the area of neurolinguistics has developed dramatically in the past twenty years, providing a wealth of exciting findings, many of which are discussed in the papers in this volume. The goal of this commentary is twofold. The first is to critically assess the current state of neurolinguistic data from the point of view of language acquisition and processing—informed by the papers that comprise this special issue and the literature as a whole—pondering how the neuroscience of language/processing might inform us with respect to linguistic and language acquisition theories. The second goal is to offer some links from implications of exploring the first goal towards informing language teachers and the creation of linguistically and neurolinguistically-informed evidence-based pedagogies for non-native language teaching.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A dynamical wind-wave climate simulation covering the North Atlantic Ocean and spanning the whole 21st century under the A1B scenario has been compared with a set of statistical projections using atmospheric variables or large scale climate indices as predictors. As a first step, the performance of all statistical models has been evaluated for the present-day climate; namely they have been compared with a dynamical wind-wave hindcast in terms of winter Significant Wave Height (SWH) trends and variance as well as with altimetry data. For the projections, it has been found that statistical models that use wind speed as independent variable predictor are able to capture a larger fraction of the winter SWH inter-annual variability (68% on average) and of the long term changes projected by the dynamical simulation. Conversely, regression models using climate indices, sea level pressure and/or pressure gradient as predictors, account for a smaller SWH variance (from 2.8% to 33%) and do not reproduce the dynamically projected long term trends over the North Atlantic. Investigating the wind-sea and swell components separately, we have found that the combination of two regression models, one for wind-sea waves and another one for the swell component, can improve significantly the wave field projections obtained from single regression models over the North Atlantic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we compare the performance of two statistical approaches for the analysis of data obtained from the social research area. In the first approach, we use normal models with joint regression modelling for the mean and for the variance heterogeneity. In the second approach, we use hierarchical models. In the first case, individual and social variables are included in the regression modelling for the mean and for the variance, as explanatory variables, while in the second case, the variance at level 1 of the hierarchical model depends on the individuals (age of the individuals), and in the level 2 of the hierarchical model, the variance is assumed to change according to socioeconomic stratum. Applying these methodologies, we analyze a Colombian tallness data set to find differences that can be explained by socioeconomic conditions. We also present some theoretical and empirical results concerning the two models. From this comparative study, we conclude that it is better to jointly modelling the mean and variance heterogeneity in all cases. We also observe that the convergence of the Gibbs sampling chain used in the Markov Chain Monte Carlo method for the jointly modeling the mean and variance heterogeneity is quickly achieved.