19 resultados para SERIES MODELS

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of model selection of a univariate long memory time series is investigated once a semi parametric estimator for the long memory parameter has been used. Standard information criteria are not consistent in this case. A Modified Information Criterion (MIC) that overcomes these difficulties is introduced and proofs that show its asymptotic validity are provided. The results are general and cover a wide range of short memory processes. Simulation evidence compares the new and existing methodologies and empirical applications in monthly inflation and daily realized volatility are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A field experiment was conducted on a real continuous steel Gerber-truss bridge with artificial damage applied. This article summarizes the results of the experiment for bridge damage detection utilizing traffic-induced vibrations. It investigates the sensitivities of a number of quantities to bridge damage including the identified modal parameters and their statistical patterns, Nair’s damage indicator and its statistical pattern and different sets of measurement points. The modal parameters are identified by autoregressive time-series models. The decision on bridge health condition is made and the sensitivity of variables is evaluated with the aid of the Mahalanobis–Taguchi system, a multivariate pattern recognition tool. Several observations are made as follows. For the modal parameters, although bridge damage detection can be achieved by performing Mahalanobis–Taguchi system on certain modal parameters of certain sets of measurement points, difficulties were faced in subjective selection of meaningful bridge modes and low sensitivity of the statistical pattern of the modal parameters to damage. For Nair’s damage indicator, bridge damage detection could be achieved by performing Mahalanobis–Taguchi system on Nair’s damage indicators of most sets of measurement points. As a damage indicator, Nair’s damage indicator was superior to the modal parameters. Three main advantages were observed: it does not require any subjective decision in calculating Nair’s damage indicator, thus potential human errors can be prevented and an automatic detection task can be achieved; its statistical pattern has high sensitivity to damage and, finally, it is flexible regarding the choice of sets of measurement points.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The speeds of sound u, densities ? and refractive indices nD of homologous series of mono-, di-, and tri-alkylamines were measured in the temperature range from 298.15 to 328.15 K. Isentropic and isothermal compressibilities ?S and ?T, molar refraction Rm, Eykman’s constant Cm, Rao’s molar sound function R, thermal expansion coefficient a, thermal pressure coefficient ?, and reduction parameters P*, V*, and T* in frameworks of the ERAS model for associated amines and Flory model for tertiary amines have been calculated from the measured experimental data. Applicability of the Rao theory and the ERAS and Flory models have been examined and discussed for the alkyl amines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we report the synthesis and biological activity of a series of dihydroisocoumarin analogues Conjugated with fatty acids, alcohols, or amines, of varying hydrocarbon chain length and degree of unsaturation, to (he dihydroisocoumarins, kigelin and mellein, at the C-7 and C-8 positions on the core dihydroisocoumarin structure. These compounds were evaluated for their antiproliferative activity against human breast cancer (MCF-7 and MDA-MB-468) and melanoma cells (SK-MEL-28 and Malme-3M) using the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl-2H-tetrazolium bromide (MTT) assay. Two compounds Conjugated with gamma-linolenyl alcohol (18:3 n-6) demonstrated potent antiproliferative activity in vitro with one of these 4-hydroxy-3-oxo-1.3-dihydro-isobenzofuran-5-carboxylic acid octadeca-6,9,12-trienyl ester, demonstrating significant antitumor activity in vivo ill a number of human tumor xenograft models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Evidence suggests that in prokaryotes sequence-dependent transcriptional pauses a?ect the dynamics of transcription and translation, as well as of small genetic circuits. So far, a few pause-prone sequences have been identi?ed from in vitro measurements of transcription elongation kinetics.

Results: Using a stochastic model of gene expression at the nucleotide and codon levels with realistic parameter values, we investigate three di?erent but related questions and present statistical methods for their analysis. First, we show that information from in vivo RNA and protein temporal numbers is su?cient to discriminate between models with and without a pause site in their coding sequence. Second, we demonstrate that it is possible to separate a large variety of models from each other with pauses of various durations and locations in the template by means of a hierarchical clustering and a random forest classi?er. Third, we introduce an approximate likelihood function that allows to estimate the location of a pause site.

Conclusions: This method can aid in detecting unknown pause-prone sequences from temporal measurements of RNA and protein numbers at a genome-wide scale and thus elucidate possible roles that these sequences play in the dynamics of genetic networks and phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emotion research has long been dominated by the “standard method” of displaying posed or acted static images of facial expressions of emotion. While this method has been useful it is unable to investigate the dynamic nature of emotion expression. Although continuous self-report traces have enabled the measurement of dynamic expressions of emotion, a consensus has not been reached on the correct statistical techniques that permit inferences to be made with such measures. We propose Generalized Additive Models and Generalized Additive Mixed Models as techniques that can account for the dynamic nature of such continuous measures. These models allow us to hold constant shared components of responses that are due to perceived emotion across time, while enabling inference concerning linear differences between groups. The mixed model GAMM approach is preferred as it can account for autocorrelation in time series data and allows emotion decoding participants to be modelled as random effects. To increase confidence in linear differences we assess the methods that address interactions between categorical variables and dynamic changes over time. In addition we provide comments on the use of Generalized Additive Models to assess the effect size of shared perceived emotion and discuss sample sizes. Finally we address additional uses, the inference of feature detection, continuous variable interactions, and measurement of ambiguity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The German site of Geißenklösterle is crucial to debates concerning the European Middle to Upper Palaeolithic transition and the origins of the Aurignacian in Europe. Previous dates from the site are
central to an important hypothesis, the Kulturpumpe model, which posits that the Swabian Jura was an area where crucial behavioural developments took place and then spread to other parts of Europe. The previous chronology (critical to the model), is based mainly on radiocarbon dating, but remains poorly constrained due to the dating resolution and the variability of dates. The cause of these problems is disputed, but two principal explanations have been proposed: a) larger than expected variations in the production of atmospheric radiocarbon, and b) taphonomic in?uences in the site mixing the bones that were dated into different parts of the site. We reinvestigate the chronology using a new series of radiocarbon determinations obtained from the Mousterian, Aurignacian and Gravettian levels. The results strongly imply that the previous dates were affected by insuf?cient decontamination of the bone collagen prior to dating. Using an ultra?ltration protocol the chronometric picture becomes much clearer. Comparison of the results against other recently dated sites in other parts of Europe suggests the Early Aurignacian levels are earlier than other sites in the south of France and Italy, but not as early as recently dated sites which suggest a pre-Aurignacian dispersal of modern humans to Italy byw45000 cal BP. They are consistent with the importance of the Danube Corridor as a key route for the movement of people and ideas. The new dates fail to refute the Kulturpumpe model and suggest that Swabian Jura is a region that contributed signi?cantly to the evolution of symbolic behaviour as indicated by early evidence for ?gurative art, music and mythical imagery. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A good understanding of the different theoretical models is essential when working in the field of mental health. Not only does it help with understanding experiences of mental health difficulties and to find meaning, but it also provides a framework for expanding our knowledge of the field.

As part of the Foundations of Mental Health Practice series, this book provides a critical overview of the theoretical perspectives relevant to mental health practice. At the core of this book is the idea that no single theory is comprehensive on its own and each theory has its limitations. Divided in to two parts, Part I explores traditional models of mental health and covers the key areas: bio-medical perspectives, psychological perspectives and social perspectives, whilst Part II looks at contemporary ideas that challenge and push these traditional views. The contributions, strengths and limitations of each model are explored and, as a result, the book encourages a more holistic, open approach to understanding and responding to mental health issues.

Together, these different approaches offer students and practitioners a powerful set of perspectives from which to approach their study and careers. Each model is covered in a clear and structured way with supporting exercises and case studies. It is an essential text for anyone studying or practising in the field of mental health, including social workers, nurses and psychologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective
To investigate the effect of fast food consumption on mean population body mass index (BMI) and explore the possible influence of market deregulation on fast food consumption and BMI.

Methods
The within-country association between fast food consumption and BMI in 25 high-income member countries of the Organisation for Economic Co-operation and Development between 1999 and 2008 was explored through multivariate panel regression models, after adjustment for per capita gross domestic product, urbanization, trade openness, lifestyle indicators and other covariates. The possible mediating effect of annual per capita intake of soft drinks, animal fats and total calories on the association between fast food consumption and BMI was also analysed. Two-stage least squares regression models were conducted, using economic freedom as an instrumental variable, to study the causal effect of fast food consumption on BMI.

Findings
After adjustment for covariates, each 1-unit increase in annual fast food transactions per capita was associated with an increase of 0.033 kg/m2 in age-standardized BMI (95% confidence interval, CI: 0.013–0.052). Only the intake of soft drinks – not animal fat or total calories – mediated the observed association (β: 0.030; 95% CI: 0.010–0.050). Economic freedom was an independent predictor of fast food consumption (β: 0.27; 95% CI: 0.16–0.37). When economic freedom was used as an instrumental variable, the association between fast food and BMI weakened but remained significant (β: 0.023; 95% CI: 0.001–0.045).

Conclusion
Fast food consumption is an independent predictor of mean BMI in high-income countries. Market deregulation policies may contribute to the obesity epidemic by facilitating the spread of fast food.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the issue of life expectancy has become of upmost importance to pension providers, insurance companies and the government bodies in the developed world. Significant and consistent improvements in mortality rates and, hence, life expectancy have led to unprecedented increases in the cost of providing for older ages. This has resulted in an explosion of stochastic mortality models forecasting trends in mortality data in order to anticipate future life expectancy and, hence, quantify the costs of providing for future aging populations. Many stochastic models of mortality rates identify linear trends in mortality rates by time, age and cohort, and forecast these trends into the future using standard statistical methods. The modeling approaches used failed to capture the effects of any structural change in the trend and, thus, potentially produced incorrect forecasts of future mortality rates. In this paper, we look at a range of leading stochastic models of mortality and test for structural breaks in the trend time series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we compare a number of the classical models used to characterize fading in body area networks (BANs) with the recently proposed shadowed ț–ȝ fading model. In particular, we focus on BAN channels which are considered to be susceptible to shadowing by the human body. The measurements considered in this study were conducted at 2.45 GHz for hypothetical BAN channels operating in both anechoic and highly reverberant environments while the person was moving. Compared to the Rice, Nakagami and lognormal fading models, it was found that the recently proposed shadowed ț௅μ fading model provided an enhanced fit to the measured data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many modeling problems require to estimate a scalar output from one or more time series. Such problems are usually tackled by extracting a fixed number of features from the time series (like their statistical moments), with a consequent loss in information that leads to suboptimal predictive models. Moreover, feature extraction techniques usually make assumptions that are not met by real world settings (e.g. uniformly sampled time series of constant length), and fail to deliver a thorough methodology to deal with noisy data. In this paper a methodology based on functional learning is proposed to overcome the aforementioned problems; the proposed Supervised Aggregative Feature Extraction (SAFE) approach allows to derive continuous, smooth estimates of time series data (yielding aggregate local information), while simultaneously estimating a continuous shape function yielding optimal predictions. The SAFE paradigm enjoys several properties like closed form solution, incorporation of first and second order derivative information into the regressor matrix, interpretability of the generated functional predictor and the possibility to exploit Reproducing Kernel Hilbert Spaces setting to yield nonlinear predictive models. Simulation studies are provided to highlight the strengths of the new methodology w.r.t. standard unsupervised feature selection approaches. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.