950 resultados para Longitudinal Data Analysis and Time Series


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed effects models are frequently used to analyse longitudinal data, due to their flexibility in modelling the covariance structure between and within observations. Further, it is easy to deal with unbalanced data, either with respect to the number of observations per subject or per time period, and with varying time intervals between observations. In most applications of mixed models to biological sciences, a normal distribution is assumed both for the random effects and for the residuals. This, however, makes inferences vulnerable to the presence of outliers. Here, linear mixed models employing thick-tailed distributions for robust inferences in longitudinal data analysis are described. Specific distributions discussed include the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted, and the Gibbs sampler and the Metropolis-Hastings algorithms are used to carry out the posterior analyses. An example with data on orthodontic distance growth in children is discussed to illustrate the methodology. Analyses based on either the Student-t distribution or on the usual Gaussian assumption are contrasted. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process for modelling distributions of the random effects and of residuals in linear mixed models, and the MCMC implementation allows the computations to be performed in a flexible manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selection criteria and misspecification tests for the intra-cluster correlation structure (ICS) in longitudinal data analysis are considered. In particular, the asymptotical distribution of the correlation information criterion (CIC) is derived and a new method for selecting a working ICS is proposed by standardizing the selection criterion as the p-value. The CIC test is found to be powerful in detecting misspecification of the working ICS structures, while with respect to the working ICS selection, the standardized CIC test is also shown to have satisfactory performance. Some simulation studies and applications to two real longitudinal datasets are made to illustrate how these criteria and tests might be useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A modeling paradigm is proposed for covariate, variance and working correlation structure selection for longitudinal data analysis. Appropriate selection of covariates is pertinent to correct variance modeling and selecting the appropriate covariates and variance function is vital to correlation structure selection. This leads to a stepwise model selection procedure that deploys a combination of different model selection criteria. Although these criteria find a common theoretical root based on approximating the Kullback-Leibler distance, they are designed to address different aspects of model selection and have different merits and limitations. For example, the extended quasi-likelihood information criterion (EQIC) with a covariance penalty performs well for covariate selection even when the working variance function is misspecified, but EQIC contains little information on correlation structures. The proposed model selection strategies are outlined and a Monte Carlo assessment of their finite sample properties is reported. Two longitudinal studies are used for illustration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robust methods are useful in making reliable statistical inferences when there are small deviations from the model assumptions. The widely used method of the generalized estimating equations can be "robustified" by replacing the standardized residuals with the M-residuals. If the Pearson residuals are assumed to be unbiased from zero, parameter estimators from the robust approach are asymptotically biased when error distributions are not symmetric. We propose a distribution-free method for correcting this bias. Our extensive numerical studies show that the proposed method can reduce the bias substantially. Examples are given for illustration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Asymmetric Power Arch representation for the volatility was introduced by Ding et al.(1993) in order to account for asymmetric responses in the volatility in the analysis of continuous-valued financial time series like, for instance, the log-return series of foreign exchange rates, stock indices or share prices. As reported by Brannas and Quoreshi (2010), asymmetric responses in volatility are also observed in time series of counts such as the number of intra-day transactions in stocks. In this work, an asymmetric power autoregressive conditional Poisson model is introduced for the analysis of time series of counts exhibiting asymmetric overdispersion. Basic probabilistic and statistical properties are summarized and parameter estimation is discussed. A simulation study is presented to illustrate the proposed model. Finally, an empirical application to a set of data concerning the daily number of stock transactions is also presented to attest for its practical applicability in data analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this dissertation is to study the structure and behavior of the Atmospheric Boundary Layer (ABL) in stable conditions. This type of boundary layer is not completely well understood yet, although it is very important for many practical uses, from forecast modeling to atmospheric dispersion of pollutants. We analyzed data from the SABLES98 experiment (Stable Atmospheric Boundary Layer Experiment in Spain, 1998), and compared the behaviour of this data using Monin-Obukhov's similarity functions for wind speed and potential temperature. Analyzing the vertical profiles of various variables, in particular the thermal and momentum fluxes, we identified two main contrasting structures describing two different states of the SBL, a traditional and an upside-down boundary layer. We were able to determine the main features of these two states of the boundary layer in terms of vertical profiles of potential temperature and wind speed, turbulent kinetic energy and fluxes, studying the time series and vertical structure of the atmosphere for two separate nights in the dataset, taken as case studies. We also developed an original classification of the SBL, in order to separate the influence of mesoscale phenomena from turbulent behavior, using as parameters the wind speed and the gradient Richardson number. We then compared these two formulations, using the SABLES98 dataset, verifying their validity for different variables (wind speed and potential temperature, and their difference, at different heights) and with different stability parameters (zita or Rg). Despite these two classifications having completely different physical origins, we were able to find some common behavior, in particular under weak stability conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While many time-series studies of ozone and daily mortality identified positive associations,others yielded null or inconclusive results. We performed a meta-analysis of 144 effect estimates from 39 time-series studies, and estimated pooled effects by lags, age groups,cause-specific mortality, and concentration metrics. We compared results to estimates from the National Morbidity, Mortality, and Air Pollution Study (NMMAPS), a time-series study of 95 large U.S. cities from 1987 to 2000. Both meta-analysis and NMMAPS results provided strong evidence of a short-term association between ozone and mortality, with larger effects for cardiovascular and respiratory mortality, the elderly, and current day ozone exposure as compared to other single day lags. In both analyses, results were not sensitive to adjustment for particulate matter and model specifications. In the meta-analysis we found that a 10 ppb increase in daily ozone is associated with a 0.83 (95% confidence interval: 0.53, 1.12%) increase in total mortality, whereas the corresponding NMMAPS estimate is 0.25%(0.12, 0.39%). Meta-analysis results were consistently larger than those from NMMAPS,indicating publication bias. Additional publication bias is evident regarding the choice of lags in time-series studies, and the larger heterogeneity in posterior city-specific estimates in the meta-analysis, as compared with NMAMPS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Heat-related impacts may have greater public health implications as climate change continues. It is important to appropriately characterize the relationship between heatwave and health outcomes. However, it is unclear whether a case-crossover design can be effectively used to assess the event- or episode-related health effects. This study examined the association between exposure to heatwaves and mortality and emergency hospital admissions (EHAs) from non-external causes in Brisbane, Australia, using both case-crossover and time series analyses approaches. Methods Poisson generalised additive model (GAM) and time-stratified case-crossover analyses were used to assess the short-term impact of heatwaves on mortality and EHAs. Heatwaves exhibited a significant impact on mortality and EHAs after adjusting for air pollution, day of the week, and season. Results For time-stratified case-crossover analysis, odds ratios of mortality and EHAs during heatwaves were 1.62 (95% confidence interval (CI): 1.36–1.94) and 1.22 (95% CI: 1.14–1.30) at lag 1, respectively. Time series GAM models gave similar results. Relative risks of mortality and EHAs ranged from 1.72 (95% CI: 1.40–2.11) to 1.81 (95% CI: 1.56–2.10) and from 1.14 (95% CI: 1.06–1.23) to 1.28 (95% CI: 1.21–1.36) at lag 1, respectively. The risk estimates gradually attenuated after the lag of one day for both case-crossover and time series analyses. Conclusions The risk estimates from both case-crossover and time series models were consistent and comparable. This finding may have implications for future research on the assessment of event- or episode-related (e.g., heatwave) health effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The method of generalized estimating equations (GEEs) provides consistent estimates of the regression parameters in a marginal regression model for longitudinal data, even when the working correlation model is misspecified (Liang and Zeger, 1986). However, the efficiency of a GEE estimate can be seriously affected by the choice of the working correlation model. This study addresses this problem by proposing a hybrid method that combines multiple GEEs based on different working correlation models, using the empirical likelihood method (Qin and Lawless, 1994). Analyses show that this hybrid method is more efficient than a GEE using a misspecified working correlation model. Furthermore, if one of the working correlation structures correctly models the within-subject correlations, then this hybrid method provides the most efficient parameter estimates. In simulations, the hybrid method's finite-sample performance is superior to a GEE under any of the commonly used working correlation models and is almost fully efficient in all scenarios studied. The hybrid method is illustrated using data from a longitudinal study of the respiratory infection rates in 275 Indonesian children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

African societies are dependent on rainfall for agricultural and other water-dependent activities, yet rainfall is extremely variable in both space and time and reoccurring water shocks, such as drought, can have considerable social and economic impacts. To help improve our knowledge of the rainfall climate, we have constructed a 30-year (1983–2012), temporally consistent rainfall dataset for Africa known as TARCAT (TAMSAT African Rainfall Climatology And Time-series) using archived Meteosat thermal infra-red (TIR) imagery, calibrated against rain gauge records collated from numerous African agencies. TARCAT has been produced at 10-day (dekad) scale at a spatial resolution of 0.0375°. An intercomparison of TARCAT from 1983 to 2010 with six long-term precipitation datasets indicates that TARCAT replicates the spatial and seasonal rainfall patterns and interannual variability well, with correlation coefficients of 0.85 and 0.70 with the Climate Research Unit (CRU) and Global Precipitation Climatology Centre (GPCC) gridded-gauge analyses respectively in the interannual variability of the Africa-wide mean monthly rainfall. The design of the algorithm for drought monitoring leads to TARCAT underestimating the Africa-wide mean annual rainfall on average by −0.37 mm day−1 (21%) compared to other datasets. As the TARCAT rainfall estimates are historically calibrated across large climatically homogeneous regions, the data can provide users with robust estimates of climate related risk, even in regions where gauge records are inconsistent in time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way to control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and this private system makes it difficult to follow the work remotely. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The choice of MDSplus (Model Driven System plus) is proved by the fact that it is widely utilized, and the scientists from different institutions may use the same system in different experiments in different tokamaks without the need to know how each system treats its acquisition system and data analysis. Another important point is the fact that the MDSplus has a library system that allows communication between different types of language (JAVA, Fortran, C, C++, Python) and programs such as MATLAB, IDL, OCTAVE. In the case of tokamak TCABR interfaces (object of this paper) between the system already in use and MDSplus were developed, instead of using the MDSplus at all stages, from the control, and data acquisition to the data analysis. This was done in the way to preserve a complex system already in operation and otherwise it would take a long time to migrate. This implementation also allows add new components using the MDSplus fully at all stages. (c) 2012 Elsevier B.V. All rights reserved.