968 resultados para Longitudinal Data Analysis and Time Series
Resumo:
Objective: To document the course of psychological symptomology, mental health treatment, and unmet psychological needs using caregiver reports in the first 18 months following pediatric brain injury (BI). Method: Participants included 28 children (aged 1-18 years) who were hospitalized at a children's hospital's rehabilitation unit. Caregiver reports of children's psychological symptoms, receipt of mental health treatment, and unmet psychological needs were assessed at one month, six months, 12 months, and 18 months post-BI. Results: Caregivers reported a general increase in psychological symptoms and receipt of mental health treatment over the 18 months following BI; however, there was a substantial gap between the high rate of reported symptoms and low rate of reported treatment. Across all four follow-up time points there were substantial unmet psychological needs (at least 60% of sample). Conclusions: Findings suggest that there are substantial unmet psychological needs among children during the first 18 months after BI. Barriers to mental health treatment for this population need to be addressed.
Resumo:
Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.
Resumo:
Information of crop phenology is essential for evaluating crop productivity. In a previous work, we determined phenological stages with remote sensing data using a dynamic system framework and an extended Kalman filter (EKF) approach. In this paper, we demonstrate that the particle filter is a more reliable method to infer any phenological stage compared to the EKF. The improvements achieved with this approach are discussed. In addition, this methodology enables the estimation of key cultivation dates, thus providing a practical product for many applications. The dates of some important stages, as the sowing date and the day when the crop reaches the panicle initiation stage, have been chosen to show the potential of this technique.
Resumo:
Frequently, population ecology of marine organisms uses a descriptive approach in which their sizes and densities are plotted over time. This approach has limited usefulness for design strategies in management or modelling different scenarios. Population projection matrix models are among the most widely used tools in ecology. Unfortunately, for the majority of pelagic marine organisms, it is difficult to mark individuals and follow them over time to determine their vital rates and built a population projection matrix model. Nevertheless, it is possible to get time-series data to calculate size structure and densities of each size, in order to determine the matrix parameters. This approach is known as a “demographic inverse problem” and it is based on quadratic programming methods, but it has rarely been used on aquatic organisms. We used unpublished field data of a population of cubomedusae Carybdea marsupialis to construct a population projection matrix model and compare two different management strategies to lower population to values before year 2008 when there was no significant interaction with bathers. Those strategies were by direct removal of medusae and by reducing prey. Our results showed that removal of jellyfish from all size classes was more effective than removing only juveniles or adults. When reducing prey, the highest efficiency to lower the C. marsupialis population occurred when prey depletion affected prey of all medusae sizes. Our model fit well with the field data and may serve to design an efficient management strategy or build hypothetical scenarios such as removal of individuals or reducing prey. TThis This sdfsdshis method is applicable to other marine or terrestrial species, for which density and population structure over time are available.
Resumo:
Subsidence is a hazard that may have natural or anthropogenic origin causing important economic losses. The area of Murcia city (SE Spain) has been affected by subsidence due to groundwater overexploitation since the year 1992. The main observed historical piezometric level declines occurred in the periods 1982–1984, 1992–1995 and 2004–2008 and showed a close correlation with the temporal evolution of ground displacements. Since 2008, the pressure recovery in the aquifer has led to an uplift of the ground surface that has been detected by the extensometers. In the present work an elastic hydro-mechanical finite element code has been used to compute the subsidence time series for 24 geotechnical boreholes, prescribing the measured groundwater table evolution. The achieved results have been compared with the displacements estimated through an advanced DInSAR technique and measured by the extensometers. These spatio-temporal comparisons have showed that, in spite of the limited geomechanical data available, the model has turned out to satisfactorily reproduce the subsidence phenomenon affecting Murcia City. The model will allow the prediction of future induced deformations and the consequences of any piezometric level variation in the study area.
Resumo:
Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available). Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche) weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model) characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally.
Resumo:
Includes indexes.
Resumo:
"Originally published as Monograph no.6 of the Cowles Commission for Research in Economics."
Multivariate analyses of variance and covariance for simulation studies involving normal time series
Resumo:
Photocopy.
Resumo:
"First published during the war as a classified report to Section D2, National Defense Research Committee."
Resumo:
Mode of access: Internet.
Resumo:
Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.