976 resultados para Temporal models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. Method: TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. Results: TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. Conclusions: TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a network of competing species, a competitive intransitivity occurs when the ranking of competitive abilities does not follow a linear hierarchy (A > B > C but C > A). A variety of mathematical models suggests that intransitive networks can prevent or slow down competitive exclusion and maintain biodiversity by enhancing species coexistence. However, it has been difficult to assess empirically the relative importance of intransitive competition because a large number of pairwise species competition experiments are needed to construct a competition matrix that is used to parameterize existing models. Here we introduce a statistical framework for evaluating the contribution of intransitivity to community structure using species abundance matrices that are commonly generated from replicated sampling of species assemblages. We provide metrics and analytical methods for using abundance matrices to estimate species competition and patch transition matrices by using reverse-engineering and a colonization-competition model. These matrices provide complementary metrics to estimate the degree of intransitivity in the competition network of the sampled communities. Benchmark tests reveal that the proposed methods could successfully detect intransitive competition networks, even in the absence of direct measures of pairwise competitive strength. To illustrate the approach, we analyzed patterns of abundance and biomass of five species of necrophagous Diptera and eight species of their hymenopteran parasitoids that co-occur in beech forests in Germany. We found evidence for a strong competitive hierarchy within communities of flies and parasitoids. However, for parasitoids, there was a tendency towards increasing intransitivity in higher weight classes, which represented larger resource patches. These tests provide novel methods for empirically estimating the degree of intransitivity in competitive networks from observational datasets. They can be applied to experimental measures of pairwise species interactions, as well as to spatio-temporal samples of assemblages in homogenous environments or environmental gradients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulating the spatio-temporal dynamics of inundation is key to understanding the role of wetlands under past and future climate change. Earlier modelling studies have mostly relied on fixed prescribed peatland maps and inundation time series of limited temporal coverage. Here, we describe and assess the the Dynamical Peatland Model Based on TOPMODEL (DYPTOP), which predicts the extent of inundation based on a computationally efficient TOPMODEL implementation. This approach rests on an empirical, grid-cell-specific relationship between the mean soil water balance and the flooded area. DYPTOP combines the simulated inundation extent and its temporal persistency with criteria for the ecosystem water balance and the modelled peatland-specific soil carbon balance to predict the global distribution of peatlands. We apply DYPTOP in combination with the LPX-Bern DGVM and benchmark the global-scale distribution, extent, and seasonality of inundation against satellite data. DYPTOP successfully predicts the spatial distribution and extent of wetlands and major boreal and tropical peatland complexes and reveals the governing limitations to peatland occurrence across the globe. Peatlands covering large boreal lowlands are reproduced only when accounting for a positive feedback induced by the enhanced mean soil water holding capacity in peatland-dominated regions. DYPTOP is designed to minimize input data requirements, optimizes computational efficiency and allows for a modular adoption in Earth system models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE There is increasing evidence that epileptic activity involves widespread brain networks rather than single sources and that these networks contribute to interictal brain dysfunction. We investigated the fast-varying behavior of epileptic networks during interictal spikes in right and left temporal lobe epilepsy (RTLE and LTLE) at a whole-brain scale using directed connectivity. METHODS In 16 patients, 8 with LTLE and 8 with RTLE, we estimated the electrical source activity in 82 cortical regions of interest (ROIs) using high-density electroencephalography (EEG), individual head models, and a distributed linear inverse solution. A multivariate, time-varying, and frequency-resolved Granger-causal modeling (weighted Partial Directed Coherence) was applied to the source signal of all ROIs. A nonparametric statistical test assessed differences between spike and baseline epochs. Connectivity results between RTLE and LTLE were compared between RTLE and LTLE and with neuropsychological impairments. RESULTS Ipsilateral anterior temporal structures were identified as key drivers for both groups, concordant with the epileptogenic zone estimated invasively. We observed an increase in outflow from the key driver already before the spike. There were also important temporal and extratemporal ipsilateral drivers in both conditions, and contralateral only in RTLE. A different network pattern between LTLE and RTLE was found: in RTLE there was a much more prominent ipsilateral to contralateral pattern than in LTLE. Half of the RTLE patients but none of the LTLE patients had neuropsychological deficits consistent with contralateral temporal lobe dysfunction, suggesting a relationship between connectivity changes and cognitive deficits. SIGNIFICANCE The different patterns of time-varying connectivity in LTLE and RTLE suggest that they are not symmetrical entities, in line with our neuropsychological results. The highest outflow region was concordant with invasive validation of the epileptogenic zone. This enhanced characterization of dynamic connectivity patterns could better explain cognitive deficits and help the management of epilepsy surgery candidates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The success of an intervention to prevent the complications of an infection is influenced by the natural history of the infection. Assumptions about the temporal relationship between infection and the development of sequelae can affect the predicted effect size of an intervention and the sample size calculation. This study investigates how a mathematical model can be used to inform sample size calculations for a randomised controlled trial (RCT) using the example of Chlamydia trachomatis infection and pelvic inflammatory disease (PID). METHODS We used a compartmental model to imitate the structure of a published RCT. We considered three different processes for the timing of PID development, in relation to the initial C. trachomatis infection: immediate, constant throughout, or at the end of the infectious period. For each process we assumed that, of all women infected, the same fraction would develop PID in the absence of an intervention. We examined two sets of assumptions used to calculate the sample size in a published RCT that investigated the effect of chlamydia screening on PID incidence. We also investigated the influence of the natural history parameters of chlamydia on the required sample size. RESULTS The assumed event rates and effect sizes used for the sample size calculation implicitly determined the temporal relationship between chlamydia infection and PID in the model. Even small changes in the assumed PID incidence and relative risk (RR) led to considerable differences in the hypothesised mechanism of PID development. The RR and the sample size needed per group also depend on the natural history parameters of chlamydia. CONCLUSIONS Mathematical modelling helps to understand the temporal relationship between an infection and its sequelae and can show how uncertainties about natural history parameters affect sample size calculations when planning a RCT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compares gridded European seasonal series of surface air temperature (SAT) and precipitation (PRE) reconstructions with a regional climate simulation over the period 1500–1990. The area is analysed separately for nine subareas that represent the majority of the climate diversity in the European sector. In their spatial structure, an overall good agreement is found between the reconstructed and simulated climate features across Europe, supporting consistency in both products. Systematic biases between both data sets can be explained by a priori known deficiencies in the simulation. Simulations and reconstructions, however, largely differ in the temporal evolution of past climate for European subregions. In particular, the simulated anomalies during the Maunder and Dalton minima show stronger response to changes in the external forcings than recorded in the reconstructions. Although this disagreement is to some extent expected given the prominent role of internal variability in the evolution of regional temperature and precipitation, a certain degree of agreement is a priori expected in variables directly affected by external forcings. In this sense, the inability of the model to reproduce a warm period similar to that recorded for the winters during the first decades of the 18th century in the reconstructions is indicative of fundamental limitations in the simulation that preclude reproducing exceptionally anomalous conditions. Despite these limitations, the simulated climate is a physically consistent data set, which can be used as a benchmark to analyse the consistency and limitations of gridded reconstructions of different variables. A comparison of the leading modes of SAT and PRE variability indicates that reconstructions are too simplistic, especially for precipitation, which is associated with the linear statistical techniques used to generate the reconstructions. The analysis of the co-variability between sea level pressure (SLP) and SAT and PRE in the simulation yields a result which resembles the canonical co-variability recorded in the observations for the 20th century. However, the same analysis for reconstructions exhibits anomalously low correlations, which points towards a lack of dynamical consistency between independent reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20–30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow–Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the present review was to summarize the evidence available on the temporal sequence of hard and soft tissue healing around titanium dental implants in animal models and in humans. A search was undertaken to find animal and human studies reporting on the temporal dynamics of hard and soft tissue integration of titanium dental implants. Moreover, the influence of implant surface roughness and chemistry on the molecular mechanisms associated with osseointegration was also investigated. The findings indicated that the integration of titanium dental implants into hard and soft tissue represents the result of a complex cascade of biological events initiated by the surgical intervention. Implant placement into alveolar bone induces a cascade of healing events starting with clot formation and continuing with the maturation of bone in contact with the implant surface. From a genetic point of view, osseointegration is associated with a decrease in inflammation and an increase in osteogenesis-, angiogenesis- and neurogenesis-associated gene expression during the early stages of wound healing. The attachment and maturation of the soft tissue complex (i.e. epithelium and connective tissue) to implants becomes established 6-8 weeks following surgery. Based on the findings of the present review it can be concluded that improved understanding of the mechanisms associated with osseointegration will provide leads and targets for strategies aimed at enhancing the clinical performance of titanium dental implants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Recent reports using administrative claims data suggest the incidence of community- and hospital-onset sepsis is increasing. Whether this reflects changing epidemiology, more effective diagnostic methods, or changes in physician documentation and medical coding practices is unclear. METHODS We performed a temporal-trend study from 2008 to 2012 using administrative claims data and patient-level clinical data of adult patients admitted to Barnes-Jewish Hospital in St. Louis, Missouri. Temporal-trend and annual percent change were estimated using regression models with autoregressive integrated moving average errors. RESULTS We analyzed 62,261 inpatient admissions during the 5-year study period. 'Any SIRS' (i.e., SIRS on a single calendar day during the hospitalization) and 'multi-day SIRS' (i.e., SIRS on 3 or more calendar days), which both use patient-level data, and medical coding for sepsis (i.e., ICD-9-CM discharge diagnosis codes 995.91, 995.92, or 785.52) were present in 35.3 %, 17.3 %, and 3.3 % of admissions, respectively. The incidence of admissions coded for sepsis increased 9.7 % (95 % CI: 6.1, 13.4) per year, while the patient data-defined events of 'any SIRS' decreased by 1.8 % (95 % CI: -3.2, -0.5) and 'multi-day SIRS' did not change significantly over the study period. Clinically-defined sepsis (defined as SIRS plus bacteremia) and severe sepsis (defined as SIRS plus hypotension and bacteremia) decreased at statistically significant rates of 5.7 % (95 % CI: -9.0, -2.4) and 8.6 % (95 % CI: -4.4, -12.6) annually. All-cause mortality, SIRS mortality, and SIRS and clinically-defined sepsis case fatality did not change significantly during the study period. Sepsis mortality, based on ICD-9-CM codes, however, increased by 8.8 % (95 % CI: 1.9, 16.2) annually. CONCLUSIONS The incidence of sepsis, defined by ICD-9-CM codes, and sepsis mortality increased steadily without a concomitant increase in SIRS or clinically-defined sepsis. Our results highlight the need to develop strategies to integrate clinical patient-level data with administrative data to draw more accurate conclusions about the epidemiology of sepsis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution, ground-based and independent observations including co-located wind radiometer, lidar stations, and infrasound instruments are used to evaluate the accuracy of general circulation models and data-constrained assimilation systems in the middle atmosphere at northern hemisphere midlatitudes. Systematic comparisons between observations, the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analyses including the recent Integrated Forecast System cycles 38r1 and 38r2, the NASA’s Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalyses, and the free-running climate Max Planck Institute–Earth System Model–Low Resolution (MPI-ESM-LR) are carried out in both temporal and spectral dom ains. We find that ECMWF and MERRA are broadly consistent with lidar and wind radiometer measurements up to ~40 km. For both temperature and horizontal wind components, deviations increase with altitude as the assimilated observations become sparser. Between 40 and 60 km altitude, the standard deviation of the mean difference exceeds 5 K for the temperature and 20 m/s for the zonal wind. The largest deviations are observed in winter when the variability from large-scale planetary waves dominates. Between lidar data and MPI-ESM-LR, there is an overall agreement in spectral amplitude down to 15–20 days. At shorter time scales, the variability is lacking in the model by ~10 dB. Infrasound observations indicate a general good agreement with ECWMF wind and temperature products. As such, this study demonstrates the potential of the infrastructure of the Atmospheric Dynamics Research Infrastructure in Europe project that integrates various measurements and provides a quantitative understanding of stratosphere-troposphere dynamical coupling for numerical weather prediction applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rho-family GTPases are molecular switches that transmit extracellular cues to intracellular signaling pathways. Their regulation is likely to be highly regulated in space and in time, but most of what is known about Rho-family GTPase signaling has been derived from techniques that do not resolve these dimensions. New imaging technologies now allow the visualization of Rho GTPase signaling with high spatio-temporal resolution. This has led to insights that significantly extend classic models and call for a novel conceptual framework. These approaches clearly show three things. First, Rho GTPase signaling dynamics occur on micrometer length scales and subminute timescales. Second, multiple subcellular pools of one given Rho GTPase can operate simultaneously in time and space to regulate a wide variety of morphogenetic events (e.g. leading-edge membrane protrusion, tail retraction, membrane ruffling). These different Rho GTPase subcellular pools might be described as 'spatio-temporal signaling modules' and might involve the specific interaction of one GTPase with different guanine nucleotide exchange factors (GEFs), GTPase-activating proteins (GAPs) and effectors. Third, complex spatio-temporal signaling programs that involve precise crosstalk between multiple Rho GTPase signaling modules regulate specific morphogenetic events. The next challenge is to decipher the molecular circuitry underlying this complex spatio-temporal modularity to produce integrated models of Rho GTPase signaling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we extend the debate concerning Credit Default Swap valuation to include time varying correlation and co-variances. Traditional multi-variate techniques treat the correlations between covariates as constant over time; however, this view is not supported by the data. Secondly, since financial data does not follow a normal distribution because of its heavy tails, modeling the data using a Generalized Linear model (GLM) incorporating copulas emerge as a more robust technique over traditional approaches. This paper also includes an empirical analysis of the regime switching dynamics of credit risk in the presence of liquidity by following the general practice of assuming that credit and market risk follow a Markov process. The study was based on Credit Default Swap data obtained from Bloomberg that spanned the period January 1st 2004 to August 08th 2006. The empirical examination of the regime switching tendencies provided quantitative support to the anecdotal view that liquidity decreases as credit quality deteriorates. The analysis also examined the joint probability distribution of the credit risk determinants across credit quality through the use of a copula function which disaggregates the behavior embedded in the marginal gamma distributions, so as to isolate the level of dependence which is captured in the copula function. The results suggest that the time varying joint correlation matrix performed far superior as compared to the constant correlation matrix; the centerpiece of linear regression models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dataset contains the revised age models and foraminiferal records obtained for the Last Interglacial period in six marine sediment cores: - the Southern Ocean core MD02-2488 (age model, sea surface temperatures, benthic d18O and d13C for the period 136-108 ka), - the North Atlantic core MD95-2042 (age model, planktic d18O, benthic d18O and d13C for the period 135-110 ka), - the North Atlantic core ODP 980 (age model, planktic d18O, sea surface temperatures, seawater d18O, benthic d18O and d13C, ice-rafted detritus for the period 135-110 ka), - the North Atlantic core CH69-K09 (age model, planktic d18O, sea surface temperatures, seawater d18O, benthic d18O and d13C, ice-rafted detritus for the period 135-110 ka), - the Norwegian Sea core MD95-2010 (age model, percentage of Neogloboquadrina pachyderma sinistral, sea surface temperatures, benthic d18O, ice-rafted detritus for the period 134-110 ka), - the Labrador Sea core EW9302-JPC2 (age model, percentage of Neogloboquadrina pachyderma sinistral, sea surface temperatures, benthic d18O for the period 134-110 ka).