940 resultados para Measure of time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this work is to learn a parsimonious and informative representation for high-dimensional time series. Conceptually, this comprises two distinct yet tightly coupled tasks: learning a low-dimensional manifold and modeling the dynamical process. These two tasks have a complementary relationship as the temporal constraints provide valuable neighborhood information for dimensionality reduction and conversely, the low-dimensional space allows dynamics to be learnt efficiently. Solving these two tasks simultaneously allows important information to be exchanged mutually. If nonlinear models are required to capture the rich complexity of time series, then the learning problem becomes harder as the nonlinearities in both tasks are coupled. The proposed solution approximates the nonlinear manifold and dynamics using piecewise linear models. The interactions among the linear models are captured in a graphical model. By exploiting the model structure, efficient inference and learning algorithms are obtained without oversimplifying the model of the underlying dynamical process. Evaluation of the proposed framework with competing approaches is conducted in three sets of experiments: dimensionality reduction and reconstruction using synthetic time series, video synthesis using a dynamic texture database, and human motion synthesis, classification and tracking on a benchmark data set. In all experiments, the proposed approach provides superior performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this work is to learn a parsimonious and informative representation for high-dimensional time series. Conceptually, this comprises two distinct yet tightly coupled tasks: learning a low-dimensional manifold and modeling the dynamical process. These two tasks have a complementary relationship as the temporal constraints provide valuable neighborhood information for dimensionality reduction and conversely, the low-dimensional space allows dynamics to be learnt efficiently. Solving these two tasks simultaneously allows important information to be exchanged mutually. If nonlinear models are required to capture the rich complexity of time series, then the learning problem becomes harder as the nonlinearities in both tasks are coupled. The proposed solution approximates the nonlinear manifold and dynamics using piecewise linear models. The interactions among the linear models are captured in a graphical model. The model structure setup and parameter learning are done using a variational Bayesian approach, which enables automatic Bayesian model structure selection, hence solving the problem of over-fitting. By exploiting the model structure, efficient inference and learning algorithms are obtained without oversimplifying the model of the underlying dynamical process. Evaluation of the proposed framework with competing approaches is conducted in three sets of experiments: dimensionality reduction and reconstruction using synthetic time series, video synthesis using a dynamic texture database, and human motion synthesis, classification and tracking on a benchmark data set. In all experiments, the proposed approach provides superior performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Elective repeat caesarean delivery (ERCD) rates have been increasing worldwide, thus prompting obstetric discourse on the risks and benefits for the mother and infant. Yet, these increasing rates also have major economic implications for the health care system. Given the dearth of information on the cost-effectiveness related to mode of delivery, the aim of this paper was to perform an economic evaluation on the costs and short-term maternal health consequences associated with a trial of labour after one previous caesarean delivery compared with ERCD for low risk women in Ireland.Methods: Using a decision analytic model, a cost-effectiveness analysis (CEA) was performed where the measure of health gain was quality-adjusted life years (QALYs) over a six-week time horizon. A review of international literature was conducted to derive representative estimates of adverse maternal health outcomes following a trial of labour after caesarean (TOLAC) and ERCD. Delivery/procedure costs derived from primary data collection and combined both "bottom-up" and "top-down" costing estimations.Results: Maternal morbidities emerged in twice as many cases in the TOLAC group than the ERCD group. However, a TOLAC was found to be the most-effective method of delivery because it was substantially less expensive than ERCD ((sic)1,835.06 versus (sic)4,039.87 per women, respectively), and QALYs were modestly higher (0.84 versus 0.70). Our findings were supported by probabilistic sensitivity analysis.Conclusions: Clinicians need to be well informed of the benefits and risks of TOLAC among low risk women. Ideally, clinician-patient discourse would address differences in length of hospital stay and postpartum recovery time. While it is premature advocate a policy of TOLAC across maternity units, the results of the study prompt further analysis and repeat iterations, encouraging future studies to synthesis previous research and new and relevant evidence under a single comprehensive decision model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

© 2015, Institute of Mathematical Statistics. All rights reserved.In order to use persistence diagrams as a true statistical tool, it would be very useful to have a good notion of mean and variance for a set of diagrams. In [23], Mileyko and his collaborators made the first study of the properties of the Fréchet mean in (Dp, Wp), the space of persistence diagrams equipped with the p-th Wasserstein metric. In particular, they showed that the Fréchet mean of a finite set of diagrams always exists, but is not necessarily unique. The means of a continuously-varying set of diagrams do not themselves (necessarily) vary continuously, which presents obvious problems when trying to extend the Fréchet mean definition to the realm of time-varying persistence diagrams, better known as vineyards. We fix this problem by altering the original definition of Fréchet mean so that it now becomes a probability measure on the set of persistence diagrams; in a nutshell, the mean of a set of diagrams will be a weighted sum of atomic measures, where each atom is itself a persistence diagram determined using a perturbation of the input diagrams. This definition gives for each N a map (Dp)N→ℙ(Dp). We show that this map is Hölder continuous on finite diagrams and thus can be used to build a useful statistic on vineyards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Successful interaction with the world depends on accurate perception of the timing of external events. Neurons at early stages of the primate visual system represent time-varying stimuli with high precision. However, it is unknown whether this temporal fidelity is maintained in the prefrontal cortex, where changes in neuronal activity generally correlate with changes in perception. One reason to suspect that it is not maintained is that humans experience surprisingly large fluctuations in the perception of time. To investigate the neuronal correlates of time perception, we recorded from neurons in the prefrontal cortex and midbrain of monkeys performing a temporal-discrimination task. Visual time intervals were presented at a timescale relevant to natural behavior (<500 ms). At this brief timescale, neuronal adaptation--time-dependent changes in the size of successive responses--occurs. We found that visual activity fluctuated with timing judgments in the prefrontal cortex but not in comparable midbrain areas. Surprisingly, only response strength, not timing, predicted task performance. Intervals perceived as longer were associated with larger visual responses and shorter intervals with smaller responses, matching the dynamics of adaptation. These results suggest that the magnitude of prefrontal activity may be read out to provide temporal information that contributes to judging the passage of time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rotating-frame nuclear magnetic relaxation rate of spins diffusing on a disordered lattice has been calculated by Monte Carlo methods. The disorder includes not only variation in the distances between neighbouring spin sites but also variation in the hopping rate associated with each site. The presence of the disorder, particularly the hopping rate disorder, causes changes in the time-dependent spin correlation functions which translate into asymmetry in the characteristic peak in the temperature dependence of the dipolar relaxation rate. The results may be used to deduce the average hopping rate from the relaxation but the effect is not sufficiently marked to enable the distribution of the hopping rates to be evaluated. The distribution, which is a measure of the degree of disorder, is the more interesting feature and it has been possible to show from the calculation that measurements of the relaxation rate as a function of the strength of the radiofrequency spin-locking magnetic field can lead to an evaluation of its width. Some experimental data on an amorphous metal - hydrogen alloy are reported which demonstrate the feasibility of this novel approach to rotating-frame relaxation in disordered materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear poly(amidoamine)s (PAAs) have been designed to exhibit minimal non-specific toxicity, display pH-dependent membrane lysis and deliver genes and toxins in vitro. The aim of this study was to measure PAA cellular uptake using ISA1-OG (and as a reference ISA23-OG) in B16F10 cells in vitro and, by subcellular fractionation, quantitate intracellular trafficking of (125)I-labelled ISA1-tyr in liver cells after intravenous (i.v.) administration to rats. The effect of time after administration (0.5-3h) and ISA1 dose (0.04-100mg/kg) on trafficking, and vesicle permeabilisation (N-acetyl-b-D-glucosaminidase (NAG) release from an isolated vesicular fraction) were also studied. ISA1-OG displayed approximately 60-fold greater B16F10 cell uptake than ISA23-OG. Passage of ISA1 along the liver cell endocytic pathway caused a transient decrease in vesicle buoyant density (also visible by TEM). Increasing ISA1 dose from 10mg/kg to 100mg/kg increased both radioactivity and NAG levels in the cytosolic fraction (5-10 fold) at 1h. Moreover, internalised ISA1 provoked NAG release from an isolated vesicular fraction in a dose-dependent manner. These results provide direct evidence, for the first time, of PAA permeabilisation of endocytic vesicular membranes in vivo, and they have important implications for potential efficacy/toxicity of such polymeric vectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides mutual information performance analysis of multiple-symbol differential WSK (M-phase shift keying) over time-correlated, time-varying flat-fading communication channels. A state space approach is used to model time correlation of time varying channel phase. This approach captures the dynamics of time correlated, time-varying channels and enables exploitation of the forward-backward algorithm for mutual information performance analysis. It is shown that the differential decoding implicitly uses a sequence of innovations of the channel process time correlation and this sequence is essentially uncorrelated. It enables utilization of multiple-symbol differential detection, as a form of block-by-block maximum likelihood sequence detection for capacity achieving mutual information performance. It is shown that multiple-symbol differential ML detection of BPSK and QPSK practically achieves the channel information capacity with observation times only on the order of a few symbol intervals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climatic variability on the European Continental Shelf is dominated by events over the North Atlantic Ocean, and in particular by the North Atlantic Oscillation (NAO). The NAO is essentially a winter phenomenon, and its effects will be felt most strongly by populations for which winter conditions are critical. One example is the copepod Calanus finmarchicus, whose northern North Sea populations overwinter at depth in the North Atlantic. Its annual abundance in this region is strongly dependent on water transports at the end of the winter, and hence on the NAO index. Variations in the NAO give rise to changes in the circulation of the North Atlantic Ocean, with additional perturbations arising from El Ni (n) over tildeo - Southern Oscillation (ENSO) events in the Pacific, and these changes can be delayed by several years because of the adjustment time of the ocean circulation. One measure of the circulation is the latitude of the north wall of the Gulf Stream (GSNW index). Interannual variations in the plankton of the Shelf Seas show strong correlations with the fluctuations of the GSNW index, which are the result of Atlantic-wide atmospheric processes. These associations imply that the interannual variations are climatically induced rather than due to natural fluctuations of the marine ecosystem, and that the zooplankton populations have not been significantly affected by anthropogenic processes such as nutrient enrichment or fishing pressure. While the GSNW index represents a response to atmospheric changes over two or more years, the zooplankton populations correlated with it have generation times of a few weeks. The simplest explanation for the associations between the zooplankton and the GSNW index is that the plankton are responding to weather patterns propagating downstream from the Gulf Stream system. It seems that these meteorological processes operate in the spring. Although it has been suggested that there was a regime shift in the North Sea in the late 1980s, examination of the time-series by the cumulative sum (CUSUM) technique shows that any changes in the zooplankton of the central and northern North Sea are consistent with the background climatic variability. The abundance of total copepods increased during this period but this change does not represent a dramatic change in ecosystem processes. It is possible some change may have occurred at the end of the time-series in the years 1997 and 1998.