892 resultados para Time-motion Analysis
Resumo:
A time series is a sequence of observations made over time. Examples in public health include daily ozone concentrations, weekly admissions to an emergency department or annual expenditures on health care in the United States. Time series models are used to describe the dependence of the response at each time on predictor variables including covariates and possibly previous values in the series. Time series methods are necessary to account for the correlation among repeated responses over time. This paper gives an overview of time series ideas and methods used in public health research.
Resumo:
The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.
Resumo:
Campylobacter, a major zoonotic pathogen, displays seasonality in poultry and in humans. In order to identify temporal patterns in the prevalence of thermophilic Campylobacter spp. in a voluntary monitoring programme in broiler flocks in Germany and in the reported human incidence, time series methods were used. The data originated between May 2004 and June 2007. By the use of seasonal decomposition, autocorrelation and cross-correlation functions, it could be shown that an annual seasonality is present. However, the peak month differs between sample submission, prevalence in broilers and human incidence. Strikingly, the peak in human campylobacterioses preceded the peak in broiler prevalence in Lower Saxony rather than occurring after it. Significant cross-correlations between monthly temperature and prevalence in broilers as well as between human incidence, monthly temperature, rainfall and wind-force were identified. The results highlight the necessity to quantify the transmission of Campylobacter from broiler to humans and to include climatic factors in order to gain further insight into the epidemiology of this zoonotic disease.
Resumo:
Utilizing advanced information technology, Intensive Care Unit (ICU) remote monitoring allows highly trained specialists to oversee a large number of patients at multiple sites on a continuous basis. In the current research, we conducted a time-motion study of registered nurses’ work in an ICU remote monitoring facility. Data were collected on seven nurses through 40 hours of observation. The results showed that nurses’ essential tasks were centered on three themes: monitoring patients, maintaining patients’ health records, and managing technology use. In monitoring patients, nurses spent 52% of the time assimilating information embedded in a clinical information system and 15% on monitoring live vitals. System-generated alerts frequently interrupted nurses in their task performance and redirected them to manage suddenly appearing events. These findings provide insight into nurses’ workflow in a new, technology-driven critical care setting and have important implications for system design, work engineering, and personnel selection and training.
DIGITAL BOUNDARY DETECTION, VOLUMETRIC AND WALL MOTION ANALYSIS OF LEFT VENTRICULAR CINE ANGIOGRAMS.
Resumo:
SUMMARY Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.
Resumo:
BACKGROUND Acute myeloid leukaemia mainly affects elderly people, with a median age at diagnosis of around 70 years. Although about 50-60% of patients enter first complete remission upon intensive induction chemotherapy, relapse remains high and overall outcomes are disappointing. Therefore, effective post-remission therapy is urgently needed. Although often no post-remission therapy is given to elderly patients, it might include chemotherapy or allogeneic haemopoietic stem cell transplantation (HSCT) following reduced-intensity conditioning. We aimed to assess the comparative value of allogeneic HSCT with other approaches, including no post-remission therapy, in patients with acute myeloid leukaemia aged 60 years and older. METHODS For this time-dependent analysis, we used the results from four successive prospective HOVON-SAKK acute myeloid leukaemia trials. Between May 3, 2001, and Feb 5, 2010, a total of 1155 patients aged 60 years and older were entered into these trials, of whom 640 obtained a first complete remission after induction chemotherapy and were included in the analysis. Post-remission therapy consisted of allogeneic HSCT following reduced-intensity conditioning (n=97), gemtuzumab ozogamicin (n=110), chemotherapy (n=44), autologous HSCT (n=23), or no further treatment (n=366). Reduced-intensity conditioning regimens consisted of fludarabine combined with 2 Gy of total body irradiation (n=71), fludarabine with busulfan (n=10), or other regimens (n=16). A time-dependent analysis was done, in which allogeneic HSCT was compared with other types of post-remission therapy. The primary endpoint of the study was 5-year overall survival for all treatment groups, analysed by a time-dependent analysis. FINDINGS 5-year overall survival was 35% (95% CI 25-44) for patients who received an allogeneic HSCT, 21% (17-26) for those who received no additional post-remission therapy, and 26% (19-33) for patients who received either additional chemotherapy or autologous HSCT. Overall survival at 5 years was strongly affected by the European LeukemiaNET acute myeloid leukaemia risk score, with patients in the favourable risk group (n=65) having better 5-year overall survival (56% [95% CI 43-67]) than those with intermediate-risk (n=131; 23% [19-27]) or adverse-risk (n=444; 13% [8-20]) acute myeloid leukaemia. Multivariable analysis with allogeneic HSCT as a time-dependent variable showed that allogeneic HSCT was associated with better 5-year overall survival (HR 0·71 [95% CI 0·53-0·95], p=0·017) compared with non-allogeneic HSCT post-remission therapies or no post-remission therapy, especially in patients with intermediate-risk (0·82 [0·58-1·15]) or adverse-risk (0.39 [0·21-0·73]) acute myeloid leukaemia. INTERPRETATION Collectively, the results from these four trials suggest that allogeneic HSCT might be the preferred treatment approach in patients 60 years of age and older with intermediate-risk and adverse-risk acute myeloid leukaemia in first complete remission, but the comparative value should ideally be shown in a prospective randomised study. FUNDING None.
Resumo:
PURPOSE To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. MATERIALS AND METHODS This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. RESULTS All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). CONCLUSION Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.
Resumo:
Reelection and self-interest are recurring themes in the study of our congressional leaders. To date, many studies have already been done on the trends between elections, party affiliation, and voting behavior in Congress. However, because a plethora of data has been collected on both elections and congressional voting, the ability to draw a connection between the two provides a very reasonable prospect. This project analyzes whether voting shifts in congressional elections have an effect on congressional voting. Will a congressman become ideologically more polarized when his electoral margins increase? Essentially, this paper assumes that all congressmen are ideologically polarized, and it is elections which serve to reel congressmen back toward the ideological middle. The election and ideological data for this study, which spans from the 56th to the 107th Congress, finds statistically significant relationships between these two variables. In fact, congressman pay attention to election returns when voting in Congress. When broken down by party, Democrats are more exhibitive of this phenomenon, which suggest that Democrats may be more likely to intrinsically follow the popular model of representation. Meanwhile, it can be hypothesized that insignificant results for Republicans indicate that Republicans may follow a trustee model of representation.
Resumo:
OBJECTIVE. To determine the effectiveness of active surveillance cultures and associated infection control practices on the incidence of methicillin resistant Staphylococcus aureus (MRSA) in the acute care setting. DESIGN. A historical analysis of existing clinical data utilizing an interrupted time series design. ^ SETTING AND PARTICIPANTS. Patients admitted to a 260-bed tertiary care facility in Houston, TX between January 2005 through December 2010. ^ INTERVENTION. Infection control practices, including enhanced barrier precautions, compulsive hand hygiene, disinfection and environmental cleaning, and executive ownership and education, were simultaneously introduced during a 5-month intervention implementation period culminating with the implementation of active surveillance screening. Beginning June 2007, all high risk patients were cultured for MRSA nasal carriage within 48 hours of admission. Segmented Poisson regression was used to test the significance of the difference in incidence of healthcare-associated MRSA during the 29-month pre-intervention period compared to the 43-month post-intervention period. ^ RESULTS. A total of 9,957 of 11,095 high-risk patients (89.7%) were screened for MRSA carriage during the intervention period. Active surveillance cultures identified 1,330 MRSA-positive patients (13.4%) contributing to an admission prevalence of 17.5% in high-risk patients. The mean rate of healthcare-associated MRSA infection and colonization decreased from 1.1 per 1,000 patient-days in the pre-intervention period to 0.36 per 1,000 patient-days in the post-intervention period (P<0.001). The effect of the intervention in association with the percentage of S. aureus isolates susceptible to oxicillin were shown to be statistically significantly associated with the incidence of MRSA infection and colonization (IRR = 0.50, 95% CI = 0.31-0.80 and IRR = 0.004, 95% CI = 0.00003-0.40, respectively). ^ CONCLUSIONS. It can be concluded that aggressively targeting patients at high risk for colonization of MRSA with active surveillance cultures and associated infection control practices as part of a multifaceted, hospital-wide intervention is effective in reducing the incidence of healthcare-associated MRSA.^
Resumo:
The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.
Resumo:
The spatial and temporal dynamics of seagrasses have been well studied at the leaf to patch scales, however, the link to large spatial extent landscape and population dynamics is still unresolved in seagrass ecology. Traditional remote sensing approaches have lacked the temporal resolution and consistency to appropriately address this issue. This study uses two high temporal resolution time-series of thematic seagrass cover maps to examine the spatial and temporal dynamics of seagrass at both an inter- and intra-annual time scales, one of the first globally to do so at this scale. Previous work by the authors developed an object-based approach to map seagrass cover level distribution from a long term archive of Landsat TM and ETM+ images on the Eastern Banks (~200 km**2), Moreton Bay, Australia. In this work a range of trend and time-series analysis methods are demonstrated for a time-series of 23 annual maps from 1988 to 2010 and a time-series of 16 monthly maps during 2008-2010. Significant new insight was presented regarding the inter- and intra-annual dynamics of seagrass persistence over time, seagrass cover level variability, seagrass cover level trajectory, and change in area of seagrass and cover levels over time. Overall we found that there was no significant decline in total seagrass area on the Eastern Banks, but there was a significant decline in seagrass cover level condition. A case study of two smaller communities within the Eastern Banks that experienced a decline in both overall seagrass area and condition are examined in detail, highlighting possible differences in environmental and process drivers. We demonstrate how trend and time-series analysis enabled seagrass distribution to be appropriately assessed in context of its spatial and temporal history and provides the ability to not only quantify change, but also describe the type of change. We also demonstrate the potential use of time-series analysis products to investigate seagrass growth and decline as well as the processes that drive it. This study demonstrates clear benefits over traditional seagrass mapping and monitoring approaches, and provides a proof of concept for the use of trend and time-series analysis of remotely sensed seagrass products to benefit current endeavours in seagrass ecology.