924 resultados para time domain analysis
Resumo:
The original cefepime product was withdrawn from the Swiss market in January 2007 and replaced by a generic 10 months later. The goals of the study were to assess the impact of this cefepime shortage on the use and costs of alternative broad-spectrum antibiotics, on antibiotic policy, and on resistance of Pseudomonas aeruginosa toward carbapenems, ceftazidime, and piperacillin-tazobactam. A generalized regression-based interrupted time series model assessed how much the shortage changed the monthly use and costs of cefepime and of selected alternative broad-spectrum antibiotics (ceftazidime, imipenem-cilastatin, meropenem, piperacillin-tazobactam) in 15 Swiss acute care hospitals from January 2005 to December 2008. Resistance of P. aeruginosa was compared before and after the cefepime shortage. There was a statistically significant increase in the consumption of piperacillin-tazobactam in hospitals with definitive interruption of cefepime supply and of meropenem in hospitals with transient interruption of cefepime supply. Consumption of each alternative antibiotic tended to increase during the cefepime shortage and to decrease when the cefepime generic was released. These shifts were associated with significantly higher overall costs. There was no significant change in hospitals with uninterrupted cefepime supply. The alternative antibiotics for which an increase in consumption showed the strongest association with a progression of resistance were the carbapenems. The use of alternative antibiotics after cefepime withdrawal was associated with a significant increase in piperacillin-tazobactam and meropenem use and in overall costs and with a decrease in susceptibility of P. aeruginosa in hospitals. This warrants caution with regard to shortages and withdrawals of antibiotics.
Resumo:
The purpose of this study was to compare inter-observer agreement of Stratus™ OCT versus Spectralis™ OCT image grading in patients with neovascular age-related macular degeneration (AMD). Thirty eyes with neovascular AMD were examined with Stratus™ OCT and Spectralis™ OCT. Four different scan protocols were used for imaging. Three observers graded the images for the presence of various pathologies. Inter-observer agreement between OCT models was assessed by calculating intra-class correlation coefficients (ICC). In Stratus™ OCT highest interobserver agreement was found for subretinal fluid (ICC: 0.79), and in Spectralis™ OCT for intraretinal cysts (IRC) (ICC: 0.93). Spectralis™ OCT showed superior interobserver agreement for IRC and epiretinal membranes (ERM) (ICC(Stratus™): for IRC 0.61; for ERM 0.56; ICC(Spectralis™): for IRC 0.93; for ERM 0.84). Increased image resolution of Spectralis™ OCT did improve the inter-observer agreement for grading intraretinal cysts and epiretinal membranes but not for other retinal changes.
Resumo:
Objective: We compare the prognostic strength of the lymph node ratio (LNR), positive lymph nodes (+LNs) and collected lymph nodes (LNcoll) using a time-dependent analysis in colorectal cancer patients stratified by mismatch repair (MMR) status. Method: 580 stage III-IV patients were included. Multivariable Cox regression analysis and time-dependent receiver operating characteristic (tROC) curve analysis were performed. The Area under the Curve (AUC) over time was compared for the three features. Results were validated on a second cohort of 105 stage III-IV patients. Results: The AUC for the LNR was 0.71 and outperformed + LNs and LNcoll by 10–15 % in both MMR-proficient and deficient cancers. LNR and + LNs were both significant (p<0.0001) in multivariable analysis but the effect was considerably stronger for the LNR [LNR: HR=5.18 (95 % CI: 3.5–7.6); +LNs=1.06 (95 % CI: 1.04–1.08)]. Similar results were obtained for patients with >12 LNcoll. An optimal cut off score for LNR=0.231 was validated on the second cohort (p<0.001). Conclusion: The LNR outperforms the + LNs and LNcoll even in patients with >12 LNcoll. Its clinical value is not confounded by MMR status. A cut-of score of 0.231 may best stratify patients into prognostic subgroups and could be a basis for the future prospective analysis of the LNR.
Resumo:
A time series is a sequence of observations made over time. Examples in public health include daily ozone concentrations, weekly admissions to an emergency department or annual expenditures on health care in the United States. Time series models are used to describe the dependence of the response at each time on predictor variables including covariates and possibly previous values in the series. Time series methods are necessary to account for the correlation among repeated responses over time. This paper gives an overview of time series ideas and methods used in public health research.
Resumo:
We present an overview of different methods for decomposing a multichannel spontaneous electroencephalogram (EEG) into sets of temporal patterns and topographic distributions. All of the methods presented here consider the scalp electric field as the basic analysis entity in space. In time, the resolution of the methods is between milliseconds (time-domain analysis), subseconds (time- and frequency-domain analysis) and seconds (frequency-domain analysis). For any of these methods, we show that large parts of the data can be explained by a small number of topographic distributions. Physically, this implies that the brain regions that generated one of those topographies must have been active with a common phase. If several brain regions are producing EEG signals at the same time and frequency, they have a strong tendency to do this in a synchronized mode. This view is illustrated by several examples (including combined EEG and functional magnetic resonance imaging (fMRI)) and a selective review of the literature. The findings are discussed in terms of short-lasting binding between different brain regions through synchronized oscillations, which could constitute a mechanism to form transient, functional neurocognitive networks.
Resumo:
Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished. Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished.
Resumo:
The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.
Resumo:
A publication entitled “A default mode of brain function” initiated a new way of looking at functional imaging data. In this PET study the authors discussed the often-observed consistent decrease of brain activation in a variety of tasks as compared with the baseline. They suggested that this deactivation is due to a task-induced suspension of a default mode of brain function that is active during rest, i.e. that there exists intrinsic well-organized brain activity during rest in several distinct brain regions. This suggestion led to a large number of imaging studies on the resting state of the brain and to the conclusion that the study of this intrinsic activity is crucial for understanding how the brain works. The fact that the brain is active during rest has been well known from a variety of EEG recordings for a very long time. Different states of the brain in the sleep–wake continuum are characterized by typical patterns of spontaneous oscillations in different frequency ranges and in different brain regions. Best studied are the evolving states during the different sleep stages, but characteristic EEG oscillation patterns have also been well described during awake periods (see Chapter 1 for details). A highly recommended comprehensive review on the brain's default state defined by oscillatory electrical brain activities is provided in the recent book by György Buzsaki, showing how these states can be measured by electrophysiological procedures at the global brain level as well as at the local cellular level.
Resumo:
The lipoprotein LppQ is the most prominent antigen of Mycoplasma mycoides subsp. mycoides small colony type (SC) during infection of cattle. This pathogen causes contagious bovine pleuropneumonia (CBPP), a devastating disease of considerable socio-economic importance in many countries worldwide. The dominant antigenicity and high specificity for M. mycoides subsp. mycoides SC of lipoprotein LppQ have been exploited for serological diagnosis and for epidemiological investigations of CBPP. Scanning electron microscopy and immunogold labelling were used to provide ultrastructural evidence that LppQ is located to the cell membrane at the outer surface of M. mycoides subsp. mycoides SC. The selectivity and specificity of this method were demonstrated through discriminating localization of extracellular (i.e., in the zone of contact with host cells) vs. integral membrane domains of LppQ. Thus, our findings support the suggestion that the accessible N-terminal domain of LppQ is surface exposed and such surface localization may be implicated in the pathogenesis of CBPP.
Resumo:
Campylobacter, a major zoonotic pathogen, displays seasonality in poultry and in humans. In order to identify temporal patterns in the prevalence of thermophilic Campylobacter spp. in a voluntary monitoring programme in broiler flocks in Germany and in the reported human incidence, time series methods were used. The data originated between May 2004 and June 2007. By the use of seasonal decomposition, autocorrelation and cross-correlation functions, it could be shown that an annual seasonality is present. However, the peak month differs between sample submission, prevalence in broilers and human incidence. Strikingly, the peak in human campylobacterioses preceded the peak in broiler prevalence in Lower Saxony rather than occurring after it. Significant cross-correlations between monthly temperature and prevalence in broilers as well as between human incidence, monthly temperature, rainfall and wind-force were identified. The results highlight the necessity to quantify the transmission of Campylobacter from broiler to humans and to include climatic factors in order to gain further insight into the epidemiology of this zoonotic disease.