5 resultados para Reversals: Process, Time Scale, Magnetostratigraphy
em DigitalCommons@The Texas Medical Center
Resumo:
During the healthcare reform debate in the United States in 2009/2010, many health policy experts expressed a concern that expanding coverage would increase waiting times for patients to obtain care. Many complained that delays in obtaining care in turn would compromise the quality of healthcare in the United States. Using data from The Commonwealth Fund 2010 International Health Policy Survey in Eleven Countries, this study explored the relationship between wait times and quality of care, employing a wait time scale and several quality of care indicators present in the dataset. The impact of wait times on quality was assessed. Increased wait time was expected to reduce quality of care. However, this study found that wait times correlated with better health outcomes for some measures, and had no association with others. Since this is a pilot study and statistical significance was not achieved for any of the correlations, further research is needed to confirm and deepen the findings. However, if future studies confirm this finding, an emphasis on reducing wait times at the expense of other health system level performance variables may be inappropriate. ^
Resumo:
One of the fundamental questions in neuroscience is to understand how encoding of sensory inputs is distributed across neuronal networks in cerebral cortex to influence sensory processing and behavioral performance. The fact that the structure of neuronal networks is organized according to cortical layers raises the possibility that sensory information could be processed differently in distinct layers. The goal of my thesis research is to understand how laminar circuits encode information in their population activity, how the properties of the population code adapt to changes in visual input, and how population coding influences behavioral performance. To this end, we performed a series of novel experiments to investigate how sensory information in the primary visual cortex (V1) emerges across laminar cortical circuits. First, it is commonly known that the amount of information encoded by cortical circuits depends critically on whether or not nearby neurons exhibit correlations. We examined correlated variability in V1 circuits from a laminar-specific perspective and observed that cells in the input layer, which have only local projections, encode incoming stimuli optimally by exhibiting low correlated variability. In contrast, output layers, which send projections to other cortical and subcortical areas, encode information suboptimally by exhibiting large correlations. These results argue that neuronal populations in different cortical layers play different roles in network computations. Secondly, a fundamental feature of cortical neurons is their ability to adapt to changes in incoming stimuli. Understanding how adaptation emerges across cortical layers to influence information processing is vital for understanding efficient sensory coding. We examined the effects of adaptation, on the time-scale of a visual fixation, on network synchronization across laminar circuits. Specific to the superficial layers, we observed an increase in gamma-band (30-80 Hz) synchronization after adaptation that was correlated with an improvement in neuronal orientation discrimination performance. Thus, synchronization enhances sensory coding to optimize network processing across laminar circuits. Finally, we tested the hypothesis that individual neurons and local populations synchronize their activity in real-time to communicate information about incoming stimuli, and that the degree of synchronization influences behavioral performance. These analyses assessed for the first time the relationship between changes in laminar cortical networks involved in stimulus processing and behavioral performance.
Resumo:
A bench-scale treatability study was conducted on a high-strength wastewater from a chemical plant to develop an alternative for the existing waste stabilization pond treatment system. The objective of this study was to determine the treatability of the wastewater by the activated sludge process and, if treatable, to determine appropriate operating conditions, and to evaluate the degradability of bis(2-chloroethyl)ether (Chlorex) and benzene in the activated sludge system. Four 4-L Plexi-glass, complete mixing, continuous flow activated sludge reactors were operated in parallel under different operating conditions over a 6-month period. The operating conditions examined were hydraulic retention time (HRT), sludge retention time (SRT), nutrient supplement, and Chlorex/benzene spikes. Generally the activated sludge system treating high-strength wastewater was stable under large variations of organic loading and operating conditions. At an HRT of 2 days, more than 90% removal efficiency with good sludge settleability was achieved when the organic loading was less than 0.4 g BOD$\sb5$/g MLVSS/d or 0.8 g COD/g MLVSS/d. At least 20 days of SRT was required to maintain steady operation. Phosphorus addition enhanced the performance of the system especially during stressed operation. On the average, removals of benzene and Chlorex were 73-86% and 37-65%, respectively. In addition, the low-strength wastewater was treatable by activated sludge process, showing more than 90% BOD removal at a HRT of 0.5 days. In general, the sludge had poor settling characteristics. The aerated lagoon process treating high-strength wastewater also provided significant organic reduction, but did not produce an acceptable effluent concentration. ^
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^
Resumo:
A general model for the illness-death stochastic process with covariates has been developed for the analysis of survival data. This model incorporates important baseline and time-dependent covariates to make proper adjustment for the transition probabilities and survival probabilities. The follow-up period is subdivided into small intervals and a constant hazard is assumed for each interval. An approximation formula is derived to estimate the transition parameters when the exact transition time is unknown.^ The method developed is illustrated by using data from a study on the prevention of the recurrence of a myocardial infarction and subsequent mortality, the Beta-Blocker Heart Attack Trial (BHAT). This method provides an analytical approach which simultaneously includes provision for both fatal and nonfatal events in the model. According to this analysis, the effectiveness of the treatment can be compared between the Placebo and Propranolol treatment groups with respect to fatal and nonfatal events. ^