7 resultados para Standard Time
em DigitalCommons@The Texas Medical Center
Resumo:
BACKGROUND: Few reports of the utilization of an accurate, cost-effective means for measuring HPV oncogene transcripts have been published. Several papers have reported the use of relative quantitation or more expensive Taqman methods. Here, we report a method of absolute quantitative real-time PCR utilizing SYBR-green fluorescence for the measurement of HPV E7 expression in cervical cytobrush specimens. RESULTS: The construction of a standard curve based on the serial dilution of an E7-containing plasmid was the key for being able to accurately compare measurements between cervical samples. The assay was highly reproducible with an overall coefficient of variation of 10.4%. CONCLUSION: The use of highly reproducible and accurate SYBR-based real-time polymerase chain reaction (PCR) assays instead of performing Taqman-type assays allows low-cost, high-throughput analysis of viral mRNA expression. The development of such assays will help in refining the current screening programs for HPV-related carcinomas.
Resumo:
PURPOSE: To review our clinical experience and determine if there are appropriate signs and symptoms to consider POLG sequencing prior to valproic acid (VPA) dosing in patients with seizures. METHODS: Four patients who developed VPA-induced hepatotoxicity were examined for POLG sequence variations. A subsequent chart review was used to describe clinical course prior to and after VPA dosing. RESULTS: Four patients of multiple different ethnicities, age 3-18 years, developed VPA-induced hepatotoxicity. All were given VPA due to intractable partial seizures. Three of the patients had developed epilepsia partialis continua. The time from VPA exposure to liver failure was between 2 and 3 months. Liver failure was reversible in one patient. Molecular studies revealed homozygous p.R597W or p.A467T mutations in two patients. The other two patients showed compound heterozygous mutations, p.A467T/p.Q68X and p.L83P/p.G888S. Clinical findings and POLG mutations were diagnostic of Alpers-Huttenlocher syndrome. CONCLUSION: Our cases underscore several important findings: POLG mutations have been observed in every ethnic group studied to date; early predominance of epileptiform discharges over the occipital region is common in POLG-induced epilepsy; the EEG and MRI findings varying between patients and stages of the disease; and VPA dosing at any stage of Alpers-Huttenlocher syndrome can precipitate liver failure. Our data support an emerging proposal that POLG gene testing should be considered in any child or adolescent who presents or develops intractable seizures with or without status epilepticus or epilepsia partialis continua, particularly when there is a history of psychomotor regression.
Resumo:
We investigated if CLSI M27-A2 Candida species breakpoints for fluconazole MIC are valid when read at 24 h. Analysis of a data set showed good correlation between 48- and 24-h MICs, as well as similar outcomes and pharmacodynamic efficacy parameters, except for isolates in the susceptible dose-dependent category, such as Candida glabrata.
Resumo:
Despite the availability of hepatitis B vaccine for over two decades, drug users and other high-risk adult populations have experienced low vaccine coverage. Poor compliance has limited efforts to reduce transmission of hepatitis B infection in this population. Evidence suggests that immunological response in drug users is impaired compared to the general population, both in terms of lower seroprotection rates and antibodies levels.^ The current study investigated the effectiveness of the multi-dose hepatitis B vaccine and compared the effect of the standard and accelerated vaccine schedules in a not-in-treatment, drug-using adult population in the city of Houston, USA.^ A population of drug-users from two communities in Houston, susceptible to hepatitis B, was sampled by outreach workers and referral methodology. Subjects were randomized either to the standard hepatitis vaccine schedule (0, 1-, 6-month) or to an accelerated schedule (0, 1-, 2-month). Antibody levels were detected through laboratory analyses at various time-points. The participants were followed for two years and seroconversion rates were calculated to determine immune response.^ A four percent difference in the overall compliance rate was observed between the standard (73%) and accelerated schedules (77%). Logistic regression analyses showed that drug users living on the streets were twice as likely to not complete all three vaccine doses (p=0.028), and current speedball use was also associated with non-completion (p=0.002). Completion of all three vaccinations in the multivariate analysis was also correlated with older age. Drug users on the accelerated schedule were 26% more likely to achieve completion, although this factor was marginally significant (p=0.085).^ Cumulative adequate protective response was gained by 65% of the HBV susceptible subgroup by 12-months and was identical for both the standard and accelerated schedules. Excess protective response (>=100 mIU/mL) occurred with greater frequency at the later period for the standard schedule (36% at 12-months compared to 14% at six months), while the greater proportion of excess protective response for the accelerated schedule occurred earlier (34% at 6 months compared to 18% at 12-months). Seroconversion at the adequate protective response level of 10 mIU/mL was reached by the accelerated schedule group at a quicker rate (62% vs. 49%), and with a higher mean titer (104.8 vs. 64.3 mIU/mL), when measured at six months. Multivariate analyses indicated a 63% increased risk of non-response for older age and confirmed the existence of an accelerating decline in immune response to vaccination manifesting after 40 years (p=0.001). Injecting more than daily was also highly associated with the risk of non-response (p=0.016).^ The substantial increase in the seroprotection rate at six months may be worth the trade-off against the faster antibody titer decrease and is recommended for enhancing compliance and seroconversion. Utilization of the accelerated schedule with the primary objective of increasing compliance and seroconversion rates during the six months after the first dose may confer early protective immunity and reduce the HBV vulnerability of drug users who continue, or have recently initiated, increased high risk drug use and sexual behaviors.^
Resumo:
Considering the broader context of school reform that is seeking education strategies that might deliver substantial impact, this article examines four questions related to the policy and practice of expanding learning time: (a) why do educators find the standard American school calendar insufficient to meet students’ educational needs, especially those of disadvantaged students? (b) how do educators implement a longer day and/or year, addressing concerns about both educational quality and costs? (c) what does research report about outcomes of expanding time in schools? and (d) what are the future prospects for increasing the number of expanded-time schools? The paper examines these questions by considering research, policy, and practice at the national level and, throughout, by drawing upon additional evidence from Massachusetts, one of the leading states in the expanded-time movement. In considering the latter two questions, the article explores the knowns and unknowns related to expanded learning time and offers suggestions for further research.
Resumo:
The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.
Resumo:
The Phase I clinical trial is considered the "first in human" study in medical research to examine the toxicity of a new agent. It determines the maximum tolerable dose (MTD) of a new agent, i.e., the highest dose in which toxicity is still acceptable. Several phase I clinical trial designs have been proposed in the past 30 years. The well known standard method, so called the 3+3 design, is widely accepted by clinicians since it is the easiest to implement and it does not need a statistical calculation. Continual reassessment method (CRM), a design uses Bayesian method, has been rising in popularity in the last two decades. Several variants of the CRM design have also been suggested in numerous statistical literatures. Rolling six is a new method introduced in pediatric oncology in 2008, which claims to shorten the trial duration as compared to the 3+3 design. The goal of the present research was to simulate clinical trials and compare these phase I clinical trial designs. Patient population was created by discrete event simulation (DES) method. The characteristics of the patients were generated by several distributions with the parameters derived from a historical phase I clinical trial data review. Patients were then selected and enrolled in clinical trials, each of which uses the 3+3 design, the rolling six, or the CRM design. Five scenarios of dose-toxicity relationship were used to compare the performance of the phase I clinical trial designs. One thousand trials were simulated per phase I clinical trial design per dose-toxicity scenario. The results showed the rolling six design was not superior to the 3+3 design in terms of trial duration. The time to trial completion was comparable between the rolling six and the 3+3 design. However, they both shorten the duration as compared to the two CRM designs. Both CRMs were superior to the 3+3 design and the rolling six in accuracy of MTD estimation. The 3+3 design and rolling six tended to assign more patients to undesired lower dose levels. The toxicities were slightly greater in the CRMs.^