914 resultados para predictive coding


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Symptoms of primary ciliary dyskinesia (PCD) are nonspecific and guidance on whom to refer for testing is limited. Diagnostic tests for PCD are highly specialised, requiring expensive equipment and experienced PCD scientists. This study aims to develop a practical clinical diagnostic tool to identify patients requiring testing.Patients consecutively referred for testing were studied. Information readily obtained from patient history was correlated with diagnostic outcome. Using logistic regression, the predictive performance of the best model was tested by receiver operating characteristic curve analyses. The model was simplified into a practical tool (PICADAR) and externally validated in a second diagnostic centre.Of 641 referrals with a definitive diagnostic outcome, 75 (12%) were positive. PICADAR applies to patients with persistent wet cough and has seven predictive parameters: full-term gestation, neonatal chest symptoms, neonatal intensive care admittance, chronic rhinitis, ear symptoms, situs inversus and congenital cardiac defect. Sensitivity and specificity of the tool were 0.90 and 0.75 for a cut-off score of 5 points. Area under the curve for the internally and externally validated tool was 0.91 and 0.87, respectively.PICADAR represents a simple diagnostic clinical prediction rule with good accuracy and validity, ready for testing in respiratory centres referring to PCD centres.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With substance abuse treatment expanding in prisons and jails, understanding how behavior change interacts with a restricted setting becomes more essential. The Transtheoretical Model (TTM) has been used to understand intentional behavior change in unrestricted settings, however, evidence indicates restrictive settings can affect the measurement and structure of the TTM constructs. The present study examined data from problem drinkers at baseline and end-of-treatment from three studies: (1) Project CARE (n = 187) recruited inmates from a large county jail; (2) Project Check-In (n = 116) recruited inmates from a state prison; (3) Project MATCH, a large multi-site alcohol study had two recruitment arms, aftercare (n = 724 pre-treatment and 650 post-treatment) and outpatient (n = 912 pre-treatment and 844 post-treatment). The analyses were conducted using cross-sectional data to test for non-invariance of measures of the TTM constructs: readiness, confidence, temptation, and processes of change (Structural Equation Modeling, SEM) across restricted and unrestricted settings. Two restricted (jail and aftercare) and one unrestricted group (outpatient) entering treatment and one restricted (prison) and two unrestricted groups (aftercare and outpatient) at end-of-treatment were contrasted. In addition TTM end-of-treatment profiles were tested as predictors of 12 month drinking outcomes (Profile Analysis). Although SEM did not indicate structural differences in the overall TTM construct model across setting types, there were factor structure differences on the confidence and temptation constructs at pre-treatment and in the factor structure of the behavioral processes at the end-of-treatment. For pre-treatment temptation and confidence, differences were found in the social situations factor loadings and in the variance for the confidence and temptation latent factors. For the end-of-treatment behavioral processes, differences across the restricted and unrestricted settings were identified in the counter-conditioning and stimulus control factor loadings. The TTM end-of-treatment profiles were not predictive of drinking outcomes in the prison sample. Both pre and post-treatment differences in structure across setting types involved constructs operationalized with behaviors that are limited for those in restricted settings. These studies suggest the TTM is a viable model for explicating addictive behavior change in restricted settings but calls for modification of subscale items that refer to specific behaviors and caution in interpreting the mean differences across setting types for problem drinkers. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human cytochrome P450 3A (CYP3A) subfamily is responsible for most of the metabolism of therapeutic drugs; however, an adequate in vivo model has yet to be discovered. This study begins with an investigation of a controversial topic surrounding the human CYP3As--estrogen regulation. A novel approach to this topic was used by defining expression in the estrogen-responsive endometrium. This study shows that estrogen down-regulates CYP3A4 expression in the endometrium. On the other hand, analogous studies showed an increase in CYP3A expression as age increases in liver tissue. Following the discussion of estrogen regulation, is an investigation of the cross-species relationships among all of the CYP3As was completed. The study compares isoforms from piscines, avians, rodents, canines, ovines, bovines, and primates. Using the traditional phylogenetic analyses and employing a novel approach using exon and intron lengths, the results show that only another primate could be the best animal model for analysis of the regulation of the expression of the human CYP3As. This analysis also demonstrated that the chimpanzee seems to be the best available human model. Moreover, the study showed the presence and similarities of one additional isoform in the chimpanzee genome that is absent in humans. Based on these results, initial characterization of the chimpanzee CYP3A subfamily was begun. While the human genome contains four isoforms--CYP3A4, CYP3A5, CYP3A7, and CYP3A43--the chimpanzee genome has five, the four previously mentioned and CYP3A67. Both species express CYP3A4, CYP3A5, and CYP3A43, but humans express CYP3A7 while chimpanzees express CYP3A67. In humans, CYP3A4 is expressed at higher levels than the other isoforms, but some chimpanzee individuals express CYP3A67 at higher levels than CYP3A4. Such a difference is expected to alter significantly the total CYP3A metabolism. On the other hand, any study considering individual isoforms would still constitute a valid method of study for the human CYP3A4, CYP3A5, and CYP3A43 isoforms. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many patients with anxiety and depression initially seek treatment from their primary care physicians. Changes in insurance coverage and current mental parity laws, make reimbursement for services a problem. This has led to a coding dilemma for physicians seeking payment for their services. This study seeks to determine first the frequency at which primary care physicians use alternative coding, and secondly, if physicians would change their coding practices, provided reimbursement was assured through changes in mental parity laws. A mail survey was sent to 260 randomly selected primary care physicians, who are family practice, internal medicine, and general practice physicians, and members of the Harris County Medical Society. The survey evaluated the physicians' demographics, the number of patients with psychiatric disorders seen by primary care physicians, the frequency with which physicians used alternative coding, and if mental parity laws changed, the rate at which physicians would use a psychiatric illness diagnosis as the primary diagnostic code. The overall response rate was 23%. Only 47 of the 59 physicians, who responded, qualified for the study and of those 45% used a psychiatric disorder to diagnose patients with a primary psychiatric disorder, 47% used a somatic/symptom disorder, and 8% used a medical diagnosis. From the physicians who would not use a psychiatric diagnosis as a primary ICD-9 code, 88% were afraid of not being reimbursed and 12% were worried about stigma or jeopardizing insurability. If payment were assured using a psychiatric diagnostic code, 81% physicians would use a psychiatric diagnosis as the primary diagnostic code. However, 19% would use an alternative diagnostic code in fear of stigmatizing and/or jeopardizing patients' insurability. Although the sample size of the study design was adequate, our survey did not have an ideal response rate, and no significant correlation was observed. However, it is evident that reimbursement for mental illness continues to be a problem for primary care physicians. The reformation of mental parity laws is necessary to ensure that patients receive mental health services and that primary care physicians are reimbursed. Despite the possibility of improved mental parity legislation, some physicians are still hesitant to assign patients with a mental illness diagnosis, due to the associated stigma, which still plays a role in today's society. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than a century ago Ramon y Cajal pioneered the description of neural circuits. Currently, new techniques are being developed to streamline the characterization of entire neural circuits. Even if this 'connectome' approach is successful, it will represent only a static description of neural circuits. Thus, a fundamental question in neuroscience is to understand how information is dynamically represented by neural populations. In this thesis, I studied two main aspects of dynamical population codes. ^ First, I studied how the exposure or adaptation, for a fraction of a second to oriented gratings dynamically changes the population response of primary visual cortex neurons. The effects of adaptation to oriented gratings have been extensively explored in psychophysical and electrophysiological experiments. However, whether rapid adaptation might induce a change in the primary visual cortex's functional connectivity to dynamically impact the population coding accuracy is currently unknown. To address this issue, we performed multi-electrode recordings in primary visual cortex, where adaptation has been previously shown to induce changes in the selectivity and response amplitude of individual neurons. We found that adaptation improves the population coding accuracy. The improvement was more prominent for iso- and orthogonal orientation adaptation, consistent with previously reported psychophysical experiments. We propose that selective decorrelation is a metabolically inexpensive mechanism that the visual system employs to dynamically adapt the neural responses to the statistics of the input stimuli to improve coding efficiency. ^ Second, I investigated how ongoing activity modulates orientation coding in single neurons, neural populations and behavior. Cortical networks are never silent even in the absence of external stimulation. The ongoing activity can account for up to 80% of the metabolic energy consumed by the brain. Thus, a fundamental question is to understand the functional role of ongoing activity and its impact on neural computations. I studied how the orientation coding by individual neurons and cell populations in primary visual cortex depend on the spontaneous activity before stimulus presentation. We hypothesized that since the ongoing activity of nearby neurons is strongly correlated, it would influence the ability of the entire population of orientation-selective cells to process orientation depending on the prestimulus spontaneous state. Our findings demonstrate that ongoing activity dynamically filters incoming stimuli to shape the accuracy of orientation coding by individual neurons and cell populations and this interaction affects behavioral performance. In summary, this thesis is a contribution to the study of how dynamic internal states such as rapid adaptation and ongoing activity modulate the population code accuracy. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of feminine products such as vaginal douches, tampons, and sanitary napkins are common among women. Despite the results of some studies that suggest an association between douching and bacterial vaginosis, douching remains a topic that is understudied. The possibility of an association between tampon use and infection has not been significantly investigated since the toxic shock outbreak in the 1980s. The first objective of our study was to evaluate demographic, reproductive health, and sexual behavior variables to establish an epidemiologic profile of menstruating women who reported douching and women who reported using sanitary napkins only. The second objective of our study was to evaluate whether the behaviors of douching and using tampons were associated with an increased risk of bacterial vaginosis or trichomonas. We analyzed these factors, using logistic regression, among the 3,174 women from the NHANES cross sectional data from 2001-2004, who met the inclusion criteria determined for our study. We established an epidemiologic profile for women who had the highest frequency of douching reported as women who were age 36-49, had a high school education or GED, black race, not taking oral contraceptives, reported vaginal symptoms in the last month, two or more sexual partners in the last year, or tested positive for bacterial vaginosis or trichomonas. The profile for those who had the highest frequency of exclusive sanitary napkin use included women with less than a high school education, married women, women classified as black or "other" in race, and women who were not on oral contraceptives. While we were able to establish a significant increase in the odds of douching among women who tested positive for bacterial vaginosis or trichomonas, we did not find any significant difference in the odds of exclusive napkin use and testing negative for bacterial vaginosis or trichomonas.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the fundamental questions in neuroscience is to understand how encoding of sensory inputs is distributed across neuronal networks in cerebral cortex to influence sensory processing and behavioral performance. The fact that the structure of neuronal networks is organized according to cortical layers raises the possibility that sensory information could be processed differently in distinct layers. The goal of my thesis research is to understand how laminar circuits encode information in their population activity, how the properties of the population code adapt to changes in visual input, and how population coding influences behavioral performance. To this end, we performed a series of novel experiments to investigate how sensory information in the primary visual cortex (V1) emerges across laminar cortical circuits. First, it is commonly known that the amount of information encoded by cortical circuits depends critically on whether or not nearby neurons exhibit correlations. We examined correlated variability in V1 circuits from a laminar-specific perspective and observed that cells in the input layer, which have only local projections, encode incoming stimuli optimally by exhibiting low correlated variability. In contrast, output layers, which send projections to other cortical and subcortical areas, encode information suboptimally by exhibiting large correlations. These results argue that neuronal populations in different cortical layers play different roles in network computations. Secondly, a fundamental feature of cortical neurons is their ability to adapt to changes in incoming stimuli. Understanding how adaptation emerges across cortical layers to influence information processing is vital for understanding efficient sensory coding. We examined the effects of adaptation, on the time-scale of a visual fixation, on network synchronization across laminar circuits. Specific to the superficial layers, we observed an increase in gamma-band (30-80 Hz) synchronization after adaptation that was correlated with an improvement in neuronal orientation discrimination performance. Thus, synchronization enhances sensory coding to optimize network processing across laminar circuits. Finally, we tested the hypothesis that individual neurons and local populations synchronize their activity in real-time to communicate information about incoming stimuli, and that the degree of synchronization influences behavioral performance. These analyses assessed for the first time the relationship between changes in laminar cortical networks involved in stimulus processing and behavioral performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiple Endocrine Neoplasia type 1 (MEN1) is a hereditary cancer syndrome characterized by tumors of the endocrine system. Tumors most commonly develop in the parathyroid glands, pituitary gland, and the gastro-entero pancreatic tract. MEN1 is a highly penetrant condition and age of onset is variable. Most patients are diagnosed in early adulthood; however, rare cases of MEN1 present in early childhood. Expert consensus opinion is that predictive genetic testing should be offered at age 5 years, however there are no evidence-based studies that clearly establish that predictive genetic testing at this age would be beneficial since most symptoms do not present until later in life. This study was designed to explore attitudes about the most appropriate age for predictive genetic testing from individuals at risk of having a child with MEN1. Participants who had an MEN1 mutation were invited to complete a survey and were asked to invite their spouses to participate as well. The survey included several validated measures designed to assess participants’ attitudes about predictive testing in minors. Fifty-eight affected participants and twenty-two spouses/partners completed the survey. Most participants felt that MEN1 genetic testing was appropriate in healthy minors. Younger age and increased knowledge of MEN1 genetics and inheritance predicted genetic testing at a younger age. Additionally, participants who saw more positive than negative general outcomes from genetic testing were more likely to favor genetic testing at younger ages. Overall, participants felt genetic testing should be offered at a younger age than most adult onset conditions and most felt the appropriate time for testing was when a child could understand and participate in the testing process. Psychological concerns seemed to be the primary focus of participants who favored later ages for genetic testing, while medical benefits were more commonly cited for younger age. This exploratory study has implications for counseling patients whose children are at risk of developing MEN1 and illustrates issues that are important to patients and their spouses when considering testing in children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tumor Suppressor Candidate 2 (TUSC2) is a novel tumor suppressor gene located in the human chromosome 3p21.3 region. TUSC2 mRNA transcripts could be detected on Northern blots in both normal lung and some lung cancer cell lines, but no endogenous TUSC2 protein could be detected in a majority of lung cancer cell lines. Mechanisms regulating TUSC2 protein expression and its inactivation in primary lung cancer cells are largely unknown. We investigated the role of the 5’- and 3’-untranslated regions (UTRs) of the TUSC2 gene in the regulation of TUSC2 protein expression. We found that two small upstream open-reading frames (uORFs) in the 5’UTR of TUSC2 could markedly inhibit the translational initiation of TUSC2 protein by interfering with the “scanning” of the ribosome initiation complexes. Site-specific stem-loop array reverse transcription-polymerase chain reaction (SLA-RT-PCR) verified several micoRNAs (miRNAs) targeted at 3’UTR and directed TUSC2 cleavage and degradation. In addition, we used the established let-7-targeted high mobility group A2 (Hmga2) mRNA as a model system to study the mechanism of regulation of target mRNA by miRNAs in mammalian cells under physiological conditions. There have been no evidence of direct link between mRNA downregulation and mRNA cleavages mediated by miRNAs. Here we showed that the endonucleolytic cleavages on mRNAs were initiated by mammalian miRNA in seed pairing style. Let-7 directed cleavage activities among the eight predicted potential target sites have varied efficiency, which are influenced by the positional and the structural contexts in the UTR. The 5’ cleaved RNA fragments were mostly oligouridylated at their 3’-termini and accumulated for delayed 5’–3’ degradation. RNA fragment oligouridylation played important roles in marking RNA fragments for delayed bulk degradation and in converting RNA degradation mode from 3’–5’ to 5’–3’ with cooperative efforts from both endonucleolytic and non-catalytic miRNA-induced silencing complex (miRISC). Our findings point to a mammalian miRNA-mediated mechanism for the regulation of mRNA that miRNA can decrease target mRNA through target mRNA cleavage and uridine addition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing a Model Interruption is a known human factor that contributes to errors and catastrophic events in healthcare as well as other high-risk industries. The landmark Institute of Medicine (IOM) report, To Err is Human, brought attention to the significance of preventable errors in medicine and suggested that interruptions could be a contributing factor. Previous studies of interruptions in healthcare did not offer a conceptual model by which to study interruptions. As a result of the serious consequences of interruptions investigated in other high-risk industries, there is a need to develop a model to describe, understand, explain, and predict interruptions and their consequences in healthcare. Therefore, the purpose of this study was to develop a model grounded in the literature and to use the model to describe and explain interruptions in healthcare. Specifically, this model would be used to describe and explain interruptions occurring in a Level One Trauma Center. A trauma center was chosen because this environment is characterized as intense, unpredictable, and interrupt-driven. The first step in developing the model began with a review of the literature which revealed that the concept interruption did not have a consistent definition in either the healthcare or non-healthcare literature. Walker and Avant’s method of concept analysis was used to clarify and define the concept. The analysis led to the identification of five defining attributes which include (1) a human experience, (2) an intrusion of a secondary, unplanned, and unexpected task, (3) discontinuity, (4) externally or internally initiated, and (5) situated within a context. However, before an interruption could commence, five conditions known as antecedents must occur. For an interruption to take place (1) an intent to interrupt is formed by the initiator, (2) a physical signal must pass a threshold test of detection by the recipient, (3) the sensory system of the recipient is stimulated to respond to the initiator, (4) an interruption task is presented to recipient, and (5) the interruption task is either accepted or rejected by v the recipient. An interruption was determined to be quantifiable by (1) the frequency of occurrence of an interruption, (2) the number of times the primary task has been suspended to perform an interrupting task, (3) the length of time the primary task has been suspended, and (4) the frequency of returning to the primary task or not returning to the primary task. As a result of the concept analysis, a definition of an interruption was derived from the literature. An interruption is defined as a break in the performance of a human activity initiated internal or external to the recipient and occurring within the context of a setting or location. This break results in the suspension of the initial task by initiating the performance of an unplanned task with the assumption that the initial task will be resumed. The definition is inclusive of all the defining attributes of an interruption. This is a standard definition that can be used by the healthcare industry. From the definition, a visual model of an interruption was developed. The model was used to describe and explain the interruptions recorded for an instrumental case study of physicians and registered nurses (RNs) working in a Level One Trauma Center. Five physicians were observed for a total of 29 hours, 31 minutes. Eight registered nurses were observed for a total of 40 hours 9 minutes. Observations were made on either the 0700–1500 or the 1500-2300 shift using the shadowing technique. Observations were recorded in the field note format. The field notes were analyzed by a hybrid method of categorizing activities and interruptions. The method was developed by using both a deductive a priori classification framework and by the inductive process utilizing line-byline coding and constant comparison as stated in Grounded Theory. The following categories were identified as relative to this study: Intended Recipient - the person to be interrupted Unintended Recipient - not the intended recipient of an interruption; i.e., receiving a phone call that was incorrectly dialed Indirect Recipient – the incidental recipient of an interruption; i.e., talking with another, thereby suspending the original activity Recipient Blocked – the intended recipient does not accept the interruption Recipient Delayed – the intended recipient postpones an interruption Self-interruption – a person, independent of another person, suspends one activity to perform another; i.e., while walking, stops abruptly and talks to another person Distraction – briefly disengaging from a task Organizational Design – the physical layout of the workspace that causes a disruption in workflow Artifacts Not Available – supplies and equipment that are not available in the workspace causing a disruption in workflow Initiator – a person who initiates an interruption Interruption by Organizational Design and Artifacts Not Available were identified as two new categories of interruption. These categories had not previously been cited in the literature. Analysis of the observations indicated that physicians were found to perform slightly fewer activities per hour when compared to RNs. This variance may be attributed to differing roles and responsibilities. Physicians were found to have more activities interrupted when compared to RNs. However, RNs experienced more interruptions per hour. Other people were determined to be the most commonly used medium through which to deliver an interruption. Additional mediums used to deliver an interruption vii included the telephone, pager, and one’s self. Both physicians and RNs were observed to resume an original interrupted activity more often than not. In most interruptions, both physicians and RNs performed only one or two interrupting activities before returning to the original interrupted activity. In conclusion the model was found to explain all interruptions observed during the study. However, the model will require an even more comprehensive study in order to establish its predictive value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Li- Fraumeni Syndrome (LFS) is a rare autosomal dominant hereditary cancer syndrome caused by mutations in the TP53 gene that predisposes individuals to a wide variety of cancers, including breast cancer, soft tissue sarcomas, osteosarcomas, brain tumors, and adrenocortical carcinomas. Individuals found to carry germline mutations in TP53 have a 90% lifetime cancer risk, with a 20% chance to develop cancer under the age of 20. Despite the significant risk of childhood cancer, predictive testing for unaffected minors at risk for LFS historically has not been recommended, largely due to the lack of available and effective screening for the types of cancers involved. A recently developed screening protocol suggests an advantage to identifying and screening children at risk for LFS and we therefore hypothesized that this alongside with the availability of new screening modalities may substantiate a shift in recommendations for predictive genetic testing in minors at risk for LFS. We aimed to describe current screening recommendations that genetic counselors provide to this population as well as explore factors that may have influenced genetic counselors attitude and practice in regards to this issue. An online survey was emailed to members of the National Society of Genetic Counselors (NSGC) and the Canadian Association of Genetic Counsellors (CAGC). Of an estimated 1000 eligible participants, 172 completed surveys that were analyzed. Genetic counselors in this study were more likely to support predictive genetic testing for this population as the minor aged (p

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing data, collected from 1st-year students enrolled in a major Health Science Community College in the south central United States, for Fall 2010, Spring 2011, Fall 2011 and Spring 2012 semesters as part of the "Online Navigational Assessment Vehicle, Intervention Guidance, and Targeting of Risks (NAVIGATOR) for Undergraduate Minority Student Success" with CPHS approval number HSC-GEN-07-0158, was used for this thesis. The Personal Background and Preparation Survey (PBPS) and a two-question risk self-assessment subscale were administered to students during their 1st-year orientation. The PBPS total risk score, risk self-assessment total and overall scores, and Under Representative Minority Student (URMS) status were recorded. The purpose of this study is to evaluate and report the predictive validity of the indicators identified above for Adverse Academic Status Events (AASE) and Nonadvancement Adverse Academic Status Events (NAASE) as well as the effectiveness of interventions targeted using the PBPS among a diverse population of health science community college students. The predictive validity of the PBPS for AASE has previously been demonstrated among health science professions and graduate students (Johnson, Johnson, Kim, & McKee, 2009a; Johnson, Johnson, McKee, & Kim, 2009b). Data will be analyzed using binary logistic regression and correlation using SPSS 19 statistical package. Independent variables will include baseline- versus intervention-year treatments, PBPS, risk self-assessment, and URMS status. The dependent variables will be binary AASE and NAASE status. ^ The PBPS was the first reliable diagnostic and prescriptive instrument to establish documented predictive validity for student Adverse Academic Status Events (AASE) among students attending health science professional schools. These results extend the documented validity for the PBPS in predicting AASE to a health science community college student population. Results further demonstrated that interventions introduced using the PBPS were followed by approximately one-third reduction in the odds of Nonadvancement Adverse Academic Status Events (NAASE), controlling for URMS status and risk self-assessment scores. These results indicate interventions introduced using the PBPS may have potential to reduce AASE or attrition among URMS and nonURMS attending health science community colleges on a broader scale; positively impacting costs, shortages, and diversity of health science professionals.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective::Describe and understand regional differences and associated multilevel factors (patient, provider and regional) to inappropriate utilization of advance imaging tests in the privately insured population of Texas. Methods: We analyzed Blue Cross Blue Shield of Texas claims dataset to study the advance imaging utilization during 2008-2010 in the PPO/PPO+ plans. We used three of CMS "Hospital Outpatient Quality Reporting" imaging efficiency measures. These included ordering MRI for low back pain without prior conservative management (OP-8) and utilization of combined with and without contrast abdominal CT (OP-10) and thorax CT (OP-11). Means and variation by hospital referral regions (HRR) in Texas were measured and a multilevel logistic regression for being a provider with high values for any the three OP measures was used in the analysis. We also analyzed OP-8 at the individual level. A multilevel logistic regression was used to identify predictive factors for having an inappropriate MRI for low back pain. Results: Mean OP-8 for Texas providers was 37.89%, OP-10 was 29.94% and OP-11 was 9.24%. Variation was higher for CT measure. And certain HRRs were consistently above the mean. Hospital providers had higher odds of high OP-8 values (OP-8: OR, 1.34; CI, 1.12-1.60) but had smaller odds of having high OP-10 and OP-11 values (OP-10: OR, 0.15; CI, 0.12-0.18; OP-11: OR, 0.43; CI, 0.34-0.53). Providers with the highest volume of imaging studies performed, were less likely to have high OP-8 measures (OP-8: OR, 0.58; CI, 0.48-0.70) but more likely to perform combined thoracic CT scans (OP-11: OR, 1.62; CI, 1.34-1.95). Males had higher odds of inappropriate MRI (OR, 1.21; CI, 1.16-1.26). Pattern of care in the six months prior to the MRI event was significantly associated with having an inappropriate MRI. Conclusion::We identified a significant variation in advance imaging utilization across Texas. Type of facility was associated with measure performance, but the associations differ according to the type of study. Last, certain individual characteristics such as gender, age and pattern of care were found to be predictors of inappropriate MRIs.^