17 resultados para Screening tools

em Duke University


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: Depression-screening tools exist and are widely used in Western settings. There have been few studies done to explore whether or not existing tools are valid and effective to use in sub-Saharan Africa. Our study aimed to develop and validate a perinatal depression-screening tool in rural Kenya.

Methods: We utilized conducted free listing and card sorting exercises with a purposive sample of 12 women and 38 CHVs living in a rural community to explore the manifestations of perinatal depression in that setting. We used the information obtained to produce a locally relevant depression-screening tool that comprised of existing Western psychiatric concepts and locally derived items. Subsequently, we administered the novel depression-screening tool and two existing screening tools (the Edinburgh Postnatal Depression Scale and the Patient Health Questionnaire-9) to 193 women and compared the results of the screening tool with that of a gold standard structured clinical interview to determine validity.

Results: The free listing and card sorting exercise produced a set of 60 screening items. Of the items in this set, we identified the 10 items that most accurately classified cases and non-cases. This 10-item scale had a sensitivity of 100.0 and specificity of 81.2. This compared to 90.0, 31.5 and 90.0, 49.7 for the EPDS and the PHQ-9, respectively. Overall, we found a prevalence of depression of 5.2 percent.

Conclusions: The new scale does very well in terms of diagnostic validity, having the highest scores in this domain compared to the EPDS, EPDS-R and PHQ-9. The adapted scale does very well with regards to convergent validity-illustrating clear distinction between mean scores across the different categories. It does well with regards to discriminant validity, internal consistency reliability, and test-retest reliability- not securing top scores in those domains but still yielding satisfactory results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While substance use problems are considered to be common in medical settings, they are not systematically assessed and diagnosed for treatment management. Research data suggest that the majority of individuals with a substance use disorder either do not use treatment or delay treatment-seeking for over a decade. The separation of substance abuse services from mainstream medical care and a lack of preventive services for substance abuse in primary care can contribute to under-detection of substance use problems. When fully enacted in 2014, the Patient Protection and Affordable Care Act 2010 will address these barriers by supporting preventive services for substance abuse (screening, counseling) and integration of substance abuse care with primary care. One key factor that can help to achieve this goal is to incorporate the standardized screeners or common data elements for substance use and related disorders into the electronic health records (EHR) system in the health care setting. Incentives for care providers to adopt an EHR system for meaningful use are part of the Health Information Technology for Economic and Clinical Health Act 2009. This commentary focuses on recent evidence about routine screening and intervention for alcohol/drug use and related disorders in primary care. Federal efforts in developing common data elements for use as screeners for substance use and related disorders are described. A pressing need for empirical data on screening, brief intervention, and referral to treatment (SBIRT) for drug-related disorders to inform SBIRT and related EHR efforts is highlighted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dual-specificity protein tyrosine phosphatases (PTPs) play integral roles in the regulation of cell signaling. There is a need for new tools to study these phosphatases, and the identification of inhibitors potentially affords not only new means for their study, but also possible therapeutics for the treatment of diseases caused by their dysregulation. However, the identification of selective inhibitors of the protein phosphatases has proven somewhat difficult. PTP localized to mitochondrion 1 (PTPMT1) is a recently discovered dual-specificity phosphatase that has been implicated in the regulation of insulin secretion. Screening of a commercially available small-molecule library yielded alexidine dihydrochloride, a dibiguanide compound, as an effective and selective inhibitor of PTPMT1 with an in vitro concentration that inhibits response by 50% of 1.08 microM. A related dibiguanide analog, chlorhexidine dihydrochloride, also significantly inhibited PTPMT1, albeit with lower potency, while a monobiguanide analog showed very weak inhibition. Treatment of isolated rat pancreatic islets with alexidine dihydrochloride resulted in a dose-dependent increase in insulin secretion, whereas treatment of a pancreatic beta-cell line with the drug affected the phosphorylation of mitochondrial proteins in a manner similar to genetic inhibition of PTPMT1. Furthermore, knockdown of PTPMT1 in rat islets rendered them insensitive to alexidine dihydrochloride treatment, providing evidence for mechanism-based activity of the inhibitor. Taken together, these studies establish alexidine dihydrochloride as an effective inhibitor of PTPMT1, both in vitro and in cells, and support the notion that PTPMT1 could serve as a pharmacological target in the treatment of type II diabetes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Timing-related defects are major contributors to test escapes and in-field reliability problems for very-deep submicrometer integrated circuits. Small delay variations induced by crosstalk, process variations, power-supply noise, as well as resistive opens and shorts can potentially cause timing failures in a design, thereby leading to quality and reliability concerns. We present a test-grading technique that uses the method of output deviations for screening small-delay defects (SDDs). A new gate-delay defect probability measure is defined to model delay variations for nanometer technologies. The proposed technique intelligently selects the best set of patterns for SDD detection from an n-detect pattern set generated using timing-unaware automatic test-pattern generation (ATPG). It offers significantly lower computational complexity and excites a larger number of long paths compared to a current generation commercial timing-aware ATPG tool. Our results also show that, for the same pattern count, the selected patterns provide more effective coverage ramp-up than timing-aware ATPG and a recent pattern-selection method for random SDDs potentially caused by resistive shorts, resistive opens, and process variations. © 2010 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, the storage and use of residual newborn screening (NBS) samples has gained attention. To inform ongoing policy discussions, this article provides an update of previous work on new policies, educational materials, and parental options regarding the storage and use of residual NBS samples. A review of state NBS Web sites was conducted for information related to the storage and use of residual NBS samples in January 2010. In addition, a review of current statutes and bills introduced between 2005 and 2009 regarding storage and/or use of residual NBS samples was conducted. Fourteen states currently provide information about the storage and/or use of residual NBS samples. Nine states provide parents the option to request destruction of the residual NBS sample after the required storage period or the option to exclude the sample for research uses. In the coming years, it is anticipated that more states will consider policies to address parental concerns about the storage and use of residual NBS samples. Development of new policies regarding storage and use of residual NBS samples will require careful consideration of impact on NBS programs, parent and provider educational materials, and respect for parents among other issues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: There is considerable interest in the development of methods to efficiently identify all coding variants present in large sample sets of humans. There are three approaches possible: whole-genome sequencing, whole-exome sequencing using exon capture methods, and RNA-Seq. While whole-genome sequencing is the most complete, it remains sufficiently expensive that cost effective alternatives are important. RESULTS: Here we provide a systematic exploration of how well RNA-Seq can identify human coding variants by comparing variants identified through high coverage whole-genome sequencing to those identified by high coverage RNA-Seq in the same individual. This comparison allowed us to directly evaluate the sensitivity and specificity of RNA-Seq in identifying coding variants, and to evaluate how key parameters such as the degree of coverage and the expression levels of genes interact to influence performance. We find that although only 40% of exonic variants identified by whole genome sequencing were captured using RNA-Seq; this number rose to 81% when concentrating on genes known to be well-expressed in the source tissue. We also find that a high false positive rate can be problematic when working with RNA-Seq data, especially at higher levels of coverage. CONCLUSIONS: We conclude that as long as a tissue relevant to the trait under study is available and suitable quality control screens are implemented, RNA-Seq is a fast and inexpensive alternative approach for finding coding variants in genes with sufficiently high expression levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: HIV-1 clade C (HIV-C) predominates worldwide, and anti-HIV-C vaccines are urgently needed. Neutralizing antibody (nAb) responses are considered important but have proved difficult to elicit. Although some current immunogens elicit antibodies that neutralize highly neutralization-sensitive (tier 1) HIV strains, most circulating HIVs exhibiting a less sensitive (tier 2) phenotype are not neutralized. Thus, both tier 1 and 2 viruses are needed for vaccine discovery in nonhuman primate models. METHODOLOGY/PRINCIPAL FINDINGS: We constructed a tier 1 simian-human immunodeficiency virus, SHIV-1157ipEL, by inserting an "early," recently transmitted HIV-C env into the SHIV-1157ipd3N4 backbone [1] encoding a "late" form of the same env, which had evolved in a SHIV-infected rhesus monkey (RM) with AIDS. SHIV-1157ipEL was rapidly passaged to yield SHIV-1157ipEL-p, which remained exclusively R5-tropic and had a tier 1 phenotype, in contrast to "late" SHIV-1157ipd3N4 (tier 2). After 5 weekly low-dose intrarectal exposures, SHIV-1157ipEL-p systemically infected 16 out of 17 RM with high peak viral RNA loads and depleted gut CD4+ T cells. SHIV-1157ipEL-p and SHIV-1157ipd3N4 env genes diverge mostly in V1/V2. Molecular modeling revealed a possible mechanism for the increased neutralization resistance of SHIV-1157ipd3N4 Env: V2 loops hindering access to the CD4 binding site, shown experimentally with nAb b12. Similar mutations have been linked to decreased neutralization sensitivity in HIV-C strains isolated from humans over time, indicating parallel HIV-C Env evolution in humans and RM. CONCLUSIONS/SIGNIFICANCE: SHIV-1157ipEL-p, the first tier 1 R5 clade C SHIV, and SHIV-1157ipd3N4, its tier 2 counterpart, represent biologically relevant tools for anti-HIV-C vaccine development in primates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative optical spectroscopy has the potential to provide an effective low cost, and portable solution for cervical pre-cancer screening in resource-limited communities. However, clinical studies to validate the use of this technology in resource-limited settings require low power consumption and good quality control that is minimally influenced by the operator or variable environmental conditions in the field. The goal of this study was to evaluate the effects of two sources of potential error: calibration and pressure on the extraction of absorption and scattering properties of normal cervical tissues in a resource-limited setting in Leogane, Haiti. Our results show that self-calibrated measurements improved scattering measurements through real-time correction of system drift, in addition to minimizing the time required for post-calibration. Variations in pressure (tested without the potential confounding effects of calibration error) caused local changes in vasculature and scatterer density that significantly impacted the tissue absorption and scattering properties Future spectroscopic systems intended for clinical use, particularly where operator training is not viable and environmental conditions unpredictable, should incorporate a real-time self-calibration channel and collect diffuse reflectance spectra at a consistent pressure to maximize data integrity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most powerful analytical tools used in the social sciences are well suited for studying static situations. Static and mechanistic analysis, however, is not adequate to understand the changing world in which we live. In order to adequately address the most pressing social and environmental challenges looming ahead, we need to develop analytical tools for analyzing dynamic situations -particularly institutional change. In this paper, we develop an analytical tool to study institutional change, more specifically, the evolution of rules and norms. We believe that in order for such an analytical tool to be useful to develop a general theory of institutional change, it needs to enable the analyst to concisely record the processes of change in multiple specific settings so that lessons from such settings can eventually be integrated into a more general predictive theory of change. Copyright © The JOIE Foundation 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Duke Medicine utilized interprofessional case conferences (ICCs) from 2008-2012 with the objective of modeling and facilitating development of teamwork skills among diverse health profession students, including physical therapy, physician assistant, medical doctor and nursing. The purpose of this publication was to describe the operational process used to develop and implement the ICCs and measure the success of the ICCs in order to shape future work. The ICCs were offered to develop skills and attitudes essential for participation in healthcare teams. Students were facilitated by faculty of different professions to conduct a comprehensive historical assessment of a standardized patient (SP), determine pertinent physical and lab assessments to undertake, and develop and share a comprehensive management plan. Cases included patient problems that were authentic and relevant to each professional student in attendance. The main barriers to implementation are outlined and the focus on the process of working together is highlighted. Evaluation showed high satisfaction rates among participants and the outcomes from these experiences are presented. The limitations of these results are discussed and recommendations for future assessment are emphasized. The ICCs demonstrated that students will come together voluntarily to learn in teams, even at a research-focused institution, and express benefit from the collaborative exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To compare the predictive performance and potential clinical usefulness of risk calculators of the European Randomized Study of Screening for Prostate Cancer (ERSPC RC) with and without information on prostate volume. METHODS: We studied 6 cohorts (5 European and 1 US) with a total of 15,300 men, all biopsied and with pre-biopsy TRUS measurements of prostate volume. Volume was categorized into 3 categories (25, 40, and 60 cc), to reflect use of digital rectal examination (DRE) for volume assessment. Risks of prostate cancer were calculated according to a ERSPC DRE-based RC (including PSA, DRE, prior biopsy, and prostate volume) and a PSA + DRE model (including PSA, DRE, and prior biopsy). Missing data on prostate volume were completed by single imputation. Risk predictions were evaluated with respect to calibration (graphically), discrimination (AUC curve), and clinical usefulness (net benefit, graphically assessed in decision curves). RESULTS: The AUCs of the ERSPC DRE-based RC ranged from 0.61 to 0.77 and were substantially larger than the AUCs of a model based on only PSA + DRE (ranging from 0.56 to 0.72) in each of the 6 cohorts. The ERSPC DRE-based RC provided net benefit over performing a prostate biopsy on the basis of PSA and DRE outcome in five of the six cohorts. CONCLUSIONS: Identifying men at increased risk for having a biopsy detectable prostate cancer should consider multiple factors, including an estimate of prostate volume.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated that promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests behavioral markers can be observed late in the first year of life. Many of these studies involved extensive frame-by-frame video observation and analysis of a child's natural behavior. Although non-intrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are impractical for clinical and large population research purposes. Diagnostic measures for ASD are available for infants but are only accurate when used by specialists experienced in early diagnosis. This work is a first milestone in a long-term multidisciplinary project that aims at helping clinicians and general practitioners accomplish this early detection/measurement task automatically. We focus on providing computer vision tools to measure and identify ASD behavioral markers based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure three critical AOSI activities that assess visual attention. We augment these AOSI activities with an additional test that analyzes asymmetrical patterns in unsupported gait. The first set of algorithms involves assessing head motion by tracking facial features, while the gait analysis relies on joint foreground segmentation and 2D body pose estimation in video. We show results that provide insightful knowledge to augment the clinician's behavioral observations obtained from real in-clinic assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hepatitis delta virus (HDV) ribozyme is a self-cleaving RNA enzyme essential for processing viral transcripts during rolling circle viral replication. The first crystal structure of the cleaved ribozyme was solved in 1998, followed by structures of uncleaved, mutant-inhibited and ion-complexed forms. Recently, methods have been developed that make the task of modeling RNA structure and dynamics significantly easier and more reliable. We have used ERRASER and PHENIX to rebuild and re-refine the cleaved and cis-acting C75U-inhibited structures of the HDV ribozyme. The results correct local conformations and identify alternates for RNA residues, many in functionally important regions, leading to improved R values and model validation statistics for both structures. We compare the rebuilt structures to a higher resolution, trans-acting deoxy-inhibited structure of the ribozyme, and conclude that although both inhibited structures are consistent with the currently accepted hammerhead-like mechanism of cleavage, they do not add direct structural evidence to the biochemical and modeling data. However, the rebuilt structures (PDBs: 4PR6, 4PRF) provide a more robust starting point for research on the dynamics and catalytic mechanism of the HDV ribozyme and demonstrate the power of new techniques to make significant improvements in RNA structures that impact biologically relevant conclusions.