792 resultados para Expected-utility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research has identified a number of putative risk factors that places adolescents at incrementally higher risk for involvement in alcohol and other drug (AOD) use and sexual risk behaviors (SRBs). Such factors include personality characteristics such as sensation-seeking, cognitive factors such as positive expectancies and inhibition conflict as well as peer norm processes. The current study was guided by a conceptual perspective that support the notion that an integrative framework that includes multi-level factors has significant explanatory value for understanding processes associated with the co-occurrence of AOD use and sexual risk behavior outcomes. This study evaluated simultaneously the mediating role of AOD-sex related expectancies and inhibition conflict on antecedents of AOD use and SRBs including sexual sensation-seeking and peer norms for condom use. The sample was drawn from the Enhancing My Personal Options While Evaluating Risk (EMPOWER: Jonathan Tubman, PI), data set (N = 396; aged 12-18 years). Measures used in the study included Sexual Sensation-Seeking Scale, Inhibition Conflict for Condom Use, Risky Sex Scale. All relevant measures had well-documented psychometric properties. A global assessment of alcohol, drug use and sexual risk behaviors was used. Results demonstrated that AOD-sex related expectancies mediated the influence of sexual sensation-seeking on the co-occurrence of alcohol and other drug use and sexual risk behaviors. The evaluation of the integrative model also revealed that sexual sensation-seeking was positively associated with peer norms for condom use. Also, peer norms predicted inhibition conflict among this sample of multi-problem youth. This dissertation research identified mechanisms of risk and protection associated with the co-occurrence of AOD use and SRBs among a multi-problem sample of adolescents receiving treatment for alcohol or drug use and related problems. This study is informative for adolescent-serving programs that address those individual and contextual characteristics that enhance treatment efficacy and effectiveness among adolescents receiving substance use and related problems services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study reports one of the first controlled studies to examine the impact of a school based positive youth development program (Lerner, Fisher, & Weinberg, 2000) on promoting qualitative change in life course experiences as a positive intervention outcome. The study built on a recently proposed relational developmental methodological metanarrative (Overton, 1998) and advances in use of qualitative research methods (Denzin & Lincoln, 2000). The study investigated the use the Life Course Interview (Clausen, 1998) and an integrated qualitative and quantitative data analytic strategy (IQDAS) to provide empirical documentation of the impact the Changing Lives Program on qualitative change in positive identity in a multicultural population of troubled youth in an alternative public high school. The psychosocial life course intervention approach used in this study draws its developmental framework from both psychosocial developmental theory (Erikson, 1968) and life course theory (Elder, 1998) and its intervention strategies from the transformative pedagogy of Freire's (1983/1970). Using the 22 participants in the Intervention Condition and the 10 participants in the Control Condition, RMANOVAs found significantly more positive qualitative change in personal identity for program participants relative to the non-intervention control condition. In addition, the 2X2X2X3 mixed design RMANOVA in which Time (pre, post) was the repeated factor and Condition (Intervention versus Control), Gender, and Ethnicity the between group factors, also found significant interactions for the Time by Gender and Time by Ethnicity. Moreover, the directionality of the basic pattern of change was positive for participants of both genders and all three ethnic groups. The pattern of the moderation effects also indicated a marked tendency for participants in the intervention group to characterize their sense of self as more secure and less negative at the end of the their first semester in the intervention, that was stable across both genders and all three ethnicities. The basic differential pattern of an increase in the intervention condition of a positive characterization of sense of self relative to both pre test and relative to the directionality of the movement of the non-intervention controls, was stable across both genders and all three ethnic groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To report on the responsiveness testing and clinical utility of the 12-item Geriatric Self-Efficacy Index for Urinary Incontinence (GSE-UI). DESIGN: Prospective cohort study. SETTING: Six urinary incontinence (UI) outpatient clinics in Quebec, Canada. PARTICIPANTS: Community-dwelling incontinent adults aged 65 and older. MEASUREMENTS: The abridged 12-item GSE-UI, measuring older adults' level of confidence for preventing urine loss, was administered to all new consecutive incontinent patients 1 week before their initial clinic visit, at baseline, and 3 months posttreatment. At follow-up, a positive rating of improvement in UI was ascertained from patients and their physicians using the Patient's and Clinician's Global Impression of Improvement scales, respectively. Responsiveness of the GSE-UI was calculated using Guyatt's change index. Its clinical utility was determined using receiver operating curves. RESULTS: Eighty-nine of 228 eligible patients (39.0%) participated (mean age 72.6+5.8, range 65–90). At 3-month follow-up, 22.5% of patients were very much better, and 41.6% were a little or much better. Guyatt's change index was 2.6 for patients who changed by a clinically meaningful amount and 1.5 for patients having experienced any level of improvement. An improvement of 14 points on the 12-item GSE-UI had a sensitivity of 75.1% and a specificity of 78.2% for detecting clinically meaningful changes in UI status. Mean GSE-UI scores varied according to improvement status (P<.001) and correlated with changes in quality-of-life scores (r=0.7, P<.001) and reductions in UI episodes (r=0.4, P=.004). CONCLUSION: The GSE-UI is responsive and clinically useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.

Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.

The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.

The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.

All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important aspect of managing chronic liver disease is assessing for evidence of fibrosis. Historically, this has been accomplished using liver biopsy, which is an invasive procedure associated with risk for complications and significant sampling and observer error, limiting the accuracy for determination of fibrosis stage. Hence, several serum biomarkers and imaging methods for noninvasive assessment of liver fibrosis have been developed. In this article, we review the current literature on an important noninvasive imaging modality to measure tissue elastography (FibroScan(®)). This ultrasound-based technique is now increasingly available in many countries and has been shown to be a reliable and safe noninvasive means of assessing disease severity in chronic liver disease of varying etiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over 50% of the world's population live within 3. km of rivers and lakes highlighting the on-going importance of freshwater resources to human health and societal well-being. Whilst covering c. 3.5% of the Earth's non-glaciated land mass, trends in the environmental quality of the world's standing waters (natural lakes and reservoirs) are poorly understood, at least in comparison with rivers, and so evaluation of their current condition and sensitivity to change are global priorities. Here it is argued that a geospatial approach harnessing existing global datasets, along with new generation remote sensing products, offers the basis to characterise trajectories of change in lake properties e.g., water quality, physical structure, hydrological regime and ecological behaviour. This approach furthermore provides the evidence base to understand the relative importance of climatic forcing and/or changing catchment processes, e.g. land cover and soil moisture data, which coupled with climate data provide the basis to model regional water balance and runoff estimates over time. Using examples derived primarily from the Danube Basin but also other parts of the World, we demonstrate the power of the approach and its utility to assess the sensitivity of lake systems to environmental change, and hence better manage these key resources in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human cadavers have long been used to teach human anatomy and are increasingly used in other disciplines. Different embalming techniques have been reported in the literature; however there is no clear consensus on the opinion of anatomists on the utility of embalmed cadavers for the teaching of anatomy. To this end, we aimed to survey British and Irish anatomy teachers to report their opinions on different preservation methods for the teaching of anatomy. In this project eight human cadavers were embalmed using formalin, Genelyn, Thiel and Imperial College London- Soft Preserving (ICL-SP) techniques to compare different characteristics of these four techniques. The results of this thesis show that anatomy teachers consider hard-fixed cadavers not to be the most accurate teaching model in comparison to the human body, although it still serves as a useful teaching method (Chapter 2). In addition, our findings confirm that joints of cadavers embalmed using ICL-SP solution faithfully mimics joints of an unembalmed cadaver compared to the other techniques (Chapter 3). Embalming a human body prevents the deterioration in the quality of images and our findings highlight that the influence of the embalming solutions varied with the radiological modality used (Chapter 4). The method developed as part of this thesis enables anatomists and forensic scientists to quantify the decomposition rate of an embalmed human cadaver (Chapter 5). Formalin embalming solution showed the strongest antimicrobial abilities followed by Thiel, Genelyn and finally by ICL-SP (Chapter 6). The overarching viewpoint of this set of studies show that it is inaccurate to state that one embalming technique is ultimately the best. The value of each technique differs based on the requirement of the particular education or research area. Hence we highlight how different embalming techniques may be better suited to certain fields of study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this review was to examine the utility and accuracy of commercially available motion sensors to measure step-count and time spent upright in frail older hospitalized patients. A database search (CINAHL and PubMed, 2004–2014) and a further hand search of papers’ references yielded 24 validation studies meeting the inclusion criteria. Fifteen motion sensors (eight pedometers, six accelerometers, and one sensor systems) have been tested in older adults. Only three have been tested in hospital patients, two of which detected postures and postural changes accurately, but none estimated step-count accurately. Only one motion sensor remained accurate at speeds typical of frail older hospitalized patients, but it has yet to be tested in this cohort. Time spent upright can be accurately measured in the hospital, but further validation studies are required to determine which, if any, motion sensor can accurately measure step-count.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To report on the responsiveness testing and clinical utility of the 12-item Geriatric Self-Efficacy Index for Urinary Incontinence (GSE-UI). DESIGN: Prospective cohort study. SETTING: Six urinary incontinence (UI) outpatient clinics in Quebec, Canada. PARTICIPANTS: Community-dwelling incontinent adults aged 65 and older. MEASUREMENTS: The abridged 12-item GSE-UI, measuring older adults' level of confidence for preventing urine loss, was administered to all new consecutive incontinent patients 1 week before their initial clinic visit, at baseline, and 3 months posttreatment. At follow-up, a positive rating of improvement in UI was ascertained from patients and their physicians using the Patient's and Clinician's Global Impression of Improvement scales, respectively. Responsiveness of the GSE-UI was calculated using Guyatt's change index. Its clinical utility was determined using receiver operating curves. RESULTS: Eighty-nine of 228 eligible patients (39.0%) participated (mean age 72.6+5.8, range 65–90). At 3-month follow-up, 22.5% of patients were very much better, and 41.6% were a little or much better. Guyatt's change index was 2.6 for patients who changed by a clinically meaningful amount and 1.5 for patients having experienced any level of improvement. An improvement of 14 points on the 12-item GSE-UI had a sensitivity of 75.1% and a specificity of 78.2% for detecting clinically meaningful changes in UI status. Mean GSE-UI scores varied according to improvement status (P<.001) and correlated with changes in quality-of-life scores (r=0.7, P<.001) and reductions in UI episodes (r=0.4, P=.004). CONCLUSION: The GSE-UI is responsive and clinically useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-cognitive skills have caught the attention of current education policy writers in Canada. Within the last 10 years, almost every province has produced a document including the importance of supporting non-cognitive skills in K-12 students in the classroom. Although often called different names (such as learning skills, cross curricular competencies, and 20th Century Skills) and occasionally viewed through different lenses (such as emotional intelligence skills, character skills, and work habits), what unifies non-cognitive skills within the policy documents is the claim that students that are strong in these skills are more successful in academic achievement and are more successful in post-secondary endeavors. Though the interest from policy-makers and educators is clear, there are still many questions about non-cognitive skills that have yet to be answered. These include: What skills are the most important for teacher’s to support in the classroom? What are these skills’ exact contributions to student success? How can teachers best support these skills? Are there currently reliable and valid measures of these skills? These are very important questions worth answering if Canadian teachers are expected to support non-cognitive skills in their classrooms with an already burdened workload. As well, it can begin to untangle the plethora of research that exists within the non-cognitive realm. Without a critical look at the current literature, it is impossible to ensure that these policies are effective in Canadian classrooms, and to see an alignment between research and policy. Upon analysis of Canadian curriculum, five non-cognitive skills were found to be the most prevalent among many of the provinces: Self-Regulation, Collaboration, Initiative, Responsibility and Creativity. The available research literature was then examined to determine the utility of teaching these skills in the classroom (can students improve on these skills, do these skills impact other aspects of students’ lives, and are there methods to validly and reliably assess these skills). It was found that Self-Regulation and Initiative had the strongest basis for being implemented in the classroom. On the other hand, Creativity still requires a lot more justification in terms of its impact on students’ lives and ability to assess in the classroom.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Antimalarial chloroquine (CQ) prevents haematin detoxication when CQ-base concentrates in the acidic digestive vacuole through protonation of its p-aminopyridine (pAP) basic aro- matic nitrogen and sidechain diethyl-N. CQ export through the variant vacuolar membrane export channel, PFCRT, causes CQ-resistance in Plasmodium falciparum but 3-methyl CQ (sontochin SC), des-ethyl amodiaquine (DAQ) and bis 4-aminoquinoline piperaquine (PQ) are still active. This is determined by changes in drug accumulation ratios in parasite lipid (LAR) and in vacuolar water (VAR). Higher LAR may facilitate drug binding to and blocking PFCRT and also aid haematin in lipid to bind drug. LAR for CQ is only 8.3; VAR is 143,482. More hydrophobic SC has LAR 143; VAR remains 68,523. Similarly DAQ with a phenol sub- stituent has LAR of 40.8, with VAR 89,366. In PQ, basicity of each pAP is reduced by distal piperazine N, allowing very high LAR of 973,492, retaining VAR of 104,378. In another bis quinoline, dichlorquinazine (DCQ), also active but clinically unsatisfactory, each pAP retains basicity, being insulated by a 2-carbon chain from a proximal nitrogen of the single linking piperazine. While LAR of 15,488 is still high, the lowest estimate of VAR approaches 4.9 million. DCQ may be expected to be very highly lysosomotropic and therefore potentially hepatotoxic. In 11 pAP antimalarials a quadratic relationship between logLAR and logRe- sistance Index (RI) was confirmed, while log (LAR/VAR) vs logRI for 12 was linear. Both might be used to predict the utility of structural modifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discovery of an ever-expanding plethora of coding and non-coding RNAs with nodal and causal roles in the regulation of lung physiology and disease is reinvigorating interest in the clinical utility of the oligonucleotide therapeutic class. This is strongly supported through recent advances in nucleic acids chemistry, synthetic oligonucleotide delivery and viral gene therapy that have succeeded in bringing to market at least three nucleic acid-based drugs. As a consequence, multiple new candidates such as RNA interference modulators, antisense, and splice switching compounds are now progressing through clinical evaluation. Here, manipulation of RNA for the treatment of lung disease is explored, with emphasis on robust pharmacological evidence aligned to the five pillars of drug development: exposure to the appropriate tissue, binding to the desired molecular target, evidence of the expected mode of action, activity in the relevant patient population and commercially viable value proposition.