951 resultados para CARA utility function
Resumo:
The study proposes to test the ‘IS-Impact’ index as Analytic Theory (AT). To (a) methodically evaluate the ‘relevance’ qualities of IS-Impact; namely, Utility & Intuitiveness. In so doing, to (b) document an exemplar of ‘a rigorous approach to relevance’, while (c) treating the overarching study as a higher-order case study having AT as the unit-of-analysis, and assessing adequacy of the 6 AT qualities, both for IS-Impact and for similar taxonomies. Also to (d) look beyond IS-Impact to other forms of Design Science, considering the generality of the AT qualities; and (e) further validating IS-Impact in new system organisation contexts taking account of contemporary understandings of construct theorisation, operationalization and validation.
Resumo:
Background Not all cancer patients receive state-of-the-art care and providing regular feedback to clinicians might reduce this problem. The purpose of this study was to assess the utility of various data sources in providing feedback on the quality of cancer care. Methods Published clinical practice guidelines were used to obtain a list of processes-of-care of interest to clinicians. These were assigned to one of four data categories according to their availability and the marginal cost of using them for feedback. Results Only 8 (3%) of 243 processes-of-care could be measured using population-based registry or administrative inpatient data (lowest cost). A further 119 (49%) could be measured using a core clinical registry, which contains information on important prognostic factors (e.g., clinical stage, physiological reserve, hormone-receptor status). Another 88 (36%) required an expanded clinical registry or medical record review; mainly because they concerned long-term management of disease progression (recurrences and metastases) and 28 (11.5%) required patient interview or audio-taping of consultations because they involved information sharing between clinician and patient. Conclusion The advantages of population-based cancer registries and administrative inpatient data are wide coverage and low cost. The disadvantage is that they currently contain information on only a few processes-of-care. In most jurisdictions, clinical cancer registries, which can be used to report on many more processes-of-care, do not cover smaller hospitals. If we are to provide feedback about all patients, not just those in larger academic hospitals with the most developed data systems, then we need to develop sustainable population-based data systems that capture information on prognostic factors at the time of initial diagnosis and information on management of disease progression.
Resumo:
Abstract Opioid drugs, such as morphine, are among the most effective analgesics available. However, their utility for the treatment of chronic pain is limited by side effects including tolerance and dependence. Morphine acts primarily through the mu-opioid receptor (MOP-R) , which is also a target of endogenous opioids. However, unlike endogenous ligands, morphine fails to promote substantial receptor endocytosis both in vitro, and in vivo. Receptor endocytosis serves at least two important functions in signal transduction. First, desensitization and endocytosis act as an "off" switch by uncoupling receptors from G protein. Second, endocytosis functions as an "on" switch, resensitizing receptors by recycling them to the plasma membrane. Thus, both the off and on function of the MOP-R are altered in response to morphine compared to endogenous ligands. To examine whether the low degree of endocytosis induced by morphine contributes to tolerance and dependence, we generated a knockin mouse that expresses a mutant MOP-R that undergoes morphine-induced endocytosis. Morphine remains an excellent antinociceptive agent in these mice. Importantly, these mice display substantially reduced antinociceptive tolerance and physical dependence. These data suggest that opioid drugs with a pharmacological profile similar to morphine but the ability to promote endocytosis could provide analgesia while having a reduced liability for promoting tolerance and dependence
Resumo:
We report that 10% of melanoma tumors and cell lines harbor mutations in the fibroblast growth factor receptor 2 (FGFR2) gene. These novel mutations include three truncating mutations and 20 missense mutations occurring at evolutionary conserved residues in FGFR2 as well as among all four FGFRs. The mutation spectrum is characteristic of those induced by UV radiation. Mapping of these mutations onto the known crystal structures of FGFR2 followed by in vitro and in vivo studies show that these mutations result in receptor loss of function through several distinct mechanisms, including loss of ligand binding affinity, impaired receptor dimerization, destabilization of the extracellular domains, and reduced kinase activity. To our knowledge, this is the first demonstration of loss-of-function mutations in a class IV receptor tyrosine kinase in cancer. Taken into account with our recent discovery of activating FGFR2 mutations in endometrial cancer, we suggest that FGFR2 may join the list of genes that play context-dependent opposing roles in cancer.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
Resumo:
Emotional responses can incite and entice consumers to select a particular product from a row of similar items and thus have a considerable impact on purchase decisions. Consequently, more and more companies are challenging designers to address the emotional impact of their work and to design for emotion and consumerproduct relationships. Furthermore, the creation of emotional attachment to one’s possessions is one way of approaching a sustainable consumer-product relationship. The aim of this research is to gain a deeper understanding of the instantaneous emotional attachment that consumers form with products and its subsequent implications for product development. The foci of the study are visceral design, consumer hedonics and product rhetoric. Studied in a conglomerate they become an area of new investigation: visceral hedonic rhetoric. In this context, the term “visceral hedonic rhetoric” is defined as the properties of a product that persuasively elicit the pursuit of pleasure at an instinctual level of cognition. This study explores visceral hedonic rhetoric evident in the design of interactive products and resides within the context of emotional design research. It employs an empirical approach to understand how consumers respond hedonically on a visceral level to rhetoric in products. Specifically, it examines visceral hedonic responses given by thirty participants to the stimuli of six mobile telephones, six Mp3 players and six USB memory flash drives. The study findings demonstrate a hierarchy of visceral hedonic rhetoric evident in interactive products. This hierarchy of visceral hedonic attributes include: colour, size, shape, intrigue, material, perceived usability, portability, perceived function, novelty, analogy, brand, quality, texture and gender. However, it is the interrelationships between these visceral hedonic attributes that are the most significant findings of this research. Certain associations were revealed between product attribute combinations and consumer perception. The most predominant of these were: gender bias associated with colour selection; the creation of intrigue through a vibrant attention-grabbing colour; perceived ease of use and function; product confidence as a result of brand familiarity and perceived usability; analogous association through familiarity with similar objects and shapes; and the association of longevity with quality, novelty or recent technology. A significant outcome of the research is the distillation of visceral hedonic rhetoric design principles, and a tool to assist designers in harnessing the full potential of visceral hedonic rhetoric. This study contributes to the identification of the emerging research field of visceral hedonic rhetoric. Application of this study’s findings has the potential to provide a hedonic consumer-product relationship that is more meaningful, less disposable and more sustainable. This theory of visceral hedonic rhetoric is not only a significant contribution to design knowledge but is also generally transferable to other research domains, as later suggested in future research avenues.
Resumo:
Diabetes is an increasingly prevalent disease worldwide. Providing early management of the complications can prevent morbidity and mortality in this population. Peripheral neuropathy, a significant complication of diabetes, is the major cause of foot ulceration and amputation in diabetes. Delay in attending to complication of the disease contributes to significant medical expenses for diabetic patients and the community. Early structural changes to the neural components of the retina have been demonstrated to occur prior to the clinically visible retinal vasculature complication of diabetic retinopathy. Additionally visual functionloss has been shown to exist before the ophthalmoscopic manifestations of vasculature damage. The purpose of this thesis was to evaluate the relationship between diabetic peripheral neuropathy and both retinal structure and visual function. The key question was whether diabetic peripheral neuropathy is the potential underlying factor responsible for retinal anatomical change and visual functional loss in people with diabetes. This study was conducted on a cohort with type 2 diabetes. Retinal nerve fibre layer thickness was assessed by means of Optical Coherence Tomography (OCT). Visual function was assessed using two different methods; Standard Automated Perimetry (SAP) and flicker perimetry were performed within the central 30 degrees of fixation. The level of diabetic peripheral neuropathy (DPN) was assessed using two techniques - Quantitative Sensory Testing and Neuropathy Disability Score (NDS). These techniques are known to be capable of detecting DPN at very early stages. NDS has also been shown as a gold standard for detecting 'risk of foot ulceration'. Findings reported in this thesis showed that RNFL thickness, particularly in the inferior quadrant, has a significant association with severity of DPN when the condition has been assessed using NDS. More specifically it was observed that inferior RNFL thickness has the ability to differentiate individuals who are at higher risk of foot ulceration from those who are at lower risk, indicating that RNFL thickness can predict late-staged DPN. Investigating the association between RNFL and QST did not show any meaningful interaction, which indicates that RNFL thickness for this cohort was not as predictive of neuropathy status as NDS. In both of these studies, control participants did not have different results from the type 2 cohort who did not DPN suggesting that RNFL thickness is not a marker for diagnosing DPN at early stages. The latter finding also indicated that diabetes per se, is unlikely to affect the RNFL thickness. Visual function as measured by SAP and flicker perimetry was found to be associated with severity of peripheral neuropathy as measured by NDS. These findings were also capable of differentiating individuals at higher risk of foot ulceration; however, visual function also proved not to be a maker for early diagnosis of DPN. It was found that neither SAP, nor flicker sensitivity have meaningful associations with DPN when neuropathy status was measured using QST. Importantly diabetic retinopathy did not explain any of the findings in these experiments. The work described here is valuable as no other research to date has investigated the association between diabetic peripheral neuropathy and either retinal structure or visual function.
Resumo:
Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, few attempts have been made to explore the structure damage with frequency response functions (FRFs). This paper illustrates the damage identification and condition assessment of a beam structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). In practice, usage of all available FRF data as an input to artificial neural networks makes the training and convergence impossible. Therefore one of the data reduction techniques Principal Component Analysis (PCA) is introduced in the algorithm. In the proposed procedure, a large set of FRFs are divided into sub-sets in order to find the damage indices for different frequency points of different damage scenarios. The basic idea of this method is to establish features of damaged structure using FRFs from different measurement points of different sub-sets of intact structure. Then using these features, damage indices of different damage cases of the structure are identified after reconstructing of available FRF data using PCA. The obtained damage indices corresponding to different damage locations and severities are introduced as input variable to developed artificial neural networks. Finally, the effectiveness of the proposed method is illustrated and validated by using the finite element modal of a beam structure. The illustrated results show that the PCA based damage index is suitable and effective for structural damage detection and condition assessment of building structures.
Resumo:
This article examines, from both within and outside the context of compulsory third party motor vehicle insurance, the different academic and judicial perspectives regarding the relevance of insurance to the imposition of negligence liability via the formulation of legal principle. In particular, the utility of insurance in setting the standard of care held owing by a learner driver to an instructor in Imbree v McNeilly is analysed and the implications of this High Court decision, in light of current jurisprudential argument and for other principles of negligence liability, namely claimant vulnerability, are considered. It concludes that ultimately one’s stance as to the relevance, or otherwise, of insurance to the development of the common law of negligence will be predominately influenced by normative views of torts’ function as an instrument of corrective or distributive justice.