842 resultados para Quality Model
Resumo:
Background Pharmaceuticals are big business, reporting strong market growth year after year. The ‘gatekeepers’ of this market are prescribers of medicines, who are the major target of pharmaceutical companies, utilizing direct and indirect influences. Methods This paper draws on previous research investigating pharmaceutical company prescribing influences to develop a qualitative model demonstrating the synergism between commercial influences on prescribing. The generic model was used to explore a realistic but hypothetical scenario to ascertain the applicability of the model. Results and Discussion A generic influence model was developed. The model was readily able to be adapted to reflect a realistic practice scenario. Conclusion Prescriber awareness of the linkages between various seemingly separate marketing techniques could potentially improve medicines usage in an evidence-based practice paradigm.
Resumo:
Objective. Leconotide (CVID, AM336, CNSB004) is an omega conopeptide similar to ziconotide, which blocks voltage sensitive calcium channels. However, unlike ziconotide, which must be administered intrathecally, leconotide can be given intravenously because it is less toxic. This study investigated the antihyperalgesic potency of leconotide given intravenously alone and in combinations with morphine-administered intraperitoneally, in a rat model of bone cancer pain. Design. Syngeneic rat prostate cancer cells AT3B-1 were injected into one tibia of male Wistar rats. The tumor expanded within the bone causing hyperalgesia to heat applied to the ipsilateral hind paw. Measurements were made of the maximum dose (MD) of morphine and leconotide given alone and in combinations that caused no effect in an open-field activity monitor, rotarod, and blood pressure and heart rate measurements. Paw withdrawal thresholds from noxious heat were measured. Dose response curves for morphine (0.312–5.0 mg/kg intraperitoneal) and leconotide (0.002–200 µg/kg intravenous) given alone were plotted and responses compared with those caused by morphine and leconotide in combinations. Results. Leconotide caused minimal antihyperalgesic effects when administered alone. Morphine given alone intraperitoneally caused dose-related antihyperalgesic effects (ED50 = 2.40 ± 1.24 mg/kg), which were increased by coadministration of leconotide 20 µg/kg (morphine ED50 = 0.16 ± 1.30 mg/kg); 0.2 µg/kg (morphine ED50 = 0.39 ± 1.27 mg/kg); and 0.02 µg/kg (morphine ED50 = 1.24 ± 1.30 mg/kg). Conclusions. Leconotide caused a significant increase in reversal by morphine of the bone cancer-induced hyperalgesia without increasing the side effect profile of either drug. Clinical Implication. Translation into clinical practice of the method of analgesia described here will improve the quantity and quality of analgesia in patients with bone metastases. The use of an ordinary parenteral route for administration of the calcium channel blocker (leconotide) at low dose opens up the technique to large numbers of patients who could not have an intrathecal catheter for drug administration. Furthermore, the potentiating synergistic effect with morphine on hyperalgesia without increased side effects will lead to greater analgesia with improved quality of life.
Resumo:
Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.
Resumo:
In a tag-based recommender system, the multi-dimensional
A tag-based personalized item recommendation system using tensor modeling and topic model approaches
Resumo:
This research falls in the area of enhancing the quality of tag-based item recommendation systems. It aims to achieve this by employing a multi-dimensional user profile approach and by analyzing the semantic aspects of tags. Tag-based recommender systems have two characteristics that need to be carefully studied in order to build a reliable system. Firstly, the multi-dimensional correlation, called as tag assignment
Resumo:
Fear-related illnesses such as post-traumatic stress disorder (PTSD) impose a tremendous burden on individual quality of life, families, and the national economy. In the military population, 17-20% of services members returning from deployment are diagnosed with PTSD. While treatments have improved for PTSD and are helpful for some, many people continue to suffer despite therapy. The aim of this research is to examine fear memory behaviourally and at the cellular level in the amygdala by using a unique inter-cross strain of high and low fear phenotype mice. An extended outcross C57BL/6J x DBA/2J (F8) are selected for high or low Pavlovian fear memory to context and cue. On presentation of either the original learning context or the cue (tone) mice display high or low levels of freezing as a behavioural measure of fear. In order to identify key aspects of the cellular basis of this difference in fear memory behaviour we are making measurements of protein levels and neuron numbers of a known pathway involved in the consolidation of a long term fear memory (pMAPK). Ongoing studies aim to determine if high fear behaviour is associated with differential signalling in the lateral amygdala compared to low fear behaviour. Additionally, by blocking this pathway in the lateral amygdala (LA), we aim to reduce fear behaviour following Pavlovian fear conditioning. This research will help to unravel the cellular mechanisms underlying high fear behaviour and advance the field toward targeted treatment and improved outcomes, ultimately improving human quality of life.
Resumo:
Enhancing quality of food products and reducing volume of waste during mechanical operations of food industry requires a comprehensive knowledge of material response under loadings. While research has focused on mechanical response of food material, the volume of waste after harvesting and during processing stages is still considerably high in both developing and developed countries. This research aims to develop and evaluate a constitutive model of mechanical response of tough skinned vegetables under postharvest and processing operations. The model focuses on both tensile and compressive properties of pumpkin flesh and peel tissues where the behaviours of these tissues vary depending on various factors such as rheological response and cellular structure. Both elastic and plastic response of tissue were considered in the modelling process and finite elasticity combined with pseudo elasticity theory was applied to generate the model. The outcomes were then validated using the published results of experimental work on pumpkin flesh and peel under uniaxial tensile and compression. The constitutive coefficients for peel under tensile test was α = 25.66 and β = −18.48 Mpa and for flesh α = −5.29 and β = 5.27 Mpa. under compression the constitutive coefficients were α = 4.74 and β = −1.71 Mpa for peel and α = 0.76 and β = −1.86 Mpa for flesh samples. Constitutive curves predicted the values of force precisely and close to the experimental values. The curves were fit for whole stress versus strain curve as well as a section of curve up to bio yield point. The modelling outputs had presented good agreement with the empirical values and the constructive curves exhibited a very similar pattern to the experimental curves. The presented constitutive model can be applied next to other agricultural materials under loading in future.
Resumo:
Purpose: The purpose of this paper is to review, critique and develop a research agenda for the Elaboration Likelihood Model (ELM). The model was introduced by Petty and Cacioppo over three decades ago and has been modified, revised and extended. Given modern communication contexts, it is appropriate to question the model’s validity and relevance. Design/methodology/approach: The authors develop a conceptual approach, based on a fully comprehensive and extensive review and critique of ELM and its development since its inception. Findings: This paper focuses on major issues concerning the ELM. These include model assumptions and its descriptive nature; continuum questions, multi-channel processing and mediating variables before turning to the need to replicate the ELM and to offer recommendations for its future development. Research limitations/implications: This paper offers a series of questions in terms of research implications. These include whether ELM could or should be replicated, its extension, a greater conceptualization of argument quality, an explanation of movement along the continuum and between central and peripheral routes to persuasion, or to use new methodologies and technologies to help better understanding consume thinking and behaviour? All these relate to the current need to explore the relevance of ELM in a more modern context. Practical implications: It is time to question the validity and relevance of the ELM. The diversity of on- and off-line media options and the variants of consumer choice raise significant issues. Originality/value: While the ELM model continues to be widely cited and taught as one of the major cornerstones of persuasion, questions are raised concerning its relevance and validity in 21st century communication contexts.
Resumo:
Emotion and cognition are known to interact during human decision processes. In this study we focus on a specific kind of cognition, namely metacognition. Our experiment induces a negative emotion, worry, during a perceptual task. In a numerosity task subjects have to make a two alternative forced choice and then reveal their confidence in this decision. We measure metacognition in terms of discrimination and calibration abilities. Our results show that metacognition, but not choice, is affected by the level of worry anticipatedbefore the decision. Under worry individuals tend to have better metacognition in terms of the two measures. Furthermore understanding the formation of confidence is better explained with taking into account the level of worry in the model. This study shows the importance of an emotional component in the formation and the quality of the subjective probabilities.
Resumo:
The nature of construction projects and their delivery exposes participants to accidents and dangers. Safety climate serves as a frame of reference for employees to make sense of safety measures in the workplace and adapt their behaviors. Though safety climate research abounds, fewer efforts are made to investigate the formation of a safety climate. An effort to explore forming psychological safety climate, an operationalization of safety climate at the individual level, is an appropriate starting point. Taking the view that projects are social processes, this paper develops a conceptual framework of forming the psychological safety climate, and provides a preliminary validation. The model suggests that management can create the desired psychological safety climate by efforts from structural, perceptual, interactive, and cultural perspectives. Future empirical research can be built on the model to provide a more comprehensive and coherent picture of the determinants of safety climate.
Resumo:
There has been an increasing focus on the development of test methods to evaluate the durability performance of concrete. This paper contributes to this focus by presenting a study that evaluates the effect of water accessible porosity and oven-dry unit weight on the resistance of both normal and light-weight concrete to chloride-ion penetration. Based on the experimental results and regression analyses, empirical models are established to correlate the total charge passed and the chloride migration coefficient with the basic properties of concrete such as water accessible porosity, oven dry unit weight, and compressive strength. These equations can be broadly applied to both normal and lightweight aggregate concretes. The model was also validated by an independent set of experimental results from two different concrete mixtures. The model provides a very good estimate on the concrete’s durability performance in respect to the resistance to chloride ion penetration.
Resumo:
This research established innovative methods and a predictive model to evaluate water quality using the trace element and heavy metal concentrations of drinking water from the greater Brisbane area. Significantly, the combined use of Inductively Coupled Plasma - Mass Spectrometry and Chemometrics can be used worldwide to provide comprehensive, rapid and affordable analyses of elements in drinking water that can have a considerable impact on human health.
Resumo:
This paper presents a layered framework for the purposes of integrating different Socio-Technical Systems (STS) models and perspectives into a whole-of-systems model. Holistic modelling plays a critical role in the engineering of STS due to the interplay between social and technical elements within these systems and resulting emergent behaviour. The framework decomposes STS models into components, where each component is either a static object, dynamic object or behavioural object. Based on existing literature, a classification of the different elements that make up STS, whether it be a social, technical or a natural environment element, is developed; each object can in turn be classified according to the STS elements it represents. Using the proposed framework, it is possible to systematically decompose models to an extent such that points of interface can be identified and the contextual factors required in transforming the component of one model to interface into another is obtained. Using an airport inbound passenger facilitation process as a case study socio-technical system, three different models are analysed: a Business Process Modelling Notation (BPMN) model, Hybrid Queue-based Bayesian Network (HQBN) model and an Agent Based Model (ABM). It is found that the framework enables the modeller to identify non-trivial interface points such as between the spatial interactions of an ABM and the causal reasoning of a HQBN, and between the process activity representation of a BPMN and simulated behavioural performance in a HQBN. Such a framework is a necessary enabler in order to integrate different modelling approaches in understanding and managing STS.
Co-optimisation of indoor environmental quality and energy consumption within urban office buildings
Resumo:
This study aimed to develop a multi-component model that can be used to maximise indoor environmental quality inside mechanically ventilated office buildings, while minimising energy usage. The integrated model, which was developed and validated from fieldwork data, was employed to assess the potential improvement of indoor air quality and energy saving under different ventilation conditions in typical air-conditioned office buildings in the subtropical city of Brisbane, Australia. When operating the ventilation system under predicted optimal conditions of indoor environmental quality and energy conservation and using outdoor air filtration, average indoor particle number (PN) concentration decreased by as much as 77%, while indoor CO2 concentration and energy consumption were not significantly different compared to the normal summer time operating conditions. Benefits of operating the system with this algorithm were most pronounced during the Brisbane’s mild winter. In terms of indoor air quality, average indoor PN and CO2 concentrations decreased by 48% and 24%, respectively, while potential energy savings due to free cooling went as high as 108% of the normal winter time operating conditions. The application of such a model to the operation of ventilation systems can help to significantly improve indoor air quality and energy conservation in air-conditioned office buildings.