905 resultados para optimising compiler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic kidney disease (CKD) is associated with increased cardiovascular risk in comparison with the general population. This can be observed even in the early stages of CKD, and rises in proportion to the degree of renal impairment. Not only is cardiovascular disease (CVD) more prevalent in CKD, but its nature differs too, with an excess of morbidity and mortality associated with congestive cardiac failure, arrhythmia and sudden death, as well as the accelerated atherosclerosis which is also observed. Conventional cardiovascular risk factors such as hypertension, dyslipidaemia, obesity, glycaemia and smoking, are highly prevalent amongst patients with CKD, although in many of these examples the interaction between risk factor and disease differs from that which exists in normal renal function. Nevertheless, the extent of CVD cannot be fully explained by these conventional risk factors, and non-conventional factors specific to CKD are now recognised to contribute to the burden of CVD. Oxidative stress is a state characterised by excessive production of reactive oxygen species (ROS) and other radical species, a reduction in the capacity of antioxidant systems, and disturbance in normal redox homeostasis with depletion of protective vascular signalling molecules such as nitric oxide (NO). This results in oxidative damage to macromolecules such as lipids, proteins and DNA which can alter their functionality. Moreover, many enzymes are sensitive to redox regulation such that oxidative modification to cysteine thiol groups results in activation of signalling cascades which result in adverse cardiovascular effects such as vascular and endothelial dysfunction. Endothelial dysfunction and oxidative stress are present in association with many conventional cardiovascular risk factors, and can be observed even prior to the development of overt, clinical, vascular pathology, suggesting that these phenomena represent the earliest stages of CVD. In the presence of CKD, there is increased ROS production due to upregulated NADPH oxidase (NOX), increase in a circulating asymmetric dimethylarginine (ADMA), uncoupling of endothelial nitric oxide synthase (eNOS) as well as other mechanisms. There is also depletion in exogenous antioxidants such as ascorbic acid and tocopherol, and a reduction in activity of endogenous antioxidant systems regulated by the master gene regulator Nrf-2. In previous studies, circulating markers of oxidative stress have been shown to be increased in CKD, together with a reduction in endothelial function in a stepwise fashion relating to the severity of renal impairment. Not only is CVD linked to oxidative stress, but the progression of CKD itself is also in part dependent on redox sensitive mechanisms. For example, administration of the ROS scavenger tempol attenuates renal injury and reduces renal fibrosis seen on biopsy in a mouse model of CKD, whilst conversely, supplementation with the NOS inhibitor L-NAME causes proteinuria and renal impairment. Previous human studies examining the effect of antioxidant administration on vascular and renal function have been conflicting however. The work contained in this thesis therefore examines the effect of antioxidant administration on vascular and endothelial function in CKD. Firstly, 30 patients with CKD stages 3 – 5, and 20 matched hypertensive controls were recruited. Participants with CKD had lower ascorbic acid, higher TAP and ADMA, together with higher augmentation index and pulse wave velocity. There was no difference in baseline flow mediated dilatation (FMD) between groups. Intravenous ascorbic acid increased TAP and O2-, and reduced central BP and augmentation index in both groups, and lowered ADMA in the CKD group only. No effect on FMD was observed. The effects of ascorbic acid on kidney function was then investigated, however this was hindered by the inherent drawbacks of existing methods of non-invasively measuring kidney function. Arterial spin labelling MRI is an emerging imaging technique which allows measurement of renal perfusion without administration of an exogenous contrast agent. The technique relies upon application of an inversion pulse to blood within the vasculature proximal to the kidneys, which magnetically labels protons allowing measurement upon transit to the kidney. At the outset of this project local experience using ASL MRI was limited and there ensued a prolonged pre-clinical phase of testing with the aim of optimising imaging strategy. A study was then designed to investigate the repeatability of ASL MRI in a group of 12 healthy volunteers with normal renal function. The measured T1 longitudinal relaxation times and ASL MRI perfusion values were in keeping with those found in the literature; T1 time was 1376 ms in the cortex and 1491 ms in the whole kidney ROI, whilst perfusion was 321 mL/min/100g in the cortex, and 228 mL/min/100g in the whole kidney ROI. There was good reproducibility demonstrated on Bland Altman analysis, with a CVws was 9.2% for cortical perfusion and 7.1% for whole kidney perfusion. Subsequently, in a study of 17 patients with CKD and 24 healthy volunteers, the effects of ascorbic acid on renal perfusion was investigated. Although no change in renal perfusion was found following ascorbic acid, it was found that ASL MRI demonstrated significant differences between those with normal renal function and participants with CKD stages 3 – 5, with increased cortical and whole kidney T1, and reduced cortical and whole kidney perfusion. Interestingly, absolute perfusion showed a weak but significant correlation with progression of kidney disease over the preceding year. Ascorbic acid was therefore shown to have a significant effect on vascular biology both in CKD and in those with normal renal function, and to reduce ADMA only in patients with CKD. ASL MRI has shown promise as a non-invasive investigation of renal function and as a biomarker to identify individuals at high risk of progressive renal impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A smart solar photovoltaic grid system is an advent of innovation coherence of information and communications technology (ICT) with power systems control engineering via the internet [1]. This thesis designs and demonstrates a smart solar photovoltaic grid system that is selfhealing, environmental and consumer friendly, but also with the ability to accommodate other renewable sources of energy generation seamlessly, creating a healthy competitive energy industry and optimising energy assets efficiency. This thesis also presents the modelling of an efficient dynamic smart solar photovoltaic power grid system by exploring the maximum power point tracking efficiency, optimisation of the smart solar photovoltaic array through modelling and simulation to improve the quality of design for the solar photovoltaic module. In contrast, over the past decade quite promising results have been published in literature, most of which have not addressed the basis of the research questions in this thesis. The Levenberg-Marquardt and sparse based algorithms have proven to be very effective tools in helping to improve the quality of design for solar photovoltaic modules, minimising the possible relative errors in this thesis. Guided by theoretical and analytical reviews in literature, this research has carefully chosen the MatLab/Simulink software toolbox for modelling and simulation experiments performed on the static smart solar grid system. The auto-correlation coefficient results obtained from the modelling experiments give an accuracy of 99% with negligible mean square error (MSE), root mean square error (RMSE) and standard deviation. This thesis further explores the design and implementation of a robust real-time online solar photovoltaic monitoring system, establishing a comparative study of two solar photovoltaic tracking systems which provide remote access to the harvested energy data. This research made a landmark innovation in designing and implementing a unique approach for online remote access solar photovoltaic monitoring systems providing updated information of the energy produced by the solar photovoltaic module at the site location. In addressing the challenge of online solar photovoltaic monitoring systems, Darfon online data logger device has been systematically integrated into the design for a comparative study of the two solar photovoltaic tracking systems examined in this thesis. The site location for the comparative study of the solar photovoltaic tracking systems is at the National Kaohsiung University of Applied Sciences, Taiwan, R.O.C. The overall comparative energy output efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic monitoring system as observed at the research location site is about 72% based on the total energy produced, estimated money saved and the amount of CO2 reduction achieved. Similarly, in comparing the total amount of energy produced by the two solar photovoltaic tracking systems, the overall daily generated energy for the month of July shows the effectiveness of the azimuthal-altitude tracking systems over the 450 stationary solar photovoltaic system. It was found that the azimuthal-altitude dual-axis tracking systems were about 68.43% efficient compared to the 450 stationary solar photovoltaic systems. Lastly, the overall comparative hourly energy efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic energy system was found to be 74.2% efficient. Results from this research are quite promising and significant in satisfying the purpose of the research objectives and questions posed in the thesis. The new algorithms introduced in this research and the statistical measures applied to the modelling and simulation of a smart static solar photovoltaic grid system performance outperformed other previous works in reviewed literature. Based on this new implementation design of the online data logging systems for solar photovoltaic monitoring, it is possible for the first time to have online on-site information of the energy produced remotely, fault identification and rectification, maintenance and recovery time deployed as fast as possible. The results presented in this research as Internet of things (IoT) on smart solar grid systems are likely to offer real-life experiences especially both to the existing body of knowledge and the future solar photovoltaic energy industry irrespective of the study site location for the comparative solar photovoltaic tracking systems. While the thesis has contributed to the smart solar photovoltaic grid system, it has also highlighted areas of further research and the need to investigate more on improving the choice and quality design for solar photovoltaic modules. Finally, it has also made recommendations for further research in the minimization of the absolute or relative errors in the quality and design of the smart static solar photovoltaic module.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A poster of this paper will be presented at the 25th International Conference on Parallel Architecture and Compilation Technology (PACT ’16), September 11-15, 2016, Haifa, Israel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Many acute stroke trials have given neutral results. Sub-optimal statistical analyses may be failing to detect efficacy. Methods which take account of the ordinal nature of functional outcome data are more efficient. We compare sample size calculations for dichotomous and ordinal outcomes for use in stroke trials. Methods Data from stroke trials studying the effects of interventions known to positively or negatively alter functional outcome – Rankin Scale and Barthel Index – were assessed. Sample size was calculated using comparisons of proportions, means, medians (according to Payne), and ordinal data (according to Whitehead). The sample sizes gained from each method were compared using Friedman 2 way ANOVA. Results Fifty-five comparisons (54 173 patients) of active vs. control treatment were assessed. Estimated sample sizes differed significantly depending on the method of calculation (Po00001). The ordering of the methods showed that the ordinal method of Whitehead and comparison of means produced significantly lower sample sizes than the other methods. The ordinal data method on average reduced sample size by 28% (inter-quartile range 14–53%) compared with the comparison of proportions; however, a 22% increase in sample size was seen with the ordinal method for trials assessing thrombolysis. The comparison of medians method of Payne gave the largest sample sizes. Conclusions Choosing an ordinal rather than binary method of analysis allows most trials to be, on average, smaller by approximately 28% for a given statistical power. Smaller trial sample sizes may help by reducing time to completion, complexity, and financial expense. However, ordinal methods may not be optimal for interventions which both improve functional outcome

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimisation of real world Variable Data printing (VDP) documents is a dicult problem because the interdependencies between layout functions may drastically reduce the number of invariant blocks that can be factored out for pre-rasterisation. This paper examines how speculative evaluation at an early stage in a document-preparation pipeline, provides a generic and effective method of optimising VDP documents that contain such interdependencies. Speculative evaluation will be at its most effective in speeding up print runs if sets of layout invariances can either be discovered automatically, or designed into the document at an early stage. In either case the expertise of the layout designer needs to be supplemented by expertise in exploiting potential invariances and also in predicting the effects of speculative evaluation on the caches used at various stages in the print production pipeline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on continuing research into the modelling of an order picking process within a Crossdocking distribution centre using Simulation Optimisation. The aim of this project is to optimise a discrete event simulation model and to understand factors that affect finding its optimal performance. Our initial investigation revealed that the precision of the selected simulation output performance measure and the number of replications required for the evaluation of the optimisation objective function through simulation influences the ability of the optimisation technique. We experimented with Common Random Numbers, in order to improve the precision of our simulation output performance measure, and intended to use the number of replications utilised for this purpose as the initial number of replications for the optimisation of our Crossdocking distribution centre simulation model. Our results demonstrate that we can improve the precision of our selected simulation output performance measure value using Common Random Numbers at various levels of replications. Furthermore, after optimising our Crossdocking distribution centre simulation model, we are able to achieve optimal performance using fewer simulations runs for the simulation model which uses Common Random Numbers as compared to the simulation model which does not use Common Random Numbers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La description des termes dans les ressources terminologiques traditionnelles se limite à certaines informations, comme le terme (principalement nominal), sa définition et son équivalent dans une langue étrangère. Cette description donne rarement d’autres informations qui peuvent être très utiles pour l’utilisateur, surtout s’il consulte les ressources dans le but d’approfondir ses connaissances dans un domaine de spécialité, maitriser la rédaction professionnelle ou trouver des contextes où le terme recherché est réalisé. Les informations pouvant être utiles dans ce sens comprennent la description de la structure actancielle des termes, des contextes provenant de sources authentiques et l’inclusion d’autres parties du discours comme les verbes. Les verbes et les noms déverbaux, ou les unités terminologiques prédicatives (UTP), souvent ignorés par la terminologie classique, revêtent une grande importance lorsqu’il s’agit d’exprimer une action, un processus ou un évènement. Or, la description de ces unités nécessite un modèle de description terminologique qui rend compte de leurs particularités. Un certain nombre de terminologues (Condamines 1993, Mathieu-Colas 2002, Gross et Mathieu-Colas 2001 et L’Homme 2012, 2015) ont d’ailleurs proposé des modèles de description basés sur différents cadres théoriques. Notre recherche consiste à proposer une méthodologie de description terminologique des UTP de la langue arabe, notamment l’arabe standard moderne (ASM), selon la théorie de la Sémantique des cadres (Frame Semantics) de Fillmore (1976, 1977, 1982, 1985) et son application, le projet FrameNet (Ruppenhofer et al. 2010). Le domaine de spécialité qui nous intéresse est l’informatique. Dans notre recherche, nous nous appuyons sur un corpus recueilli du web et nous nous inspirons d’une ressource terminologique existante, le DiCoInfo (L’Homme 2008), pour compiler notre propre ressource. Nos objectifs se résument comme suit. Premièrement, nous souhaitons jeter les premières bases d’une version en ASM de cette ressource. Cette version a ses propres particularités : 1) nous visons des unités bien spécifiques, à savoir les UTP verbales et déverbales; 2) la méthodologie développée pour la compilation du DiCoInfo original devra être adaptée pour prendre en compte une langue sémitique. Par la suite, nous souhaitons créer une version en cadres de cette ressource, où nous regroupons les UTP dans des cadres sémantiques, en nous inspirant du modèle de FrameNet. À cette ressource, nous ajoutons les UTP anglaises et françaises, puisque cette partie du travail a une portée multilingue. La méthodologie consiste à extraire automatiquement les unités terminologiques verbales et nominales (UTV et UTN), comme Ham~ala (حمل) (télécharger) et taHmiyl (تحميل) (téléchargement). Pour ce faire, nous avons adapté un extracteur automatique existant, TermoStat (Drouin 2004). Ensuite, à l’aide des critères de validation terminologique (L’Homme 2004), nous validons le statut terminologique d’une partie des candidats. Après la validation, nous procédons à la création de fiches terminologiques, à l’aide d’un éditeur XML, pour chaque UTV et UTN retenue. Ces fiches comprennent certains éléments comme la structure actancielle des UTP et jusqu’à vingt contextes annotés. La dernière étape consiste à créer des cadres sémantiques à partir des UTP de l’ASM. Nous associons également des UTP anglaises et françaises en fonction des cadres créés. Cette association a mené à la création d’une ressource terminologique appelée « DiCoInfo : A Framed Version ». Dans cette ressource, les UTP qui partagent les mêmes propriétés sémantiques et structures actancielles sont regroupées dans des cadres sémantiques. Par exemple, le cadre sémantique Product_development regroupe des UTP comme Taw~ara (طور) (développer), to develop et développer. À la suite de ces étapes, nous avons obtenu un total de 106 UTP ASM compilées dans la version en ASM du DiCoInfo et 57 cadres sémantiques associés à ces unités dans la version en cadres du DiCoInfo. Notre recherche montre que l’ASM peut être décrite avec la méthodologie que nous avons mise au point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past several years have seen the surprising and rapid rise of Bitcoin and other “cryptocurrencies.” These are decentralized peer-to-peer networks that allow users to transmit money, tocompose financial instruments, and to enforce contracts between mutually distrusting peers, andthat show great promise as a foundation for financial infrastructure that is more robust, efficientand equitable than ours today. However, it is difficult to reason about the security of cryptocurrencies. Bitcoin is a complex system, comprising many intricate and subtly-interacting protocol layers. At each layer it features design innovations that (prior to our work) have not undergone any rigorous analysis. Compounding the challenge, Bitcoin is but one of hundreds of competing cryptocurrencies in an ecosystem that is constantly evolving. The goal of this thesis is to formally reason about the security of cryptocurrencies, reining in their complexity, and providing well-defined and justified statements of their guarantees. We provide a formal specification and construction for each layer of an abstract cryptocurrency protocol, and prove that our constructions satisfy their specifications. The contributions of this thesis are centered around two new abstractions: “scratch-off puzzles,” and the “blockchain functionality” model. Scratch-off puzzles are a generalization of the Bitcoin “mining” algorithm, its most iconic and novel design feature. We show how to provide secure upgrades to a cryptocurrency by instantiating the protocol with alternative puzzle schemes. We construct secure puzzles that address important and well-known challenges facing Bitcoin today, including wasted energy and dangerous coalitions. The blockchain functionality is a general-purpose model of a cryptocurrency rooted in the “Universal Composability” cryptography theory. We use this model to express a wide range of applications, including transparent “smart contracts” (like those featured in Bitcoin and Ethereum), and also privacy-preserving applications like sealed-bid auctions. We also construct a new protocol compiler, called Hawk, which translates user-provided specifications into privacy-preserving protocols based on zero-knowledge proofs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Purpose—Most large acute stroke trials have been neutral. Functional outcome is usually analyzed using a yes or no answer, eg, death or dependency versus independence. We assessed which statistical approaches are most efficient in analyzing outcomes from stroke trials. Methods—Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and “3 questions”. Data were analyzed using a variety of approaches which compare 2 treatment groups. The results for each statistical test for each trial were then compared. Results—Data from 55 datasets were obtained (47 trials, 54 173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (2; ANOVA, P0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions—When analyzing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, test, and robust ranks test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To transform data from a research setting into a format that could be used to support strategies encouraging healthy lifestyle choices and service planning within local government. METHODS: Details of the health status and lifestyle behaviours of the Geelong, Victoria, population were generated independently by the Geelong Osteoporosis Study (GOS), a prospective population-based cohort study. Recent GOS follow-up phases provided evidence about patterns of unhealthy diet, physical inactivity, smoking and harmful alcohol use. These factors are well-recognised modifiable risk factors for chronic disease; the dataset was complemented with prevalence estimates for musculoskeletal disease, obesity, diabetes, cardiovascular disease, asthma and cancer. RESULTS: Data were provided to Healthy Together Geelong in aggregate form according to age, sex and suburb. A population statistics company used the data to project health outcomes by suburb for use by local council. This data exchange served as a conduit between epidemiological research and policy development. CONCLUSION AND IMPLICATIONS: Regional policy makers were informed by local evidence, rather than national or state health survey, thereby optimising potential intervention strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Al and Mg machining chip blends were compacted by equal-channel angular pressing with back pressure. By varying the weight fraction of the constituent materials, temperature and processing route, as well as employing subsequent heat treatment, the microstructure and the mechanical properties of the compact were varied. The width of the interdiffusion zone and the formation of intermetallic phases near the interfaces between the two metals were studied by energy-dispersive X-ray spectroscopy and nanoindentation. It was shown that substantial improvement of mechanical properties, such as an increase of strength, strain-hardening capability and ductility, can be obtained. This is achieved by changing the processing parameters of equal-channel angular pressing and the annealing temperature, as well as by optimising the weight fraction of the constituent metals. © 2013 Springer Science+Business Media New York.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To (1) quantify levels of subjective health literacy in people with long-term health conditions (diabetes, cardiovascular disease, chronic obstructive pulmonary disease, musculoskeletal disorders, cancer and mental disorders) and compare these to levels in the general population and (2) examine the association between health literacy, socioeconomic characteristics and comorbidity in each long-term condition group.

DESIGN: Population-based survey in the Central Denmark Region (n=29,473). MAIN

OUTCOME MEASURES: Health literacy was measured using two scales from the Health Literacy Questionnaire (HLQ): (1) Ability to understand health information and (2) Ability to actively engage with healthcare providers.

RESULTS: People with long-term conditions reported more difficulties than the general population in understanding health information and actively engaging with healthcare providers. Wide variation was found between disease groups, with people with cancer having fewer difficulties and people with mental health disorders having more difficulties in actively engaging with healthcare providers than other long-term condition groups. Having more than one long-term condition was associated with more difficulty in engaging with healthcare providers and understanding health information. People with low levels of education had lower health literacy than people with high levels of education.

CONCLUSIONS: Compared with the general population, people with long-term conditions report more difficulties in understanding health information and engaging with healthcare providers. These two dimensions are critical to the provision of patient-centred healthcare and for optimising health outcomes. More effort should be made to respond to the health literacy needs among individuals with long-term conditions, multiple comorbidities and low education levels, to improve health outcomes and to reduce social inequality in health.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The overarching aim of this thesis was to develop an intervention to support patient-centred prescribing in the context of multimorbidity in primary care. Methods A range of research methods were used to address different components of the Medical Research Council, UK (MRC) guidance on the development and evaluation of complex interventions in health care. The existing evidence on GPs’ perceptions of the management of multimorbidity was systematically reviewed. In qualitative interviews, chart-stimulated recall was used to explore the challenges experienced by GPs when prescribing for multimorbid patients. In a cross-sectional study, the psychosocial issues that complicate the management of multimorbidity were examined. To develop the complex intervention, the Behaviour Change Wheel (BCW) was used to integrate behavioural theory with the findings of these three studies. A feasibility study of the new intervention was then conducted with GPs. Results The systematic review revealed four domains of clinical practice where GPs experienced difficulties in multimorbidity. The qualitative interview study showed that GPs responded to these difficulties by ‘satisficing’. In multimorbid patients perceived as stable, GPs preferred to ‘maintain the status quo’ rather than actively change medications. In the cross-sectional study, the significant association between multimorbidity and negative psychosocial factors was shown. These findings informed the development of the ‘Multimorbidity Collaborative Medication Review and Decision-making’ (MY COMRADE) intervention. The intervention involves peer support: two GPs review the medications prescribed to a complex multimorbid patient together. In the feasibility study, GPs reported that the intervention was appropriate for the context of general practice; was widely applicable to their patients with multimorbidity; and recommendations for optimising medications arose from all collaborative reviews. Conclusion Applying theory to empirical data has led to an intervention that is implementable in clinical practice, and has the potential to positively change GPs’ behaviour in the management of medications for patients with multimorbidity.