354 resultados para optimising compilers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on continuing research into the modelling of an order picking process within a Crossdocking distribution centre using Simulation Optimisation. The aim of this project is to optimise a discrete event simulation model and to understand factors that affect finding its optimal performance. Our initial investigation revealed that the precision of the selected simulation output performance measure and the number of replications required for the evaluation of the optimisation objective function through simulation influences the ability of the optimisation technique. We experimented with Common Random Numbers, in order to improve the precision of our simulation output performance measure, and intended to use the number of replications utilised for this purpose as the initial number of replications for the optimisation of our Crossdocking distribution centre simulation model. Our results demonstrate that we can improve the precision of our selected simulation output performance measure value using Common Random Numbers at various levels of replications. Furthermore, after optimising our Crossdocking distribution centre simulation model, we are able to achieve optimal performance using fewer simulations runs for the simulation model which uses Common Random Numbers as compared to the simulation model which does not use Common Random Numbers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Purpose—Most large acute stroke trials have been neutral. Functional outcome is usually analyzed using a yes or no answer, eg, death or dependency versus independence. We assessed which statistical approaches are most efficient in analyzing outcomes from stroke trials. Methods—Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and “3 questions”. Data were analyzed using a variety of approaches which compare 2 treatment groups. The results for each statistical test for each trial were then compared. Results—Data from 55 datasets were obtained (47 trials, 54 173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (2; ANOVA, P0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions—When analyzing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, test, and robust ranks test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The overarching aim of this thesis was to develop an intervention to support patient-centred prescribing in the context of multimorbidity in primary care. Methods A range of research methods were used to address different components of the Medical Research Council, UK (MRC) guidance on the development and evaluation of complex interventions in health care. The existing evidence on GPs’ perceptions of the management of multimorbidity was systematically reviewed. In qualitative interviews, chart-stimulated recall was used to explore the challenges experienced by GPs when prescribing for multimorbid patients. In a cross-sectional study, the psychosocial issues that complicate the management of multimorbidity were examined. To develop the complex intervention, the Behaviour Change Wheel (BCW) was used to integrate behavioural theory with the findings of these three studies. A feasibility study of the new intervention was then conducted with GPs. Results The systematic review revealed four domains of clinical practice where GPs experienced difficulties in multimorbidity. The qualitative interview study showed that GPs responded to these difficulties by ‘satisficing’. In multimorbid patients perceived as stable, GPs preferred to ‘maintain the status quo’ rather than actively change medications. In the cross-sectional study, the significant association between multimorbidity and negative psychosocial factors was shown. These findings informed the development of the ‘Multimorbidity Collaborative Medication Review and Decision-making’ (MY COMRADE) intervention. The intervention involves peer support: two GPs review the medications prescribed to a complex multimorbid patient together. In the feasibility study, GPs reported that the intervention was appropriate for the context of general practice; was widely applicable to their patients with multimorbidity; and recommendations for optimising medications arose from all collaborative reviews. Conclusion Applying theory to empirical data has led to an intervention that is implementable in clinical practice, and has the potential to positively change GPs’ behaviour in the management of medications for patients with multimorbidity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hydrogen is considered as an appealing alternative to fossil fuels in the pursuit of sustainable, secure and prosperous growth in the UK and abroad. However there exists a persisting bottleneck in the effective storage of hydrogen for mobile applications in order to facilitate a wide implementation of hydrogen fuel cells in the fossil fuel dependent transportation industry. To address this issue, new means of solid state chemical hydrogen storage are proposed in this thesis. This involves the coupling of LiH with three different organic amines: melamine, urea and dicyandiamide. In principle, thermodynamically favourable hydrogen release from these systems proceeds via the deprotonation of the protic N-H moieties by the hydridic metal hydride. Simultaneously hydrogen kinetics is expected to be enhanced over heavier hydrides by incorporating lithium ions in the proposed binary hydrogen storage systems. Whilst the concept has been successfully demonstrated by the results obtained in this work, it was observed that optimising the ball milling conditions is central in promoting hydrogen desorption in the proposed systems. The theoretical amount of 6.97 wt% by dry mass of hydrogen was released when heating a ball milled mixture of LiH and melamine (6:1 stoichiometry) to 320 °C. It was observed that ball milling introduces a disruption in the intermolecular hydrogen bonding network that exists in pristine melamine. This effect extends to a molecular level electron redistribution observed as a function of shifting IR bands. It was postulated that stable phases form during the first stages of dehydrogenation which contain the triazine skeleton. Dehydrogenation of this system yields a solid product Li2NCN, which has been rehydrogenated back to melamine via hydrolysis under weak acidic conditions. On the other hand, the LiH and urea system (4:1 stoichiometry) desorbed approximately 5.8 wt% of hydrogen, from the theoretical capacity of 8.78 wt% (dry mass), by 270 °C accompanied by undesirable ammonia and trace amount of water release. The thermal dehydrogenation proceeds via the formation of Li(HN(CO)NH2) at 104.5 °C; which then decomposes to LiOCN and unidentified phases containing C-N moieties by 230 °C. The final products are Li2NCN and Li2O (270 °C) with LiCN and Li2CO3 also detected under certain conditions. It was observed that ball milling can effectively supress ammonia formation. Furthermore results obtained from energetic ball milling experiments have indicated that the barrier to full dehydrogenation between LiH and urea is principally kinetic. Finally the dehydrogenation reaction between LiH and dicyandiamide system (4:1 stoichiometry) occurs through two distinct pathways dependent on the ball milling conditions. When ball milled at 450 RPM for 1 h, dehydrogenation proceeds alongside dicyandiamide condensation by 400 °C whilst at a slower milling speed of 400 RPM for 6h, decomposition occurs via a rapid gas desorption (H2 and NH3) at 85 °C accompanied by sample foaming. The reactant dicyandiamide can be generated by hydrolysis using the product Li2NCN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cleaning is one of the most important and delicate procedures that are part of the restoration process. When developing new systems, it is fundamental to consider its selectivity towards the layer to-be-removed, non-invasiveness towards the one to-be-preserved, its sustainability and non-toxicity. Besides assessing its efficacy, it is important to understand its mechanism by analytical protocols that strike a balance between cost, practicality, and reliable interpretation of results. In this thesis, the development of cleaning systems based on the coupling of electrospun fabrics (ES) and greener organic solvents is proposed. Electrospinning is a versatile technique that allows the production of micro/nanostructured non-woven mats, which have already been used as absorbents in various scientific fields, but to date, not in the restoration field. The systems produced proved to be effective for the removal of dammar varnish from paintings, where the ES not only act as solvent-binding agents but also as adsorbents towards the partially solubilised varnish due to capillary rise, thus enabling a one-step procedure. They have also been successfully applied for the removal of spray varnish from marble substrates and wall paintings. Due to the materials' complexity, the procedure had to be adapted case-by-case and mechanical action was still necessary. According to the spinning solution, three types of ES mats have been produced: polyamide 6,6, pullulan and pullulan with melanin nanoparticles. The latter, under irradiation, allows for a localised temperature increase accelerating and facilitating the removal of less soluble layers (e.g. reticulated alkyd-based paints). All the systems produced, and the mock-ups used were extensively characterised using multi-analytical protocols. Finally, a monitoring protocol and image treatment based on photoluminescence macro-imaging is proposed. This set-up allowed the study of the removal mechanism of dammar varnish and semi-quantify its residues. These initial results form the basis for optimising the acquisition set-up and data processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerous types of acute respiratory failure are routinely treated using non-invasive ventilatory support (NIV). Its efficacy is well documented: NIV lowers intubation and death rates in various respiratory disorders. It can be delivered by means of face masks or head helmets. Currently the scientific community’s interest about NIV helmets is mostly focused on optimising the mixing between CO2 and clean air and on improving patient comfort. To this end, fluid dynamic analysis plays a particularly important role and a two- pronged approach is frequently employed. While on one hand numerical simulations provide information about the entire flow field and different geometries, they exhibit require huge temporal and computational resources. Experiments on the other hand help to validate simulations and provide results with a much smaller time investment and thus remain at the core of research in fluid dynamics. The aim of this thesis work was to develop a flow bench and to utilise it for the analysis of NIV helmets. A flow test bench and an instrumented mannequin were successfully designed, produced and put into use. Experiments were performed to characterise the helmet interface in terms of pressure drop and flow rate drop over different inlet flow rates and outlet pressure set points. Velocity measurements by means of Particle Image Velocimetry were performed. Pressure drop and flow rate characteristics from experiments were contrasted with CFD data and sufficient agreement was observed between both numerical and experimental results. PIV studies permitted qualitative and quantitative comparisons with numerical simulation data and offered a clear picture of the internal flow behaviour, aiding the identification of coherent flow features.