583 resultados para optimising compiler
Resumo:
Optimisation of real world Variable Data printing (VDP) documents is a dicult problem because the interdependencies between layout functions may drastically reduce the number of invariant blocks that can be factored out for pre-rasterisation. This paper examines how speculative evaluation at an early stage in a document-preparation pipeline, provides a generic and effective method of optimising VDP documents that contain such interdependencies. Speculative evaluation will be at its most effective in speeding up print runs if sets of layout invariances can either be discovered automatically, or designed into the document at an early stage. In either case the expertise of the layout designer needs to be supplemented by expertise in exploiting potential invariances and also in predicting the effects of speculative evaluation on the caches used at various stages in the print production pipeline.
Resumo:
This paper reports on continuing research into the modelling of an order picking process within a Crossdocking distribution centre using Simulation Optimisation. The aim of this project is to optimise a discrete event simulation model and to understand factors that affect finding its optimal performance. Our initial investigation revealed that the precision of the selected simulation output performance measure and the number of replications required for the evaluation of the optimisation objective function through simulation influences the ability of the optimisation technique. We experimented with Common Random Numbers, in order to improve the precision of our simulation output performance measure, and intended to use the number of replications utilised for this purpose as the initial number of replications for the optimisation of our Crossdocking distribution centre simulation model. Our results demonstrate that we can improve the precision of our selected simulation output performance measure value using Common Random Numbers at various levels of replications. Furthermore, after optimising our Crossdocking distribution centre simulation model, we are able to achieve optimal performance using fewer simulations runs for the simulation model which uses Common Random Numbers as compared to the simulation model which does not use Common Random Numbers.
Resumo:
La description des termes dans les ressources terminologiques traditionnelles se limite à certaines informations, comme le terme (principalement nominal), sa définition et son équivalent dans une langue étrangère. Cette description donne rarement d’autres informations qui peuvent être très utiles pour l’utilisateur, surtout s’il consulte les ressources dans le but d’approfondir ses connaissances dans un domaine de spécialité, maitriser la rédaction professionnelle ou trouver des contextes où le terme recherché est réalisé. Les informations pouvant être utiles dans ce sens comprennent la description de la structure actancielle des termes, des contextes provenant de sources authentiques et l’inclusion d’autres parties du discours comme les verbes. Les verbes et les noms déverbaux, ou les unités terminologiques prédicatives (UTP), souvent ignorés par la terminologie classique, revêtent une grande importance lorsqu’il s’agit d’exprimer une action, un processus ou un évènement. Or, la description de ces unités nécessite un modèle de description terminologique qui rend compte de leurs particularités. Un certain nombre de terminologues (Condamines 1993, Mathieu-Colas 2002, Gross et Mathieu-Colas 2001 et L’Homme 2012, 2015) ont d’ailleurs proposé des modèles de description basés sur différents cadres théoriques. Notre recherche consiste à proposer une méthodologie de description terminologique des UTP de la langue arabe, notamment l’arabe standard moderne (ASM), selon la théorie de la Sémantique des cadres (Frame Semantics) de Fillmore (1976, 1977, 1982, 1985) et son application, le projet FrameNet (Ruppenhofer et al. 2010). Le domaine de spécialité qui nous intéresse est l’informatique. Dans notre recherche, nous nous appuyons sur un corpus recueilli du web et nous nous inspirons d’une ressource terminologique existante, le DiCoInfo (L’Homme 2008), pour compiler notre propre ressource. Nos objectifs se résument comme suit. Premièrement, nous souhaitons jeter les premières bases d’une version en ASM de cette ressource. Cette version a ses propres particularités : 1) nous visons des unités bien spécifiques, à savoir les UTP verbales et déverbales; 2) la méthodologie développée pour la compilation du DiCoInfo original devra être adaptée pour prendre en compte une langue sémitique. Par la suite, nous souhaitons créer une version en cadres de cette ressource, où nous regroupons les UTP dans des cadres sémantiques, en nous inspirant du modèle de FrameNet. À cette ressource, nous ajoutons les UTP anglaises et françaises, puisque cette partie du travail a une portée multilingue. La méthodologie consiste à extraire automatiquement les unités terminologiques verbales et nominales (UTV et UTN), comme Ham~ala (حمل) (télécharger) et taHmiyl (تحميل) (téléchargement). Pour ce faire, nous avons adapté un extracteur automatique existant, TermoStat (Drouin 2004). Ensuite, à l’aide des critères de validation terminologique (L’Homme 2004), nous validons le statut terminologique d’une partie des candidats. Après la validation, nous procédons à la création de fiches terminologiques, à l’aide d’un éditeur XML, pour chaque UTV et UTN retenue. Ces fiches comprennent certains éléments comme la structure actancielle des UTP et jusqu’à vingt contextes annotés. La dernière étape consiste à créer des cadres sémantiques à partir des UTP de l’ASM. Nous associons également des UTP anglaises et françaises en fonction des cadres créés. Cette association a mené à la création d’une ressource terminologique appelée « DiCoInfo : A Framed Version ». Dans cette ressource, les UTP qui partagent les mêmes propriétés sémantiques et structures actancielles sont regroupées dans des cadres sémantiques. Par exemple, le cadre sémantique Product_development regroupe des UTP comme Taw~ara (طور) (développer), to develop et développer. À la suite de ces étapes, nous avons obtenu un total de 106 UTP ASM compilées dans la version en ASM du DiCoInfo et 57 cadres sémantiques associés à ces unités dans la version en cadres du DiCoInfo. Notre recherche montre que l’ASM peut être décrite avec la méthodologie que nous avons mise au point.
Resumo:
The past several years have seen the surprising and rapid rise of Bitcoin and other “cryptocurrencies.” These are decentralized peer-to-peer networks that allow users to transmit money, tocompose financial instruments, and to enforce contracts between mutually distrusting peers, andthat show great promise as a foundation for financial infrastructure that is more robust, efficientand equitable than ours today. However, it is difficult to reason about the security of cryptocurrencies. Bitcoin is a complex system, comprising many intricate and subtly-interacting protocol layers. At each layer it features design innovations that (prior to our work) have not undergone any rigorous analysis. Compounding the challenge, Bitcoin is but one of hundreds of competing cryptocurrencies in an ecosystem that is constantly evolving. The goal of this thesis is to formally reason about the security of cryptocurrencies, reining in their complexity, and providing well-defined and justified statements of their guarantees. We provide a formal specification and construction for each layer of an abstract cryptocurrency protocol, and prove that our constructions satisfy their specifications. The contributions of this thesis are centered around two new abstractions: “scratch-off puzzles,” and the “blockchain functionality” model. Scratch-off puzzles are a generalization of the Bitcoin “mining” algorithm, its most iconic and novel design feature. We show how to provide secure upgrades to a cryptocurrency by instantiating the protocol with alternative puzzle schemes. We construct secure puzzles that address important and well-known challenges facing Bitcoin today, including wasted energy and dangerous coalitions. The blockchain functionality is a general-purpose model of a cryptocurrency rooted in the “Universal Composability” cryptography theory. We use this model to express a wide range of applications, including transparent “smart contracts” (like those featured in Bitcoin and Ethereum), and also privacy-preserving applications like sealed-bid auctions. We also construct a new protocol compiler, called Hawk, which translates user-provided specifications into privacy-preserving protocols based on zero-knowledge proofs.
Resumo:
Background and Purpose—Most large acute stroke trials have been neutral. Functional outcome is usually analyzed using a yes or no answer, eg, death or dependency versus independence. We assessed which statistical approaches are most efficient in analyzing outcomes from stroke trials. Methods—Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and “3 questions”. Data were analyzed using a variety of approaches which compare 2 treatment groups. The results for each statistical test for each trial were then compared. Results—Data from 55 datasets were obtained (47 trials, 54 173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (2; ANOVA, P0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions—When analyzing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, test, and robust ranks test.
Resumo:
In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions
Resumo:
In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.
Resumo:
The overarching aim of this thesis was to develop an intervention to support patient-centred prescribing in the context of multimorbidity in primary care. Methods A range of research methods were used to address different components of the Medical Research Council, UK (MRC) guidance on the development and evaluation of complex interventions in health care. The existing evidence on GPs’ perceptions of the management of multimorbidity was systematically reviewed. In qualitative interviews, chart-stimulated recall was used to explore the challenges experienced by GPs when prescribing for multimorbid patients. In a cross-sectional study, the psychosocial issues that complicate the management of multimorbidity were examined. To develop the complex intervention, the Behaviour Change Wheel (BCW) was used to integrate behavioural theory with the findings of these three studies. A feasibility study of the new intervention was then conducted with GPs. Results The systematic review revealed four domains of clinical practice where GPs experienced difficulties in multimorbidity. The qualitative interview study showed that GPs responded to these difficulties by ‘satisficing’. In multimorbid patients perceived as stable, GPs preferred to ‘maintain the status quo’ rather than actively change medications. In the cross-sectional study, the significant association between multimorbidity and negative psychosocial factors was shown. These findings informed the development of the ‘Multimorbidity Collaborative Medication Review and Decision-making’ (MY COMRADE) intervention. The intervention involves peer support: two GPs review the medications prescribed to a complex multimorbid patient together. In the feasibility study, GPs reported that the intervention was appropriate for the context of general practice; was widely applicable to their patients with multimorbidity; and recommendations for optimising medications arose from all collaborative reviews. Conclusion Applying theory to empirical data has led to an intervention that is implementable in clinical practice, and has the potential to positively change GPs’ behaviour in the management of medications for patients with multimorbidity.
Resumo:
Hydrogen is considered as an appealing alternative to fossil fuels in the pursuit of sustainable, secure and prosperous growth in the UK and abroad. However there exists a persisting bottleneck in the effective storage of hydrogen for mobile applications in order to facilitate a wide implementation of hydrogen fuel cells in the fossil fuel dependent transportation industry. To address this issue, new means of solid state chemical hydrogen storage are proposed in this thesis. This involves the coupling of LiH with three different organic amines: melamine, urea and dicyandiamide. In principle, thermodynamically favourable hydrogen release from these systems proceeds via the deprotonation of the protic N-H moieties by the hydridic metal hydride. Simultaneously hydrogen kinetics is expected to be enhanced over heavier hydrides by incorporating lithium ions in the proposed binary hydrogen storage systems. Whilst the concept has been successfully demonstrated by the results obtained in this work, it was observed that optimising the ball milling conditions is central in promoting hydrogen desorption in the proposed systems. The theoretical amount of 6.97 wt% by dry mass of hydrogen was released when heating a ball milled mixture of LiH and melamine (6:1 stoichiometry) to 320 °C. It was observed that ball milling introduces a disruption in the intermolecular hydrogen bonding network that exists in pristine melamine. This effect extends to a molecular level electron redistribution observed as a function of shifting IR bands. It was postulated that stable phases form during the first stages of dehydrogenation which contain the triazine skeleton. Dehydrogenation of this system yields a solid product Li2NCN, which has been rehydrogenated back to melamine via hydrolysis under weak acidic conditions. On the other hand, the LiH and urea system (4:1 stoichiometry) desorbed approximately 5.8 wt% of hydrogen, from the theoretical capacity of 8.78 wt% (dry mass), by 270 °C accompanied by undesirable ammonia and trace amount of water release. The thermal dehydrogenation proceeds via the formation of Li(HN(CO)NH2) at 104.5 °C; which then decomposes to LiOCN and unidentified phases containing C-N moieties by 230 °C. The final products are Li2NCN and Li2O (270 °C) with LiCN and Li2CO3 also detected under certain conditions. It was observed that ball milling can effectively supress ammonia formation. Furthermore results obtained from energetic ball milling experiments have indicated that the barrier to full dehydrogenation between LiH and urea is principally kinetic. Finally the dehydrogenation reaction between LiH and dicyandiamide system (4:1 stoichiometry) occurs through two distinct pathways dependent on the ball milling conditions. When ball milled at 450 RPM for 1 h, dehydrogenation proceeds alongside dicyandiamide condensation by 400 °C whilst at a slower milling speed of 400 RPM for 6h, decomposition occurs via a rapid gas desorption (H2 and NH3) at 85 °C accompanied by sample foaming. The reactant dicyandiamide can be generated by hydrolysis using the product Li2NCN.
Resumo:
Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations.
Resumo:
Cleaning is one of the most important and delicate procedures that are part of the restoration process. When developing new systems, it is fundamental to consider its selectivity towards the layer to-be-removed, non-invasiveness towards the one to-be-preserved, its sustainability and non-toxicity. Besides assessing its efficacy, it is important to understand its mechanism by analytical protocols that strike a balance between cost, practicality, and reliable interpretation of results. In this thesis, the development of cleaning systems based on the coupling of electrospun fabrics (ES) and greener organic solvents is proposed. Electrospinning is a versatile technique that allows the production of micro/nanostructured non-woven mats, which have already been used as absorbents in various scientific fields, but to date, not in the restoration field. The systems produced proved to be effective for the removal of dammar varnish from paintings, where the ES not only act as solvent-binding agents but also as adsorbents towards the partially solubilised varnish due to capillary rise, thus enabling a one-step procedure. They have also been successfully applied for the removal of spray varnish from marble substrates and wall paintings. Due to the materials' complexity, the procedure had to be adapted case-by-case and mechanical action was still necessary. According to the spinning solution, three types of ES mats have been produced: polyamide 6,6, pullulan and pullulan with melanin nanoparticles. The latter, under irradiation, allows for a localised temperature increase accelerating and facilitating the removal of less soluble layers (e.g. reticulated alkyd-based paints). All the systems produced, and the mock-ups used were extensively characterised using multi-analytical protocols. Finally, a monitoring protocol and image treatment based on photoluminescence macro-imaging is proposed. This set-up allowed the study of the removal mechanism of dammar varnish and semi-quantify its residues. These initial results form the basis for optimising the acquisition set-up and data processing.
Resumo:
L'analisi di codice compilato è un'attività sempre più richiesta e necessaria, critica per la sicurezza e stabilità delle infrastrutture informatiche utilizzate in tutto il mondo. Le tipologie di file binari da analizzare sono numerose e in costante evoluzione, si può passare da applicativi desktop o mobile a firmware di router o baseband. Scopo della tesi è progettare e realizzare Dragonlifter, un convertitore da codice compilato a C che sia estendibile e in grado di supportare un numero elevato di architetture, sistemi operativi e formati file. Questo rende possibile eseguire programmi compilati per altre architetture, tracciare la loro esecuzione e modificarli per mitigare vulnerabilità o cambiarne il comportamento.
Resumo:
Numerous types of acute respiratory failure are routinely treated using non-invasive ventilatory support (NIV). Its efficacy is well documented: NIV lowers intubation and death rates in various respiratory disorders. It can be delivered by means of face masks or head helmets. Currently the scientific community’s interest about NIV helmets is mostly focused on optimising the mixing between CO2 and clean air and on improving patient comfort. To this end, fluid dynamic analysis plays a particularly important role and a two- pronged approach is frequently employed. While on one hand numerical simulations provide information about the entire flow field and different geometries, they exhibit require huge temporal and computational resources. Experiments on the other hand help to validate simulations and provide results with a much smaller time investment and thus remain at the core of research in fluid dynamics. The aim of this thesis work was to develop a flow bench and to utilise it for the analysis of NIV helmets. A flow test bench and an instrumented mannequin were successfully designed, produced and put into use. Experiments were performed to characterise the helmet interface in terms of pressure drop and flow rate drop over different inlet flow rates and outlet pressure set points. Velocity measurements by means of Particle Image Velocimetry were performed. Pressure drop and flow rate characteristics from experiments were contrasted with CFD data and sufficient agreement was observed between both numerical and experimental results. PIV studies permitted qualitative and quantitative comparisons with numerical simulation data and offered a clear picture of the internal flow behaviour, aiding the identification of coherent flow features.