71 resultados para optimising compiler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zeolites exchanged with transition metal cations Co2+, Mn2+, Zn2+ and Cu2+ are capable of storing and delivering a large quantity of nitric oxide in a range of 1.2-2.7 mmolg(-1). The metal ion exchange impacts the pore volumes of zeolite FAU more significantly than LTA. The storage of NO mainly involves coordination of NO to metal cation sites. By exposing zeolites to a moisture atmosphere, the stored nitric oxide can be released. The NO release takes more than 2 hours for the NO concentration decreasing below similar to 5ppb in outlet gas. Its release rate can be controlled by tailoring zeolite frameworks and optimising release conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data flow techniques have been around since the early '70s when they were used in compilers for sequential languages. Shortly after their introduction they were also consideredas a possible model for parallel computing, although the impact here was limited. Recently, however, data flow has been identified as a candidate for efficient implementation of various programming models on multi-core architectures. In most cases, however, the burden of determining data flow "macro" instructions is left to the programmer, while the compiler/run time system manages only the efficient scheduling of these instructions. We discuss a structured parallel programming approach supporting automatic compilation of programs to macro data flow and we show experimental results demonstrating the feasibility of the approach and the efficiency of the resulting "object" code on different classes of state-of-the-art multi-core architectures. The experimental results use different base mechanisms to implement the macro data flow run time support, from plain pthreads with condition variables to more modern and effective lock- and fence-free parallel frameworks. Experimental results comparing efficiency of the proposed approach with those achieved using other, more classical, parallel frameworks are also presented. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND
Social disadvantage can have a significant impact on early child development, health and wellbeing. What happens during this critical period is important for all aspects of development. Caregiving competence and the quality of the environment play an important role in supporting development in young children and parents have an important role to play in optimising child development and mitigating the negative effects of social disadvantage. Home-based child development programmes aim to optimise children's developmental outcomes through educating, training and supporting parents in their own home to provide a more nurturing and stimulating environment for their child.

OBJECTIVES
To determine the effects of home-based programmes aimed specifically at improving developmental outcomes for preschool children from socially disadvantaged families.

SEARCH STRATEGY
We searched the following databases between 7 October and 12 October 2010: Cochrane Central Register of Controlled Trials (CENTRAL) (2010, Issue 4), MEDLINE (1950 to week 4, September 2010), EMBASE (1980 to Week 39, 2010), CINAHL (1937 to current), PsycINFO (1887 to current), ERIC (1966 to current), ASSIA (1987 to current), Sociological Abstracts (1952 to current), Social Science Citation Index (1970 to current). We also searched reference lists of articles.

SELECTION CRITERIA
Randomised controlled trials comparing home-based preschool child development interventions with a 'standard care' control. Participants were parents with children up to the age of school entry who were socially disadvantaged in respect of poverty, lone parenthood or ethnic minority status.

DATA COLLECTION AND ANALYSIS
Two authors independently selected studies, assessed the trials' risk of bias and extracted data.

RESULTS
We included seven studies, which involved 723 participants. We assessed four of the seven studies as being at high risk of bias and three had an unclear risk of bias; the quality of the evidence was difficult to assess as there was often insufficient detail reported to enable any conclusions to be drawn about the methodological rigour of the studies. Four trials involving 285 participants measured cognitive development and we synthesised these data in a meta-analysis. Compared to the control group, there was no statistically significant impact of the intervention on cognitive development (standardised mean difference (SMD) 0.30; 95% confidence interval -0.18 to 0.78). Only three studies reported socioemotional outcomes and there was insufficient data to combine into a meta-analysis. No study reported on adverse effects.

AUTHORS’ CONCLUSIONS
This review does not provide evidence of the effectiveness of home-based interventions that are specifically targeted at improving developmental outcomes for preschool children from socially disadvantaged families. Future studies should endeavour to better document and report their methodological processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The inherent difficulty of thread-based shared-memory programming has recently motivated research in high-level, task-parallel programming models. Recent advances of Task-Parallel models add implicit synchronization, where the system automatically detects and satisfies data dependencies among spawned tasks. However, dynamic dependence analysis incurs significant runtime overheads, because the runtime must track task resources and use this information to schedule tasks while avoiding conflicts and races.
We present SCOOP, a compiler that effectively integrates static and dynamic analysis in code generation. SCOOP combines context-sensitive points-to, control-flow, escape, and effect analyses to remove redundant dependence checks at runtime. Our static analysis can work in combination with existing dynamic analyses and task-parallel runtimes that use annotations to specify tasks and their memory footprints. We use our static dependence analysis to detect non-conflicting tasks and an existing dynamic analysis to handle the remaining dependencies. We evaluate the resulting hybrid dependence analysis on a set of task-parallel programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enhancing sampling and analyzing simulations are central issues in molecular simulation. Recently, we introduced PLUMED, an open-source plug-in that provides some of the most popular molecular dynamics (MD) codes with implementations of a variety of different enhanced sampling algorithms and collective variables (CVs). The rapid changes in this field, in particular new directions in enhanced sampling and dimensionality reduction together with new hardware, require a code that is more flexible and more efficient. We therefore present PLUMED 2 here a,complete rewrite of the code in an object-oriented programming language (C++). This new version introduces greater flexibility and greater modularity, which both extends its core capabilities and makes it far easier to add new methods and CVs. It also has a simpler interface with the MD engines and provides a single software library containing both tools and core facilities. Ultimately, the new code better serves the ever-growing community of users and contributors in coping with the new challenges arising in the field.

Program summary

Program title: PLUMED 2

Catalogue identifier: AEEE_v2_0

Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEEE_v2_0.html

Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland

Licensing provisions: Yes

No. of lines in distributed program, including test data, etc.: 700646

No. of bytes in distributed program, including test data, etc.: 6618136

Distribution format: tar.gz

Programming language: ANSI-C++.

Computer: Any computer capable of running an executable produced by a C++ compiler.

Operating system: Linux operating system, Unix OSs.

Has the code been vectorized or parallelized?: Yes, parallelized using MPI.

RAM: Depends on the number of atoms, the method chosen and the collective variables used.

Classification: 3, 7.7, 23. Catalogue identifier of previous version: AEEE_v1_0.

Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1961.

External routines: GNU libmatheval, Lapack, Bias, MPI. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this research, two different methods have been investigated for optimising the preparation of hydrogenated acrylonitrile butadiene rubber/clay nanocomposites. Commercially available organoclay (Cloisite 20A) has been considered for the preparation of rubber nanocomposites. A detailed analysis has been made to investigate the morphological structure and mechanical behaviour at room temperature and at elevated temperature. Also the influence of organoclays on permeability has been studied. Structural analysis indicates very good dispersion for a low loading of 5 parts per hundred (phr) amount of nanoclays. Significant improvements in mechanical properties have been observed with the addition of organoclays at both room and elevated temperatures. Even with the low level of addition of nanoclays, there was a remarkable reduction in permeability. © Institute of Materials, Minerals and Mining 2011.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diabetic retinopathy (DR) is the leading cause of visual loss in the developed world in those of working age, and its prevalence is predicted to double by 2025. The management of diabetic retinopathy has traditionally relied on screening, on laser treatment delivered by ophthalmologists, and on optimising blood glucose and blood pressure. Recent evidence suggests that the role of systemic factors is more complex than originally thought, and that drugs such as ACE inhibitors, fibrates and glitazones may all influence the course of diabetic macular oedema. Antagonism of vascular endothelial growth factor offers a new therapeutic avenue that may transform the management of diabetic macular oedema. Several other therapeutic options are under investigation and development, including aminoguanidine, sorbinol, ruboxistaurin and autologous stem cell transfusion. © Royal College of Physicians, 2013.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate conceptual models of groundwater systems are essential for correct interpretation of monitoring data in catchment studies. In surface-water dominated hard rock regions, modern ground and surface water monitoring programmes often have very high resolution chemical, meteorological and hydrological observations but lack an equivalent emphasis on the subsurface environment, the properties of which exert a strong control on flow pathways and interactions with surface waters. The reasons for this disparity are the complexity of the system and the difficulty in accurately characterising the subsurface, except locally at outcrops or in boreholes. This is particularly the case in maritime north-western Europe, where a legacy of glacial activity, combined with large areas underlain by heterogeneous igneous and metamorphic bedrock, make the structure and weathering of bedrock difficult to map or model. Traditional approaches which seek to extrapolate information from borehole to field-scale are of limited application in these environments due to the high degree of spatial heterogeneity. Here we apply an integrative and multi-scale approach, optimising and combining standard geophysical techniques to generate a three-dimensional geological conceptual model of the subsurface in a catchment in NE Ireland. Available airborne LiDAR, electromagnetic and magnetic data sets were analysed for the region. At field-scale surface geophysical methods, including electrical resistivity tomography, seismic refraction, ground penetrating radar and magnetic surveys, were used and combined with field mapping of outcrops and borehole testing. The study demonstrates how combined interpretation of multiple methods at a range of scales produces robust three-dimensional conceptual models and a stronger basis for interpreting groundwater and surface water monitoring data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polymer extrusion, in which a polymer is melted and conveyed to a mould or die, forms the basis of most polymer processing techniques. Extruders frequently run at non-optimised conditions and can account for 15–20% of overall process energy losses. In times of increasing energy efficiency such losses are a major concern for the industry. Product quality, which depends on the homogeneity and stability of the melt flow which in turn depends on melt temperature and screw speed, is also an issue of concern of processors. Gear pumps can be used to improve the stability of the production line, but the cost is usually high. Likewise it is possible to introduce energy meters but they also add to the capital cost of the machine. Advanced control incorporating soft sensing capabilities offers opportunities to this industry to improve both quality and energy efficiency. Due to strong correlations between the critical variables, such as the melt temperature and melt pressure, traditional decentralized PID (Proportional–Integral–Derivative) control is incapable of handling such processes if stricter product specifications are imposed or the material is changed from one batch to another. In this paper, new real-time energy monitoring methods have been introduced without the need to install power meters or develop data-driven models. The effects of process settings on energy efficiency and melt quality are then studied based on developed monitoring methods. Process variables include barrel heating temperature, water cooling temperature, and screw speed. Finally, a fuzzy logic controller is developed for a single screw extruder to achieve high melt quality. The resultant performance of the developed controller has shown it to be a satisfactory alternative to the expensive gear pump. Energy efficiency of the extruder can further be achieved by optimising the temperature settings. Experimental results from open-loop control and fuzzy control on a Killion 25 mm single screw extruder are presented to confirm the efficacy of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents the possibility of optimising 3D Organised Mesoporous Silica (OMS) coated with both iron and aluminium oxides for the optimal removal of As(III) and As(V) from synthetic contaminated water. The materials developed were fully characterised and were tested for removing arsenic in batch experiments. The effect of total Al to Fe oxides coating on the selective removal of As(III) and As(V) was studied. It was shown that 8% metal coating was the optimal configuration for the coated OMS materials in removing arsenic. The effect of arsenic initial concentration and pH, kinetics and diffusion mechanisms was studied, modelled and discussed. It was shown that the advantage of an organised material over an un-structured sorbent was very limited in terms of kinetic and diffusion under the experimental conditions. It was shown that physisorption was the main adsorption process involved in As removal by the coated OMS. Maximum adsorption capacity of 55 mg As(V).g-1 was noticed at pH 5 for material coated with 8% Al oxides while 35 mg As(V).g-1 was removed at pH 4 for equivalent material coated with Fe oxides.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The new Global Initiative for Chronic Obstructive Lung Disease (GOLD) 2011 document recommends a combined assessment of chronic obstructive pulmonary disease (COPD) based on current symptoms and future risk.

A large database of primary-care COPD patients across the UK was used to determine COPD distribution and characteristics according to the new GOLD classification. 80 general practices provided patients with a Read code diagnosis of COPD. Electronic and hand searches of patient medical records were undertaken, optimising data capture.

Data for 9219 COPD patients were collected. For the 6283 patients with both forced expiratory volume in 1 s (FEV1) and modified Medical Research Council scores (mean¡SD age 69.2¡10.6 years, body mass index 27.3¡6.2 kg?m-2), GOLD 2011 group distributions were: A (low risk and fewer symptoms) 36.1%, B (low risk and more symptoms) 19.1%, C (high risk and fewer symptoms) 19.6% and D (high risk and more symptoms) 25.3%. This is in contrast with GOLD 2007 stage classification: I (mild) 17.1%, II (moderate) 52.2%, III (severe) 25.5% and IV (very severe) 5.2%. 20% of patients with FEV1 o50% predicted had more than two exacerbations in the previous 12 months. 70% of patients with FEV1 ,50% pred had fewer than two exacerbations in the previous 12 months.

This database, representative of UK primary-care COPD patients, identified greater proportions of patients in the mildest and most severe categories upon comparing 2011 versus 2007 GOLD classifications. Discordance between airflow limitation severity and exacerbation risk was observed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a design paradigm for energy efficient and variation-aware operation of next-generation multicore heterogeneous platforms. The main idea behind the proposed approach lies on the observation that not all operations are equally important in shaping the output quality of various applications and of the overall system. Based on such an observation, we suggest that all levels of the software design stack, including the programming model, compiler, operating system (OS) and run-time system should identify the critical tasks and ensure correct operation of such tasks by assigning them to dynamically adjusted reliable cores/units. Specifically, based on error rates and operating conditions identified by a sense-and-adapt (SeA) unit, the OS selects and sets the right mode of operation of the overall system. The run-time system identifies the critical/less-critical tasks based on special directives and schedules them to the appropriate units that are dynamically adjusted for highly-accurate/approximate operation by tuning their voltage/frequency. Units that execute less significant operations can operate at voltages less than what is required for correct operation and consume less power, if required, since such tasks do not need to be always exact as opposed to the critical ones. Such scheme can lead to energy efficient and reliable operation, while reducing the design cost and overheads of conventional circuit/micro-architecture level techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low-power processors and accelerators that were originally designed for the embedded systems market are emerging as building blocks for servers. Power capping has been actively explored as a technique to reduce the energy footprint of high-performance processors. The opportunities and limitations of power capping on the new low-power processor and accelerator ecosystem are less understood. This paper presents an efficient power capping and management infrastructure for heterogeneous SoCs based on hybrid ARM/FPGA designs. The infrastructure coordinates dynamic voltage and frequency scaling with task allocation on a customised Linux system for the Xilinx Zynq SoC. We present a compiler-assisted power model to guide voltage and frequency scaling, in conjunction with workload allocation between the ARM cores and the FPGA, under given power caps. The model achieves less than 5% estimation bias to mean power consumption. In an FFT case study, the proposed power capping schemes achieve on average 97.5% of the performance of the optimal execution and match the optimal execution in 87.5% of the cases, while always meeting power constraints.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The overall aim of the project was to study the influence of process variables on the distribution of a model active pharmaceutical ingredient (API) during fluidised melt granulation of pharmaceutical granules with a view of optimising product characteristics. Granules were produced using common pharmaceutical excipients; lactose monohydrate using poly ethylene glycol (PEG1500) as a meltable binder. Methylene blue was used as a model API. Empirical models relating the process variables to the granules properties such as granule mean size, product homogeneity and granule strength were developed using the design of experiment approach. Fluidising air velocity and fluidising air temperature were shown to strongly influence the product properties. Optimisation studies showed that strong granules with homogeneous distribution of the active ingredient can be produced at high fluidising air velocity and at high fluidising air temperatures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective:

The aim of this study was to identify sources of anatomical misrepresentation due to the location of camera mounting, tumour motion velocity and image processing artefacts in order to optimise the 4DCT scan protocol and improve geometrical-temporal accuracy.

Methods:

A phantom with an imaging insert was driven with a sinusoidal superior-inferior motion of varying amplitude and period for 4DCT scanning. The length of a high density cube within the insert was measured using treatment planning software to determine the accuracy of its spatial representation. Scan parameters were varied including the tube rotation period and the cine time between reconstructed images. A CT image quality phantom was used to measure various image quality signatures under the scan parameters tested.

Results:

No significant difference in spatial accuracy was found for 4DCT scans carried out using the wall mounted or couch mounted camera for sinusoidal target motion. Greater spatial accuracy was found for 4DCT scans carried out using a tube rotation speed of 0.5s rather than 1.0s. The reduction in image quality when using a faster rotation speed was not enough to require an increase in patient dose.

Conclusions:

4DCT accuracy may be increased by optimising scan parameters, including choosing faster tube rotation speeds. Peak misidentification in the recorded breathing trace leads to spatial artefacts and this risk can be reduced by using a couch mounted infrared camera.

Advances in knowledge:

This study explicitly shows that 4DCT scan accuracy is improved by scanning with a faster CT tube rotation speed.