905 resultados para optimising compiler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In parallel adaptive finite element simulations the work load on the individual processors may change frequently. To (re)distribute the load evenly over the processors a load balancing heuristic is needed. Common strategies try to minimise subdomain dependencies by optimising the cutsize of the partitioning. However for certain solvers cutsize only plays a minor role, and their convergence is highly dependent on the subdomain shapes. Degenerated subdomain shapes cause them to need significantly more iterations to converge. In this work a new parallel load balancing strategy is introduced which directly addresses the problem of generating and conserving reasonably good subdomain shapes in a dynamically changing Finite Element Simulation. Geometric data is used to formulate several cost functions to rate elements in terms of their suitability to be migrated. The well known diffusive method which calculates the necessary load flow is enhanced by weighting the subdomain edges with the help of these cost functions. The proposed methods have been tested and results are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method is outlined for optimising graph partitions which arise in mapping unstructured mesh calculations to parallel computers. The method employs a relative gain iterative technique to both evenly balance the workload and minimise the number and volume of interprocessor communications. A parallel graph reduction technique is also briefly described and can be used to give a global perspective to the optimisation. The algorithms work efficiently in parallel as well as sequentially and when combined with a fast direct partitioning technique (such as the Greedy algorithm) to give an initial partition, the resulting two-stage process proves itself to be both a powerful and flexible solution to the static graph-partitioning problem. Experiments indicate that the resulting parallel code can provide high quality partitions, independent of the initial partition, within a few seconds. The algorithms can also be used for dynamic load-balancing, reusing existing partitions and in this case the procedures are much faster than static techniques, provide partitions of similar or higher quality and, in comparison, involve the migration of a fraction of the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The identification of patients' health needs is pivotal in optimising the quality of health care, increasing patient satisfaction and directing resource allocation. Health needs are complex and not so easily evaluated as health-related quality of life (HRQL), which is becoming increasingly accepted as a means of providing a more global, patient-orientated assessment of the outcome of health care interventions than the simple medical model. The potential of HRQL as a surrogate measure of healthcare needs has not been evaluated. OBJECTIVES AND METHOD: A generic (Short Form-12; SF-12) and a disease-specific questionnaire (Seattle Angina Questionnaire; SAQ) were tested for their potential to predict health needs in patients with acute coronary disease. A wide range of healthcare needs were determined using a questionnaire specifically developed for this purpose. RESULTS: With the exception of information needs, healthcare needs were highly correlated with health-related quality of life. Patients with limited enjoyment of personal interests, weak financial situation, greater dependency on others to access health services, and dissatisfaction with accommodation reported poorer HRQL (SF-12: p < 0.001; SAQ: p < 0.01). Difficulties with mobility, aids to daily living and activities requiring assistance from someone else were strongly associated with both generic and disease-specific questionnaires (SF-12: r = 0.46-0.55, p < 0.01; SAQ: r = 0.53-0.65, p < 0.001). Variables relating to quality of care and health services were more highly correlated with SAQ components (r = 0.33-0.59) than with SF-12 (r = 0.07-0.33). Overall, the disease-specific Seattle Angina Questionnaire was superior to the generic Short Form-12 in detecting healthcare needs in patients with coronary disease. Receiver-operator curves supported the sensitivity of HRQL tools in detecting health needs. CONCLUSION: Healthcare needs are complex and developing suitable questionnaires to measure these is difficult and time-consuming. Without a satisfactory means of measuring these needs, the extent to which disease impacts on health will continue to be underestimated. Further investigation on larger populations is warranted but HRQL tools appear to be a reasonable proxy for healthcare needs, as they identify the majority of needs in patients with coronary disease, an observation not previously reported in this patient group

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article describes the design and implementation of computer-aided tool called Relational Algebra Translator (RAT) in data base courses, for the teaching of relational algebra. There was a problem when introducing the relational algebra topic in the course EIF 211 Design and Implementation of Databases, which belongs to the career of Engineering in Information Systems of the National University of Costa Rica, because students attending this course were lacking profound mathematical knowledge, which led to a learning problem, being this an important subject to understand what the data bases search and request do RAT comes along to enhance the teaching-learning process.It introduces the architectural and design principles required for its implementation, such as: the language symbol table, the gramatical rules and the basic algorithms that RAT uses to translate from relational algebra to SQL language. This tool has been used for one periods and has demonstrated to be effective in the learning-teaching process.  This urged investigators to publish it in the web site: www.slinfo.una.ac.cr in order for this tool to be used in other university courses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A syntax directed package for converting Revised Algol 68 programs into Algol 68-R form, (where possible) is being developed at Nottingham. The package makes use of J.M. Foster's Syntax Improving Device (SID) [1]. The experience gained has underlined the value of a syntactic approach to problems of this sort. A far wider range of constructs can be translated than would ever be possible by using ad hoc methods. In many respects the difficulties encountered are those of conventional compiler writing, but some intriguing new problems arise when, as in this case, the source language and target language differ relatively little in philosophy and appearance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wide adaptation of Internet Protocol (IP) as de facto protocol for most communication networks has established a need for developing IP capable data link layer protocol solutions for Machine to machine (M2M) and Internet of Things (IoT) networks. However, the wireless networks used for M2M and IoT applications usually lack the resources commonly associated with modern wireless communication networks. The existing IP capable data link layer solutions for wireless IoT networks provide the necessary overhead minimising and frame optimising features, but are often built to be compatible only with IPv6 and specific radio platforms. The objective of this thesis is to design IPv4 compatible data link layer for Netcontrol Oy's narrow band half-duplex packet data radio system. Based on extensive literature research, system modelling and solution concept testing, this thesis proposes the usage of tunslip protocol as the basis for the system data link layer protocol development. In addition to the functionality of tunslip, this thesis discusses the additional network, routing, compression, security and collision avoidance changes required to be made to the radio platform in order for it to be IP compatible while still being able to maintain the point-to-multipoint and multi-hop network characteristics. The data link layer design consists of the radio application, dynamic Maximum Transmission Unit (MTU) optimisation daemon and the tunslip interface. The proposed design uses tunslip for creating an IP capable data link protocol interface. The radio application receives data from tunslip and compresses the packets and uses the IP addressing information for radio network addressing and routing before forwarding the message to radio network. The dynamic MTU size optimisation daemon controls the tunslip interface maximum MTU size according to the link quality assessment calculated from the radio network diagnostic data received from the radio application. For determining the usability of tunslip as the basis for data link layer protocol, testing of the tunslip interface is conducted with both IEEE 802.15.4 radios and packet data radios. The test cases measure the radio network usability for User Datagram Protocol (UDP) based applications without applying any header or content compression. The test results for the packet data radios reveal that the typical success rate for packet reception through a single-hop link is above 99% with a round-trip-delay of 0.315s for 63B packets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exceptions are an important feature of modern programming languages, but their compilation has traditionally been viewed as an advanced topic. In this article we show that the basic method of compiling exceptions using stack unwinding can be explained and verified both simply and precisely, using elementary functional programming techniques. In particular, we develop a compiler for a small language with exceptions, together with a proof of its correctness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In previous work we showed how to verify a compiler for a small language with exceptions. In this article we show how to calculate, as opposed to verify, an abstract machine for this language. The key step is the use of Reynold's defunctionalization, an old program transformation technique that has recently been rejuvenated by the work of Danvy et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence, higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes amongst the agents on solution quality are examined for two multiple-choice optimisation problems. It is shown that partnering strategies that exploit problem-specific knowledge are superior and can counter inappropriate (sub-) fitness measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes for (sub-)fitness evaluation purposes are examined for two multiple-choice optimisation problems. It is shown that random partnering strategies perform best by providing better sampling and more diversity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

COSTA, Umberto Souza; MOREIRA, Anamaria Martins; MUSICANTE, Matin A.; SOUZA NETO, Plácido A. JCML: A specification language for the runtime verification of Java Card programs. Science of Computer Programming. [S.l]: [s.n], 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the use of a hierarchical coevolutionary genetic algorithm under different partnering strategies. Cascading clusters of sub-populations are built from the bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence higher-level sub-populations potentially search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes amongst the sub-populations on solution quality are examined for two constrained optimisation problems. We examine a number of recombination partnering strategies in the construction of higher-level individuals and a number of related schemes for evaluating sub-solutions. It is shown that partnering strategies that exploit problem-specific knowledge are superior and can counter inappropriate (sub-) fitness measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When designing a new passenger ship or naval vessel or modifying an existing design, how do we ensure that the proposed design is safe from an evacuation point of view? In the wake of major maritime disasters such as the Herald of Free Enterprise and the Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerned with the evacuation of passengers and crew at sea are receiving renewed interest. In the maritime industry, ship evacuation models are now recognised by IMO through the publication of the Interim Guidelines for Evacuation Analysis of New and Existing Passenger Ships including Ro-Ro. This approach offers the promise to quickly and efficiently bring evacuation considerations into the design phase, while the ship is "on the drawing board" as well as reviewing and optimising the evacuation provision of the existing fleet. Other applications of this technology include the optimisation of operating procedures for civil and naval vessels such as determining the optimal location of a feature such as a casino, organising major passenger movement events such as boarding/disembarkation or restaurant/theatre changes, determining lean manning requirements, location and number of damage control parties, etc. This paper describes the development of the maritimeEXODUS evacuation model which is fully compliant with IMO requirements and briefly presents an example application to a large passenger ferry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This fishery assessment report describes the commercial stout whiting fishery operation along Australia’s east coast between Sandy Cape and the Queensland-New South Wales border. The fishery is identified by a T4 symbol. This study follows methods applied in (O'Neill & Leigh, 2016a) and extends the results of that study by using the latest data available up to end of March 2016. The fishery statistics reported herein are for fishing years 1991 to 2016. This study analysed stout whiting catch rates from both Queensland and New South Wales (NSW) for all vessels, areas and fishing gears. The 2016 catch rate index from Queensland and NSW waters was 0.86. This means that the 2016 catch rate index was 86% of the mean standardised catch rate. Results showed that there was a stable trend in catch rates from 2012 to 2016, as in the previous study (O'Neill & Leigh, 2016a), with the 2015 and 2014 catch rates 85% of the mean catch rate. The fish-length frequency and age-length-otolith data were translated using two models which showed: • Where patterns of fish age-abundance were estimated from the fish-length frequency and age-length data, there were slightly decreased estimated measures of fish survival at 38% for 2014, compared to fish survival estimates in 2013 at 40%. The 2014 and 2015 estimated age structure was dominated by 1+ and 2+ old fished, with a slightly higher frequency of age 2 - 3 fish for 2015. • Where only the age-length data were used, estimates showed that from 2011 to 2014 the survival index increased. The estimated survival index increased from 35% in 2013 to 64% in 2014, indicating stronger survival of fish as they recruited and aged. Together the stout whiting catch rate and survival indicators showed the recent fishery harvests were sustainable. Since 1997, T4 management (Stout Whiting Fishery) is centred on annual assessments of total allowable commercial catch (TACC). The TACC is assessed before the start of each fishing year using statistical assessment methodologies, namely evaluation of trends in fish catch rates and catch-at-age frequencies measured against management reference points. The TACC has been under-caught in many years. For setting the 2017 T4 stout whiting TACC, the calculations covered a range of settings to account for the variance in the data and provide options for quota change. The overall (averaged) results suggested: • The procedure where the quota was adjusted based on previous TACC setting in year 2016 gave a recommended TACC for 2017 of between 1100 and 1130 t. • The procedure that focussed directly on optimising the average harvest to match target reference points gave a recommended TACC for 2017 of between 860 and 890 t. Use of these estimates to set TACC will depend on management and industry aims for the fishery.