978 resultados para Optimizing Compilation
Resumo:
Effectively using heterogeneous, distributed information has attracted much research in recent years. Current web services technologies have been used successfully in some non data intensive distributed prototype systems. However, most of them can not work well in data intensive environment. This paper provides an infrastructure layer in data intensive environment for the effectively providing spatial information services by using the web services over the Internet. We extensively investigate and analyze the overhead of web services in data intensive environment, and propose some new optimization techniques which can greatly increase the system’s efficiency. Our experiments show that these techniques are suitable to data intensive environment. Finally, we present the requirement of these techniques for the information of web services over the Internet.
Resumo:
High-level language program compilation strategies can be proven correct by modelling the process as a series of refinement steps from source code to a machine-level description. We show how this can be done for programs containing recursively-defined procedures in the well-established predicate transformer semantics for refinement. To do so the formalism is extended with an abstraction of the way stack frames are created at run time for procedure parameters and variables.
Resumo:
Presence-absence surveys are a commonly used method for monitoring broad-scale changes in wildlife distributions. However, the lack of power of these surveys for detecting population trends is problematic for their application in wildlife management. Options for improving power include increasing the sampling effort or arbitrarily relaxing the type I error rate. We present an alternative, whereby targeted sampling of particular habitats in the landscape using information from a habitat model increases power. The advantage of this approach is that it does not require a trade-off with either cost or the Pr(type I error) to achieve greater power. We use a demographic model of koala (Phascolarctos cinereus) population dynamics and simulations of the monitoring process to estimate the power to detect a trend in occupancy for a range of strategies, thereby demonstrating that targeting particular habitat qualities can improve power substantially. If the objective is to detect a decline in occupancy, the optimal strategy is to sample high-quality habitats. Alternatively, if the objective is to detect an increase in occupancy, the optimal strategy is to sample intermediate-quality habitats. The strategies with the highest power remained the same under a range of parameter assumptions, although observation error had a strong influence on the optimal strategy. Our approach specifically applies to monitoring for detecting long-term trends in occupancy or abundance. This is a common and important monitoring objective for wildlife managers, and we provide guidelines for more effectively achieving it.
Resumo:
Purpose. This article explores the experiences of 26 assistive technology (AT) users having a range of physical impairments as they optimized their use of technology in the workplace. Method. A qualitative research design was employed using in-depth, open-ended interviews and observations of AT users in the workplace. Results. Participants identified many factors that limited their use of technology such as discomfort and pain, limited knowledge of the technology's features, and the complexity of the technology. The amount of time required for training, limited work time available for mastery, cost of training and limitations of the training provided, resulted in an over-reliance on trial and error and informal support networks and a sense of isolation. AT users enhanced their use of technology by addressing the ergonomics of the workstation and customizing the technology to address individual needs and strategies. Other key strategies included tailored training and learning support as well as opportunities to practice using the technology and explore its features away from work demands. Conclusions. This research identified structures important for effective AT use in the workplace which need to be put in place to ensure that AT users are able to master and optimize their use of technology.
Resumo:
In this paper we extend the conventional framework of program refinement down to the assembler level. We describe an extension to the Refinement Calculus that supports the refinement of programs in the Guarded Command Language to programs in .NET assembler. This is illustrated by a small example.
Resumo:
Purpose – The purpose of this paper is to investigate the optimization for a placement machine in printed circuit board (PCB) assembly when family setup strategy is adopted. Design/methodology/approach – A complete mathematical model is developed for the integrated problem to optimize feeder arrangement and component placement sequences so as to minimize the makespan for a set of PCB batches. Owing to the complexity of the problem, a specific genetic algorithm (GA) is proposed. Findings – The established model is able to find the minimal makespan for a set of PCB batches through determining the feeder arrangement and placement sequences. However, exact solutions to the problem are not practical due to the complexity. Experimental tests show that the proposed GA can solve the problem both effectively and efficiently. Research limitations/implications – When a placement machine is set up for production of a set of PCB batches, the feeder arrangement of the machine together with the component placement sequencing for each PCB type should be solved simultaneously so as to minimize the overall makespan. Practical implications – The paper investigates the optimization for PCB assembly with family setup strategy, which is adopted by many PCB manufacturers for reducing both setup costs and human errors. Originality/value – The paper investigates the feeder arrangement and placement sequencing problems when family setup strategy is adopted, which has not been studied in the literature.
Resumo:
PURPOSE: To determine whether letter sequences and/or lens-presentation order should be randomized when measuring defocus curves and to assess the most appropriate criterion for calculating the subjective amplitude of accommodation (AoA) from defocus curves. SETTING: Eye Clinic, School of Life & Health Sciences, Aston University, Birmingham, United Kingdom. METHODS: Defocus curves (from +3.00 diopters [D] to -3.00 D in 0.50 D steps) for 6 possible combinations of randomized or nonrandomized letter sequences and/or lens-presentation order were measured in a random order in 20 presbyopic subjects. Subjective AoA was calculated from the defocus curves by curve fitting using various published criteria, and each was correlated to subjective push-up AoA. Objective AoA was measured for comparison of blur tolerance and pupil size. RESULTS: Randomization of lens-presentation order and/or letter sequences, or lack of, did not affect the measured defocus curves (P>.05, analysis of variance). The range of defocus that maintains highest achievable visual acuity (allowing for variability of repeated measurement) was better correlated to (r = 0.84) and agreed best with ( 0.50 D) subjective push-up AoA than any other relative or absolute acuity criterion used in previous studies. CONCLUSIONS: Nonrandomized letters and lens presentation on their own did not affect subjective AoA measured by defocus curves, although their combination should be avoided. Quantification of subjective AoA from defocus curves should be standardized to the range of defocus that maintains the best achievable visual acuity.
Resumo:
This paper presents a Decision Support System framework based on Constrain Logic Programming and offers suggestions for using RFID technology to improve several of the critical procedures involved. This paper suggests that a widely distributed and semi-structured network of waste producing and waste collecting/processing enterprises can improve their planning both by the proposed Decision Support System, but also by implementing RFID technology to update and validate information in a continuous manner. © 2010 IEEE.