925 resultados para Subroutines in Procedural Programming Languages


Relevância:

50.00% 50.00%

Publicador:

Resumo:

Abstract machines provide a certain separation between platformdependent and platform-independent concerns in compilation. Many of the differences between architectures are encapsulated in the speciflc abstract machine implementation and the bytecode is left largely architecture independent. Taking advantage of this fact, we present a framework for estimating upper and lower bounds on the execution times of logic programs running on a bytecode-based abstract machine. Our approach includes a one-time, programindependent proflling stage which calculates constants or functions bounding the execution time of each abstract machine instruction. Then, a compile-time cost estimation phase, using the instruction timing information, infers expressions giving platform-dependent upper and lower bounds on actual execution time as functions of input data sizes for each program. Working at the abstract machine level makes it possible to take into account low-level issues in new architectures and platforms by just reexecuting the calibration stage instead of having to tailor the analysis for each architecture and platform. Applications of such predicted execution times include debugging/veriflcation of time properties, certiflcation of time properties in mobile code, granularity control in parallel/distributed computing, and resource-oriented specialization.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Incorporating the possibility of attaching attributes to variables in a logic programming system has been shown to allow the addition of general constraint solving capabilities to it. This approach is very attractive in that by adding a few primitives any logic programming system can be turned into a generic constraint logic programming system in which constraint solving can be user deñned, and at source level - an extreme example of the "glass box" approach. In this paper we propose a different and novel use for the concept of attributed variables: developing a generic parallel/concurrent (constraint) logic programming system, using the same "glass box" flavor. We argüe that a system which implements attributed variables and a few additional primitives can be easily customized at source level to implement many of the languages and execution models of parallelism and concurrency currently proposed, in both shared memory and distributed systems. We illustrate this through examples and report on an implementation of our ideas.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Incorporating the possibility of attaching attributes to variables in a logic programming system has been shown to allow the addition of general constraint solving capabilities to it. This approach is very attractive in that by adding a few primitives any logic programming system can be turned into a generic constraint logic programming system in which constraint solving can be user defined, and at source level - an extreme example of the "glass box" approach. In this paper we propose a different and novel use for the concept of attributed variables: developing a generic parallel/concurrent (constraint) logic programming system, using the same "glass box" flavor. We argüe that a system which implements attributed variables and a few additional primitives can be easily customized at source level to implement many of the languages and execution models of parallelism and concurrency currently proposed, in both shared memory and distributed systems. We illustrate this through examples.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The main purpose of this work is to describe the case of an online Java Programming course for engineering students to learn computer programming and to practice other non-technicalabilities: online training, self-assessment, teamwork and use of foreign languages. It is important that students develop confidence and competence in these skills, which will be required later in their professional tasks and/or in other engineering courses (life-long learning). Furthermore, this paper presents the pedagogical methodology, the results drawn from this experience and an objective performance comparison with another conventional (face-to-face) Java course.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Recent reviews have indicated that low level level laser therapy (LLLT) is ineffective in lateral elbow tendinopathy (LET) without assessing validity of treatment procedures and doses or the influence of prior steroid injections. Methods: Systematic review with meta-analysis, with primary outcome measures of pain relief and/or global improvement and subgroup analyses of methodological quality, wavelengths and treatment procedures. Results: 18 randomised placebo-controlled trials (RCTs) were identified with 13 RCTs (730 patients) meeting the criteria for meta-analysis. 12 RCTs satisfied half or more of the methodological criteria. Publication bias was detected by Egger's graphical test, which showed a negative direction of bias. Ten of the trials included patients with poor prognosis caused by failed steroid injections or other treatment failures, or long symptom duration or severe baseline pain. The weighted mean difference (WMD) for pain relief was 10.2 mm [95% CI: 3.0 to 17.5] and the RR for global improvement was 1.36 [1.16 to 1.60]. Trials which targeted acupuncture points reported negative results, as did trials with wavelengths 820, 830 and 1064 nm. In a subgroup of five trials with 904 nm lasers and one trial with 632 nm wavelength where the lateral elbow tendon insertions were directly irradiated, WMD for pain relief was 17.2 mm [95% CI: 8.5 to 25.9] and 14.0 mm [95% CI: 7.4 to 20.6] respectively, while RR for global pain improvement was only reported for 904 nm at 1.53 [95% CI: 1.28 to 1.83]. LLLT doses in this subgroup ranged between 0.5 and 7.2 Joules. Secondary outcome measures of painfree grip strength, pain pressure threshold, sick leave and follow-up data from 3 to 8 weeks after the end of treatment, showed consistently significant results in favour of the same LLLT subgroup (p < 0.02). No serious side-effects were reported. Conclusion: LLLT administered with optimal doses of 904 nm and possibly 632 nm wavelengths directly to the lateral elbow tendon insertions, seem to offer short-term pain relief and less disability in LET, both alone and in conjunction with an exercise regimen. This finding contradicts the conclusions of previous reviews which failed to assess treatment procedures, wavelengths and optimal doses.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper addresses the non-preemptive single machine scheduling problem to minimize total tardiness. We are interested in the online version of this problem, where orders arrive at the system at random times. Jobs have to be scheduled without knowledge of what jobs will come afterwards. The processing times and the due dates become known when the order is placed. The order release date occurs only at the beginning of periodic intervals. A customized approximate dynamic programming method is introduced for this problem. The authors also present numerical experiments that assess the reliability of the new approach and show that it performs better than a myopic policy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

These notes follow on from the material that you studied in CSSE1000 Introduction to Computer Systems. There you studied details of logic gates, binary numbers and instruction set architectures using the Atmel AVR microcontroller family as an example. In your present course (METR2800 Team Project I), you need to get on to designing and building an application which will include such a microcontroller. These notes focus on programming an AVR microcontroller in C and provide a number of example programs to illustrate the use of some of the AVR peripheral devices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background. Age-related motor slowing may reflect either motor programming deficits, poorer movement execution, or mere strategic preferences for online guidance of movement. We controlled such preferences, limiting the extent to which movements could be programmed. Methods. Twenty-four young and 24 older adults performed a line drawing task that allowed movements to he prepared in advance in one case (i.e., cue initially available indicating target location) and not in another (i.e., no cue initially available as to target location). Participants connected large or small targets illuminated by light-emitting diodes upon a graphics tablet that sampled pen tip position at 200 Hz. Results. Older adults had a disproportionate difficulty initiating movement when prevented from programming in advance. Older adults produced slower, less efficient movements, particularly when prevented from programming under greater precision requirements. Conclusions. The slower movements of older adults do not simply reflect a preference for online control, as older adults have less efficient movements when forced to reprogram their movements. Age-related motor slowing kinematically resembles that seen in patients with cerebellar dysfunction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the unique collection of additional features of Qu-Prolog, a variant of the Al programming language Prolog, and illustrates how they can be used for implementing DAI applications. By this we mean applications comprising communicating information servers, expert systems, or agents, with sophisticated reasoning capabilities and internal concurrency. Such an application exploits the key features of Qu-Prolog: support for the programming of sound non-clausal inference systems, multi-threading, and high level inter-thread message communication between Qu-Prolog query threads anywhere on the internet. The inter-thread communication uses email style symbolic names for threads, allowing easy construction of distributed applications using public names for threads. How threads react to received messages is specified by a disjunction of reaction rules which the thread periodically executes. A communications API allows smooth integration of components written in C, which to Qu-Prolog, look like remote query threads.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A chance constrained programming model is developed to assist Queensland barley growers make varietal and agronomic decisions in the face of changing product demands and volatile production conditions. Unsuitable or overlooked in many risk programming applications, the chance constrained programming approach nonetheless aptly captures the single-stage decision problem faced by barley growers of whether to plant lower-yielding but potentially higher-priced malting varieties, given a particular expectation of meeting malting grade standards. Different expectations greatly affect the optimal mix of malting and feed barley activities. The analysis highlights the suitability of chance constrained programming to this specific class of farm decision problem.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A modelling framework is developed to determine the joint economic and environmental net benefits of alternative land allocation strategies. Estimates of community preferences for preservation of natural land, derived from a choice modelling study, are used as input to a model of agricultural production in an optimisation framework. The trade-offs between agricultural production and environmental protection are analysed using the sugar industry of the Herbert River district of north Queensland as an example. Spatially-differentiated resource attributes and the opportunity costs of natural land determine the optimal tradeoffs between production and conservation for a range of sugar prices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although planning is important for the functioning of patients with dementia of the Alzheimer Type (DAT), little is known about response programming in DAT. This study used a cueing paradigm coupled with quantitative kinematic analysis to document the preparation and execution of movements made by a group of 12 DAT patients and their age and sex matched controls. Participants connected a series of targets placed upon a WACOM SD420 graphics tablet, in response to the pattern of illumination of a set of light emitting diodes (LEDs). In one condition, participants could programme the upcoming movement, whilst in another they were forced to reprogramme this movement on-line (i.e. they were not provided with advance information about the location of the upcoming target). DAT patients were found to have programming deficits, taking longer to initiate movements; particularly in the absence of cues. While problems spontaneously programming a movement might cause a greater reliance upon on-line guidance, when both groups were required to guide the movement on-line, DAT patients continued to show slower and less efficient movements implying declining sensori-motor function; these differences were not simply due to strategy or medication status. (C) 1997 Elsevier Science Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background-Prasugrel is a novel thienopyridine that reduces new or recurrent myocardial infarctions (MIs) compared with clopidogrel in patients with acute coronary syndrome undergoing percutaneous coronary intervention. This effect must be balanced against an increased bleeding risk. We aimed to characterize the effect of prasugrel with respect to the type, size, and timing of MI using the universal classification of MI. Methods and Results-We studied 13 608 patients with acute coronary syndrome undergoing percutaneous coronary intervention randomized to prasugrel or clopidogrel and treated for 6 to 15 months in the Trial to Assess Improvement in Therapeutic Outcomes by Optimizing Platelet Inhibition With Prasugrel-Thrombolysis in Myocardial Infarction (TRITON-TIMI 38). Each MI underwent supplemental classification as spontaneous, secondary, or sudden cardiac death (types 1, 2, and 3) or procedure related (Types 4 and 5) and examined events occurring early and after 30 days. Prasugrel significantly reduced the overall risk of MI (7.4% versus 9.7%; hazard ratio [HR], 0.76; 95% confidence interval [CI], 0.67 to 0.85; P < 0.0001). This benefit was present for procedure-related MIs (4.9% versus 6.4%; HR, 0.76; 95% CI, 0.66 to 0.88; P = 0.0002) and nonprocedural (type 1, 2, or 3) MIs (2.8% versus 3.7%; HR, 0.72; 95% CI, 0.59 to 0.88; P = 0.0013) and consistently across MI size, including MIs with a biomarker peak >= 5 times the reference limit (HR. 0.74; 95% CI, 0.64 to 0.86; P = 0.0001). In landmark analyses starting at 30 days, patients treated with prasugrel had a lower risk of any MI (2.9% versus 3.7%; HR, 0.77; P = 0.014), including nonprocedural MI (2.3% versus 3.1%; HR, 0.74; 95% CI, 0.60 to 0.92; P = 0.0069). Conclusion-Treatment with prasugrel compared with clopidogrel for up to 15 months in patients with acute coronary syndrome undergoing percutaneous coronary intervention significantly reduces the risk of MIs that are procedure related and spontaneous and those that are small and large, including new MIs occurring during maintenance therapy. (Circulation. 2009; 119: 2758-2764.)