56 resultados para Run-Time Code Generation, Programming Languages, Object-Oriented Programming
Resumo:
This book consists of eight chapters with writings on improvisation and an introduction by one of the most intriguing improvisers of our generation – Evan Parker.
The initial discussions for this book were carried out during the “Two Thousand + TEN” symposium, which took place on the 6th of November 2010 at the Sonic Arts Research Centre in Belfast. The symposium’s theme was ‘improvisation’. Georgina Born and David Borgo gave keynote addresses, alongside many other excellent speakers, several of who are included in this book.
My contribution is a novel way of editing texts, as I have woven all chapters into an overall piece of writing, interlaced with my own contribution of around 40 pages.
Resumo:
Call control features (e.g., call-divert, voice-mail) are primitive options to which users can subscribe off-line to personalise their service. The configuration of a feature subscription involves choosing and sequencing features from a catalogue and is subject to constraints that prevent undesirable feature interactions at run-time. When the subscription requested by a user is inconsistent, one problem is to find an optimal relaxation, which is a generalisation of the feedback vertex set problem on directed graphs, and thus it is an NP-hard task. We present several constraint programming formulations of the problem. We also present formulations using partial weighted maximum Boolean satisfiability and mixed integer linear programming. We study all these formulations by experimentally comparing them on a variety of randomly generated instances of the feature subscription problem.
Resumo:
The inherent difficulty of thread-based shared-memory programming has recently motivated research in high-level, task-parallel programming models. Recent advances of Task-Parallel models add implicit synchronization, where the system automatically detects and satisfies data dependencies among spawned tasks. However, dynamic dependence analysis incurs significant runtime overheads, because the runtime must track task resources and use this information to schedule tasks while avoiding conflicts and races.
We present SCOOP, a compiler that effectively integrates static and dynamic analysis in code generation. SCOOP combines context-sensitive points-to, control-flow, escape, and effect analyses to remove redundant dependence checks at runtime. Our static analysis can work in combination with existing dynamic analyses and task-parallel runtimes that use annotations to specify tasks and their memory footprints. We use our static dependence analysis to detect non-conflicting tasks and an existing dynamic analysis to handle the remaining dependencies. We evaluate the resulting hybrid dependence analysis on a set of task-parallel programs.
Resumo:
Here we describe the development of the MALTS software which is a generalized tool that simulates Lorentz Transmission Electron Microscopy (LTEM) contrast of magnetic nanostructures. Complex magnetic nanostructures typically have multiple stable domain structures. MALTS works in conjunction with the open access micromagnetic software Object Oriented Micromagnetic Framework or MuMax. Magnetically stable trial magnetization states of the object of interest are input into MALTS and simulated LTEM images are output. MALTS computes the magnetic and electric phases accrued by the transmitted electrons via the Aharonov-Bohm expressions. Transfer and envelope functions are used to simulate the progression of the electron wave through the microscope lenses. The final contrast image due to these effects is determined by Fourier Optics. Similar approaches have been used previously for simulations of specific cases of LTEM contrast. The novelty here is the integration with micromagnetic codes via a simple user interface enabling the computation of the contrast from any structure. The output from MALTS is in good agreement with both experimental data and published LTEM simulations. A widely-available generalized code for the analysis of Lorentz contrast is a much needed step towards the use of LTEM as a standardized laboratory technique.
Resumo:
In this paper we continue our investigation into the development of computational-science software based on the identification and formal specification of Abstract Data Types (ADTs) and their implementation in Fortran 90. In particular, we consider the consequences of using pointers when implementing a formally specified ADT in Fortran 90. Our aim is to highlight the resulting conflict between the goal of information hiding, which is central to the ADT methodology, and the space efficiency of the implementation. We show that the issue of storage recovery cannot be avoided by the ADT user, and present a range of implementations of a simple ADT to illustrate various approaches towards satisfactory storage management. Finally, we propose a set of guidelines for implementing ADTs using pointers in Fortran 90. These guidelines offer a way gracefully to provide disposal operations in Fortran 90. Such an approach is desirable since Fortran 90 does not provide automatic garbage collection which is offered by many object-oriented languages including Eiffel, Java, Smalltalk, and Simula.
Resumo:
Throughout the world the share of wind power in the generation mix is increasing. In the All Island Grid, of the Republic of Ireland and Northern Ireland there is now over 1.5 GW of installed wind power. As the penetration of these variable, non-dispatchable generators increases, power systems are becoming more sensitive to weather events on the supply side as well as on the demand side. In the temperate climate of Ireland, sensitivity of supply to weather is mainly due to wind variability while demand sensitivity is driven by space heating or cooling loads. The interplay of these two weather-driven effects is of particular concern if demand spikes driven by low temperatures coincide with periods of low winds. In December 2009 and January 2010 Ireland experienced a prolonged spell of unusually cold conditions. During much of this time, wind generation output was low due to low wind speeds. The impacts of this event are presented as a case study of the effects of weather extremes on power systems with high penetrations of variable renewable generation.
Resumo:
In this research, an agent-based model (ABM) was developed to generate human movement routes between homes and water resources in a rural setting, given commonly available geospatial datasets on population distribution, land cover and landscape resources. ABMs are an object-oriented computational approach to modelling a system, focusing on the interactions of autonomous agents, and aiming to assess the impact of these agents and their interactions on the system as a whole. An A* pathfinding algorithm was implemented to produce walking routes, given data on the terrain in the area. A* is an extension of Dijkstra's algorithm with an enhanced time performance through the use of heuristics. In this example, it was possible to impute daily activity movement patterns to the water resource for all villages in a 75 km long study transect across the Luangwa Valley, Zambia, and the simulated human movements were statistically similar to empirical observations on travel times to the water resource (Chi-squared, 95% confidence interval). This indicates that it is possible to produce realistic data regarding human movements without costly measurement as is commonly achieved, for example, through GPS, or retrospective or real-time diaries. The approach is transferable between different geographical locations, and the product can be useful in providing an insight into human movement patterns, and therefore has use in many human exposure-related applications, specifically epidemiological research in rural areas, where spatial heterogeneity in the disease landscape, and space-time proximity of individuals, can play a crucial role in disease spread.
Resumo:
Multi-agent systems have become increasingly mature, but their appearance does not make the traditional OO approach obsolete. On the contrary, OO methodologies can benefit from the principles and tools designed for agent systems. The Agent-Rule-Class (ARC) framework is proposed as an approach that builds agents upon traditional OO system components and makes use of business rules to dictate agent behaviour with the aid of OO components. By modelling agent knowledge in business rules, the proposed paradigm provides a straightforward means to develop agent-oriented systems based on the existing object-oriented systems and offers features that are otherwise difficult to achieve in the original OO systems. The main outcome of using ARC is the achievement of adaptivity. The framework is supported by a tool that ensures agents implement up-to-date requirements from business people, reflecting desired current behaviour, without the need for frequent system rebuilds. ARC is illustrated with a rail track example.
Resumo:
The census and similar sources of data have been published for two centuries so the information that they contain should provide an unparalleled insight into the changing population of Britain over this time period. To date, however, the seemingly trivial problem of changes in boundaries has seriously hampered the use of these sources as they make it impossible to create long run time series of spatially detailed data. The paper reviews methodologies that attempt to resolve this problem by using geographical information systems and areal inter-polation to allow the reallocation of data from one set of administrative units onto another. This makes it possible to examine change over time for a standard geography and thus it becomes possible to unlock the spatial detail and the temporal depth that are held in the census and in related sources.
Resumo:
In Run Time Reconfiguration (RTR) systems, the amount of reconfiguration is considerable when compared to the circuit changes implemented. This is because reconfiguration is not considered as part of the design flow. This paper presents a method for reconfigurable circuit design by modeling the underlying FPGA reconfigurable circuitry and taking it into consideration in the system design. This is demonstrated for an image processing example on the Xilinx Virtex FPGA.
Resumo:
A method is proposed to accelerate the evaluation of the Green's function of an infinite double periodic array of thin wire antennas. The method is based on the expansion of the Green's function into series corresponding to the propagating and evanescent waves and the use of Poisson and Kummer transformations enhanced with the analytic summation of the slowly convergent asymptotic terms. Unlike existing techniques the procedure reported here provides uniform convergence regardless of the geometrical parameters of the problem or plane wave excitation wavelength. In addition, it is numerically stable and does not require numerical integration or internal tuning parameters, since all necessary series are directly calculated in terms of analytical functions. This means that for nonlinear problem scenarios that the algorithm can be deployed without run time intervention or recursive adjustment within a harmonic balance engine. Numerical examples are provided to illustrate the efficiency and accuracy of the developed approach as compared with the Ewald method for which these classes of problems requires run time splitting parameter adaptation.
Resumo:
A rapid liquid chromatographic-tandem mass spectrometric (LC-MS/MS) multi-residue method for the simultaneous quantitation and identification of sixteen synthetic growth promoters and bisphenol A in bovine milk has been developed and validated. Sample preparation was straightforward, efficient and economically advantageous. Milk was extracted with acetonitrile followed by phase separation with NaCl. After centrifugation, the extract was purified by dispersive solid-phase extraction with C18 sorbent material. The compounds were analysed by reversed-phase LC-MS/MS using both positive and negative ionization and operated in multiple reaction monitoring (MRM) mode, acquiring two diagnostic product ions from each of the chosen precursor ions for unambiguous confirmation. Total chromatographic run time was less than 10 min for each sample. The method was validated at a level of 1 mu g L-1. A wide variety of deuterated internal standards were used to improve method performance. The accuracy and precision of the method were satisfactory for all analytes. The confirmative quantitative liquid chromatographic tandem mass spectrometric (LC-MS/MS) method was validated according to Commission Decision 2002/657/EC. The decision limit (CC alpha) and the detection capability (CC beta) were found to be below the chosen validation level of 1 mu g L-1 for all compounds. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The potential use of negative electrospray ionisation mass spectrometry (ESI-MS) in the characterisation of the three polyacetylenes common in carrots (Daucus carota) has been assessed. The MS scans have demonstrated that the polyacetylenes undergo a modest degree of in-source decomposition in the negative ionisation mode while the positive ionisation mode has shown predominantly sodiated ions and no [M+H](+) ions. Tandem mass spectrometric (MS/MS) studies have shown that the polyacetylenes follow two distinct fragmentation pathways: one that involves cleavage of the C3-C4 bond and the other with cleavage of the C7-C8 bond. The cleavage of the C7-C8 bond generated product ions m/z 105.0 for falcarinol, m/z 105/107.0 for falcarindiol, m/z 147.0/149.1 for falcarindiol-3-acetate. In addition to these product ions, the transitions m/z 243.2 -> 187.1 (falcarinol), m/z 259.2 -> 203.1 (falcarindiol), m/z 301.2 -> 255.2/203.1 (falcarindiol-3-acetate), mostly from the C3-C4 bond cleavage, can form the basis of multiple reaction monitoring (MRM)-quantitative methods which are poorly represented in the literature. The 'MS3' experimental data confirmed a less pronounced homolytic cleavage site between the C11-C12 bond in the falcarinol-type polacetylenes. The optimised liquid chromatography (LC)/MS conditions have achieved a baseline chromatographic separation of the three polyacetylenes investigated within 40 min total run-time. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
Motivation: We study a stochastic method for approximating the set of local minima in partial RNA folding landscapes associated with a bounded-distance neighbourhood of folding conformations. The conformations are limited to RNA secondary structures without pseudoknots. The method aims at exploring partial energy landscapes pL induced by folding simulations and their underlying neighbourhood relations. It combines an approximation of the number of local optima devised by Garnier and Kallel (2002) with a run-time estimation for identifying sets of local optima established by Reeves and Eremeev (2004).
Results: The method is tested on nine sequences of length between 50 nt and 400 nt, which allows us to compare the results with data generated by RNAsubopt and subsequent barrier tree calculations. On the nine sequences, the method captures on average 92% of local minima with settings designed for a target of 95%. The run-time of the heuristic can be estimated by O(n2D?ln?), where n is the sequence length, ? is the number of local minima in the partial landscape pL under consideration and D is the maximum number of steepest descent steps in attraction basins associated with pL.
Resumo:
For the first time, a simple and validated reversed-phase liquid chromatography (RP-LC) with fluorescence detection has been developed for the simultaneous analysis of glutamate (Glu), ?-aminobutyric acid (GABA), glycine (Gly) and taurine (Tau) in Wistar and tremor rats brain synaptosomes. The samples were separated on a C18 analytical column with gradient elution of methanol and 0.1 mol L-1 potassium acetate at a flow rate of 1 mL min-1. Total run time was approximately 25 min. All calibration curves exhibited good linearity (r 2 > 0.999) within test ranges. The reproducibility was estimated by intra-and inter-day assays and RSD values were less than 2.48%. The recoveries were between 96.32 and 105.21%. The method was successfully applied to the quantification of amino acids in Wistar and tremor rats brain synaptosomes. Through this developed protocol, the levels of Glu in hippocampal and prefrontal cortical synaptosomes of tremor rats were both significantly elevated than those of adult Wistar rats whereas significantly decreased concentrations of GABA and Gly were observed in the hippocampal region of tremor rats without evident difference in the prefrontal cortex between experimental and control groups. In addition, our studies also showed a marked elevation of Tau in tremor rats hippocampal synaptosomes although there was no pronounced difference in the prefrontal cortical region of Wistar and tremor rats.