974 resultados para Tool path computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data-flow analysis is an integral part of any aggressive optimizing compiler. We propose a framework for improving the precision of data-flow analysis in the presence of complex control-flow. W initially perform data-flow analysis to determine those control-flow merges which cause the loss in data-flow analysis precision. The control-flow graph of the program is then restructured such that performing data-flow analysis on the resulting restructured graph gives more precise results. The proposed framework is both simple, involving the familiar notion of product automata, and also general, since it is applicable to any forward data-flow analysis. Apart from proving that our restructuring process is correct, we also show that restructuring is effective in that it necessarily leads to more optimization opportunities. Furthermore, the framework handles the trade-off between the increase in data-flow precision and the code size increase inherent in the restructuring. We show that determining an optimal restructuring is NP-hard, and propose and evaluate a greedy strategy. The framework has been implemented in the Scale research compiler, and instantiated for the specific problem of Constant Propagation. On the SPECINT 2000 benchmark suite we observe an average speedup of 4% in the running times over Wegman-Zadeck conditional constant propagation algorithm and 2% over a purely path profile guided approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network data packet capture and replay capabilities are basic requirements for forensic analysis of faults and security-related anomalies, as well as for testing and development. Cyber-physical networks, in which data packets are used to monitor and control physical devices, must operate within strict timing constraints, in order to match the hardware devices' characteristics. Standard network monitoring tools are unsuitable for such systems because they cannot guarantee to capture all data packets, may introduce their own traffic into the network, and cannot reliably reproduce the original timing of data packets. Here we present a high-speed network forensics tool specifically designed for capturing and replaying data traffic in Supervisory Control and Data Acquisition systems. Unlike general-purpose "packet capture" tools it does not affect the observed network's data traffic and guarantees that the original packet ordering is preserved. Most importantly, it allows replay of network traffic precisely matching its original timing. The tool was implemented by developing novel user interface and back-end software for a special-purpose network interface card. Experimental results show a clear improvement in data capture and replay capabilities over standard network monitoring methods and general-purpose forensics solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of cloud computing services (CCS) is appealing to small and medium enterprises (SMEs). However, while there is a significant push by various authorities on SMEs to adopt the CCS, knowledge of the key considerations to adopt the CCS is very limited. We use the technology-organization-environment (TOE) framework to suggest that a strategic and incremental intent, understanding the organizational structure and culture, understanding the external factors, and consideration of the human resource capacity can contribute to sustainable business value from CCS. Using survey data, we find evidence of a positive association between these considerations and the CCS-related business objectives. We also find evidence of positive association between the CCS-related business objectives and CCS-related financial objectives. The results suggest that the proposed considerations can ensure sustainable business value from the CCS. This study provides guidance to SMEs on a path to adopting the CCS with the intention of a long-term commitment and achieving sustainable business value from these services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given an undirected unweighted graph G = (V, E) and an integer k ≥ 1, we consider the problem of computing the edge connectivities of all those (s, t) vertex pairs, whose edge connectivity is at most k. We present an algorithm with expected running time Õ(m + nk3) for this problem, where |V| = n and |E| = m. Our output is a weighted tree T whose nodes are the sets V1, V2,..., V l of a partition of V, with the property that the edge connectivity in G between any two vertices s ε Vi and t ε Vj, for i ≠ j, is equal to the weight of the lightest edge on the path between Vi and Vj in T. Also, two vertices s and t belong to the same Vi for any i if and only if they have an edge connectivity greater than k. Currently, the best algorithm for this problem needs to compute all-pairs min-cuts in an O(nk) edge graph; this takes Õ(m + n5/2kmin{k1/2, n1/6}) time. Our algorithm is much faster for small values of k; in fact, it is faster whenever k is o(n5/6). Our algorithm yields the useful corollary that in Õ(m + nc3) time, where c is the size of the global min-cut, we can compute the edge connectivities of all those pairs of vertices whose edge connectivity is at most αc for some constant α. We also present an Õ(m + n) Monte Carlo algorithm for the approximate version of this problem. This algorithm is applicable to weighted graphs as well. Our algorithm, with some modifications, also solves another problem called the minimum T-cut problem. Given T ⊆ V of even cardinality, we present an Õ(m + nk3) algorithm to compute a minimum cut that splits T into two odd cardinality components, where k is the size of this cut.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a framework for optimum steering input determination of all-wheel steer vehicles (AWSV) on rough terrains. The framework computes the steering input which minimizes the tracking error for a given trajectory. Unlike previous methodologies of computing steering inputs of car-like vehicles, the proposed methodology depends explicitly on the vehicle dynamics and can be extended to vehicle having arbitrary number of steering inputs. A fully generic framework has been used to derive the vehicle dynamics and a non-linear programming based constrained optimization approach has been used to compute the steering input considering the instantaneous vehicle dynamics, no-slip and contact constraints of the vehicle. All Wheel steer Vehicles have a special parallel steering ability where the instantaneous centre of rotation (ICR) is at infinity. The proposed framework automatically enables the vehicle to choose between parallel steer and normal operation depending on the error with respect to the desired trajectory. The efficacy of the proposed framework is proved by extensive uneven terrain simulations, for trajectories with continuous or discontinuous velocity profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hurricane Isabel made landfall as a Category 2 Hurricane on 18 September 2003, on the North Carolina Outer Banks between Cape Lookout and Cape Hatteras, then coursed northwestward through Pamlico Sound and west of Chesapeake Bay where it downgraded to a tropical storm. Wind damage on the west and southwest shores of Pamlico Sound and the western shore of Chesapeake Bay was moderate, but major damage resulted from the storm tide. The NOAA, National Ocean Service, National Centers for Coastal Ocean Sciences, Center for Coastal Fisheries and Habitat Research at Beaufort, North Carolina and the Center for Coastal Environmental Health and Biomedical Research Branch at Oxford, Maryland have hurricane preparedness plans in place. These plans call for tropical storms and hurricanes to be tracked carefully through NOAA National Weather Service (NWS) watches, warnings, and advisories. When a hurricane watch changes to a hurricane warning for the areas of Beaufort or Oxford, documented hurricane preparation plans are activated. Isabel exacted some wind damage at both Beaufort and Oxford. Storm tide caused damage at Oxford, where area-wide flooding isolated the laboratory for many hours. Storm tide also caused damage at Beaufort. Because of their geographic locations on or near the open ocean (Beaufort) or on or near large estuaries (Beaufort and Oxford), storm tide poses a major threat to these NOAA facilities and the safety of federal employees. Damage from storm surge and windblown water depends on the track and intensity of a storm. One tool used to predict storm surge is the Sea, Lake, and Overland Surges from Hurricanes (SLOSH) model of the NWS, which provides valuable surge forecasts that aid in hurricane preparation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of preprocessing a large graph so that point-to-point shortest-path queries can be answered very fast. Computing shortest paths is a well studied problem, but exact algorithms do not scale to huge graphs encountered on the web, social networks, and other applications. In this paper we focus on approximate methods for distance estimation, in particular using landmark-based distance indexing. This approach involves selecting a subset of nodes as landmarks and computing (offline) the distances from each node in the graph to those landmarks. At runtime, when the distance between a pair of nodes is needed, we can estimate it quickly by combining the precomputed distances of the two nodes to the landmarks. We prove that selecting the optimal set of landmarks is an NP-hard problem, and thus heuristic solutions need to be employed. Given a budget of memory for the index, which translates directly into a budget of landmarks, different landmark selection strategies can yield dramatically different results in terms of accuracy. A number of simple methods that scale well to large graphs are therefore developed and experimentally compared. The simplest methods choose central nodes of the graph, while the more elaborate ones select central nodes that are also far away from one another. The efficiency of the suggested techniques is tested experimentally using five different real world graphs with millions of edges; for a given accuracy, they require as much as 250 times less space than the current approach in the literature which considers selecting landmarks at random. Finally, we study applications of our method in two problems arising naturally in large-scale networks, namely, social search and community detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many people suffer from conditions that lead to deterioration of motor control and makes access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware to track and translate a user's movement to pointer movement. These approaches may be perceived as intrusive, for example, wearable devices. Camera-based assistive systems that use visual tracking of features on the user's body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user's face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement, which allows users to navigate to all areas of the screen even with very limited physical movement, and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constraint programming has emerged as a successful paradigm for modelling combinatorial problems arising from practical situations. In many of those situations, we are not provided with an immutable set of constraints. Instead, a user will modify his requirements, in an interactive fashion, until he is satisfied with a solution. Examples of such applications include, amongst others, model-based diagnosis, expert systems, product configurators. The system he interacts with must be able to assist him by showing the consequences of his requirements. Explanations are the ideal tool for providing this assistance. However, existing notions of explanations fail to provide sufficient information. We define new forms of explanations that aim to be more informative. Even if explanation generation is a very hard task, in the applications we consider, we must manage to provide a satisfactory level of interactivity and, therefore, we cannot afford long computational times. We introduce the concept of representative sets of relaxations, a compact set of relaxations that shows the user at least one way to satisfy each of his requirements and at least one way to relax them, and present an algorithm that efficiently computes such sets. We introduce the concept of most soluble relaxations, maximising the number of products they allow. We present algorithms to compute such relaxations in times compatible with interactivity, achieving this by indifferently making use of different types of compiled representations. We propose to generalise the concept of prime implicates to constraint problems with the concept of domain consequences, and suggest to generate them as a compilation strategy. This sets a new approach in compilation, and allows to address explanation-related queries in an efficient way. We define ordered automata to compactly represent large sets of domain consequences, in an orthogonal way from existing compilation techniques that represent large sets of solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a system description and preliminary results for an ongoing clinical study currently being carried out at the Mid-Western Regional Hospital, Nenagh, Ireland. The goal of the trial is to determine if wireless inertial measurement technology can be employed to identify elderly patients at risk of death or imminent clinical deterioration. The system measures cumulative movement and provides a score that will help provide a robust early warning to clinical staff of clinical deterioration. In addition the study examines some of the logistical barriers to the adoption of wearable wireless technology in front-line medical care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a select overview of tools supporting traditional Jewish learning. Then we go on to discuss our own HyperJoseph/HyperIsaac project in instructional hypermedia. Its application is to teaching, teacher training, and self-instruction in given Bible passages. The treatment of two narratives has been developed thus far. The tool enables an analysis of the text in several respects: linguistic, narratological, etc. Moreover, the Scriptures' focality throughout the cultural history makes this domain of application particularly challenging, in that there is a requirement for the tool to encompass the accretion of receptions in the cultural repertoire, i.e., several layers of textual traditions—either hermeneutic (i.e., interpretive), or appropriations—related to the given core passage, thus including "secondary" texts (i.e., such that are responding or derivative) from as disparate realms as Roman-age and later homiletics, Medieval and later commentaries or supercommentaries, literary appropriations, references to the arts and modern scholarship, etc. in particular, the Midrash (homiletic expansions) is adept at narrative gap filling, so the narratives mushroom at the interstices where the primary text is silent. The genealogy of the project is rooted in Weiss' index of novelist Agnon's writings, which was eventually upgraded into a hypertextual tool, including Agnon's full-text and ancillary materials. Those early tools being intended primarily for reference and research-support in literary studies, the Agnon hypertext system was initially emulated in the conception of HyperJoseph, which is applied to the Joseph story from Genesis. Then, the transition from a tool for reference to an instructional tool required a thorough reconception in an educational perspective, which led to HyperIsaac, on the sacrifice of Isaac, and to a redesign and upgrade of HyperJoseph as patterned after HyperIsaac.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Daedalus is a computer tool, developed by an Italian magistrate - Carmelo Asaro - and integrated in his own daily routine as an investigating magistrate conducting inquiries, then as a prosecutor if and when the case investigated goes to court. This tool has recently been adopted by magistrates in judiciary offices throughout Italy, spawning moreover other related projects. First, this paper describes a sample session with daedalus. Next, an overview of an array of judicial tools leads to positioning daedalus in the context of the spectrum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach to evaluating all multiple complex roots of analytical function f(z) confined to the specified rectangular domain of complex plane has been developed and implemented in Fortran code. Generally f (z), despite being holomorphic function, does not have a closed analytical form thereby inhibiting explicit evaluation of its derivatives. The latter constraint poses a major challenge to implementation of the robust numerical algorithm. This work is at the instrumental level and provides an enabling tool for solving a broad class of eigenvalue problems and polynomial approximations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology to estimate the cost implications of design decisions by integrating cost as a design parameter at an early design stage is presented. The model is developed on a hierarchical basis, the manufacturing cost of aircraft fuselage panels being analysed in this paper. The manufacturing cost modelling is original and relies on a genetic-causal method where the drivers of each element of cost are identified relative to the process capability. The cost model is then extended to life cycle costing by computing the Direct Operating Cost as a function of acquisition cost and fuel burn, and coupled with a semi-empirical numerical analysis using Engineering Sciences Data Unit reference data to model the structural integrity of the fuselage shell with regard to material failure and various modes of buckling. The main finding of the paper is that the traditional minimum weight condition is a dated and sub-optimal approach to airframe structural design.