962 resultados para COMPUTATION
Resumo:
This work describes recent extensions to the GPFlow scientific workflow system in development at MQUTeR (www.mquter.qut.edu.au), which facilitate interactive experimentation, automatic lifting of computations from single-case to collection-oriented computation and automatic correlation and synthesis of collections. A GPFlow workflow presents as an acyclic data flow graph, yet provides powerful iteration and collection formation capabilities.
Resumo:
The ten-year anniversary of TOPLAP presents a unique opportunity for reflection and introspection. In this essay we ask the question, what is the meaning of live coding? Our goal is not to answer this question, in absolute terms, but rather to attempt to unpack some of live coding's many meanings. Our hope is that by exploring some of the formal, embodied, and cultural meanings surrounding live-coding practice, we may help to stimulate a conversation that will resonate within the live-coding community for the next ten years
Resumo:
Evolutionary algorithms are playing an increasingly important role as search methods in cognitive science domains. In this study, methodological issues in the use of evolutionary algorithms were investigated via simulations in which procedures were systematically varied to modify the selection pressures on populations of evolving agents. Traditional roulette wheel, tournament, and variations of these selection algorithms were compared on the “needle-in-a-haystack” problem developed by Hinton and Nowlan in their 1987 study of the Baldwin effect. The task is an important one for cognitive science, as it demonstrates the power of learning as a local search technique in smoothing a fitness landscape that lacks gradient information. One aspect that has continued to foster interest in the problem is the observation of residual learning ability in simulated populations even after long periods of time. Effective evolutionary algorithms balance their search effort between broad exploration of the search space and in-depth exploitation of promising solutions already found. Issues discussed include the differential effects of rank and proportional selection, the tradeoff between migration of populations towards good solutions and maintenance of diversity, and the development of measures that illustrate how each selection algorithm affects the search process over generations. We show that both roulette wheel and tournament algorithms can be modified to appropriately balance search between exploration and exploitation, and effectively eliminate residual learning in this problem.
Resumo:
This paper details the initial design and planning of a Field Programmable Gate Array (FPGA) implemented control system that will enable a path planner to interact with a MAVLink based flight computer. The design is aimed at small Unmanned Aircraft Vehicles (UAV) under autonomous operation which are typically subject to constraints arising from limited on-board processing capabilities, power and size. An FPGA implementation for the de- sign is chosen for its potential to address such limitations through low power and high speed in-hardware computation. The MAVLink protocol offers a low bandwidth interface for the FPGA implemented path planner to communicate with an on-board flight computer. A control system plan is presented that is capable of accepting a string of GPS waypoints generated on-board from a previously developed in- hardware Genetic Algorithm (GA) path planner and feeding them to the open source PX4 autopilot, while simultaneously respond- ing with flight status information.
Resumo:
In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.
Resumo:
Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.
Resumo:
This chapter presents the stability analysis based on bifurcation theory of the distribution static compensator (DSTATCOM) operating both in current control mode as in voltage control mode. The bifurcation analysis allows delimiting the operating zones of nonlinear power systems and hence the computation of these boundaries is of interest for practical design and planning purposes. Suitable mathematical representations of the DSTATCOM are proposed to carry out the bifurcation analyses efficiently. The stability regions in the Thevenin equivalent plane are computed for different power factors at the Point of Common Coupling (PCC). In addition, the stability regions in the control gain space are computed, and the DC capacitor and AC capacitor impact on the stability are analyzed in detail. It is shown through bifurcation analysis that the loss of stability in the DSTATCOM is in general due to the emergence of oscillatory dynamics. The observations are verified through detailed simulation studies.
Resumo:
Grateful Fateful Sunshine Rain is a permanent public artwork commissioned by Aria Property Group through a competitive process for the Austin apartment building in South Brisbane. Artist Statement: Residents of Brisbane have a complex relationship with weather. As the capital of the Sunshine State, weather is an integral part of the city’s cultural identity. Weather deeply affects the mood of the city – from the excitement of scantily clad partygoers on balmy December evenings and late February’s lethargy, to the deepening anxiety that emerges after 100 days of rain (or more commonly, 100 days without rain). With a brief nod to the city’s – now decommissioned – iconic MCL weather beacon, Grateful Fateful Sunshine Rain taps into this aspect of Brisbane’s psyche with poetic, illuminated visualisations of real-time weather forecasts issued by the Bureau of Meteorology. Each evening, the artwork downloads tomorrow’s forecast from the Bureau of Meteorology website. Data including, current local temperature, humidity, wind speed & direction, precipitation (rain, hail etc), are used to generate a lighting display that conveys how tomorrow will feel. The artwork’s background colour indicates the expected temperature – from cold blues through mild pastel pinks and blues to bright hot oranges and reds. White fluffy clouds roll across the artwork if cloud is predicted. The density of these clouds indicates the level of cover whilst movement indicates expected wind speed and direction. If rain is predicted, sparkles of white light will appear on top of whichever background colour is chosen for the next day’s temperature. Sparkles appear constantly before wet, drizzly days, and intermittently if scattered showers are predicted. Intermittent, but more intense sparkles appear before rain storms or thunderstorms. Research Contribution: The work has made contributions to the field in the way it rethinks approaches to the conceptualization, design and realization of illuminated urban media. This has led to new theorizations of urban media, which consider light and illumination can be used to convey meaningful data. The research has produced new methods for controlling illumination systems using tools and techniques typically employed in computation arts. It has also develop methods and processes for the design and production of illuminated urban media architectures that are connected to real time data sources, and do which not follow the assumed logics of screen based media and displays.
Resumo:
There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. Among them, CVTree method, feature frequency profiles method and dynamical language approach were used to investigate the whole-proteome phylogeny of large dsDNA viruses. Using the data set of large dsDNA viruses from Gao and Qi (BMC Evol. Biol. 2007), the phylogenetic results based on the CVTree method and the dynamical language approach were compared in Yu et al. (BMC Evol. Biol. 2010). In this paper, we first apply dynamical language approach to the data set of large dsDNA viruses from Wu et al. (Proc. Natl. Acad. Sci. USA 2009) and compare our phylogenetic results with those based on the feature frequency profiles method. Then we construct the whole-proteome phylogeny of the larger dataset combining the above two data sets. According to the report of The International Committee on the Taxonomy of Viruses (ICTV), the trees from our analyses are in good agreement to the latest classification of large dsDNA viruses.
Resumo:
"Quanta" was an interactive audio visual installation. Quanta was selected for inclusion in Virtual Terrain 2, an exhibition of the International Digital Art Project in 2008
Resumo:
For the past few years, research works on the topic of secure outsourcing of cryptographic computations has drawn significant attention from academics in security and cryptology disciplines as well as information security practitioners. One main reason for this interest is their application for resource constrained devices such as RFID tags. While there has been significant progress in this domain since Hohenberger and Lysyanskaya have provided formal security notions for secure computation delegation, there are some interesting challenges that need to be solved that can be useful towards a wider deployment of cryptographic protocols that enable secure outsourcing of cryptographic computations. This position paper brings out these challenging problems with RFID technology as the use case together with our ideas, where applicable, that can provide a direction towards solving the problems.
Resumo:
We analyse the security of iterated hash functions that compute an input dependent checksum which is processed as part of the hash computation. We show that a large class of such schemes, including those using non-linear or even one-way checksum functions, is not secure against the second preimage attack of Kelsey and Schneier, the herding attack of Kelsey and Kohno and the multicollision attack of Joux. Our attacks also apply to a large class of cascaded hash functions. Our second preimage attacks on the cascaded hash functions improve the results of Joux presented at Crypto’04. We also apply our attacks to the MD2 and GOST hash functions. Our second preimage attacks on the MD2 and GOST hash functions improve the previous best known short-cut second preimage attacks on these hash functions by factors of at least 226 and 254, respectively. Our herding and multicollision attacks on the hash functions based on generic checksum functions (e.g., one-way) are a special case of the attacks on the cascaded iterated hash functions previously analysed by Dunkelman and Preneel and are not better than their attacks. On hash functions with easily invertible checksums, our multicollision and herding attacks (if the hash value is short as in MD2) are more efficient than those of Dunkelman and Preneel.
Resumo:
Vision-based underwater navigation and obstacle avoidance demands robust computer vision algorithms, particularly for operation in turbid water with reduced visibility. This paper describes a novel method for the simultaneous underwater image quality assessment, visibility enhancement and disparity computation to increase stereo range resolution under dynamic, natural lighting and turbid conditions. The technique estimates the visibility properties from a sparse 3D map of the original degraded image using a physical underwater light attenuation model. Firstly, an iterated distance-adaptive image contrast enhancement enables a dense disparity computation and visibility estimation. Secondly, using a light attenuation model for ocean water, a color corrected stereo underwater image is obtained along with a visibility distance estimate. Experimental results in shallow, naturally lit, high-turbidity coastal environments show the proposed technique improves range estimation over the original images as well as image quality and color for habitat classification. Furthermore, the recursiveness and robustness of the technique allows implementation onboard an Autonomous Underwater Vehicle for improving navigation and obstacle avoidance performance.