959 resultados para GRAPHICS
Resumo:
The combined effect of pressure and temperature on the rate of gelatinisation of starch present in Thai glutinous rice was investigated. Pressure was found to initiate gelatinisation when its value exceeded 200 MPa at ambient temperature. On the other hand, complete gelatinisation was observed at 500 and 600 MPa at 70 degrees C, when the rice was soaked in water under these conditions for 120 min. A first-order kinetic model describing the rate of gelatinisation was developed to estimate the values of the rate constants as a function of pressure and temperature in the range: 0.1-600 MPa and 20-70 degrees C. The model, based on the well-known Arrhenius and Eyring equations, assumed the form [GRAPHICS] The constants k(0), E-a, and Delta V were found to take values: 31.19 s(-1), 37.89 kJ mol(-1) and -9.98 cm(3) mol(-1), respectively. It was further noted that the extent of gelatinisation occurring at any time, temperature and pressure, could be exclusively correlated with the grain moisture content. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone. The methods are also limited to optical see-through HMDs. Building on our existing HMD calibration method [1], we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside an HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in various positions. The locations of image features on the calibration object are then re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the display’s intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner in both see-through and in non-see-through modes and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors and involves no error-prone human measurements.
Resumo:
Ever since man invented writing he has used text to store and distribute his thoughts. With the advent of computers and the Internet the delivery of these messages has become almost instant. Textual conversations can now be had regardless of location or distance. Advances in computational power for 3D graphics are enabling Virtual Environments(VE) within which users can become increasingly more immersed. By opening these environments to other users such as initially through sharing these text conversations channels, we aim to extend the immersed experience into an online virtual community. This paper examines work that brings textual communications into the VE, enabling interaction between the real and virtual worlds.
Resumo:
In this paper, we study the oscillating property of positive solutions and the global asymptotic stability of the unique equilibrium of the two rational difference equations [GRAPHICS] and [GRAPHICS] where a is a nonnegative constant. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
In this paper, we study the behavior of the positive solutions of the system of two difference equations [GRAPHICS] where p >= 1, r >= 1, s >= 1, A >= 0, and x(1-r), x(2-r),..., x(0), y(1-max) {p.s},..., y(0) are positive real numbers. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Many well-established statistical methods in genetics were developed in a climate of severe constraints on computational power. Recent advances in simulation methodology now bring modern, flexible statistical methods within the reach of scientists having access to a desktop workstation. We illustrate the potential advantages now available by considering the problem of assessing departures from Hardy-Weinberg (HW) equilibrium. Several hypothesis tests of HW have been established, as well as a variety of point estimation methods for the parameter which measures departures from HW under the inbreeding model. We propose a computational, Bayesian method for assessing departures from HW, which has a number of important advantages over existing approaches. The method incorporates the effects-of uncertainty about the nuisance parameters--the allele frequencies--as well as the boundary constraints on f (which are functions of the nuisance parameters). Results are naturally presented visually, exploiting the graphics capabilities of modern computer environments to allow straightforward interpretation. Perhaps most importantly, the method is founded on a flexible, likelihood-based modelling framework, which can incorporate the inbreeding model if appropriate, but also allows the assumptions of the model to he investigated and, if necessary, relaxed. Under appropriate conditions, information can be shared across loci and, possibly, across populations, leading to more precise estimation. The advantages of the method are illustrated by application both to simulated data and to data analysed by alternative methods in the recent literature.
Resumo:
A unique parameterization of the perspective projections in all whole-numbered dimensions is reported. The algorithm for generating a perspective transformation from parameters and for recovering parameters from a transformation is a modification of the Givens orthogonalization algorithm. The algorithm for recovering a perspective transformation from a perspective projection is a modification of Roberts' classical algorithm. Both algorithms have been implemented in Pop-11 with call-out to the NAG Fortran libraries. Preliminary monte-carlo tests show that the transformation algorithm is highly accurate, but that the projection algorithm cannot recover magnitude and shear parameters accurately. However, there is reason to believe that the projection algorithm might improve significantly with the use of many corresponding points, or with multiple perspective views of an object. Previous parameterizations of the perspective transformations in the computer graphics and computer vision literature are discussed.
Resumo:
Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.
Resumo:
The technique of constructing a transformation, or regrading, of a discrete data set such that the histogram of the transformed data matches a given reference histogram is commonly known as histogram modification. The technique is widely used for image enhancement and normalization. A method which has been previously derived for producing such a regrading is shown to be “best” in the sense that it minimizes the error between the cumulative histogram of the transformed data and that of the given reference function, over all single-valued, monotone, discrete transformations of the data. Techniques for smoothed regrading, which provide a means of balancing the error in matching a given reference histogram against the information lost with respect to a linear transformation are also examined. The smoothed regradings are shown to optimize certain cost functionals. Numerical algorithms for generating the smoothed regradings, which are simple and efficient to implement, are described, and practical applications to the processing of LANDSAT image data are discussed.
Resumo:
The fundamental principles of the teaching methodology followed for dyslexic learners evolve around the need for a multisensory approach, which would advocate repetition of learning tasks in an enjoyable way. The introduction of multimedia technologies in the field of education has supported the merging of new tools (digital camera, scanner) and techniques (sounds, graphics, animation) in a meaningful whole. Dyslexic learners are now given the opportunity to express their ideas using these alternative media and participate actively in the educational process. This paper discussed the preliminary findings of a single case study of two English monolingual dyslexic children working together to create an open-ended multimedia project on a laptop computer. The project aimed to examine whether and if the multimedia environment could enhance the dyslexic learners’ skills in composition. Analysis of the data has indicated that the technological facilities gave the children the opportunity to enhance the style and content of their work for a variety of audiences and to develop responsibilities connected to authorship.
Resumo:
This article is an analysis and reflection on the role of lists and diagrams in Start where you are, a multimedia improvisational piece performed as part of square zero independent dance festival: the second edition/la deuxième édition. This interdisciplinary festival was organised by collective (gulp) dance projects and took place in Ottawa, Canada, in August 2005. Start where you are was the result of a collaboration between the authors: two dance artists (Andrew and MacKinnon, the principals of (gulp)) and a visual communication designer (Gillieson). A sound artist and a lighting technician also participated in the work. This is a post-performance retrospective meant to analyze more closely the experience that meshed the evidentiary weight of words and graphics with the ephemerality and subjectivity of movement-based live performance. It contextualizes some of the work of collective (gulp) within a larger tradition of improvisation in modern dance. It also looks at how choice-making processes are central to improvisation, how they relate to Start, and how linguistic material can intersect with and support improvisational performance. Lastly, it examines some characteristics of lists and diagrams, unique forms of visual language that are potentially rich sources of material for improvisation.
Resumo:
Simulating spiking neural networks is of great interest to scientists wanting to model the functioning of the brain. However, large-scale models are expensive to simulate due to the number and interconnectedness of neurons in the brain. Furthermore, where such simulations are used in an embodied setting, the simulation must be real-time in order to be useful. In this paper we present NeMo, a platform for such simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs). NeMo makes use of the Izhikevich neuron model which provides a range of realistic spiking dynamics while being computationally efficient. Our GPU kernel can deliver up to 400 million spikes per second. This corresponds to a real-time simulation of around 40 000 neurons under biologically plausible conditions with 1000 synapses per neuron and a mean firing rate of 10 Hz.
Resumo:
The physical pendulum treated with a Hamiltonian formulation is a natural topic for study in a course in advanced classical mechanics. For the past three years, we have been offering a series of problem sets studying this system numerically in our third-year undergraduate courses in mechanics. The problem sets investigate the physics of the pendulum in ways not easily accessible without computer technology and explore various algorithms for solving mechanics problems. Our computational physics is based on Mathematica with some C communicating with Mathematica, although nothing in this paper is dependent on that choice. We have nonetheless found this system, and particularly its graphics, to be a good one for use with undergraduates.