987 resultados para evolved transforms
Resumo:
Thermogravimetry combined with evolved gas mass spectrometry has been used to ascertain the stability of the ‘cave’ mineral brushite. X-ray diffraction shows that brushite from the Jenolan Caves is very pure. Thermogravimetric analysis coupled with ion current mass spectrometry shows a mass loss at 111°C due to loss of water of hydration. A further decomposition step occurs at 190°C with the conversion of hydrogen phosphate to a mixture of calcium ortho-phosphate and calcium pyrophosphate. TG-DTG shows the mineral is not stable above 111°C. A mechanism for the formation of brushite on calcite surfaces is proposed, and this mechanism has relevance to the formation of brushite in urinary tracts.
Resumo:
This book contributes to the literature on early childhood education services in Singapore. It evolved from a research study which was carried out to understand the beliefs and practices of three Singaporean teachers through their critical reflection on their professional work. This study was based on research which indicates that any efforts to improve the quality of early childhood services should involve the teachers themselves. Teachers who are capable of critical reflection on their work with children and families will be more effective practitioners.
Resumo:
We treat two related moving boundary problems. The first is the ill-posed Stefan problem for melting a superheated solid in one Cartesian coordinate. Mathematically, this is the same problem as that for freezing a supercooled liquid, with applications to crystal growth. By applying a front-fixing technique with finite differences, we reproduce existing numerical results in the literature, concentrating on solutions that break down in finite time. This sort of finite-time blow-up is characterised by the speed of the moving boundary becoming unbounded in the blow-up limit. The second problem, which is an extension of the first, is proposed to simulate aspects of a particular two-phase Stefan problem with surface tension. We study this novel moving boundary problem numerically, and provide results that support the hypothesis that it exhibits a similar type of finite-time blow-up as the more complicated two-phase problem. The results are unusual in the sense that it appears the addition of surface tension transforms a well-posed problem into an ill-posed one.
Resumo:
The profession of industrial design is changing and with that so must industrial design education. The newly derived final year industrial design unit at the Queensland University of Technology (QUT) was created to initiate such a change. A designers’ role in industry is no longer limited to the invention process surrounding human cantered design but has now evolved into design led innovation. This paper reflects upon the teaching methods employed over a two-year period and improvements made over that time to the unit. The student project outcome is to produce a design solution that integrates an underlying novel technology into a new product and or service, with business strategies and manufacturing details being fully integrated into the design process. It is this integrated approach to industrial design teaching that will foster a more grounded and resourceful future designer.
Resumo:
Picturebooks invite performance every time they are read. What happens to them when they’re adapted for live performance? This ongoing practice led research project (2008-) regenerates and transforms picturebook The Empty City (Hachette/Livre 2007) by David Megarrity and Jonathon Oxlade into a live experience. In this rebuilding, interanimation of text and illustration on the picturebook page suddenly open up into a new and complex structure incorporating composition of music, animation, live action, projected image and performing objects. The presenter is the creator of both the source text and writer/composer of the adaptation, providing a unique vantage point that draws on sources from both within and without the creative process up to and including audience reception. From the foundations up, this paper’s focus is on deep, muddy sites of development in the adaptation process, unearthed treasures, and how perceptions of fear and safety push, sway and stress the building of a new performance work for children in content, form and process.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
Specialisation in nursing enables a nurse to focus, in much greater depth, on the requisite knowledge and skills for providing patients with the best possible care. Nephrology nursing is one such area where specialisation has evolved. The characteristic focus of practice emerged as an important feature during a study into the process of expertise acquisition in nephrology nursing practice. Using grounded theory methodology, this study involved 6 non-expert and 11 expert nurses and took place in one renal unit in New South Wales. Nephrology nursing practice was observed for 103 hours, and this was immediately followed by semi-structured interviews. The characteristic of focus was conceptualised as the nurses' centre of attention or concentration while they were undertaking nursing activities. Focus ranged from inexperienced non-expert nurses concentrating predominantly on the immediate task at hand, experienced non-expert nurses who focussed on the medium term to expert nurses who viewed actions (and their possible consequences) more broadly and in the longer term. Of significance to nursing, is how nephrology nurses alter their focus of practice as they acquire and exercise their developing expertise in this specialty.
Resumo:
Due to increasing recognition by industry that partnerships with universities can lead to more effective knowledge and skills acquisition and deployment, corporate learning programmes are currently experiencing a resurgence of interest. Rethinking of corporations’ approaches to what has traditionally been classed as ‘training’ has resulted in a new focus on learning and the adoption of philosophies that underlie the academic paradigm. This paper reports on two studies of collaboration between major international engineering corporations and an Australian university, the aim of which was to up-skill the workforce in response to changing markets. The paper highlights the differences between the models of learning adopted in such collaboration and those in more conventional, university-based environments. The learning programmes combine the ADDIE (analysis, design, develop, implement and evaluate) development and workplace learning models. Adaptations that have added value for industry partners and recommendations as to how these can be evolved to cope with change are discussed. The learning is contextualised by industry- based subject matter experts working in close collaboration with university experts and learning designers to develop programmes that are reflective of current and future needs in the organisation. Results derived from user feedback indicate that the learning programmes are effectively aligned with the needs of the industry partners whilst simultaneously upholding academic ideals. In other words, it is possible to combine academic and more traditional approaches to develop corporate learning programmes that satisfy requirements in the workplace. Emerging from the study, a new conceptual framework for the development of corporate learning is presented.
Resumo:
During the last decade, globalisation and liberalisation of financial markets, changing societal expectations and corporate governance scandals have increased the attention for the fiduciary duties of non-executive directors. In this context, recent corporate governance reform initiatives have emphasised the control task and independence of non-executive directors. However, little attention has been paid to their impact on the external and internal service tasks of non-executive directors. Therefore, this paper investigates how the service tasks of non-executive directors have evolved in the Netherlands. Data on corporate governance at the top-100 listed companies in the Netherlands between 1997 and 2005 show that the emphasis on non-executive directors' external service task has shifted to their internal service task, i.e. from non-executive directors acting as boundary spanners to non-executive directors providing advice and counselling to executive directors. This shift in board responsibilities affects non-executive directors' ability to generate network benefits through board relationships and has implications for non-executive directors' functional requirements.
Resumo:
In this paper we investigate the heuristic construction of bijective s-boxes that satisfy a wide range of cryptographic criteria including algebraic complexity, high nonlinearity, low autocorrelation and have none of the known weaknesses including linear structures, fixed points or linear redundancy. We demonstrate that the power mappings can be evolved (by iterated mutation operators alone) to generate bijective s-boxes with the best known tradeoffs among the considered criteria. The s-boxes found are suitable for use directly in modern encryption algorithms.
Resumo:
A new algorithm for extracting features from images for object recognition is described. The algorithm uses higher order spectra to provide desirable invariance properties, to provide noise immunity, and to incorporate nonlinearity into the feature extraction procedure thereby allowing the use of simple classifiers. An image can be reduced to a set of 1D functions via the Radon transform, or alternatively, the Fourier transform of each 1D projection can be obtained from a radial slice of the 2D Fourier transform of the image according to the Fourier slice theorem. A triple product of Fourier coefficients, referred to as the deterministic bispectrum, is computed for each 1D function and is integrated along radial lines in bifrequency space. Phases of the integrated bispectra are shown to be translation- and scale-invariant. Rotation invariance is achieved by a regrouping of these invariants at a constant radius followed by a second stage of invariant extraction. Rotation invariance is thus converted to translation invariance in the second step. Results using synthetic and actual images show that isolated, compact clusters are formed in feature space. These clusters are linearly separable, indicating that the nonlinearity required in the mapping from the input space to the classification space is incorporated well into the feature extraction stage. The use of higher order spectra results in good noise immunity, as verified with synthetic and real images. Classification of images using the higher order spectra-based algorithm compares favorably to classification using the method of moment invariants
Resumo:
A general procedure to determine the principal domain (i.e., nonredundant region of computation) of any higher-order spectrum is presented, using the bispectrum as an example. The procedure is then applied to derive the principal domain of the trispectrum of a real-valued, stationary time series. These results are easily extended to compute the principal domains of other higher-order spectra
Resumo:
Features derived from the trispectra of DFT magnitude slices are used for multi-font digit recognition. These features are insensitive to translation, rotation, or scaling of the input. They are also robust to noise. Classification accuracy tests were conducted on a common data base of 256× 256 pixel bilevel images of digits in 9 fonts. Randomly rotated and translated noisy versions were used for training and testing. The results indicate that the trispectral features are better than moment invariants and affine moment invariants. They achieve a classification accuracy of 95% compared to about 81% for Hu's (1962) moment invariants and 39% for the Flusser and Suk (1994) affine moment invariants on the same data in the presence of 1% impulse noise using a 1-NN classifier. For comparison, a multilayer perceptron with no normalization for rotations and translations yields 34% accuracy on 16× 16 pixel low-pass filtered and decimated versions of the same data.
Resumo:
In this paper we propose a new method for face recognition using fractal codes. Fractal codes represent local contractive, affine transformations which when iteratively applied to range-domain pairs in an arbitrary initial image result in a fixed point close to a given image. The transformation parameters such as brightness offset, contrast factor, orientation and the address of the corresponding domain for each range are used directly as features in our method. Features of an unknown face image are compared with those pre-computed for images in a database. There is no need to iterate, use fractal neighbor distances or fractal dimensions for comparison in the proposed method. This method is robust to scale change, frame size change and rotations as well as to some noise, facial expressions and blur distortion in the image
Resumo:
Highlights ► Provides a review of the history and development of locative media. ► Outlines different human-computer interaction techniques applied in locative media. ► Discusses how locative media applications have changed interaction affordances in and of physical spaces. ► Discusses practices of people in urban settings that evolved through these new affordances. ► Provides an overview on methods to investigate and elaborate design principles for future locative media.