164 resultados para Wave functions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous work by Professor John Frazer on Evolutionary Architecture provides a basis for the development of a system evolving architectural envelopes in a generic and abstract manner. Recent research by the authors has focused on the implementation of a virtual environment for the automatic generation and exploration of complex forms and architectural envelopes based on solid modelling techniques and the integration of evolutionary algorithms, enhanced computational and mathematical models. Abstract data types are introduced for genotypes in a genetic algorithm order to develop complex models using generative and evolutionary computing techniques. Multi-objective optimisation techniques are employed for defining the fitness function in the evaluation process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dr. Young-Ki Paik directs the Yonsei Proteome Research Center in Seoul, Korea and was elected as the President of the Human Proteome Organization (HUPO) in 2009. In the December 2009 issue of the Current Pharmacogenomics and Personalized Medicine (CPPM), Dr. Paik explains the new field of pharmacoproteomics and the approaching wave of “proteomics diagnostics” in relation to personalized medicine, HUPO’s role in advancing proteomics technology applications, the HUPO Proteomics Standards Initiative, and the future impact of proteomics on medicine, science, and society. Additionally, he comments that (1) there is a need for launching a Gene-Centric Human Proteome Project (GCHPP) through which all representative proteins encoded by the genes can be identified and quantified in a specific cell and tissue and, (2) that the innovation frameworks within the diagnostics industry hitherto borrowed from the genetics age may require reevaluation in the case of proteomics, in order to facilitate the uptake of pharmacoproteomics innovations. He stresses the importance of biological/clinical plausibility driving the evolution of biotechnologies such as proteomics,instead of an isolated singular focus on the technology per se. Dr. Paik earned his Ph.D. in biochemistry from the University of Missouri-Columbia and carried out postdoctoral work at the Gladstone Foundation Laboratories of Cardiovascular Disease, University of California at San Francisco. In 2005, his research team at Yonsei University first identified and characterized the chemical structure of C. elegans dauer pheromone (daumone) which controls the aging process of this nematode. He is interviewed by a multidisciplinary team specializing in knowledge translation, technology regulation, health systems governance, and innovation analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method for determination of lactose in food samples by Osteryoung square wave voltammetry (OSWV) was developed. It was based on the nucleophilic addition reaction between lactose and aqua ammonia. The carbonyl group of lactose can be changed into imido group, and this increases the electrochemical activity in reduction and the sensitivity. The optimal condition for the nucleophilic addition reaction was investigated and it was found that in NH4Cl–NH3 buffer of pH 10.1, the linear range between the peak current and the concentration of lactose was 0.6–8.4 mg L−1, and the detection limits was 0.44 mg L−1. The proposed method was applied to the determination of lactose in food samples and satisfactory results were obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Matrix function approximation is a current focus of worldwide interest and finds application in a variety of areas of applied mathematics and statistics. In this thesis we focus on the approximation of A^(-α/2)b, where A ∈ ℝ^(n×n) is a large, sparse symmetric positive definite matrix and b ∈ ℝ^n is a vector. In particular, we will focus on matrix function techniques for sampling from Gaussian Markov random fields in applied statistics and the solution of fractional-in-space partial differential equations. Gaussian Markov random fields (GMRFs) are multivariate normal random variables characterised by a sparse precision (inverse covariance) matrix. GMRFs are popular models in computational spatial statistics as the sparse structure can be exploited, typically through the use of the sparse Cholesky decomposition, to construct fast sampling methods. It is well known, however, that for sufficiently large problems, iterative methods for solving linear systems outperform direct methods. Fractional-in-space partial differential equations arise in models of processes undergoing anomalous diffusion. Unfortunately, as the fractional Laplacian is a non-local operator, numerical methods based on the direct discretisation of these equations typically requires the solution of dense linear systems, which is impractical for fine discretisations. In this thesis, novel applications of Krylov subspace approximations to matrix functions for both of these problems are investigated. Matrix functions arise when sampling from a GMRF by noting that the Cholesky decomposition A = LL^T is, essentially, a `square root' of the precision matrix A. Therefore, we can replace the usual sampling method, which forms x = L^(-T)z, with x = A^(-1/2)z, where z is a vector of independent and identically distributed standard normal random variables. Similarly, the matrix transfer technique can be used to build solutions to the fractional Poisson equation of the form ϕn = A^(-α/2)b, where A is the finite difference approximation to the Laplacian. Hence both applications require the approximation of f(A)b, where f(t) = t^(-α/2) and A is sparse. In this thesis we will compare the Lanczos approximation, the shift-and-invert Lanczos approximation, the extended Krylov subspace method, rational approximations and the restarted Lanczos approximation for approximating matrix functions of this form. A number of new and novel results are presented in this thesis. Firstly, we prove the convergence of the matrix transfer technique for the solution of the fractional Poisson equation and we give conditions by which the finite difference discretisation can be replaced by other methods for discretising the Laplacian. We then investigate a number of methods for approximating matrix functions of the form A^(-α/2)b and investigate stopping criteria for these methods. In particular, we derive a new method for restarting the Lanczos approximation to f(A)b. We then apply these techniques to the problem of sampling from a GMRF and construct a full suite of methods for sampling conditioned on linear constraints and approximating the likelihood. Finally, we consider the problem of sampling from a generalised Matern random field, which combines our techniques for solving fractional-in-space partial differential equations with our method for sampling from GMRFs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While there is substantial research on attitudinal and behavioral loyalty, the deconstruction of attitudinal loyalty into its two key components – emotional and cognitive loyalty – has been largely ignored. Despite the existence of managerial strategies aimed at increasing each of these two components, there is little academic research to support these managerial efforts. This paper seeks to advance the understanding of emotional and cognitive brand loyalty by examining the psychological function that these dimensions of brand loyalty perform for the consumer. We employ Katz’s (1960) four functions of attitudes (utilitarian, knowledge, value-expression, ego-defence) to investigate this question. Surveys using a convenience sample were completed by 268 consumers in two metropolitan cities on a variety of goods, services and durable products. The relationship between the functions and dimensions of loyalty were examined using MANOVA. The results show that both the utilitarian and knowledge functions of loyalty are significantly positively related to cognitive loyalty while the ego-defensive function of loyalty is significantly positively related to emotional loyalty. The results for the value-expressive function of loyalty were nonsignificant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vitamin D is unique among the vitamins in that humans can synthesize it via the action of UV radiation upon the skin. This combined with its ability to act on specific target tissues via Vitamin D Receptor’s (VDR) make its classification as a steroid hormone more appropriate. While Vitamin D deficiency is a recognized problem in some northern latitude countries, recent studies have shown even in sunny countries such as Australia, vitamin D deficiency may be more prevalent than first thought. Vitamin D is most well known for its role in bone health, however, the discovery of VDR’s on a wide variety of tissue types has also opened up roles for vitamin D far beyond traditional bone health. These include possible associations with autoimmune diseases such as multiple sclerosis and inflammatory bowel diseases, cancer, cardiovascular diseases and muscle strength. Firstly, this paper presents an overview of the two sources of vitamin D: exposure to ultraviolet-B radiation and food sources of vitamin D, with particular focus on both Australian and international studies on dietary vitamin D intake and national fortification strategies. Secondly, the paper reviews recent epidemiological and experimental evidence linking vitamin D and its role in health and disease for the major conditions linked to suboptimal vitamin D, while identifying significant gaps in the research and possible future directions for research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report presents an analysis of the data from the first wave of the Longitudinal Study of Australian Children (LSAC) to explore the wellbeing of 5,107 children in the infant cohort of the study and the 4,983 children, aged 4 to 5 years, in the child cohort. Wave 1 of LSAC includes measures of multiple aspects of children’s early development. These developmental measures are summarised in the LSAC Outcome Index, a composite measure which includes an overall index as well as three separate domain scores, tapping physical development, social and emotional functioning, and learning and cognitive development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1967 Brisbane Repertory Theatre made a decision that was to change the city's cultural landscape in a significant and lasting way. Faced with crippling theatre rental costs, Brisbane Rep. found a realistic solution by converting one of its properties - an old Queenslander - into a unique theatre space. The theatre-in-the box that emerged, aptly called La Boite, opened on 23 June 1967 with a production of John Osborne's Look Back in Anger. This experimental space excited the imagination of a new, younger audience not previously interested in Brisbane Rep's essentially conservative fare. It attracted a new group of directors and actors keen to be part of a changing repertoire that embraced more radical, non-mainstream productions, some of which were of Australian plays. The decade after 1967 was a period of change and development unprecedented in La Boite's history. Since then the company has sustained and grown its commitment to Australian plays and the commissioning of new works. To what extent was this most significance moment in La Boite's transformational journey influenced by southern 'new waves' of change? With the benefit of hindsight, it is now time for a re-consideration of Brisbane's distinctive contribution to the New Wave.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To ascertain the effectiveness of object-centered three-dimensional representations for the modeling of corneal surfaces. Methods: Three-dimensional (3D) surface decomposition into series of basis functions including: (i) spherical harmonics, (ii) hemispherical harmonics, and (iii) 3D Zernike polynomials were considered and compared to the traditional viewer-centered representation of two-dimensional (2D) Zernike polynomial expansion for a range of retrospective videokeratoscopic height data from three clinical groups. The data were collected using the Medmont E300 videokeratoscope. The groups included 10 normal corneas with corneal astigmatism less than −0.75 D, 10 astigmatic corneas with corneal astigmatism between −1.07 D and 3.34 D (Mean = −1.83 D, SD = ±0.75 D), and 10 keratoconic corneas. Only data from the right eyes of the subjects were considered. Results: All object-centered decompositions led to significantly better fits to corneal surfaces (in terms of the RMS error values) than the corresponding 2D Zernike polynomial expansions with the same number of coefficients, for all considered corneal surfaces, corneal diameters (2, 4, 6, and 8 mm), and model orders (4th to 10th radial orders) The best results (smallest RMS fit error) were obtained with spherical harmonics decomposition which lead to about 22% reduction in the RMS fit error, as compared to the traditional 2D Zernike polynomials. Hemispherical harmonics and the 3D Zernike polynomials reduced the RMS fit error by about 15% and 12%, respectively. Larger reduction in RMS fit error was achieved for smaller corneral diameters and lower order fits. Conclusions: Object-centered 3D decompositions provide viable alternatives to traditional viewer-centered 2D Zernike polynomial expansion of a corneal surface. They achieve better fits to videokeratoscopic height data and could be particularly suited to the analysis of multiple corneal measurements, where there can be slight variations in the position of the cornea from one map acquisition to the next.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adiabatic compression testing of components in gaseous oxygen is a test method that is utilized worldwide and is commonly required to qualify a component for ignition tolerance under its intended service. This testing is required by many industry standards organizations and government agencies. This paper traces the background of adiabatic compression testing in the oxygen community and discusses the thermodynamic and fluid dynamic processes that occur during rapid pressure surges. This paper is the first of several papers by the authors on the subject of adiabatic compression testing and is presented as a non-comprehensive background and introduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.