196 resultados para Magic squares


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactions between small molecules with biopolymers e.g. the bovine serum albumin (BSA protein), are important, and significant information is recorded in the UV–vis and fluorescence spectra of their reaction mixtures. The extraction of this information is difficult conventionally and principally because there is significant overlapping of the spectra of the three analytes in the mixture. The interaction of berberine chloride (BC) and the BSA protein provides an interesting example of such complex systems. UV–vis and fluorescence spectra of BC and BSA mixtures were investigated in pH 7.4 Tris–HCl buffer at 37 °C. Two sample series were measured by each technique: (1) [BSA] was kept constant and the [BC] was varied and (2) [BC] was kept constant and the [BSA] was varied. This produced four spectral data matrices, which were combined into one expanded spectral matrix. This was processed by the multivariate curve resolution–alternating least squares method (MCR–ALS). The results produced: (1) the extracted pure BC, BSA and the BC–BSA complex spectra from the measured heavily overlapping composite responses, (2) the concentration profiles of BC, BSA and the BC–BSA complex, which are difficult to obtain by conventional means, and (3) estimates of the number of binding sites of BC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel voltammetric method for simultaneous determination of the glucocorticoid residues prednisone, prednisolone, and dexamethasone was developed. All three compounds were reduced at a mercury electrode in a Britton-Robinson buffer (pH 3.78), and well-defined voltammetric waves were observed. However, the voltammograms of these three compounds overlapped seriously and showed nonlinear character, and thus, it was difficult to analyze the compounds individually in their mixtures. In this work, two chemometrics methods, principal component regression (PCR) and partial least squares (PLS), were applied to resolve the overlapped voltammograms, and the calibration models were established for simultaneous determination of these compounds. Under the optimum experimental conditions, the limits of detection (LOD) were 5.6, 8.3, and 16.8 µg l-1 for prednisone, prednisolone, and dexamethasone, respectively. The proposed method was also applied for the determination of these glucocorticoid residues in the rabbit plasma and human urine samples with satisfactory results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple and sensitive spectrophotometric method for the simultaneous determination of acesulfame-K, sodium cyclamate and saccharin sodium sweeteners in foodstuff samples has been researched and developed. This analytical method relies on the different kinetic rates of the analytes in their oxidative reaction with KMnO4 to produce the green manganate product in an alkaline solution. As the kinetic rates of acesulfame-K, sodium cyclamate and saccharin sodium were similar and their kinetic data seriously overlapped, chemometrics methods, such as partial least squares (PLS), principal component regression (PCR) and classical least squares (CLS), were applied to resolve the kinetic data. The results showed that the PLS prediction model performed somewhat better. The proposed method was then applied for the determination of the three sweeteners in foodstuff samples, and the results compared well with those obtained by the reference HPLC method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fast and accurate procedure has been researched and developed for the simultaneous determination of maltol and ethyl maltol, based on their reaction with iron(III) in the presence of o-phenanthroline in sulfuric acid medium. This reaction was the basis for an indirect kinetic spectrophotometric method, which followed the development of the pink ferroin product (λmax = 524 nm). The kinetic data were collected in the 370–900 nm range over 0–30 s. The optimized method indicates that individual analytes followed Beer’s law in the concentration range of 4.0–76.0 mg L−1 for both maltol and ethyl maltol. The LOD values of 1.6 mg L−1 for maltol and 1.4 mg L−1 for ethyl maltol agree well with those obtained by the alternative high performance liquid chromatography with ultraviolet detection (HPLC-UV). Three chemometrics methods, principal component regression (PCR), partial least squares (PLS) and principal component analysis–radial basis function–artificial neural networks (PC–RBF–ANN), were used to resolve the measured data with small kinetic differences between the two analytes as reflected by the development of the pink ferroin product. All three performed satisfactorily in the case of the synthetic verification samples, and in their application for the prediction of the analytes in several food products. The figures of merit for the analytes based on the multivariate models agreed well with those from the alternative HPLC-UV method involving the same samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A spectrophotometric method for the simultaneous determination of the important pharmaceuticals, pefloxacin and its structurally similar metabolite, norfloxacin, is described for the first time. The analysis is based on the monitoring of a kinetic spectrophotometric reaction of the two analytes with potassium permanganate as the oxidant. The measurement of the reaction process followed the absorbance decrease of potassium permanganate at 526 nm, and the accompanying increase of the product, potassium manganate, at 608 nm. It was essential to use multivariate calibrations to overcome severe spectral overlaps and similarities in reaction kinetics. Calibration curves for the individual analytes showed linear relationships over the concentration ranges of 1.0–11.5 mg L−1 at 526 and 608 nm for pefloxacin, and 0.15–1.8 mg L−1 at 526 and 608 nm for norfloxacin. Various multivariate calibration models were applied, at the two analytical wavelengths, for the simultaneous prediction of the two analytes including classical least squares (CLS), principal component regression (PCR), partial least squares (PLS), radial basis function-artificial neural network (RBF-ANN) and principal component-radial basis function-artificial neural network (PC-RBF-ANN). PLS and PC-RBF-ANN calibrations with the data collected at 526 nm, were the preferred methods—%RPET not, vert, similar 5, and LODs for pefloxacin and norfloxacin of 0.36 and 0.06 mg L−1, respectively. Then, the proposed method was applied successfully for the simultaneous determination of pefloxacin and norfloxacin present in pharmaceutical and human plasma samples. The results compared well with those from the alternative analysis by HPLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interaction of quercetin, which is a bioflavonoid, with bovine serum albumin (BSA) was investigated under pseudo-physiological conditions by the application of UV–vis spectrometry, spectrofluorimetry and cyclic voltammetry (CV). These studies indicated a cooperative interaction between the quercetin–BSA complex and warfarin, which produced a ternary complex, quercetin–BSA–warfarin. It was found that both quercetin and warfarin were located in site I. However, the spectra of these three components overlapped and the chemometrics method – multivariate curve resolution-alternating least squares (MCR-ALS) was applied to resolve the spectra. The resolved spectra of quercetin–BSA and warfarin agreed well with their measured spectra, and importantly, the spectrum of the quercetin–BSA–warfarin complex was extracted. These results allowed the rationalization of the behaviour of the overlapping spectra. At lower concentrations ([warfarin] < 1 × 10−5 mol L−1), most of the site marker reacted with the quercetin–BSA, but free warfarin was present at higher concentrations. Interestingly, the ratio between quercetin–BSA and warfarin was found to be 1:2, suggesting a quercetin–BSA–(warfarin)2 complex, and the estimated equilibrium constant was 1.4 × 1011 M−2. The results suggest that at low concentrations, warfarin binds at the high-affinity sites (HAS), while low-affinity binding sites (LAS) are occupied at higher concentrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Silylated layered double hydroxides (LDHs) were synthesized through a surfactant-free method involving an in situ condensation of silane with the surface hydroxyl group of LDHs during its reconstruction in carbonate solution. X-ray diffraction (XRD) patterns showed the silylation reaction occurred on the external surfaces of LDHs layers. The successful silylation was evidenced by 29Si cross-polarization magic-angle spinning nuclear magnetic resonance (29Si CP/MAS NMR) spectroscopy, attenuated total reflection Fourier transform infrared (ATR FTIR) spectroscopy, and infrared emission spectroscopy (IES). The ribbon shaped crystallites with a “rodlike” aggregation were observed through transmission electron microscopy (TEM) images. The aggregation was explained by the T2 and T3 types of linkage between adjacent silane molecules as indicated in the 29Si NMR spectrum. In addition, the silylated products show high thermal stability by maintained Si related bands even when the temperature was increased to 1000 °C as observed in IES spectra.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We developed orthogonal least-squares techniques for fitting crystalline lens shapes, and used the bootstrap method to determine uncertainties associated with the estimated vertex radii of curvature and asphericities of five different models. Three existing models were investigated including one that uses two separate conics for the anterior and posterior surfaces, and two whole lens models based on a modulated hyperbolic cosine function and on a generalized conic function. Two new models were proposed including one that uses two interdependent conics and a polynomial based whole lens model. The models were used to describe the in vitro shape for a data set of twenty human lenses with ages 7–82 years. The two-conic-surface model (7 mm zone diameter) and the interdependent surfaces model had significantly lower merit functions than the other three models for the data set, indicating that most likely they can describe human lens shape over a wide age range better than the other models (although with the two-conic-surfaces model being unable to describe the lens equatorial region). Considerable differences were found between some models regarding estimates of radii of curvature and surface asphericities. The hyperbolic cosine model and the new polynomial based whole lens model had the best precision in determining the radii of curvature and surface asphericities across the five considered models. Most models found significant increase in anterior, but not posterior, radius of curvature with age. Most models found a wide scatter of asphericities, but with the asphericities usually being positive and not significantly related to age. As the interdependent surfaces model had lower merit function than three whole lens models, there is further scope to develop an accurate model of the complete shape of human lenses of all ages. The results highlight the continued difficulty in selecting an appropriate model for the crystalline lens shape.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper turns Snow-White's magic mirror onto recent economics Nobel Prize winners, top economists and happiness researchers, and through the eyes of the 'man in the street' seeks to determine who the happiest academic is. The study not only provides a clear answer to this question but also unveils who is the ladies' man and who is the sweetheart of the aged. It also explores the extent to which information matters and whether individuals' self-reported happiness affects their perceptions about the happiness of these superstars in economics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper firstly presents an extended ambiguity resolution model that deals with an ill-posed problem and constraints among the estimated parameters. In the extended model, the regularization criterion is used instead of the traditional least squares in order to estimate the float ambiguities better. The existing models can be derived from the general model. Secondly, the paper examines the existing ambiguity searching methods from four aspects: exclusion of nuisance integer candidates based on the available integer constraints; integer rounding; integer bootstrapping and integer least squares estimations. Finally, this paper systematically addresses the similarities and differences between the generalized TCAR and decorrelation methods from both theoretical and practical aspects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The core business of public health is to protect and promote health in the population. Public health planning is the means to maximise these aspirations. Health professionals develop plans to address contemporary health priorities as the evidence about changing patterns of mortality and morbidity is presented. Officials are also alert to international trends in patterns of disease that have the potential to affect the health of Australians. Integrated planning and preparation is currently underway involving all emergency health services, hospitals and population health units to ensure Australia's quick and efficient response to any major infectious disease outbreak, such as avian influenza (bird flu). Public health planning for the preparations for the Sydney Olympics and Paralympic Games in 2000 took almost three years. ‘Its major components included increased surveillance of communicable disease; presentations to sentinel emergency departments; medical encounters at Olympic venues; cruise ship surveillance; environmental and food safety inspections; bioterrorism surveillance and global epidemic intelligence’ (Jorm et al 2003, 102). In other words, the public health plan was developed to ensure food safety, hospital capacity, safe crowd control, protection against infectious diseases, and an integrated emergency and disaster plan. We have national and state plans for vaccinating children against infectious diseases in childhood; plans to promote dental health for children in schools; and screening programs for cervical, breast and prostate cancer. An effective public health response to a change in the distribution of morbidity and mortality requires planning. All levels of government plan for the public’s health. Local governments (councils) ensure healthy local environments to protect the public’s health. They plan parks for recreation, construct traffic-calming devices near schools to prevent childhood accidents, build shade structures and walking paths, and even embed drafts/chess squares in tables for people to sit and play. Environmental Health officers ensure food safety in restaurants and measure water quality. These public health measures attempt to promote the quality of life of residents. Australian and state governments produce plans that protect and promote health through various policy and program initiatives and innovations. To be effective, program plans need to be evaluated. However, building an integrated evaluation plan into a program plan is often forgotten, as planning and evaluation are seen as two distinct entities. Consequently, it is virtually impossible to measure, with any confidence, the extent to which a program has achieved its goals and objectives. This chapter introduces you to the concepts of public health program planning and evaluation. Case studies and reflection questions are presented to illustrate key points. As various authors use different terminology to describe the same concepts/actions of planning and evaluation, the glossary at the back of this book will help you to clarify the terms used in this chapter.