585 resultados para function estimation
Resumo:
The aim of this study was to characterise and quantify the fungal fragment propagules derived and released from several fungal species (Penicillium, Aspergillus niger and Cladosporium cladosporioides) using different generation methods and different air velocities over the colonies. Real time fungal spore fragmentation was investigated using an Ultraviolet Aerodynamic Particle Sizer (UVASP) and a Scanning Mobility Particle Sizer (SMPS). The study showed that there were significant differences (p < 0.01) in the fragmentation percentage between different air velocities for the three generation methods, namely the direct, the fan and the fungal spore source strength tester (FSSST) methods. The percentage of fragmentation also proved to be dependant on fungal species. The study found that there was no fragmentation for any of the fungal species at an air velocity ≤ 0.4 m/s for any method of generation. Fluorescent signals, as well as mathematical determination also showed that the fungal fragments were derived from spores. Correlation analysis showed that the number of released fragments measured by the UVAPS under controlled conditions can be predicted on the basis of the number of spores, for Penicillium and Aspergillus niger, but not for Cladosporium cladosporioides. The fluorescence percentage of fragment samples was found to be significantly different to that of non-fragment samples (p < 0.0001) and the fragment sample fluorescence was always less than that of the non-fragment samples. Size distribution and concentration of fungal fragment particles were investigated qualitatively and quantitatively, by both UVAPS and SMPS, and it was found that the UVAPS was more sensitive than the SMPS for measuring small sample concentrations, and the results obtained from the UVAPS and SMAS were not identical for the same samples.
Resumo:
PURPOSE: To introduce techniques for deriving a map that relates visual field locations to optic nerve head (ONH) sectors and to use the techniques to derive a map relating Medmont perimetric data to data from the Heidelberg Retinal Tomograph. METHODS: Spearman correlation coefficients were calculated relating each visual field location (Medmont M700) to rim area and volume measures for 10 degrees ONH sectors (HRT III software) for 57 participants: 34 with glaucoma, 18 with suspected glaucoma, and 5 with ocular hypertension. Correlations were constrained to be anatomically plausible with a computational model of the axon growth of retinal ganglion cells (Algorithm GROW). GROW generated a map relating field locations to sectors of the ONH. The sector with the maximum statistically significant (P < 0.05) correlation coefficient within 40 degrees of the angle predicted by GROW for each location was computed. Before correlation, both functional and structural data were normalized by either normative data or the fellow eye in each participant. RESULTS: The model of axon growth produced a 24-2 map that is qualitatively similar to existing maps derived from empiric data. When GROW was used in conjunction with normative data, 31% of field locations exhibited a statistically significant relationship. This significance increased to 67% (z-test, z = 4.84; P < 0.001) when both field and rim area data were normalized with the fellow eye. CONCLUSIONS: A computational model of axon growth and normalizing data by the fellow eye can assist in constructing an anatomically plausible map connecting visual field data and sectoral ONH data.
Resumo:
Market-based environmental regulation is becoming increasingly common within international and national frameworks. Environmental offset and trading regimes are part of the market-based instrument revolution. This paper proposes that environmental market mechanisms could be used to introduce an ethic of land holder responsibility. In order for market based regimes to attract sufficient levels of stakeholder engagement, participants within such scheme require an incentive to participate and furthermore need to feel a sense of security about investing in such processes. A sense of security is often associated with property based interests. This paper explores the property related issues connected with environmental offset and trading scheme initiatives. Relevant property-related considerations include land tenure considerations, public versus private management of land choices, characteristics and powers associated with property interests, theories defining property and the recognition of legal proprietal interests. The Biodiversity Banking Scheme in New South Wales is then examined as a case study followed by a critique on the role of environmental markets.
Resumo:
This paper proposes a new prognosis model based on the technique for health state estimation of machines for accurate assessment of the remnant life. For the evaluation of health stages of machines, the Support Vector Machine (SVM) classifier was employed to obtain the probability of each health state. Two case studies involving bearing failures were used to validate the proposed model. Simulated bearing failure data and experimental data from an accelerated bearing test rig were used to train and test the model. The result obtained is very encouraging and shows that the proposed prognostic model produces promising results and has the potential to be used as an estimation tool for machine remnant life prediction.
Resumo:
To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
Resumo:
This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.
Resumo:
We propose an efficient and low-complexity scheme for estimating and compensating clipping noise in OFDMA systems. Conventional clipping noise estimation schemes, which need all demodulated data symbols, may become infeasible in OFDMA systems where a specific user may only know his own modulation scheme. The proposed scheme first uses equalized output to identify a limited number of candidate clips, and then exploits the information on known subcarriers to reconstruct clipped signal. Simulation results show that the proposed scheme can significantly improve the system performance.
Resumo:
Traffic congestion is an increasing problem with high costs in financial, social and personal terms. These costs include psychological and physiological stress, aggressivity and fatigue caused by lengthy delays, and increased likelihood of road crashes. Reliable and accurate traffic information is essential for the development of traffic control and management strategies. Traffic information is mostly gathered from in-road vehicle detectors such as induction loops. Traffic Message Chanel (TMC) service is popular service which wirelessly send traffic information to drivers. Traffic probes have been used in many cities to increase traffic information accuracy. A simulation to estimate the number of probe vehicles required to increase the accuracy of traffic information in Brisbane is proposed. A meso level traffic simulator has been developed to facilitate the identification of the optimal number of probe vehicles required to achieve an acceptable level of traffic reporting accuracy. Our approach to determine the optimal number of probe vehicles required to meet quality of service requirements, is to simulate runs with varying numbers of traffic probes. The simulated traffic represents Brisbane’s typical morning traffic. The road maps used in simulation are Brisbane’s TMC maps complete with speed limits and traffic lights. Experimental results show that that the optimal number of probe vehicles required for providing a useful supplement to TMC (induction loop) data lies between 0.5% and 2.5% of vehicles on the road. With less probes than 0.25%, little additional information is provided, while for more probes than 5%, there is only a negligible affect on accuracy for increasingly many probes on the road. Our findings are consistent with on-going research work on traffic probes, and show the effectiveness of using probe vehicles to supplement induction loops for accurate and timely traffic information.
Resumo:
There is sparse systematic examination of the potential for growth as well as distress that may occur for some adult survivors of childhood sexual abuse. The presented study explored posttraumatic growth and its relationship with negative posttrauma outcomes within the specific population of survivors of childhood sexual abuse (N = 40). Results showed that 95% of the participants experienced clinically significant post-traumatic stress disorder symptomatology related to their childhood sexual abuse. In conjunction with these high levels of negative symptoms, the population evidenced posttraumatic growth levels that were comparable to other trauma samples. This research has clinical relevance in terms of adding to the knowledge base on sexual abuse and the usefulness of this knowledge in therapeutic interventions and relationships.
Molecular architecture of the human sinus node: insights into the function of the cardiac pacemaker.
Resumo:
BACKGROUND: Although we know much about the molecular makeup of the sinus node (SN) in small mammals, little is known about it in humans. The aims of the present study were to investigate the expression of ion channels in the human SN and to use the data to predict electrical activity. METHODS AND RESULTS: Quantitative polymerase chain reaction, in situ hybridization, and immunofluorescence were used to analyze 6 human tissue samples. Messenger RNA (mRNA) for 120 ion channels (and some related proteins) was measured in the SN, a novel paranodal area, and the right atrium (RA). The results showed, for example, that in the SN compared with the RA, there was a lower expression of Na(v)1.5, K(v)4.3, K(v)1.5, ERG, K(ir)2.1, K(ir)6.2, RyR2, SERCA2a, Cx40, and Cx43 mRNAs but a higher expression of Ca(v)1.3, Ca(v)3.1, HCN1, and HCN4 mRNAs. The expression pattern of many ion channels in the paranodal area was intermediate between that of the SN and RA; however, compared with the SN and RA, the paranodal area showed greater expression of K(v)4.2, K(ir)6.1, TASK1, SK2, and MiRP2. Expression of ion channel proteins was in agreement with expression of the corresponding mRNAs. The levels of mRNA in the SN, as a percentage of those in the RA, were used to estimate conductances of key ionic currents as a percentage of those in a mathematical model of human atrial action potential. The resulting SN model successfully produced pacemaking. CONCLUSIONS: Ion channels show a complex and heterogeneous pattern of expression in the SN, paranodal area, and RA in humans, and the expression pattern is appropriate to explain pacemaking.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.