924 resultados para Classical correlation
Resumo:
Texture analysis and textural cues have been applied for image classification, segmentation and pattern recognition. Dominant texture descriptors include directionality, coarseness, line-likeness etc. In this dissertation a class of textures known as particulate textures are defined, which are predominantly coarse or blob-like. The set of features that characterise particulate textures are different from those that characterise classical textures. These features are micro-texture, macro-texture, size, shape and compaction. Classical texture analysis techniques do not adequately capture particulate texture features. This gap is identified and new methods for analysing particulate textures are proposed. The levels of complexity in particulate textures are also presented ranging from the simplest images where blob-like particles are easily isolated from their back- ground to the more complex images where the particles and the background are not easily separable or the particles are occluded. Simple particulate images can be analysed for particle shapes and sizes. Complex particulate texture images, on the other hand, often permit only the estimation of particle dimensions. Real life applications of particulate textures are reviewed, including applications to sedimentology, granulometry and road surface texture analysis. A new framework for computation of particulate shape is proposed. A granulometric approach for particle size estimation based on edge detection is developed which can be adapted to the gray level of the images by varying its parameters. This study binds visual texture analysis and road surface macrotexture in a theoretical framework, thus making it possible to apply monocular imaging techniques to road surface texture analysis. Results from the application of the developed algorithm to road surface macro-texture, are compared with results based on Fourier spectra, the auto- correlation function and wavelet decomposition, indicating the superior performance of the proposed technique. The influence of image acquisition conditions such as illumination and camera angle on the results was systematically analysed. Experimental data was collected from over 5km of road in Brisbane and the estimated coarseness along the road was compared with laser profilometer measurements. Coefficient of determination R2 exceeding 0.9 was obtained when correlating the proposed imaging technique with the state of the art Sensor Measured Texture Depth (SMTD) obtained using laser profilometers.
Resumo:
Objective: Simvastatin has been shown to enhance osseointegration of pure titanium implants in osteoporotic rats. This study aimed to evaluate the relationship between the serum level of bone formation markers and the osseointegration of pure titanium implants in osteoporotic rats treated with simvastatin. Materials and methods: Fifty-four female Sprague Dawley rats, aged 3 months old, were randomly divided into three groups: Sham-operated group (SHAM; n=18), ovariectomized group (OVX; n=18), and ovariectomized with Simvastatin treatment group (OVX+SIM; n=18). Fifty-six days after ovariectomy, screw-shaped titanium implants were inserted into the tibiae. Simvastatin was administered orally at 5mg/kg each day after the placement of the implant in the OVX+SIM group. The animals were sacrificed at either 28 or 84 days after implantation and the undecalcified tissue sections were processed for histological analysis. Total alkaline phosphatase (ALP), bone specific alkaline phosphatase (BALP) and bone Gla protein (BGP) were measured in all animal sera collected at the time of euthanasia and correlated with the histological assessment of osseointegration. Results: The level of ALP in the OVX group was higher than the SHAM group at day 28, with no differences between the three groups at day 84. The level of BALP in the OVX+SIM group was significantly higher than both OVX and SHAM groups at days 28 and 84. Compared with day 28, the BALP level of all three groups showed a significant decrease at day 84. There were no significant differences in BGP levels between the three groups at day 28, but at day 84 the OVX+SIM group showed significantly higher levels than both the OVX and SHAM groups. There was a significant increase in BGP levels between days 28 and 84 in the OVX+SIM group. The serum bone marker levels correlated with the histological assessment showing reduced osseointegration in the OVX compared to the SHAM group which is subsequently reversed in the OVX+SIM group.
Resumo:
Travel time is an important network performance measure and it quantifies congestion in a manner easily understood by all transport users. In urban networks, travel time estimation is challenging due to number of reasons such as, fluctuations in traffic flow due to traffic signals, significant flow to/from mid link sinks/sources, etc. The classical analytical procedure utilizes cumulative plots at upstream and downstream locations for estimating travel time between the two locations. In this paper, we discuss about the issues and challenges with classical analytical procedure such as its vulnerability to non conservation of flow between the two locations. The complexity with respect to exit movement specific travel time is discussed. Recently, we have developed a methodology utilising classical procedure to estimate average travel time and its statistic on urban links (Bhaskar, Chung et al. 2010). Where, detector, signal and probe vehicle data is fused. In this paper we extend the methodology for route travel time estimation and test its performance using simulation. The originality is defining cumulative plots for each exit turning movement utilising historical database which is self updated after each estimation. The performance is also compared with a method solely based on probe (Probe-only). The performance of the proposed methodology has been found insensitive to different route flow, with average accuracy of more than 94% given a probe per estimation interval which is more than 5% increment in accuracy with respect to Probe-only method.
Resumo:
This paper outlines a study to determine the correlation between the LA10(18hour) and other road traffic noise indicators. It is based on a database comprising of 404 measurement locations including 947 individual days of valid noise measurements across numerous circumstances taken between November 2001 and November 2007. This paper firstly discusses the need and constraints on the indicators and their nature of matching a suitable indicator to the various road traffic noise dynamical characteristics. The paper then presents a statistical analysis of the road traffic noise monitoring data, correlating various indicators with the LA10(18hour) statistical indicator and provides a comprehensive table of linear correlations. There is an extended analysis on relationships across the night time period. The paper concludes with a discussion on the findings.
Resumo:
Seat pressure is known as a major factor of seat comfort in vehicles. In passenger vehicles, there is lacking research into the seat comfort of rear seat occupants. As accurate seat pressure measurement requires significant effort, simulation of seat pressure is evolving as a preferred method. However, analytic methods are based on complex finite element modeling and therefore are time consuming and involve high investment. Based on accurate anthropometric measurements of 64 male subjects and outboard rear seat pressure measurements in three different passenger vehicles, this study investigates if a set of parameters derived from seat pressure mapping are sensitive enough to differentiate between different seats and whether they correlate with anthropometry in linear models. In addition to the pressure map analysis, H-Points were measured with a coordinate measurement system based on palpated body landmarks and the range of H-Point locations in the three seats is provided. It was found that for the cushion, cushion contact area and cushion front area/force could be modeled by subject anthropometry,while only seatback contact area could be modeled based on anthropometry for all three vehicles. Major differences were found between the vehicles for other parameters.
Resumo:
Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.
Resumo:
Ross River Virus has caused reported outbreaks of epidemic polyarthritis, a chronic debilitating disease associated with significant long-term morbidity in Australia and the Pacific region since the 1920s. To address this public health concern, a formalin- and UV-inactivated whole virus vaccine grown in animal protein-free cell culture was developed and tested in preclinical studies to evaluate immunogenicity and efficacy in animal models. After active immunizations, the vaccine dose-dependently induced antibodies and protected adult mice from viremia and interferon α/β receptor knock-out (IFN-α/βR(-/-)) mice from death and disease. In passive transfer studies, administration of human vaccinee sera followed by RRV challenge protected adult mice from viremia and young mice from development of arthritic signs similar to human RRV-induced disease. Based on the good correlation between antibody titers in human sera and protection of animals, a correlate of protection was defined. This is of particular importance for the evaluation of the vaccine because of the comparatively low annual incidence of RRV disease, which renders a classical efficacy trial impractical. Antibody-dependent enhancement of infection, did not occur in mice even at low to undetectable concentrations of vaccine-induced antibodies. Also, RRV vaccine-induced antibodies were partially cross-protective against infection with a related alphavirus, Chikungunya virus, and did not enhance infection. Based on these findings, the inactivated RRV vaccine is expected to be efficacious and protect humans from RRV disease
Resumo:
The study presents a multi-layer genetic algorithm (GA) approach using correlation-based methods to facilitate damage determination for through-truss bridge structures. To begin, the structure’s damage-suspicious elements are divided into several groups. In the first GA layer, the damage is initially optimised for all groups using correlation objective function. In the second layer, the groups are combined to larger groups and the optimisation starts over at the normalised point of the first layer result. Then the identification process repeats until reaching the final layer where one group includes all structural elements and only minor optimisations are required to fine tune the final result. Several damage scenarios on a complicated through-truss bridge example are nominated to address the proposed approach’s effectiveness. Structural modal strain energy has been employed as the variable vector in the correlation function for damage determination. Simulations and comparison with the traditional single-layer optimisation shows that the proposed approach is efficient and feasible for complicated truss bridge structures when the measurement noise is taken into account.
Resumo:
Forecasts of volatility and correlation are important inputs into many practical financial problems. Broadly speaking, there are two ways of generating forecasts of these variables. Firstly, time-series models apply a statistical weighting scheme to historical measurements of the variable of interest. The alternative methodology extracts forecasts from the market traded value of option contracts. An efficient options market should be able to produce superior forecasts as it utilises a larger information set of not only historical information but also the market equilibrium expectation of options market participants. While much research has been conducted into the relative merits of these approaches, this thesis extends the literature along several lines through three empirical studies. Firstly, it is demonstrated that there exist statistically significant benefits to taking the volatility risk premium into account for the implied volatility for the purposes of univariate volatility forecasting. Secondly, high-frequency option implied measures are shown to lead to superior forecasts of the intraday stochastic component of intraday volatility and that these then lead on to superior forecasts of intraday total volatility. Finally, the use of realised and option implied measures of equicorrelation are shown to dominate measures based on daily returns.
Resumo:
As a part of vital infrastructure and transportation network, bridge structures must function safely at all times. Bridges are designed to have a long life span. At any point in time, however, some bridges are aged. The ageing of bridge structures, given the rapidly growing demand of heavy and fast inter-city passages and continuous increase of freight transportation, would require diligence on bridge owners to ensure that the infrastructure is healthy at reasonable cost. In recent decades, a new technique, structural health monitoring (SHM), has emerged to meet this challenge. In this new engineering discipline, structural modal identification and damage detection have formed a vital component. Witnessed by an increasing number of publications is that the change in vibration characteristics is widely and deeply investigated to assess structural damage. Although a number of publications have addressed the feasibility of various methods through experimental verifications, few of them have focused on steel truss bridges. Finding a feasible vibration-based damage indicator for steel truss bridges and solving the difficulties in practical modal identification to support damage detection motivated this research project. This research was to derive an innovative method to assess structural damage in steel truss bridges. First, it proposed a new damage indicator that relies on optimising the correlation between theoretical and measured modal strain energy. The optimisation is powered by a newly proposed multilayer genetic algorithm. In addition, a selection criterion for damage-sensitive modes has been studied to achieve more efficient and accurate damage detection results. Second, in order to support the proposed damage indicator, the research studied the applications of two state-of-the-art modal identification techniques by considering some practical difficulties: the limited instrumentation, the influence of environmental noise, the difficulties in finite element model updating, and the data selection problem in the output-only modal identification methods. The numerical (by a planer truss model) and experimental (by a laboratory through truss bridge) verifications have proved the effectiveness and feasibility of the proposed damage detection scheme. The modal strain energy-based indicator was found to be sensitive to the damage in steel truss bridges with incomplete measurement. It has shown the damage indicator's potential in practical applications of steel truss bridges. Lastly, the achievement and limitation of this study, and lessons learnt from the modal analysis have been summarised.
Resumo:
The authors present a qualitative and quantitative comparison of various similarity measures that form the kernel of common area-based stereo-matching systems. The authors compare classical difference and correlation measures as well as nonparametric measures based on the rank and census transforms for a number of outdoor images. For robotic applications, important considerations include robustness to image defects such as intensity variation and noise, the number of false matches, and computational complexity. In the absence of ground truth data, the authors compare the matching techniques based on the percentage of matches that pass the left-right consistency test. The authors also evaluate the discriminatory power of several match validity measures that are reported in the literature for eliminating false matches and for estimating match confidence. For guidance applications, it is essential to have and estimate of confidence in the three-dimensional points generated by stereo vision. Finally, a new validity measure, the rank constraint, is introduced that is capable of resolving ambiguous matches for rank transform-based matching.
Resumo:
Stronger investor interest in commodities may create closer integration with conventional asset markets. We estimate sudden and gradual changes in correlation between stocks, bonds and commodity futures returns driven by observable financial variables and time, using double smooth transition conditional correlation (DSTCC–GARCH) models. Most correlations begin the 1990s near zero but closer integration emerges around the early 2000s and reaches peaks during the recent crisis. Diversification benefits to investors across equity, bond and stock markets were significantly reduced. Increases in VIX and financial traders’ short open interest raise futures returns volatility for many commodities. Higher VIX also increases commodity returns correlation with equity returns for about half the pairs, indicating closer integration.
Resumo:
Two different morphologies of nanotextured molybdenum oxide were deposited by thermal evaporation. By measuring their field emission (FE) properties, an enhancement factor was extracted. Subsequently, these films were coated with a thin layer of Pt to form Schottky contacts. The current-voltage (I-V) characteristics showed low magnitude reverse breakdown voltages, which we attributed to the localized electric field enhancement. An enhancement factor was obtained from the I-V curves. We will show that the enhancement factor extracted from the I-V curves is in good agreement with the enhancement factor extracted from the FE measurements.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.