906 resultados para statistical techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The aim of this study was to examine the expression and prognostic relevance of thrombospondin-1 (TSP-1) in tumor biopsies taken from a consecutive series of liver resections done at the University Hospitals of Leicester and the Royal Liverpool Hospital. Experimental Design Patients having undergone a liver resection for colorectal liver metastases at our institutions between 1993 and 1999 inclusive were eligible. Inclusion criteria were curative intent, sufficient tumor biopsy, and patient follow-up data. One hundred eighty-two patients were considered in this study. Standard immunohistochemical techniques were used to study the expression of TSP-1 in 5-μm tumor sections from paraffin-embedded tissue blocks. TSP-1 was correlated with survival using the Kaplan-Meier method and log-rank test for univariate analysis and the Cox proportional hazard model for multivariate analysis. Results One hundred eighty-two patients (male, n = 122 and female, n = 60) ages between 25 and 81 years (mean, 61 years) were included. TSP-1 was expressed around blood vessels (n = 45, 25%) or in the stroma (n = 59, 33%). No expression was detected in the remaining tumors. TSP-1 significantly correlated with poor survival on univariate (P = 0.01 for perivascular expression and P = 0.03 for stromal expression) and multivariate analysis (P = 0.01 for perivascular expression). Conclusion TSP-1 is a negatively prognostic factor for survival in resected colorectal liver metastases. © 2005 American Association for Cancer Research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methodology was applied to a survey of time-course incidence of four viruses (alfalfa mosaic virus, clover yellow vein virus, subterranean clover mottle virus and subterranean clover red leaf virus) in improved pastures in southern regions of Australia. -from Authors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-image descriptors such as GIST have been used successfully for persistent place recognition when combined with temporal filtering or sequential filtering techniques. However, whole-image descriptor localization systems often apply a heuristic rather than a probabilistic approach to place recognition, requiring substantial environmental-specific tuning prior to deployment. In this paper we present a novel online solution that uses statistical approaches to calculate place recognition likelihoods for whole-image descriptors, without requiring either environmental tuning or pre-training. Using a real world benchmark dataset, we show that this method creates distributions appropriate to a specific environment in an online manner. Our method performs comparably to FAB-MAP in raw place recognition performance, and integrates into a state of the art probabilistic mapping system to provide superior performance to whole-image methods that are not based on true probability distributions. The method provides a principled means for combining the powerful change-invariant properties of whole-image descriptors with probabilistic back-end mapping systems without the need for prior training or system tuning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a comparative study on the response of a buried tunnel to surface blast using the arbitrary Lagrangian-Eulerian (ALE) and smooth particle hydrodynamics (SPH) techniques. Since explosive tests with real physical models are extremely risky and expensive, the results of a centrifuge test were used to validate the numerical techniques. The numerical study shows that the ALE predictions were faster and closer to the experimental results than those from the SPH simulations which over predicted the strains. The findings of this research demonstrate the superiority of the ALE modelling techniques for the present study. They also provide a comprehensive understanding of the preferred ALE modelling techniques which can be used to investigate the surface blast response of underground tunnels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pile foundations transfer loads from superstructures to stronger sub soil. Their strength and stability can hence affect structural safety. This paper treats the response of reinforced concrete pile in saturated sand to a buried explosion. Fully coupled computer simulation techniques are used together with five different material models. Influence of reinforcement on pile response is investigated and important safety parameters of horizontal deformations and tensile stresses in the pile are evaluated. Results indicate that adequate longitudinal reinforcement and proper detailing of transverse reinforcement can reduce pile damage. Present findings can serve as a benchmark reference for future analysis and design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spreading cell fronts play an essential role in many physiological processes. Classically, models of this process are based on the Fisher-Kolmogorov equation; however, such continuum representations are not always suitable as they do not explicitly represent behaviour at the level of individual cells. Additionally, many models examine only the large time asymptotic behaviour, where a travelling wave front with a constant speed has been established. Many experiments, such as a scratch assay, never display this asymptotic behaviour, and in these cases the transient behaviour must be taken into account. We examine the transient and asymptotic behaviour of moving cell fronts using techniques that go beyond the continuum approximation via a volume-excluding birth-migration process on a regular one-dimensional lattice. We approximate the averaged discrete results using three methods: (i) mean-field, (ii) pair-wise, and (iii) one-hole approximations. We discuss the performace of these methods, in comparison to the averaged discrete results, for a range of parameter space, examining both the transient and asymptotic behaviours. The one-hole approximation, based on techniques from statistical physics, is not capable of predicting transient behaviour but provides excellent agreement with the asymptotic behaviour of the averaged discrete results, provided that cells are proliferating fast enough relative to their rate of migration. The mean-field and pair-wise approximations give indistinguishable asymptotic results, which agree with the averaged discrete results when cells are migrating much more rapidly than they are proliferating. The pair-wise approximation performs better in the transient region than does the mean-field, despite having the same asymptotic behaviour. Our results show that each approximation only works in specific situations, thus we must be careful to use a suitable approximation for a given system, otherwise inaccurate predictions could be made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Department of Culture and the Arts undertook the first mapping of Perth’s creative industries in 2007 in partnership with the City of Perth and the Departments of Industry and Resources and the Premier and Cabinet. The 2013 Creative Industries Statistical Analysis for Western Australia report has updated the mapping with the 2011 Census employment data to provide invaluable information for the State’s creative industries, their peak associations and potential investors. The report maps sector employment numbers and growth between the 2006 and 2011 Census in the areas of music, visual and performing arts, film, TV and radio, advertising and marketing, software and digital content, publishing, and architecture and design, which includes designer fashion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual localization in outdoor environments is often hampered by the natural variation in appearance caused by such things as weather phenomena, diurnal fluctuations in lighting, and seasonal changes. Such changes are global across an environment and, in the case of global light changes and seasonal variation, the change in appearance occurs in a regular, cyclic manner. Visual localization could be greatly improved if it were possible to predict the appearance of a particular location at a particular time, based on the appearance of the location in the past and knowledge of the nature of appearance change over time. In this paper, we investigate whether global appearance changes in an environment can be learned sufficiently to improve visual localization performance. We use time of day as a test case, and generate transformations between morning and afternoon using sample images from a training set. We demonstrate the learned transformation can be generalized from training data and show the resulting visual localization on a test set is improved relative to raw image comparison. The improvement in localization remains when the area is revisited several weeks later.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The safe working lifetime of a structure in a corrosive or other harsh environment is frequently not limited by the material itself but rather by the integrity of the coating material. Advanced surface coatings are usually crosslinked organic polymers such as epoxies and polyurethanes which must not shrink, crack or degrade when exposed to environmental extremes. While standard test methods for environmental durability of coatings have been devised, the tests are structured more towards determining the end of life rather than in anticipation of degradation. We have been developing prognostic tools to anticipate coating failure by using a fundamental understanding of their degradation behaviour which, depending on the polymer structure, is mediated through hydrolytic or oxidation processes. Fourier transform infrared spectroscopy (FTIR) is a widely-used laboratory technique for the analysis of polymer degradation and with the development of portable FTIR spectrometers, new opportunities have arisen to measure polymer degradation non-destructively in the field. For IR reflectance sampling, both diffuse (scattered) and specular (direct) reflections can occur. The complexity in these spectra has provided interesting opportunities to study surface chemical and physical changes during paint curing, service abrasion and weathering, but has often required the use of advanced statistical analysis methods such as chemometrics to discern these changes. Results from our studies using this and related techniques and the technical challenges that have arisen will be presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.