25 resultados para Analyse multivariée de covariance

em Queensland University of Technology - ePrints Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is an important book that ought to launch a debate about how we research our understanding of the world, it is an innovative intervention in a vital public issue, and it is an elegant and scholarly hard look at what is actually happening. Jean Seaton, Prof of Media History, U of Westminster, UK & Official Historian of the BBC -- Summary: This book investigates the question of how comparative studies of international TV news (here: on violence presentation) can best be conceptualized in a way that allows for crossnational, comparative conclusions on an empirically validated basis. This book shows that such a conceptualization is necessary in order to overcome existing restrictions in the comparability of international analysis on violence presentation. Investigated examples include the most watched news bulletins in Great Britain (10o'clock news on the BBC), Germany (Tagesschau on ARD) and Russia (Vremja on Channel 1). This book highlights a substantial cross-national violence news flow as well as a cross-national visual violence flow (key visuals) as distinct transnational components. In addition, event-related textual analysis reveals how the historical rootedness of nations and its symbols of power are still manifested in televisual mediations of violence. In conclusion, this study lobbies for a conscientious use of comparative data/analysis both in journalism research and practice in order to understand what it may convey in the different arenas of today’s newsmaking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the analysis of the parameters which are effective in shaft voltage generation of induction generators. It focuses on different parasitic capacitive couplings by mathematical equations, finite element simulations and experiments. The effects of different design parameters have been studied on proposed capacitances and resultant shaft voltage. Some parameters can change proposed capacitive coupling such as: stator slot tooth, the gap between slot tooth and winding, and the height of the slot tooth, as well as the air gap between the rotor and the stator. This analysis can be used in a primary stage of a generator design to reduce motor shaft voltage and avoid additional costs of resultant bearing current mitigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prostate cancer metastasis is reliant on the reciprocal interactions between cancer cells and the bone niche/micro-environment. The production of suitable matrices to study metastasis, carcinogenesis and in particular prostate cancer/bone micro-environment interaction has been limited to specific protein matrices or matrix secreted by immortalised cell lines that may have undergone transformation processes altering signaling pathways and modifying gene or receptor expression. We hypothesize that matrices produced by primary human osteoblasts are a suitable means to develop an in vitro model system for bone metastasis research mimicking in vivo conditions. We have used a decellularized matrix secreted from primary human osteoblasts as a model for prostate cancer function in the bone micro-environment. We show that this collagen I rich matrix is of fibrillar appearance, highly mineralized, and contains proteins, such as osteocalcin, osteonectin and osteopontin, and growth factors characteristic of bone extracellular matrix (ECM). LNCaP and PC3 cells grown on this matrix, adhere strongly, proliferate, and express markers consistent with a loss of epithelial phenotype. Moreover, growth of these cells on the matrix is accompanied by the induction of genes associated with attachment, migration, increased invasive potential, Ca2+ signaling and osteolysis. In summary, we show that growth of prostate cancer cells on matrices produced by primary human osteoblasts mimics key features of prostate cancer bone metastases and thus is a suitable model system to study the tumor/bone micro-environment interaction in this disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pressure feeder chutes are pieces of equipment used in sugar cane crushing to increase the amount of cane that can be put through a mill. The continuous pressure feeder was developed with the objective to provide a constant feed of bagasse under pressure to the mouth of the crushing mills. The pressure feeder chute is used in a sugarcane milling unit to transfer bagasse from one set of crushing rolls to a second set of crushing rolls. There have been many pressure feeder chute failures in the past. The pressure feeder chute is quite vulnerable and if the bagasse throughput is blocked at the mill rollers, the pressure build-up in the chute can be enormous, which can ultimately result in failure. The result is substantial damage to the rollers, mill and chute construction, and downtimes of up to 48 hours can be experienced. Part of the problem is that the bagasse behaviour in the pressure feeder chute is not understood well. If the pressure feeder chute behaviour was understood, then the chute geometry design could be modified in order to minimise risk of failure. There are possible avenues for changing pressure feeder chute design and operations with a view to producing more reliable pressure feeder chutes in the future. There have been previous attempts to conduct experimental work to determine the causes of pressure feeder chute failures. There are certain guidelines available, however pressure feeder chute failures continue. Pressure feeder chute behaviour still remains poorly understood. This thesis contains the work carried out between April 14th 2009 and October 10th 2012 that focuses on the design of an experimental apparatus to measure forces and visually observe bagasse behaviour in an attempt to understand bagasse behaviour in pressure feeder chutes and minimise the risk of failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To explore the characteristics of regional distribution of cancer deaths in Shandong Province with the principle components analysis. Methods The principle components analysis with co-variance matrix for age-adjusted mortality rates and percentages of 20 types of cancer in 22 counties (cities) were carried out using SAS Software. Results Over 90% of the total information could be reflected by the top 3 principle components and the first principle component alone represented more than half of the overall regional variances. The first component mainly reflected the area differences of esophageal cancer. The second component mainly reflected the area differences of lung cancer, stomach cancer and liver cancer. The value of the first principal component scores showed a clear trend that the west areas possessed higher values and the east the lower values. Based on the top two components,the 22 counties (cities) could be divided into several geographical clusters. Conclusion The overall difference of regional distribution of cancers in Shandong is dominated by several major cancers including esophageal cancer, lung cancer, stomach cancer and liver cancer. Among them,esophageal cancer makes the largest contribution. If the range of counties (cities) analyzed could be further widened, the characteristics of regional distribution of cancer mortality would be better examined. Abstract in Chinese 目的 利用主成分分析探讨山东省恶性肿瘤死亡的地区分布特征. 方法 利用SAS软件对山东省22个县市区2004~2006午的20种恶性肿瘤标化死亡率和构成比分别进行协方差矩阵主成分分析. 结果 前3个主成分就反映了总体差异90%以上的信息,其中仅第1主成分就提供了总体差异一半以上的信息.第1主成分主要反映了食管癌的地区差异,第2主成分主要反映肺癌的地区差异,兼顾胃癌和肝癌.各地区第1主成分得分呈现西高东低的趋势,根据第1和第2主成分可以将调查地区分为若干类别,表现为明显的地理聚集性. 结论 山东省各地区恶性肿瘤死亡的总体差异主要取决于少数高发肿瘤,包括食管癌、肺癌、胃癌、肝癌等,其中以食管癌地位最为突出.如能进一步扩大分析范围,可更好地查明恶性肿瘤死亡的地区特征.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work considers the problem of building high-fidelity 3D representations of the environment from sensor data acquired by mobile robots. Multi-sensor data fusion allows for more complete and accurate representations, and for more reliable perception, especially when different sensing modalities are used. In this paper, we propose a thorough experimental analysis of the performance of 3D surface reconstruction from laser and mm-wave radar data using Gaussian Process Implicit Surfaces (GPIS), in a realistic field robotics scenario. We first analyse the performance of GPIS using raw laser data alone and raw radar data alone, respectively, with different choices of covariance matrices and different resolutions of the input data. We then evaluate and compare the performance of two different GPIS fusion approaches. The first, state-of-the-art approach directly fuses raw data from laser and radar. The alternative approach proposed in this paper first computes an initial estimate of the surface from each single source of data, and then fuses these two estimates. We show that this method outperforms the state of the art, especially in situations where the sensors react differently to the targets they perceive.