5 resultados para Dimensionality
em WestminsterResearch - UK
Resumo:
Complex network theory is a framework increasingly used in the study of air transport networks, thanks to its ability to describe the structures created by networks of flights, and their influence in dynamical processes such as delay propagation. While many works consider only a fraction of the network, created by major airports or airlines, for example, it is not clear if and how such sampling process bias the observed structures and processes. In this contribution, we tackle this problem by studying how some observed topological metrics depend on the way the network is reconstructed, i.e. on the rules used to sample nodes and connections. Both structural and simple dynamical properties are considered, for eight major air networks and different source datasets. Results indicate that using a subset of airports strongly distorts our perception of the network, even when just small ones are discarded; at the same time, considering a subset of airlines yields a better and more stable representation. This allows us to provide some general guidelines on the way airports and connections should be sampled.
Resumo:
Freshness and safety of muscle foods are generally considered as the most important parameters for the food industry. To address the rapid determination of meat spoilage, Fourier transform infrared (FTIR) spectroscopy technique, with the help of advanced learning-based methods, was attempted in this work. FTIR spectra were obtained from the surface of beef samples during aerobic storage at various temperatures, while a microbiological analysis had identified the population of Total viable counts. A fuzzy principal component algorithm has been also developed to reduce the dimensionality of the spectral data. The results confirmed the superiority of the adopted scheme compared to the partial least squares technique, currently used in food microbiology.
Resumo:
Much debate in schizotypal research has centred on the factor structure of the Schizotypal Personality Questionnaire (SPQ), with research variously showing higher-order dimensionality consisting of two to seven dimensions. In addition, cross-cultural support for the stability of those factors remains limited. Here, we examined the factor structure of the SPQ among British and Trinidadian adults. Participants from a White British sub-sample (n = 351) resident in the UK and from an African Caribbean sub-sample (n = 284) resident in Trinidad completed the SPQ. The higher-order factor structure of the SPQ was analysed through confirmatory factor analysis, followed by multiple-group analysis for the model of best-fit. Between-group differences for sex and ethnicity were investigated using multivariate analysis of variance in relation to the higher-order domains. The model of best-fit was the four-factor structure, which demonstrated measurement invariance across groups. Additionally, these data had an adequate fit for two alternative models: a) 3 factors and b) a modified 4-factor. The British sub-sample had significantly higher scores across all domains than the Trinidadian group, and men scored significantly higher on the disorganised domain than women. The four-factor structure received confirmatory support and, importantly, support for use with populations varying in ethnicity and culture.
Resumo:
In this study, we utilise a novel approach to segment out the ventricular system in a series of high resolution T1-weighted MR images. We present a brain ventricles fast reconstruction method. The method is based on the processing of brain sections and establishing a fixed number of landmarks onto those sections to reconstruct the ventricles 3D surface. Automated landmark extraction is accomplished through the use of the self-organising network, the growing neural gas (GNG), which is able to topographically map the low dimensionality of the network to the high dimensionality of the contour manifold without requiring a priori knowledge of the input space structure. Moreover, our GNG landmark method is tolerant to noise and eliminates outliers. Our method accelerates the classical surface reconstruction and filtering processes. The proposed method offers higher accuracy compared to methods with similar efficiency as Voxel Grid.
Resumo:
This article proposes a three-step procedure to estimate portfolio return distributions under the multivariate Gram-Charlier (MGC) distribution. The method combines quasi maximum likelihood (QML) estimation for conditional means and variances and the method of moments (MM) estimation for the rest of the density parameters, including the correlation coefficients. The procedure involves consistent estimates even under density misspecification and solves the so-called ‘curse of dimensionality’ of multivariate modelling. Furthermore, the use of a MGC distribution represents a flexible and general approximation to the true distribution of portfolio returns and accounts for all its empirical regularities. An application of such procedure is performed for a portfolio composed of three European indices as an illustration. The MM estimation of the MGC (MGC-MM) is compared with the traditional maximum likelihood of both the MGC and multivariate Student’s t (benchmark) densities. A simulation on Value-at-Risk (VaR) performance for an equally weighted portfolio at 1% and 5% confidence indicates that the MGC-MM method provides reasonable approximations to the true empirical VaR. Therefore, the procedure seems to be a useful tool for risk managers and practitioners.