621 resultados para Dimensionality
Resumo:
In the identification of complex dynamic systems using fuzzy neural networks, one of the main issues is the curse of dimensionality, which makes it difficult to retain a large number of system inputs or to consider a large number of fuzzy sets. Moreover, due to the correlations, not all possible network inputs or regression vectors in the network are necessary and adding them simply increases the model complexity and deteriorates the network generalisation performance. In this paper, the problem is solved by first proposing a fast algorithm for selection of network terms, and then introducing a refinement procedure to tackle the correlation issue. Simulation results show the efficacy of the method.
Resumo:
Motivation: Recently, many univariate and several multivariate approaches have been suggested for testing differential expression of gene sets between different phenotypes. However, despite a wealth of literature studying their performance on simulated and real biological data, still there is a need to quantify their relative performance when they are testing different null hypotheses.
Results: In this article, we compare the performance of univariate and multivariate tests on both simulated and biological data. In the simulation study we demonstrate that high correlations equally affect the power of both, univariate as well as multivariate tests. In addition, for most of them the power is similarly affected by the dimensionality of the gene set and by the percentage of genes in the set, for which expression is changing between two phenotypes. The application of different test statistics to biological data reveals that three statistics (sum of squared t-tests, Hotelling's T2, N-statistic), testing different null hypotheses, find some common but also some complementing differentially expressed gene sets under specific settings. This demonstrates that due to complementing null hypotheses each test projects on different aspects of the data and for the analysis of biological data it is beneficial to use all three tests simultaneously instead of focusing exclusively on just one.
Resumo:
This paper introduces a new technique for palmprint recognition based on Fisher Linear Discriminant Analysis (FLDA) and Gabor filter bank. This method involves convolving a palmprint image with a bank of Gabor filters at different scales and rotations for robust palmprint features extraction. Once these features are extracted, FLDA is applied for dimensionality reduction and class separability. Since the palmprint features are derived from the principal lines, wrinkles and texture along the palm area. One should carefully consider this fact when selecting the appropriate palm region for the feature extraction process in order to enhance recognition accuracy. To address this problem, an improved region of interest (ROI) extraction algorithm is introduced. This algorithm allows for an efficient extraction of the whole palm area by ignoring all the undesirable parts, such as the fingers and background. Experiments have shown that the proposed method yields attractive performances as evidenced by an Equal Error Rate (EER) of 0.03%.
Resumo:
The motivation for this paper is to present procedures for automatically creating idealised finite element models from the 3D CAD solid geometry of a component. The procedures produce an accurate and efficient analysis model with little effort on the part of the user. The technique is applicable to thin walled components with local complex features and automatically creates analysis models where 3D elements representing the complex regions in the component are embedded in an efficient shell mesh representing the mid-faces of the thin sheet regions. As the resulting models contain elements of more than one dimension, they are referred to as mixed dimensional models. Although these models are computationally more expensive than some of the idealisation techniques currently employed in industry, they do allow the structural behaviour of the model to be analysed more accurately, which is essential if appropriate design decisions are to be made. Also, using these procedures, analysis models can be created automatically whereas the current idealisation techniques are mostly manual, have long preparation times, and are based on engineering judgement. In the paper the idealisation approach is first applied to 2D models that are used to approximate axisymmetric components for analysis. For these models 2D elements representing the complex regions are embedded in a 1D mesh representing the midline of the cross section of the thin sheet regions. Also discussed is the coupling, which is necessary to link the elements of different dimensionality together. Analysis results from a 3D mixed dimensional model created using the techniques in this paper are compared to those from a stiffened shell model and a 3D solid model to demonstrate the improved accuracy of the new approach. At the end of the paper a quantitative analysis of the reduction in computational cost due to shell meshing thin sheet regions demonstrates that the reduction in degrees of freedom is proportional to the square of the aspect ratio of the region, and for long slender solids, the reduction can be proportional to the aspect ratio of the region if appropriate meshing algorithms are used.
Resumo:
A key problem in community ecology is to understand how individual-level traits give rise to population-level trophic interactions. Here, we propose a synthetic framework based on ecological considerations to address this question systematically. We derive a general functional form for the dependence of trophic interaction coefficients on trophically relevant quantitative traits of consumers and resources. The derived expression encompasses-and thus allows a unified comparison of-several functional forms previously proposed in the literature. Furthermore, we show how a community's, potentially low-dimensional, effective trophic niche space is related to its higher-dimensional phenotypic trait space. In this manner, we give ecological meaning to the notion of the "dimensionality of trophic niche space." Our framework implies a method for directly measuring this dimensionality. We suggest a procedure for estimating the relevant parameters from empirical data and for verifying that such data matches the assumptions underlying our derivation. © Springer Science+Business Media B.V. 2009.
--------------------------------------------------------------------------------
Reaxys Database Information|
--------------------------------------------------------------------------------
Resumo:
A question central to modelling and, ultimately, managing food webs concerns the dimensionality of trophic niche space, that is, the number of independent traits relevant for determining consumer-resource links. Food-web topologies can often be interpreted by assuming resource traits to be specified by points along a line and each consumer's diet to be given by resources contained in an interval on this line. This phenomenon, called intervality, has been known for 30 years and is widely acknowledged to indicate that trophic niche space is close to one-dimensional. We show that the degrees of intervality observed in nature can be reproduced in arbitrary-dimensional trophic niche spaces, provided that the processes of evolutionary diversification and adaptation are taken into account. Contrary to expectations, intervality is least pronounced at intermediate dimensions and steadily improves towards lower- and higher-dimensional trophic niche spaces.
Resumo:
We present high-accuracy calculations of ionization rates of helium at UV (195 nm) wavelengths. The data are obtained from full-dimensionality integrations of the helium-laser time-dependent Schrödinger equation. Comparison is made with our previously obtained data at 390 nm and 780 nm. We show that scaling laws introduced by Parker et al extend unmodified from the near-infrared limit into the UV limit. Static-field ionization rates of helium are also obtained, again from time-dependent full-dimensionality integrations of the helium Schrödinger equation. We compare the static-field ionization results with those of Scrinzi et al and Themelis et al, who also treat the full-dimensional helium atom, but with time-independent methods. Good agreement is obtained.
Resumo:
We report full-dimensionality quantum and classical calculations of double ionization (DI) of laser-driven helium at 390 nm. Good agreement is observed. We identify the relative importance of the two main non-sequential DI pathways, the direct|with an almost simultaneous ejection of both electrons|and the delayed. We find that the delayed pathway prevails at small intensities independently of total electron energy but at high intensities the direct pathway predominates up to a certain upper-limit in total energy which increases with intensity. An explanation for this increase with intensity is provided.
Resumo:
A time-dependent method for calculating the collective excitation frequencies and densities of a trapped, inhomogeneous Bose-Einstein condensate with circulation is presented. The results are compared with time-independent solutions of the Bogoliubov-de Gennes equations. The method is based on time-dependent linear-response theory combined with spectral analysis of moments of the excitation modes of interest. The technique is straightforward to apply, extremely efficient in our implementation with parallel fast Fourier transform methods, and produces highly accurate results. For high dimensionality or low symmetry the time-dependent approach is a more practical computational scheme and produces accurate and reliable data. The method is suitable for general trap geometries, condensate flows and condensates permeated with defects and vortex structures.
Resumo:
Schizophrenia is a common psychotic mental disorder that is believed to result from the effects of multiple genetic and environmental factors. In this study, we explored gene-gene interactions and main effects in both case-control (657 cases and 411 controls) and family-based (273 families, 1350 subjects) datasets of English or Irish ancestry. Fifty three markers in 8 genes were genotyped in the family sample and 44 markers in 7 genes were genotyped in the case-control sample. The Multifactor Dimensionality Reduction Pedigree Disequilibrium Test (MDR-PDT) was used to examine epistasis in the family dataset and a 3-locus model was identified (permuted p=0.003). The 3-locus model involved the IL3 (rs2069803), RGS4 (rs2661319), and DTNBP1 (rs21319539) genes. We used MDR to analyze the case-control dataset containing the same markers typed in the RGS4, IL3 and DTNBP1 genes and found evidence of a joint effect between IL3 (rs31400) and DTNBP1 (rs760761) (cross-validation consistency 4/5, balanced prediction accuracy=56.84%, p=0.019). While this is not a direct replication, the results obtained from both the family and case-control samples collectively suggest that IL3 and DTNBP1 are likely to interact and jointly contribute to increase risk for schizophrenia. We also observed a significant main effect in DTNBP1, which survived correction for multiple comparisons, and numerous nominally significant effects in several genes. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A new microfluidic-based approach to measuring liquid thermal conductivity is developed to address the requirement in many practical applications for measurements using small (microlitre) sample size and integration into a compact device. The approach also gives the possibility of high-throughput testing. A resistance heater and temperature sensor are incorporated into a glass microfluidic chip to allow transmission and detection of a planar thermal wave crossing a thin layer of the sample. The device is designed so that heat transfer is locally one-dimensional during a short initial time period. This allows the detected temperature transient to be separated into two distinct components: a short-time, purely one-dimensional part from which sample thermal conductivity can be determined and a remaining long-time part containing the effects of three-dimensionality and of the finite size of surrounding thermal reservoirs. Identification of the one-dimensional component yields a steady temperature difference from which sample thermal conductivity can be determined. Calibration is required to give correct representation of changing heater resistance, system layer thicknesses and solid material thermal conductivities with temperature. In this preliminary study, methanol/water mixtures are measured at atmospheric pressure over the temperature range 30-50A degrees C. The results show that the device has produced a measurement accuracy of within 2.5% over the range of thermal conductivity and temperature of the tests. A relation between measurement uncertainty and the geometric and thermal properties of the system is derived and this is used to identify ways that error could be further reduced.
Resumo:
This article explores the ethics and aesthetics of representing travel and intercultural encounter in textual and photographic forms. Taking as its starting point two textual accounts of journeys in the course of which photographic narratives were also produced, the article explores the possibilities and limitations of textuality and visuality and thus considers the implications of, or new opportunities afforded by, reading – and ultimately publishing – these narratives as iconotexts. Focusing on Pierre Loti's L'Inde (sans les Anglais) (1901) and Ella Maillart's Oasis interdites (1937), the article also offers an alternative perspective on writers whose work is commonly associated with an imperialist or exoticist discourse, with cliché and one-dimensionality. As such, it aims to replace the monolithic, orientalist vision often attributed to these writers with ambiguity, ethical hesitation and a plurality of perspectives. Using these examples as a springboard, the article seeks to argue that verbal/visual mobility in narratives representing mobility contributes to resisting static, monolithic perceptions of other cultures. Using the work of British graffiti artist Banksy as a foil for exploring photography as cultural commodification and art as commodity, the article also seeks to engage with current debates in Humanities research on ekphrasis and iconotextuality and on the problematics of representing other cultures within an ethical and/or humanist frame.
Resumo:
Fuzzy-neural-network-based inference systems are well-known universal approximators which can produce linguistically interpretable results. Unfortunately, their dimensionality can be extremely high due to an excessive number of inputs and rules, which raises the need for overall structure optimization. In the literature, various input selection methods are available, but they are applied separately from rule selection, often without considering the fuzzy structure. This paper proposes an integrated framework to optimize the number of inputs and the number of rules simultaneously. First, a method is developed to select the most significant rules, along with a refinement stage to remove unnecessary correlations. An improved information criterion is then proposed to find an appropriate number of inputs and rules to include in the model, leading to a balanced tradeoff between interpretability and accuracy. Simulation results confirm the efficacy of the proposed method.
Resumo:
Colour-based particle filters have been used exhaustively in the literature given rise to multiple applications However tracking coloured objects through time has an important drawback since the way in which the camera perceives the colour of the object can change Simple updates are often used to address this problem which imply a risk of distorting the model and losing the target In this paper a joint image characteristic-space tracking is proposed which updates the model simultaneously to the object location In order to avoid the curse of dimensionality a Rao-Blackwellised particle filter has been used Using this technique the hypotheses are evaluated depending on the difference between the model and the current target appearance during the updating stage Convincing results have been obtained in sequences under both sudden and gradual illumination condition changes Crown Copyright (C) 2010 Published by Elsevier B V All rights reserved
Resumo:
In human motion analysis, the joint estimation of appearance, body pose and location parameters is not always tractable due to its huge computational cost. In this paper, we propose a Rao-Blackwellized Particle Filter for addressing the problem of human pose estimation and tracking. The advantage of the proposed approach is that Rao-Blackwellization allows the state variables to be splitted into two sets, being one of them analytically calculated from the posterior probability of the remaining ones. This procedure reduces the dimensionality of the Particle Filter, thus requiring fewer particles to achieve a similar tracking performance. In this manner, location and size over the image are obtained stochastically using colour and motion clues, whereas body pose is solved analytically applying learned human Point Distribution Models.