963 resultados para INVARIANT SUBSPACES
Resumo:
Technical efficiency is estimated and examined for a cross-section of Australian dairy farms using various frontier methodologies; Bayesian and Classical stochastic frontiers, and Data Envelopment Analysis. The results indicate technical inefficiency is present in the sample data. Also identified are statistical differences between the point estimates of technical efficiency generated by the various methodologies. However, the rank of farm level technical efficiency is statistically invariant to the estimation technique employed. Finally, when confidence/credible intervals of technical efficiency are compared significant overlap is found for many of the farms' intervals for all frontier methods employed. The results indicate that the choice of estimation methodology may matter, but the explanatory power of all frontier methods is significantly weaker when interval estimate of technical efficiency is examined.
Resumo:
Background: Variation in carrying capacity and population return rates is generally ignored in traditional studies of population dynamics. Variation is hard to study in the field because of difficulties controlling the environment in order to obtain statistical replicates, and because of the scale and expense of experimenting on populations. There may also be ethical issues. To circumvent these problems we used detailed simulations of the simultaneous behaviours of interacting animals in an accurate facsimile of a real Danish landscape. The models incorporate as much as possible of the behaviour and ecology of skylarks Alauda arvensis, voles Microtus agrestis, a ground beetle Bembidion lampros and a linyphiid spider Erigone atra. This allows us to quantify and evaluate the importance of spatial and temporal heterogeneity on the population dynamics of the four species. Results: Both spatial and temporal heterogeneity affected the relationship between population growth rate and population density in all four species. Spatial heterogeneity accounted for 23–30% of the variance in population growth rate after accounting for the effects of density, reflecting big differences in local carrying capacity associated with the landscape features important to individual species. Temporal heterogeneity accounted for 3–13% of the variance in vole, skylark and spider, but 43% in beetles. The associated temporal variation in carrying capacity would be problematic in traditional analyses of density dependence. Return rates were less than one in all species and essentially invariant in skylarks, spiders and beetles. Return rates varied over the landscape in voles, being slower where there were larger fluctuations in local population sizes. Conclusion: Our analyses estimated the traditional parameters of carrying capacities and return rates, but these are now seen as varying continuously over the landscape depending on habitat quality and the mechanisms of density dependence. The importance of our results lies in our demonstration that the effects of spatial and temporal heterogeneity must be accounted for if we are to have accurate predictive models for use in management and conservation. This is an area which until now has lacked an adequate theoretical framework and methodology.
Resumo:
Background: Transcriptomic techniques are now being applied in ecotoxicology and toxicology to measure the impact of stressors and develop understanding of mechanisms of toxicity. Microarray technology in particular offers the potential to measure thousands of gene responses simultaneously. However, it is important that microarrays responses should be validated, at least initially, using real-time quantitative polymerase chain reaction (QPCR). The accurate measurement of target gene expression requires normalisation to an invariant internal control e. g., total RNA or reference genes. Reference genes are preferable, as they control for variation inherent in the cDNA synthesis and PCR. However, reference gene expression can vary between tissues and experimental conditions, which makes it crucial to validate them prior to application. Results: We evaluated 10 candidate reference genes for QPCR in Daphnia magna following a 24 h exposure to the non-steroidal anti-inflammatory drug (NSAID) ibuprofen (IB) at 0, 20, 40 and 80 mg IB l(-1). Six of the 10 candidates appeared suitable for use as reference genes. As a robust approach, we used a combination normalisation factor (NF), calculated using the geNorm application, based on the geometric mean of three selected reference genes: glyceraldehyde-3-phosphate dehydrogenase, ubiquitin conjugating enzyme and actin. The effects of normalisation are illustrated using as target gene leukotriene B4 12-hydroxydehydrogenase (Ltb4dh), which was upregulated following 24 h exposure to 63-81 mg IB l(-1). Conclusions: As anticipated, use of the NF clarified the response of Ltb4dh in daphnids exposed to sublethal levels of ibuprofen. Our findings emphasise the importance in toxicogenomics of finding and applying invariant internal QPCR control(s) relevant to the study conditions.
Resumo:
We investigated patterns of bryophyte species richness and community structure, and their relation to roof variables, on thatched roofs of the Holnicote Estate, South Somerset. Thirty-two bryophyte species were recorded from 28 sampled roofs, including the globally rare and endangered thatch moss, Leptodontium gemmascens. Multiple regression analyses revealed that thatch age has a highly significant positive effect on the number of species present, accounting for nearly half the observed variation in species richness after removal of outliers. Aspect has a slight and marginally significant effect on species diversity (accounting for an additional 6% of variation), with north-facing samples having slightly more species. Age also has a significant impact on total bryophyte cover after removal of outlying observations. TWINSPAN analysis of bryophyte cover data suggests the existence of at least five discrete communities. Simple Discriminant Analyses indicate that these communities occupy different ecological subspaces as defined by the measured roof variables, with pitch, aspect and thatch age emerging as especially significant attributes. Contingency Analysis indicates that some communities are disfavoured by water reed as compared to wheat straw. The findings are significant for understanding the structure of bryophyte communities, for evaluating the effect of bryophyte cover on thatch performance, and for conservation of thatch communities, especially those harbouring rare species.
Resumo:
This paper highlights the key role played by solubility in influencing gelation and demonstrates that many facets of the gelation process depend on this vital parameter. In particular, we relate thermal stability (T-gel) and minimum gelation concentration (MGC) values of small-molecule gelation in terms of the solubility and cooperative self-assembly of gelator building blocks. By employing a van't Hoff analysis of solubility data, determined from simple NMR measurements, we are able to generate T-calc values that reflect the calculated temperature for complete solubilization of the networked gelator. The concentration dependence of T-calc allows the previously difficult to rationalize "plateau-region" thermal stability values to be elucidated in terms of gelator molecular design. This is demonstrated for a family of four gelators with lysine units attached to each end of an aliphatic diamine, with different peripheral groups (Z or Bee) in different locations on the periphery of the molecule. By tuning the peripheral protecting groups of the gelators, the solubility of the system is modified, which in turn controls the saturation point of the system and hence controls the concentration at which network formation takes place. We report that the critical concentration (C-crit) of gelator incorporated into the solid-phase sample-spanning network within the gel is invariant of gelator structural design. However, because some systems have higher solubilities, they are less effective gelators and require the application of higher total concentrations to achieve gelation, hence shedding light on the role of the MGC parameter in gelation. Furthermore, gelator structural design also modulates the level of cooperative self-assembly through solubility effects, as determined by applying a cooperative binding model to NMR data. Finally, the effect of gelator chemical design on the spatial organization of the networked gelator was probed by small-angle neutron and X-ray scattering (SANS/SAXS) on the native gel, and a tentative self-assembly model was proposed.
Resumo:
A full dimensional, ab initio-based semiglobal potential energy surface for C2H3+ is reported. The ab initio electronic energies for this molecule are calculated using the spin-restricted, coupled cluster method restricted to single and double excitations with triples corrections [RCCSD(T)]. The RCCSD(T) method is used with the correlation-consistent polarized valence triple-zeta basis augmented with diffuse functions (aug-cc-pVTZ). The ab initio potential energy surface is represented by a many-body (cluster) expansion, each term of which uses functions that are fully invariant under permutations of like nuclei. The fitted potential energy surface is validated by comparing normal mode frequencies at the global minimum and secondary minimum with previous and new direct ab initio frequencies. The potential surface is used in vibrational analysis using the "single-reference" and "reaction-path" versions of the code MULTIMODE. (c) 2006 American Institute of Physics.
Resumo:
Earlier studies showed that the disparity with respect to other visible points could not explain stereoacuity performance, nor could various spatial derivatives of disparity [Glennerster, A., McKee, S. P., & Birch, M. D. (2002). Evidence of surface-based processing of binocular disparity. Current Biology, 12:825-828; Petrov, Y., & Glennerster, A. (2004). The role of the local reference in stereoscopic detection of depth relief. Vision Research, 44:367-376.] Two possible cues remain: (i) local changes in disparity gradient or (ii) disparity with respect to an interpolated line drawn through the reference points. Here, we aimed to distinguish between these two cues. Subjects judged.. in a two AFC paradigm, whether a target dot was in front of a plane defined by three reference dots or, in other experiments, in front of a line defined by two reference dots. We tested different slants of the reference line or plane and different locations of the target relative to the reference points. For slanted reference lines or plane, stereoacuity changed little as the target position was varied. For judgments relative to a frontoparallel reference line, stereoacuity did vary with target position, but less than would be predicted by disparity gradient change. This provides evidence that disparity with respect to the reference plane is an important cue. We discuss the potential advantages of this measure in generating a representation of surface relief that is invariant to viewpoint transformations. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
This paper proposes a novel method of authentication of users in secure buildings. The main objective is to investigate whether user actions in the built environment can produce consistent behavioural signatures upon which a building intrusion detection system could be based. In the process three behavioural expressions were discovered: time-invariant, co-dependent and idiosyncratic.
Resumo:
This paper considers left-invariant control systems defined on the orthonormal frame bundles of simply connected manifolds of constant sectional curvature, namely the space forms Euclidean space E-3, the sphere S-3 and Hyperboloid H-3 with the corresponding frame bundles equal to the Euclidean group of motions SE(3), the rotation group SO(4) and the Lorentz group SO(1, 3). Orthonormal frame bundles of space forms coincide with their isometry groups and therefore the focus shifts to left-invariant control systems defined on Lie groups. In this paper a method for integrating these systems is given where the controls are time-independent. In the Euclidean case the elements of the Lie algebra se(3) are often referred to as twists. For constant twist motions, the corresponding curves g(t) is an element of SE(3) are known as screw motions, given in closed form by using the well known Rodrigues' formula. However, this formula is only applicable to the Euclidean case. This paper gives a method for computing the non-Euclidean screw motions in closed form. This involves decoupling the system into two lower dimensional systems using the double cover properties of Lie groups, then the lower dimensional systems are solved explicitly in closed form.
Resumo:
This paper considers left-invariant control systems defined on the Lie groups SU(2) and SO(3). Such systems have a number of applications in both classical and quantum control problems. The purpose of this paper is two-fold. Firstly, the optimal control problem for a system varying on these Lie Groups, with cost that is quadratic in control is lifted to their Hamiltonian vector fields through the Maximum principle of optimal control and explicitly solved. Secondly, the control systems are integrated down to the level of the group to give the solutions for the optimal paths corresponding to the optimal controls. In addition it is shown here that integrating these equations on the Lie algebra su(2) gives simpler solutions than when these are integrated on the Lie algebra so(3).
Resumo:
This paper tackles the path planning problem for oriented vehicles travelling in the non-Euclidean 3-Dimensional space; spherical space S3. For such problem, the orientation of the vehicle is naturally represented by orthonormal frame bundle; the rotation group SO(4). Orthonormal frame bundles of space forms coincide with their isometry groups and therefore the focus shifts to control systems defined on Lie groups. The oriented vehicles, in this case, are constrained to travel at constant speed in a forward direction and their angular velocities directly controlled. In this paper we identify controls that induce steady motions of these oriented vehicles and yield closed form parametric expressions for these motions. The paths these vehicles trace are defined explicitly in terms of the controls and therefore invariant with respect to the coordinate system used to describe the motion.
Resumo:
This paper considers the motion planning problem for oriented vehicles travelling at unit speed in a 3-D space. A Lie group formulation arises naturally and the vehicles are modeled as kinematic control systems with drift defined on the orthonormal frame bundles of particular Riemannian manifolds, specifically, the 3-D space forms Euclidean space E-3, the sphere S-3, and the hyperboloid H'. The corresponding frame bundles are equal to the Euclidean group of motions SE(3), the rotation group SO(4), and the Lorentz group SO (1, 3). The maximum principle of optimal control shifts the emphasis for these systems to the associated Hamiltonian formalism. For an integrable case, the extremal curves are explicitly expressed in terms of elliptic functions. In this paper, a study at the singularities of the extremal curves are given, which correspond to critical points of these elliptic functions. The extremal curves are characterized as the intersections of invariant surfaces and are illustrated graphically at the singular points. It. is then shown that the projections, of the extremals onto the base space, called elastica, at these singular points, are curves of constant curvature and torsion, which in turn implies that the oriented vehicles trace helices.
Resumo:
In this paper we deal with performance analysis of Monte Carlo algorithm for large linear algebra problems. We consider applicability and efficiency of the Markov chain Monte Carlo for large problems, i.e., problems involving matrices with a number of non-zero elements ranging between one million and one billion. We are concentrating on analysis of the almost Optimal Monte Carlo (MAO) algorithm for evaluating bilinear forms of matrix powers since they form the so-called Krylov subspaces. Results are presented comparing the performance of the Robust and Non-robust Monte Carlo algorithms. The algorithms are tested on large dense matrices as well as on large unstructured sparse matrices.
Resumo:
This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.