927 resultados para COMING-OUT
Resumo:
Submarine cliffs are typically crowded with sessile organisms, most of which are ultimately exported downwards. Here we report a 24 month study of benthic fauna dropping from such cliffs at sites of differing cliff angle and flow rates at Lough Hyne Marine Nature Reserve, Co. Cork, Ireland. The magnitude of 'fall out' material collected in capture nets was highly seasonal and composed of sessile and mobile elements. Sponges, ascidians, cnidarians, polychaetes, bryozoans and barnacles dominated the sessile forms. The remainder (mobile fauna) were scavengers and predators such as asteroid echinoderms, gastropod molluscs and malacostracan crustaceans. These were probably migrants targeting fallen sessile organisms. 'Fall out' material (including mobile forms) increased between May and August in both years. This increase in 'fall out' material was correlated with wrasse abundance at the cliffs (with a one month lag period). The activities of the wrasse on the cliffs (feeding, nest building and territory defence) were considered responsible for the majority of 'fall out' material, with natural mortality and the activity of other large mobile organisms (e.g. crustaceans) also being triplicated. Current flow rate and cliff profile were important in amount of 'fall out' material collected. In low current situations export of fallen material was vertical, while both horizontal and vertical export was associated with moderate to high current environments. Higher 'fall out' was associated with overhanging than vertical cliff surfaces. The 'fall out' of marine organisms in low current situations is likely to provide ail important source of nutrition in close proximity to the cliff, in an otherwise impoverished soft sediment habitat. However, in high current areas material will be exported some distance from the source, with final settlement again occurring in soft sediment habitats (as current speed decreases).
Resumo:
A 'mapping task' was used to explore the networks available to head teachers, school coordinators and local authority staff. Beginning from an ego-centred perspective on networks, we illustrate a number of key analytic categories, including brokerage, formality, and strength and weakness of links with reference to a single UK primary school. We describe how teachers differentiate between the strength of network links and their value, which is characteristically related to their potential impact on classroom practice.
Resumo:
The coding of body part location may depend upon both visual and proprioceptive information, and allows targets to be localized with respect to the body. The present study investigates the interaction between visual and proprioceptive localization systems under conditions of multisensory conflict induced by optokinetic stimulation (OKS). Healthy subjects were asked to estimate the apparent motion speed of a visual target (LED) that could be located either in the extrapersonal space (visual encoding only, V), or at the same distance, but stuck on the subject's right index finger-tip (visual and proprioceptive encoding, V-P). Additionally, the multisensory condition was performed with the index finger kept in position both passively (V-P passive) and actively (V-P active). Results showed that the visual stimulus was always perceived to move, irrespective of its out- or on-the-body location. Moreover, this apparent motion speed varied consistently with the speed of the moving OKS background in all conditions. Surprisingly, no differences were found between V-P active and V-P passive conditions in the speed of apparent motion. The persistence of the visual illusion during the active posture maintenance reveals a novel condition in which vision totally dominates over proprioceptive information, suggesting that the hand-held visual stimulus was perceived as a purely visual, external object despite its contact with the hand.
Resumo:
This paper presents an efficient construction algorithm for obtaining sparse kernel density estimates based on a regression approach that directly optimizes model generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. A local regularization method is incorporated naturally into the density construction process to further enforce sparsity. An additional advantage of the proposed algorithm is that it is fully automatic and the user is not required to specify any criterion to terminate the density construction procedure. This is in contrast to an existing state-of-art kernel density estimation method using the support vector machine (SVM), where the user is required to specify some critical algorithm parameter. Several examples are included to demonstrate the ability of the proposed algorithm to effectively construct a very sparse kernel density estimate with comparable accuracy to that of the full sample optimized Parzen window density estimate. Our experimental results also demonstrate that the proposed algorithm compares favorably with the SVM method, in terms of both test accuracy and sparsity, for constructing kernel density estimates.
Resumo:
We propose a simple yet computationally efficient construction algorithm for two-class kernel classifiers. In order to optimise classifier's generalisation capability, an orthogonal forward selection procedure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very efficient owing to orthogonalisation. Examples are used to demonstrate that the proposed algorithm is a viable alternative to construct sparse two-class kernel classifiers in terms of performance and computational efficiency.
Resumo:
We propose a simple and computationally efficient construction algorithm for two class linear-in-the-parameters classifiers. In order to optimize model generalization, a forward orthogonal selection (OFS) procedure is used for minimizing the leave-one-out (LOO) misclassification rate directly. An analytic formula and a set of forward recursive updating formula of the LOO misclassification rate are developed and applied in the proposed algorithm. Numerical examples are used to demonstrate that the proposed algorithm is an excellent alternative approach to construct sparse two class classifiers in terms of performance and computational efficiency.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
Over the last decade, there has been an increasing body of work that explores whether sensory and motor information is a necessary part of semantic representation and processing. This is the embodiment hypothesis. This paper presents a theoretical review of this work that is intended to be useful for researchers in the neurosciences and neuropsychology. Beginning with a historical perspective, relevant theories are placed on a continuum from strongly embodied to completely unembodied representations. Predictions are derived and neuroscientific and neuropsychological evidence that could support different theories is reviewed; finally, criticisms of embodiment are discussed. We conclude that strongly embodied and completely disembodied theories are not supported, and that the remaining theories agree that semantic representation involves some form of Convergence Zones (Damasio, 1989) and the activation of modal content. For the future, research must carefully define the boundaries of semantic processing and tackle the representation of abstract entities.
Resumo:
This essay explores how The Truman Show, Peter Weir’s film about a television show, deserves more sustained analysis than it has received since its release in 1998. I will argue that The Truman Show problematizes the binary oppositions of cinema/television, disruption/stability, reality/simulation and outside/inside that structure it. The Truman Show proposes that binary oppositions such as outside/inside exist in a mutually implicating relationship. This deconstructionist strategy not only questions the film’s critical position, but also enables a reflection on the very status of film analysis itself.