15 resultados para ONE-COMPONENT

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The effects of attentional modulation on activity within the human visual cortex were investigated using magnetoencephalography. Chromatic sinusoidal stimuli were used to evoke activity from the occipital cortex, with attention directed either toward or away from the stimulus using a bar-orientation judgment task. For five observers, global magnetic field power was plotted as a function of time from stimulus onset. The major peak of each function occurred at about 120 ms latency and was well modeled by a current dipole near the calcarine sulcus. Independent component analysis (ICA) on the non-averaged data for each observer also revealed one component of calcarine origin, the location of which matched that of the dipolar source determined from the averaged data. For two observers, ICA revealed a second component near the parieto-occipital sulcus. Although no effects of attention were evident using standard averaging procedures, time-varying spectral analyses of single trials revealed that the main effect of attention was to alter the level of oscillatory activity. Most notably, a sustained increase in alpha-band (7-12 Hz) activity of both calcarine and parieto-occipital origin was evident. In addition, calcarine activity in the range of 13-21 Hz was enhanced, while calcarine activity in the range of 5-6 Hz was reduced. Our results are consistent with the hypothesis that attentional modulation affects neural processing within the calcarine and parieto-occipital cortex by altering the amplitude of alpha-band activity and other natural brain rhythms. © 2003 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stimuli from one family of complex motions are defined by their spiral pitch, where cardinal axes represent signed expansion and rotation. Intermediate spirals are represented by intermediate pitches. It is well established that vision contains mechanisms that sum over space and direction to detect these stimuli (Morrone et al., Nature 376 (1995) 507) and one possibility is that four cardinal mechanisms encode the entire family. We extended earlier work (Meese & Harris, Vision Research 41 (2001) 1901) using subthreshold summation of random dot kinematograms and a two-interval forced choice technique to investigate this possibility. In our main experiments, the spiral pitch of one component was fixed and that of another was varied in steps of 15° relative to the first. Regardless of whether the fixed component was aligned with cardinal axes or an intermediate spiral, summation to-coherence-threshold between the two components declined as a function of their difference in spiral pitch. Similar experiments showed that none of the following were critical design features or stimulus parameters for our results: superposition of signal dots, limited life-time dots, the presence of speed gradients, stimulus size or the number of dots. A simplex algorithm was used to fit models containing mechanisms spaced at a pitch of either 90° (cardinal model) or 45° (cardinal+model) and combined using a fourth-root summation rule. For both models, direction half-bandwidth was equated for all mechanisms and was the only free parameter. Only the cardinal+model could account for the full set of results. We conclude that the detection of complex motion in human vision requires both cardinal and spiral mechanisms with a half-bandwidth of approximately 46°. © 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A total pressure apparatus has been developed to measure vapour-liquid equilibrium data on binary mixtures at atmospheric and sub-atmospheric pressures. The method gives isothermal data which can be obtained rapidly. Only measurements of total pressure are made as a direct function of composition of synthetic liquid phase composition, the vapour phase composition being deduced through the Gibbs-Duhem relationship. The need to analyse either of the phases is eliminated. As such the errors introduced by sampling and analysis are removed. The essential requirements are that the pure components be degassed completely since any deficiency in degassing would introduce errors into the measured pressures. A similarly essential requirement was that the central apparatus would have to be absolutely leak-tight as any leakage of air either in or out of the apparatus would introduce erroneous pressure readings. The apparatus was commissioned by measuring the saturated vapour pressures of both degassed water and ethanol as a function of temperature. The pressure-temperature data on degassed water measured were directly compared with data in the literature, with good agreement. Similarly the pressure-temperature data were measured for ethanol, methanol and cyclohexane and where possible a direct comparison made with the literature data. Good agreement between the pure component data of this work and those available in the literature demonstrates firstly that a satisfactory degassing procedure has been achieved and that secondly the measurements of pressure-temperature are consistent for any one component; since this is true for a number of components, the measurements of both temperature and pressure are both self-consistent and of sufficient accuracy, with an observed compatibility between the precision/accuracy of the separate means of measuring pressure and temperature. The liquid mixtures studied were of ethanol-water, methanol-water and ethanol-cyclohexane. The total pressure was measured as the composition inside the equilibrium cell was varied at a set temperature. This gave P-T-x data sets for each mixture at a range of temperatures. A standard fitting-package from the literature was used to reduce the raw data to yield y-values to complete the x-y-P-T data sets. A consistency test could not be applied to the P-T-x data set as no y-values were obtained during the experimental measurements. In general satisfactory agreement was found between the data of this work and those available in the literature. For some runs discrepancies were observed, and further work recommended to eliminate the problems identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

TEST is a novel taxonomy of knowledge representations based on three distinct hierarchically organized representational features: Tropism, Embodiment, and Situatedness. Tropic representational features reflect constraints of the physical world on the agent's ability to form, reactivate, and enrich embodied (i.e., resulting from the agent's bodily constraints) conceptual representations embedded in situated contexts. The proposed hierarchy entails that representations can, in principle, have tropic features without necessarily having situated and/or embodied features. On the other hand, representations that are situated and/or embodied are likely to be simultaneously tropic. Hence, although we propose tropism as the most general term, the hierarchical relationship between embodiment and situatedness is more on a par, such that the dominance of one component over the other relies on the distinction between offline storage versus online generation as well as on representation-specific properties. © 2013 Cognitive Science Society, Inc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A rapid one-pot synthesis of 3-alkyl-5-[(Z)-arylme­thylidene]-1,3-thiazolidine-2,4-dionesis described that occurs in recyclable ionic liquid [bmim]PF6 (1-butyl-3-methylimidazolium hexafluorophosphate).Significant rate enhancement and good selectivity have been observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Previous attempts to formulate mixture models for PCA have therefore to some extent been ad hoc. In this paper, PCA is formulated within a maximum-likelihood framework, based on a specific form of Gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analysers, whose parameters can be determined using an EM algorithm. We discuss the advantages of this model in the context of clustering, density modelling and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The collect-and-place machine is one of the most widely used placement machines for assembling electronic components on the printed circuit boards (PCBs). Nevertheless, the number of researches concerning the optimisation of the machine performance is very few. This motivates us to study the component scheduling problem for this type of machine with the objective of minimising the total assembly time. The component scheduling problem is an integration of the component sequencing problem, that is, the sequencing of component placements; and the feeder arrangement problem, that is, the assignment of component types to feeders. To solve the component scheduling problem efficiently, a hybrid genetic algorithm is developed in this paper. A numerical example is used to compare the performance of the algorithm with different component grouping approaches and different population sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In printed circuit board (PCB) assembly, the efficiency of the component placement process is dependent on two interrelated issues: the sequence of component placement, that is, the component sequencing problem, and the assignment of component types to feeders of the placement machine, that is, the feeder arrangement problem. In cases where some components with the same type are assigned to more than one feeder, the component retrieval problem should also be considered. Due to their inseparable relationship, a hybrid genetic algorithm is adopted to solve these three problems simultaneously for a type of PCB placement machines called the sequential pick-and-place (PAP) machine in this paper. The objective is to minimise the total distance travelled by the placement head for assembling all components on a PCB. Besides, the algorithm is compared with the methods proposed by other researchers in order to examine its effectiveness and efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of our paper is to examine whether Exchange Traded Funds (ETFs) diversify away the private information of informed traders. We apply the spread decomposition models of Glosten and Harris (1998) and Madhavan, Richardson and Roomans (1997) to a sample of ETFs and their control securities. Our results indicate that ETFs have significantly lower adverse selection costs than their control securities. This suggests that private information is diversified away for these securities. Our results therefore offer one explanation for the rapid growth in the ETF market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To independently evaluate the impact of the second phase of the Health Foundation's Safer Patients Initiative (SPI2) on a range of patient safety measures. Design: A controlled before and after design. Five substudies: survey of staff attitudes; review of case notes from high risk (respiratory) patients in medical wards; review of case notes from surgical patients; indirect evaluation of hand hygiene by measuring hospital use of handwashing materials; measurement of outcomes (adverse events, mortality among high risk patients admitted to medical wards, patients' satisfaction, mortality in intensive care, rates of hospital acquired infection). Setting: NHS hospitals in England. Participants: Nine hospitals participating in SPI2 and nine matched control hospitals. Intervention The SPI2 intervention was similar to the SPI1, with somewhat modified goals, a slightly longer intervention period, and a smaller budget per hospital. Results: One of the scores (organisational climate) showed a significant (P=0.009) difference in rate of change over time, which favoured the control hospitals, though the difference was only 0.07 points on a five point scale. Results of the explicit case note reviews of high risk medical patients showed that certain practices improved over time in both control and SPI2 hospitals (and none deteriorated), but there were no significant differences between control and SPI2 hospitals. Monitoring of vital signs improved across control and SPI2 sites. This temporal effect was significant for monitoring the respiratory rate at both the six hour (adjusted odds ratio 2.1, 99% confidence interval 1.0 to 4.3; P=0.010) and 12 hour (2.4, 1.1 to 5.0; P=0.002) periods after admission. There was no significant effect of SPI for any of the measures of vital signs. Use of a recommended system for scoring the severity of pneumonia improved from 1.9% (1/52) to 21.4% (12/56) of control and from 2.0% (1/50) to 41.7% (25/60) of SPI2 patients. This temporal change was significant (7.3, 1.4 to 37.7; P=0.002), but the difference in difference was not significant (2.1, 0.4 to 11.1; P=0.236). There were no notable or significant changes in the pattern of prescribing errors, either over time or between control and SPI2 hospitals. Two items of medical history taking (exercise tolerance and occupation) showed significant improvement over time, across both control and SPI2 hospitals, but no additional SPI2 effect. The holistic review showed no significant changes in error rates either over time or between control and SPI2 hospitals. The explicit case note review of perioperative care showed that adherence rates for two of the four perioperative standards targeted by SPI2 were already good at baseline, exceeding 94% for antibiotic prophylaxis and 98% for deep vein thrombosis prophylaxis. Intraoperative monitoring of temperature improved over time in both groups, but this was not significant (1.8, 0.4 to 7.6; P=0.279), and there were no additional effects of SPI2. A dramatic rise in consumption of soap and alcohol hand rub was similar in control and SPI2 hospitals (P=0.760 and P=0.889, respectively), as was the corresponding decrease in rates of Clostridium difficile and meticillin resistant Staphylococcus aureus infection (P=0.652 and P=0.693, respectively). Mortality rates of medical patients included in the case note reviews in control hospitals increased from 17.3% (42/243) to 21.4% (24/112), while in SPI2 hospitals they fell from 10.3% (24/233) to 6.1% (7/114) (P=0.043). Fewer than 8% of deaths were classed as avoidable; changes in proportions could not explain the divergence of overall death rates between control and SPI2 hospitals. There was no significant difference in the rate of change in mortality in intensive care. Patients' satisfaction improved in both control and SPI2 hospitals on all dimensions, but again there were no significant changes between the two groups of hospitals. Conclusions: Many aspects of care are already good or improving across the NHS in England, suggesting considerable improvements in quality across the board. These improvements are probably due to contemporaneous policy activities relating to patient safety, including those with features similar to the SPI, and the emergence of professional consensus on some clinical processes. This phenomenon might have attenuated the incremental effect of the SPI, making it difficult to detect. Alternatively, the full impact of the SPI might be observable only in the longer term. The conclusion of this study could have been different if concurrent controls had not been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the literature has suggested the possibility of breach being composed of multiple facets, no previous study has investigated this possibility empirically. This study examined the factor structure of typical component forms in order to develop a multiple component form measure of breach. Two studies were conducted. In study 1 (N = 420) multi-item measures based on causal indicators representing promissory obligations were developed for the five potential component forms (delay, magnitude, type/form, inequity and reciprocal imbalance). Exploratory factor analysis showed that the five components loaded onto one higher order factor, namely psychological contract breach suggesting that breach is composed of different aspects rather than types of breach. Confirmatory factor analysis provided further evidence for the proposed model. In addition, the model achieved high construct reliability and showed good construct, convergent, discriminant and predictive validity. Study 2 data (N = 189), used to validate study 1 results, compared the multiple-component measure with an established multiple item measure of breach (rather than a single item as in study 1) and also tested for discriminant validity with an established multiple item measure of violation. Findings replicated those in study 1. The findings have important implications for considering alternative, more comprehensive and elaborate ways of assessing breach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that many medicines are a mixture of two enantiomers, or mirror-image molecules. Two enantiomers occur when a molecule has a single chiral centre and the two mirror images, called S or L (left handed) and R or D (right handed), are usually found in equal amounts in the parent (racemic) mixture. While for many compounds used in clinical practice the active moiety is found in one of the two enantiomers with the other being seen as an unnecessary and redundant component of the racemic mixture, the difference between enantiomers can mean a difference between therapeutic and adverse effects, as well as in beneficial pharmacological effect and potency. © 2010 The Author(s).