9 resultados para Four-color problem

em CentAUR: Central Archive University of Reading - UK


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Some studies have proven that a conventional visual brain computer interface (BCI) based on overt attention cannot be used effectively when eye movement control is not possible. To solve this problem, a novel visual-based BCI system based on covert attention and feature attention has been proposed and was called the gaze-independent BCI. Color and shape difference between stimuli and backgrounds have generally been used in examples of gaze-independent BCIs. Recently, a new paradigm based on facial expression changes has been presented, and obtained high performance. However, some facial expressions were so similar that users couldn't tell them apart, especially when they were presented at the same position in a rapid serial visual presentation (RSVP) paradigm. Consequently, the performance of the BCI is reduced. New Method: In this paper, we combined facial expressions and colors to optimize the stimuli presentation in the gaze-independent BCI. This optimized paradigm was called the colored dummy face pattern. It is suggested that different colors and facial expressions could help users to locate the target and evoke larger event-related potentials (ERPs). In order to evaluate the performance of this new paradigm, two other paradigms were presented, called the gray dummy face pattern and the colored ball pattern. Comparison with Existing Method(s): The key point that determined the value of the colored dummy faces stimuli in BCI systems was whether the dummy face stimuli could obtain higher performance than gray faces or colored balls stimuli. Ten healthy participants (seven male, aged 21–26 years, mean 24.5 ± 1.25) participated in our experiment. Online and offline results of four different paradigms were obtained and comparatively analyzed. Results: The results showed that the colored dummy face pattern could evoke higher P300 and N400 ERP amplitudes, compared with the gray dummy face pattern and the colored ball pattern. Online results showed that the colored dummy face pattern had a significant advantage in terms of classification accuracy (p < 0.05) and information transfer rate (p < 0.05) compared to the other two patterns. Conclusions: The stimuli used in the colored dummy face paradigm combined color and facial expressions. This had a significant advantage in terms of the evoked P300 and N400 amplitudes and resulted in high classification accuracies and information transfer rates. It was compared with colored ball and gray dummy face stimuli.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The formulation of four-dimensional variational data assimilation allows the incorporation of constraints into the cost function which need only be weakly satisfied. In this paper we investigate the value of imposing conservation properties as weak constraints. Using the example of the two-body problem of celestial mechanics we compare weak constraints based on conservation laws with a constraint on the background state.We show how the imposition of conservation-based weak constraints changes the nature of the gradient equation. Assimilation experiments demonstrate how this can add extra information to the assimilation process, even when the underlying numerical model is conserving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The well-studied link between psychotic traits and creativity is a subject of much debate. The present study investigated the extent to which schizotypic personality traits - as measured by O-LIFE (Oxford-Liverpool Inventory of Feelings and Experiences) - equip healthy individuals to engage as groups in everyday tasks. From a sample of 69 students, eight groups of four participants - comprised of high, medium, or low-schizotypy individuals - were assembled to work as a team to complete a creative problem-solving task. Predictably, high scorers on the O-LIFE formulated a greater number of strategies to solve the task, indicative of creative divergent thinking. However, for task success (as measured by time taken to complete the problem) an inverted U shaped pattern emerged, whereby high and low-schizotypy groups were consistently faster than medium schizotypy groups. Intriguing data emerged concerning leadership within the groups, and other tangential findings relating to anxiety, competition and motivation were explored. These findings challenge the traditional cliche that psychotic personality traits are linearly related to creative performance, and suggest that the nature of the problem determines which thinking styles are optimally equipped to solve it. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical weather prediction can be regarded as an initial value problem whereby the governing atmospheric equations are integrated forward from fully determined initial values of the meteorological parameters. However, in spite of the considerable improvements of the observing systems in recent years, the initial values are known only incompletely and inaccurately and one of the major tasks of any forecasting centre is to determine the best possible initial state from available observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of spurious excitation of gravity waves in the context of four-dimensional data assimilation is investigated using a simple model of balanced dynamics. The model admits a chaotic vortical mode coupled to a comparatively fast gravity wave mode, and can be initialized such that the model evolves on a so-called slow manifold, where the fast motion is suppressed. Identical twin assimilation experiments are performed, comparing the extended and ensemble Kalman filters (EKF and EnKF, respectively). The EKF uses a tangent linear model (TLM) to estimate the evolution of forecast error statistics in time, whereas the EnKF uses the statistics of an ensemble of nonlinear model integrations. Specifically, the case is examined where the true state is balanced, but observation errors project onto all degrees of freedom, including the fast modes. It is shown that the EKF and EnKF will assimilate observations in a balanced way only if certain assumptions hold, and that, outside of ideal cases (i.e., with very frequent observations), dynamical balance can easily be lost in the assimilation. For the EKF, the repeated adjustment of the covariances by the assimilation of observations can easily unbalance the TLM, and destroy the assumptions on which balanced assimilation rests. It is shown that an important factor is the choice of initial forecast error covariance matrix. A balance-constrained EKF is described and compared to the standard EKF, and shown to offer significant improvement for observation frequencies where balance in the standard EKF is lost. The EnKF is advantageous in that balance in the error covariances relies only on a balanced forecast ensemble, and that the analysis step is an ensemble-mean operation. Numerical experiments show that the EnKF may be preferable to the EKF in terms of balance, though its validity is limited by ensemble size. It is also found that overobserving can lead to a more unbalanced forecast ensemble and thus to an unbalanced analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1 Insects using olfactory stimuli to forage for prey/hosts are proposed to encounter a ‘reliability–detectability problem’, where the usability of a stimulus depends on its reliability as an indicator of herbivore presence and its detectability. 2 We investigated this theory using the responses of female seven-spot ladybirds Coccinella septempunctata (Coleoptera: Coccinellidae) to plant headspace chemicals collected from the peach-potato aphid Myzus persicae and four commercially available Brassica cultivars; Brassica rapa L. cultivar ‘turnip purple top’, Brassica juncea L. cultivar ‘red giant mustard’, Brassica napus L. cultivar ‘Apex’, Brassica napus L. cultivar ‘Courage’ and Arabidopsis thaliana. For each cultivar/species, responses to plants that were undamaged, previously infested by M. persicae and infested with M. persicae, were investigated using dual-choice Petri dish bioassays and circular arenas. 3 There was no evidence that ladybirds responded to headspace chemicals from aphids alone. Ladybirds significantly preferred headspace chemicals from B. napus cv. Apex that were undamaged compared with those from plants infested with aphids. For the other four species/cultivars, there was a consistent trend of the predators being recorded more often in the half of the Petri dish containing plant headspace chemicals from previously damaged and infested plants compared with those from undamaged ones. Furthermore, the mean distance ladybirds walked to reach aphid-infested A. thaliana was significantly shorter than to reach undamaged plants. These results suggest that aphid-induced plant chemicals could act as an arrestment or possibly an attractant stimulus to C. septempunctata. However, it is also possible that C. septempunctata could have been responding to aphid products, such as honeydew, transferred to the previously damaged and infested plants. 4 The results provide evidence to support the ‘reliability–detectability’ theory and suggest that the effectiveness of C. septempunctata as a natural enemy of aphids may be strongly affected by which species and cultivar of Brassica are being grown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Countless cities are rapidly developing across the globe, pressing the need for clear urban planning and design recommendations geared towards sustainability. This article examines the intersections of Jane Jacobs’ four conditions for diversity with low-carbon and low-energy use urban systems in four cities around the world: Lyon (France), Chicago (United-States), Kolkata (India), and Singapore (Singapore). After reviewing Jacobs’ four conditions for diversity, we introduce the four cities and describe their historical development context. We then present a framework to study the cities along three dimensions: population and density, infrastructure development/use, and climate and landscape. These cities differ in many respects and their analysis is instructive for many other cities around the globe. Jacobs’ conditions are present in all of them, manifested in different ways and to varying degrees. Overall we find that the adoption of Jacobs' conditions seems to align well with concepts of low-carbon urban systems, with their focus on walkability, transit-oriented design, and more efficient land use (i.e., smaller unit sizes). Transportation sector emissions seems to demonstrate a stronger influence from the presence of Jacobs' conditions, while the link was less pronounced in the building sector. Kolkata, a low-income, developing world city, seems to possess many of Jacobs' conditions, while exhibiting low per capita emissions - maintaining both of these during its economic expansion will take careful consideration. Greenhouse gas mitigation, however, is inherently an in situ problem and the first task must therefore be to gain local knowledge of an area before developing strategies to lower its carbon footprint.