893 resultados para non-conscious cognitive processing (NCCP) time.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation about information modelling and artificial intelligence, semantic structure, cognitive processing and quantum theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intuitive interaction is based on past experience and is fast and often non conscious. We have conducted ten studies into this issue over the past ten years, involving more than 400 participants. Data collection methods have included questionnaires, interviews, observations, concurrent and retrospective protocols, and cognitive measures. Coding schemes have been developed to suit each study and involve robust, literature based heuristics. Some other researchers have investigated this issue and their methods are also examined. The paper traces the development of the methods and compares the various approaches used over the years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gray‘s (2000) revised Reinforcement Sensitivity Theory (r-RST) was used to investigate personality effects on information processing biases to gain-framed and loss-framed anti-speeding messages and the persuasiveness of these messages. The r-RST postulates that behaviour is regulated by two major motivational systems: reward system or punishment system. It was hypothesised that both message processing and persuasiveness would be dependent upon an individual‘s sensitivity to reward or punishment. Student drivers (N = 133) were randomly assigned to view one of four anti-speeding messages or no message (control group). Individual processing differences were then measured using a lexical decision task, prior to participants completing a personality and persuasion questionnaire. Results indicated that participants who were more sensitive to reward showed a marginally significant (p = .050) tendency to report higher intentions to comply with the social gain-framed message and demonstrate a cognitive processing bias towards this message, than those with lower reward sensitivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the results from a study of information behaviors, with specific focus on information organisation-related behaviours conducted as part of a larger daily diary study with 34 participants. The findings indicate that organization of information in everyday life is a problematic area due to various factors. The self-evident one is the inter-subjectivity between the person who may have organized the information and the person looking for that same information (Berlin et. al., 1993). Increasingly though, we are not just looking for information within collections that have been designed by someone else, but within our own personal collections of information, which frequently include books, electronic files, photos, records, documents, desktops, web bookmarks, and portable devices. The passage of time between when we categorized or classified the information, and the time when we look for the same information, poses several problems of intra-subjectivity, or the difference between our own past and present perceptions of the same information. Information searching, and hence the retrieval of information from one's own collection of information in everyday life involved a spatial and temporal coordination with one's own past selves in a sort of cognitive and affective time travel, just as organizing information is a form of anticipatory coordination with one's future information needs. This has implications for finding information and also on personal information management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic incidents are recognised as one of the key sources of non-recurrent congestion that often leads to reduction in travel time reliability (TTR), a key metric of roadway performance. A method is proposed here to quantify the impacts of traffic incidents on TTR on freeways. The method uses historical data to establish recurrent speed profiles and identifies non-recurrent congestion based on their negative impacts on speeds. The locations and times of incidents are used to identify incidents among non-recurrent congestion events. Buffer time is employed to measure TTR. Extra buffer time is defined as the extra delay caused by traffic incidents. This reliability measure indicates how much extra travel time is required by travellers to arrive at their destination on time with 95% certainty in the case of an incident, over and above the travel time that would have been required under recurrent conditions. An extra buffer time index (EBTI) is defined as the ratio of extra buffer time to recurrent travel time, with zero being the best case (no delay). A Tobit model is used to identify and quantify factors that affect EBTI using a selected freeway segment in the Southeast Queensland, Australia network. Both fixed and random parameter Tobit specifications are tested. The estimation results reveal that models with random parameters offer a superior statistical fit for all types of incidents, suggesting the presence of unobserved heterogeneity across segments. What factors influence EBTI depends on the type of incident. In addition, changes in TTR as a result of traffic incidents are related to the characteristics of the incidents (multiple vehicles involved, incident duration, major incidents, etc.) and traffic characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-stationary signal modeling is a well addressed problem in the literature. Many methods have been proposed to model non-stationary signals such as time varying linear prediction and AM-FM modeling, the later being more popular. Estimation techniques to determine the AM-FM components of narrow-band signal, such as Hilbert transform, DESA1, DESA2, auditory processing approach, ZC approach, etc., are prevalent but their robustness to noise is not clearly addressed in the literature. This is critical for most practical applications, such as in communications. We explore the robustness of different AM-FM estimators in the presence of white Gaussian noise. Also, we have proposed three new methods for IF estimation based on non-uniform samples of the signal and multi-resolution analysis. Experimental results show that ZC based methods give better results than the popular methods such as DESA in clean condition as well as noisy condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Space-time block codes (STBCs) obtained from non-square complex orthogonal designs are bandwidth efficient compared to those from square real/complex orthogonal designs for colocated coherent MIMO systems and has other applications in (i) non-coherent MIMO systems with non-differential detection, (ii) Space-Time-Frequency codes for MIMO-OFDM systems and (iii) distributed space-time coding for relay channels. Liang (IEEE Trans. Inform. Theory, 2003) has constructed maximal rate non-square designs for any number of antennas, with rates given by [(a+1)/(2a)] when number of transmit antennas is 2a-1 or 2a. However, these designs have large delays. When large number of antennas are considered this rate is close to 1/2. Tarokh et al (IEEE Trans. Inform. Theory, 1999) have constructed rate 1/2 non-square CODs using the rate-1 real orthogonal designs for any number of antennas, where the decoding delay of these codes is less compared to the codes constructed by Liang for number of transmit antennas more than 5. In this paper, we construct a class of rate-1/2 codes for arbitrary number of antennas where the decoding delay is reduced by 50% when compared with the rate-1/2 codes given by Tarokh et al. It is also shown that even though scaling the variables helps to lower the delay it can not be used to increase the rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many Finnish IT companies have gone through numerous organizational changes over the past decades. This book draws attention to how stability may be central to software product development experts and IT workers more generally, who continuously have to cope with such change in their workplaces. It does so by analyzing and theorizing change and stability as intertwined and co-existent, thus throwing light on how it is possible that, for example, even if ‘the walls fall down the blokes just code’ and maintain a sense of stability in their daily work. Rather than reproducing the picture of software product development as exciting cutting edge activities and organizational change as dramatic episodes, the study takes the reader beyond the myths surrounding these phenomena to the mundane practices, routines and organizings in product development during organizational change. An analysis of these ordinary practices offers insights into how software product development experts actively engage in constructing stability during organizational change through a variety of practices, including solidarity, homosociality, close relations to products, instrumental or functional views on products, preoccupations with certain tasks and humble obedience. Consequently, the study shows that it may be more appropriate to talk about varieties of stability, characterized by a multitude of practices of stabilizing rather than states of stagnation. Looking at different practices of stability in depth shows the creation of software as an arena for micro-politics, power relations and increasing pressures for order and formalization. The thesis gives particular attention to power relations and processes of positioning following organizational change: how social actors come to understand themselves in the context of ongoing organizational change, how they comply with and/or contest dominant meanings, how they identify and dis-identify with formalization, and how power relations often are reproduced despite dis-identification. Related to processes of positioning, the reader is also given a glimpse into what being at work in a male-dominated and relatively homogeneous work environment looks like. It shows how the strong presence of men or “blokes” of a particular age and education seems to become invisible in workplace talk that appears ‘non-conscious’ of gender.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente trabalho conta com cinquenta redações de alunos do curso Pré - vestibular do Sintuperj, que já concluíram o Ensino Médio e, agora, se preparam para fazer o exame de acesso ao ensino superior - vestibular. A abordagem proposta se baseia na análise da referenciação, forma de organização coesiva do texto, especificamente, o encapsulamento anafórico. Para verificar a função discursiva desse fenômeno argumentativo nas redações, a análise foi dividida em duas perspectivas: por um lado, foram investigados os tipos de cadeias de referenciação, designadas como específica, rotulando e contribuindo para a progressão referencial no texto; ou como não específica, simplesmente, retomando e resumindo o conteúdo antecedente, evitando repetições. De outro lado, ampliou-se a análise para os elementos que constituem essas cadeias, verificando como foi realizada a menção ao referente através da manutenção temática. Além disso, foi possível notar como essa manutenção dos dados no texto possibilita a coesão textual e enriquece a argumentação que o aluno faz para defender a sua tese. Portanto, pode-se afirmar que a principal contribuição que esse trabalho oferece para o ensino de língua materna concerne na abordagem do texto a partir de uma perspectiva dos processamentos cognitivos, revelando como os elementos vão sendo construídos nas redações, a partir de componentes culturais e conhecimentos diversos dos alunos, que proporcionam a progressão referencial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have studied the response of a sol-gel based TiO(2), high k dielectric field effect transistor structure to microwave radiation. Under fixed bias conditions the transistor shows frequency dependent current fluctuations when exposed to continuous wave microwave radiation. Some of these fluctuations take the form of high Q resonances. The time dependent characteristics of these responses were studied by modulating the microwaves with a pulse signal. The measurements show that there is a shift in the centre frequency of these high Q resonances when the pulse time is varied. The measured lifetime of these resonances is high enough to be useful for non-classical information processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both decision making and sensorimotor control require real-time processing of noisy information streams. Historically these processes were thought to operate sequentially: cognitive processing leads to a decision, and the outcome is passed to the motor system to be converted into action. Recently, it has been suggested that the decision process may provide a continuous flow of information to the motor system, allowing it to prepare in a graded fashion for the probable outcome. Such continuous flow is supported by electrophysiology in nonhuman primates. Here we provide direct evidence for the continuous flow of an evolving decision variable to the motor system in humans. Subjects viewed a dynamic random dot display and were asked to indicate their decision about direction by moving a handle to one of two targets. We probed the state of the motor system by perturbing the arm at random times during decision formation. Reflex gains were modulated by the strength and duration of motion, reflecting the accumulated evidence in support of the evolving decision. The magnitude and variance of these gains tracked a decision variable that explained the subject's decision accuracy. The findings support a continuous process linking the evolving computations associated with decision making and sensorimotor control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reinforced concrete buildings in low-to-moderate seismic zones are often designed only for gravity loads in accordance with the non-seismic detailing provisions. Deficient detailing of columns and beam-column joints can lead to unpredictable brittle failures even under moderate earthquakes. Therefore, a reliable estimate of structural response is required for the seismic evaluation of these structures. For this purpose, analytical models for both interior and exterior slab-beam-column subassemblages and for a 1/3 scale model frame were implemented into the nonlinear finite element platform OpenSees. Comparison between the analytical results and experimental data available in the literature is carried out using nonlinear pushover analyses and nonlinear time history analysis for the subassemblages and the model frame, respectively. Furthermore, the seismic fragility assessment of reinforced concrete buildings is performed on a set of non-ductile frames using nonlinear time history analyses. The fragility curves, which are developed for various damage states for the maximum interstory drift ratio are characterized in terms of peak ground acceleration and spectral acceleration using a suite of ground motions representative of the seismic hazard in the region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the development of a real-time stereovision system to track multiple infrared markers attached to a surgical instrument. Multiple stages of pipeline in field-programmable gate array (FPGA) are developed to recognize the targets in both left and right image planes and to give each target a unique label. The pipeline architecture includes a smoothing filter, an adaptive threshold module, a connected component labeling operation, and a centroid extraction process. A parallel distortion correction method is proposed and implemented in a dual-core DSP. A suitable kinematic model is established for the moving targets, and a novel set of parallel and interactive computation mechanisms is proposed to position and track the targets, which are carried out by a cross-computation method in a dual-core DSP. The proposed tracking system can track the 3-D coordinate, velocity, and acceleration of four infrared markers with a delay of 9.18 ms. Furthermore, it is capable of tracking a maximum of 110 infrared markers without frame dropping at a frame rate of 60 f/s. The accuracy of the proposed system can reach the scale of 0.37 mm RMS along the x- and y-directions and 0.45 mm RMS along the depth direction (the depth is from 0.8 to 0.45 m). The performance of the proposed system can meet the requirements of applications such as surgical navigation, which needs high real time and accuracy capability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic signal is a typical non-stationary signal, whose frequency is continuously changing with time and is determined by the bandwidth of seismic source and the absorption characteristic of the media underground. The most interesting target of seismic signal’s processing and explaining is to know about the local frequency’s abrupt changing with the time, since this kind of abrupt changing is indicating the changing of the physical attributes of the media underground. As to the seismic signal’s instantaneous attributes taken from time-frequency domain, the key target is to search a effective, non-negative and fast algorithm time-frequency distribution, and transform the seismic signal into this time-frequency domain to get its instantaneous power spectrum density, and then use the process of weighted adding and average etc. to get the instantaneous attributes of seismic signal. Time-frequency analysis as a powerful tool to deal with time variant non-stationary signal is becoming a hot researching spot of modern signal processing, and also is an important method to make seismic signal’s attributes analysis. This kind of method provides joint distribution message about time domain and frequency domain, and it clearly plots the correlation of signal’s frequency changing with the time. The spectrum decomposition technique makes seismic signal’s resolving rate reach its theoretical level, and by the method of all frequency scanning and imaging the three dimensional seismic data in frequency domain, it improves and promotes the resolving abilities of seismic signal vs. geological abnormal objects. Matching pursuits method is an important way to realize signal’s self-adaptive decomposition. Its main thought is that any signal can be expressed by a series of time-frequency atoms’ linear composition. By decomposition the signal within an over completed library, the time-frequency atoms which stand for the signal itself are selected neatly and self-adaptively according to the signal’s characteristics. This method has excellent sparse decomposition characteristics, and is widely used in signal de-noising, signal coding and pattern recognizing processing and is also adaptive to seismic signal’s decomposition and attributes analysis. This paper takes matching pursuits method as the key research object. As introducing the principle and implementation techniques of matching pursuits method systematically, it researches deeply the pivotal problems of atom type’s selection, the atom dictionary’s discrete, and the most matching atom’s searching algorithm, and at the same time, applying this matching pursuits method into seismic signal’s processing by picking-up correlative instantaneous messages from time-frequency analysis and spectrum decomposition to the seismic signal. Based on the research of the theory and its correlative model examination of the adaptively signal decomposition with matching pursuit method, this paper proposes a fast optimal matching time-frequency atom’s searching algorithm aimed at seismic signal’s decomposition by frequency-dominated pursuit method and this makes the MP method pertinence to seismic signal’s processing. Upon the research of optimal Gabor atom’s fast searching and matching algorithm, this paper proposes global optimal searching method using Simulated Annealing Algorithm, Genetic Algorithm and composed Simulated Annealing and Genetic Algorithm, so as to provide another way to implement fast matching pursuit method. At the same time, aimed at the characteristics of seismic signal, this paper proposes a fast matching atom’s searching algorithm by means of designating the max energy points of complex seismic signal, searching for the most optimal atom in the neighbor area of these points according to its instantaneous frequency and instantaneous phase, and this promotes the calculating efficiency of seismic signal’s matching pursuit algorithm. According to these methods proposed above, this paper implements them by programmed calculation, compares them with some open algorithm and proves this paper’s conclusions. It also testifies the active results of various methods by the processing of actual signals. The problems need to be solved further and the aftertime researching targets are as follows: continuously seeking for more efficient fast matching pursuit algorithm and expanding its application range, and also study the actual usage of matching pursuit method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In modem signal Processing,non-linear,non-Gaussian and non-stable signals are usually the analyzed and Processed objects,especially non-stable signals. The convention always to analyze and Process non-stable signals are: short time Fourier transform,Wigner-Ville distribution,wavelet Transform and so on. But the above three algorithms are all based on Fourier Transform,so they all have the shortcoming of Fourier Analysis and cannot get rid of the localization of it. Hilbert-Huang Transform is a new non-stable signal processing technology,proposed by N. E. Huang in 1998. It is composed of Empirical Mode Decomposition (referred to as EMD) and Hilbert Spectral Analysis (referred to as HSA). After EMD Processing,any non-stable signal will be decomposed to a series of data sequences with different scales. Each sequence is called an Intrinsic Mode Function (referred to as IMF). And then the energy distribution plots of the original non-stable signal can be found by summing all the Hilbert spectrums of each IMF. In essence,this algorithm makes the non-stable signals become stable and decomposes the fluctuations and tendencies of different scales by degrees and at last describes the frequency components with instantaneous frequency and energy instead of the total frequency and energy in Fourier Spectral Analysis. In this case,the shortcoming of using many fake harmonic waves to describe non-linear and non-stable signals in Fourier Transform can be avoided. This Paper researches in the following parts: Firstly,This paper introduce the history and development of HHT,subsequently the characters and main issues of HHT. This paper briefly introduced the basic realization principles and algorithms of Hilbert-Huang transformation and confirms its validity by simulations. Secondly, This paper discuss on some shortcoming of HHT. By using FFT interpolation, we solve the problem of IMF instability and instantaneous frequency undulate which are caused by the insufficiency of sampling rate. As to the bound effect caused by the limitation of envelop algorithm of HHT, we use the wave characteristic matching method, and have good result. Thirdly, This paper do some deeply research on the application of HHT in electromagnetism signals processing. Based on the analysis of actual data examples, we discussed its application in electromagnetism signals processing and noise suppression. Using empirical mode decomposition method and multi-scale filter characteristics can effectively analyze the noise distribution of electromagnetism signal and suppress interference processing and information interpretability. It has been founded that selecting electromagnetism signal sessions using Hilbert time-frequency energy spectrum is helpful to improve signal quality and enhance the quality of data.