866 resultados para categorization IT PFC computational neuroscience model HMAX
Resumo:
The challenge the community college faces in helping meet the needs of the living open system of society is examined in this study. It is postulated that internalization student outcomes are required by society to reduce entropy and remain self-renewing. Such behavior is characterized as having an intrinsically motivated energy source and displays the seeking and conquering of challenge, the development of reflective knowledge and skill, full use of all capabilities, internal control, growth orientation, high self-esteem, relativistic thinking and competence. The development of a conceptual systems model that suggests how transactions among students, faculty and administration might occur to best meet the needs of internalization outcomes in students, and intrinsic motivation in faculty is a major purpose of this study. It is a speculative model that is based on a synthesis of a wide variety of variables. Empirical evidence, theoretical considerations, and speculative ideas are gathered together from researchers and theoretici.ans who are working on separate answers to questions of intrinsic motivation, internal control and environments that encourage their development. The model considers the effect administrators·have on faculty anq the corresponding effect faculty may have on students. The major concentration is on the administrator--teacher interface.For administrators the model may serve as a guide in planning effective transactions, and establishing system goals. The teacher is offered a means to coordinate actions toward a specific overall objective, and the administrator, teacher and researcher are invited to use the model to experiment, innovate, verify the assumptions on which the model is based, and raise additional hypotheses. Goals and history of the community colleges in Ontario are examined against current problems, previous progress and open system thinking. The nature of the person as a five part system is explored with emphasis on intrinsic motivation. The nature, operation, conceptualization, and value of this internal energy source is reviewed in detail. The current state of society, education and management theory are considered and the value of intrinsically motivating teaching tasks together with "system four" leadership style are featured. Evidence is reviewed that suggests intrinsically motivated faculty are needed, and "system four" leadership style is the kind of interaction-influence system needed to nurture intrinsic motivation in faculty.
Resumo:
A method was developed to evaluate crop disease predictive models for their economic and environmental benefits. Benefits were quantified as the value of a prediction measured by costs saved and fungicide dose saved. The value of prediction was defined as the net gain made by using predictions, measured as the difference between a scenario where predictions are available and used and a scenario without prediction. Comparable 'with' and 'without' scenarios were created with the use of risk levels. These risk levels were derived from a probability distribution fitted to observed disease severities. These distributions were used to calculate the probability that a certain disease induced economic loss was incurred. The method was exemplified by using it to evaluate a model developed for Mycosphaerella graminicola risk prediction. Based on the value of prediction, the tested model may have economic and environmental benefits to growers if used to guide treatment decisions on resistant cultivars. It is shown that the value of prediction measured by fungicide dose saved and costs saved is constant with the risk level. The model could also be used to evaluate similar crop disease predictive models.
Resumo:
Physiological evidence using Infrared Video Microscopy during the uncaging of glutamate has proven the existence of excitable calcium ion channels in spine heads, highlighting the need for reliable models of spines. In this study we compare the three main methods of simulating excitable spines: Baer & Rinzel's Continuum (B&R) model, Coombes' Spike-Diffuse-Spike (SDS) model and paired cable and ion channel equations (Cable model). Tests are done to determine how well the models approximate each other in terms of speed and heights of travelling waves. Significant quantitative differences are found between the models: travelling waves in the SDS model in particular are found to travel at much lower speeds and sometimes much higher voltages than in the Cable or B&R models. Meanwhile qualitative differences are found between the B&R and SDS models over realistic parameter ranges. The cause of these differences is investigated and potential solutions proposed.
Resumo:
Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.
Resumo:
The difference between the rate of change of cerebral blood volume (CBV) and cerebral blood flow (CBF) following stimulation is thought to be due to circumferential stress relaxation in veins (Mandeville, J.B., Marota, J.J.A., Ayata, C., Zaharchuk, G., Moskowitz, M.A., Rosen, B.R., Weisskoff, R.M., 1999. Evidence of a cerebrovascular postarteriole windkessel with delayed compliance. J. Cereb. Blood Flow Metab. 19, 679–689). In this paper we explore the visco-elastic properties of blood vessels, and present a dynamic model relating changes in CBF to changes in CBV. We refer to this model as the visco-elastic windkessel (VW) model. A novel feature of this model is that the parameter characterising the pressure–volume relationship of blood vessels is treated as a state variable dependent on the rate of change of CBV, producing hysteresis in the pressure–volume space during vessel dilation and contraction. The VW model is nonlinear time-invariant, and is able to predict the observed differences between the time series of CBV and that of CBF measurements following changes in neural activity. Like the windkessel model derived by Mandeville, J.B., Marota, J.J.A., Ayata, C., Zaharchuk, G., Moskowitz, M.A., Rosen, B.R., Weisskoff, R.M., 1999. Evidence of a cerebrovascular postarteriole windkessel with delayed compliance. J. Cereb. Blood Flow Metab. 19, 679–689, the VW model is primarily a model of haemodynamic changes in the venous compartment. The VW model is demonstrated to have the following characteristics typical of visco-elastic materials: (1) hysteresis, (2) creep, and (3) stress relaxation, hence it provides a unified model of the visco-elastic properties of the vasculature. The model will not only contribute to the interpretation of the Blood Oxygen Level Dependent (BOLD) signals from functional Magnetic Resonance Imaging (fMRI) experiments, but also find applications in the study and modelling of the brain vasculature and the haemodynamics of circulatory and cardiovascular systems.
Resumo:
Diabatic processes can alter Rossby wave structure; consequently errors arising from model processes propagate downstream. However, the chaotic spread of forecasts from initial condition uncertainty renders it difficult to trace back from root mean square forecast errors to model errors. Here diagnostics unaffected by phase errors are used, enabling investigation of systematic errors in Rossby waves in winter-season forecasts from three operational centers. Tropopause sharpness adjacent to ridges decreases with forecast lead time. It depends strongly on model resolution, even though models are examined on a common grid. Rossby wave amplitude reduces with lead time up to about five days, consistent with under-representation of diabatic modification and transport of air from the lower troposphere into upper-tropospheric ridges, and with too weak humidity gradients across the tropopause. However, amplitude also decreases when resolution is decreased. Further work is necessary to isolate the contribution from errors in the representation of diabatic processes.
Resumo:
The local speeds of object contours vary systematically with the cosine of the angle between the normal component of the local velocity and the global object motion direction. An array of Gabor elements whose speed changes with local spatial orientation in accordance with this pattern can appear to move as a single surface. The apparent direction of motion of plaids and Gabor arrays has variously been proposed to result from feature tracking, vector addition and vector averaging in addition to the geometrically correct global velocity as indicated by the intersection of constraints (IOC) solution. Here a new combination rule, the harmonic vector average (HVA), is introduced, as well as a new algorithm for computing the IOC solution. The vector sum can be discounted as an integration strategy as it increases with the number of elements. The vector average over local vectors that vary in direction always provides an underestimate of the true global speed. The HVA, however, provides the correct global speed and direction for an unbiased sample of local velocities with respect to the global motion direction, as is the case for a simple closed contour. The HVA over biased samples provides an aggregate velocity estimate that can still be combined through an IOC computation to give an accurate estimate of the global velocity, which is not true of the vector average. Psychophysical results for type II Gabor arrays show perceived direction and speed falls close to the IOC direction for Gabor arrays having a wide range of orientations but the IOC prediction fails as the mean orientation shifts away from the global motion direction and the orientation range narrows. In this case perceived velocity generally defaults to the HVA.
Resumo:
Building Information Modeling (BIM) is the process of structuring, capturing, creating, and managing a digital representation of physical and/or functional characteristics of a built space [1]. Current BIM has limited ability to represent dynamic semantics, social information, often failing to consider building activity, behavior and context; thus limiting integration with intelligent, built-environment management systems. Research, such as the development of Semantic Exchange Modules, and/or the linking of IFC with semantic web structures, demonstrates the need for building models to better support complex semantic functionality. To implement model semantics effectively, however, it is critical that model designers consider semantic information constructs. This paper discusses semantic models with relation to determining the most suitable information structure. We demonstrate how semantic rigidity can lead to significant long-term problems that can contribute to model failure. A sufficiently detailed feasibility study is advised to maximize the value from the semantic model. In addition we propose a set of questions, to be used during a model’s feasibility study, and guidelines to help assess the most suitable method for managing semantics in a built environment.
Resumo:
Stimulation protocols for medical devices should be rationally designed. For episodic migraine with aura we outline model-based design strategies toward preventive and acute therapies using stereotactic cortical neuromodulation. To this end, we regard a localized spreading depression (SD) wave segment as a central element in migraine pathophysiology. To describe nucleation and propagation features of the SD wave segment, we define the new concepts of cortical hot spots and labyrinths, respectively. In particular, we firstly focus exclusively on curvature-induced dynamical properties by studying a generic reaction-diffusion model of SD on the folded cortical surface. This surface is described with increasing level of details, including finally personalized simulations using patient's magnetic resonance imaging (MRI) scanner readings. At this stage, the only relevant factor that can modulate nucleation and propagation paths is the Gaussian curvature, which has the advantage of being rather readily accessible by MRI. We conclude with discussing further anatomical factors, such as areal, laminar, and cellular heterogeneity, that in addition to and in relation to Gaussian curvature determine the generalized concept of cortical hot spots and labyrinths as target structures for neuromodulation. Our numerical simulations suggest that these target structures are like fingerprints, they are individual features of each migraine sufferer. The goal in the future will be to provide individualized neural tissue simulations. These simulations should predict the clinical data and therefore can also serve as a test bed for exploring stereotactic cortical neuromodulation.
Resumo:
A basic data requirement of a river flood inundation model is a Digital Terrain Model (DTM) of the reach being studied. The scale at which modeling is required determines the accuracy required of the DTM. For modeling floods in urban areas, a high resolution DTM such as that produced by airborne LiDAR (Light Detection And Ranging) is most useful, and large parts of many developed countries have now been mapped using LiDAR. In remoter areas, it is possible to model flooding on a larger scale using a lower resolution DTM, and in the near future the DTM of choice is likely to be that derived from the TanDEM-X Digital Elevation Model (DEM). A variable-resolution global DTM obtained by combining existing high and low resolution data sets would be useful for modeling flood water dynamics globally, at high resolution wherever possible and at lower resolution over larger rivers in remote areas. A further important data resource used in flood modeling is the flood extent, commonly derived from Synthetic Aperture Radar (SAR) images. Flood extents become more useful if they are intersected with the DTM, when water level observations (WLOs) at the flood boundary can be estimated at various points along the river reach. To illustrate the utility of such a global DTM, two examples of recent research involving WLOs at opposite ends of the spatial scale are discussed. The first requires high resolution spatial data, and involves the assimilation of WLOs from a real sequence of high resolution SAR images into a flood model to update the model state with observations over time, and to estimate river discharge and model parameters, including river bathymetry and friction. The results indicate the feasibility of such an Earth Observation-based flood forecasting system. The second example is at a larger scale, and uses SAR-derived WLOs to improve the lower-resolution TanDEM-X DEM in the area covered by the flood extents. The resulting reduction in random height error is significant.
Resumo:
Background: Some studies have proven that a conventional visual brain computer interface (BCI) based on overt attention cannot be used effectively when eye movement control is not possible. To solve this problem, a novel visual-based BCI system based on covert attention and feature attention has been proposed and was called the gaze-independent BCI. Color and shape difference between stimuli and backgrounds have generally been used in examples of gaze-independent BCIs. Recently, a new paradigm based on facial expression changes has been presented, and obtained high performance. However, some facial expressions were so similar that users couldn't tell them apart, especially when they were presented at the same position in a rapid serial visual presentation (RSVP) paradigm. Consequently, the performance of the BCI is reduced. New Method: In this paper, we combined facial expressions and colors to optimize the stimuli presentation in the gaze-independent BCI. This optimized paradigm was called the colored dummy face pattern. It is suggested that different colors and facial expressions could help users to locate the target and evoke larger event-related potentials (ERPs). In order to evaluate the performance of this new paradigm, two other paradigms were presented, called the gray dummy face pattern and the colored ball pattern. Comparison with Existing Method(s): The key point that determined the value of the colored dummy faces stimuli in BCI systems was whether the dummy face stimuli could obtain higher performance than gray faces or colored balls stimuli. Ten healthy participants (seven male, aged 21–26 years, mean 24.5 ± 1.25) participated in our experiment. Online and offline results of four different paradigms were obtained and comparatively analyzed. Results: The results showed that the colored dummy face pattern could evoke higher P300 and N400 ERP amplitudes, compared with the gray dummy face pattern and the colored ball pattern. Online results showed that the colored dummy face pattern had a significant advantage in terms of classification accuracy (p < 0.05) and information transfer rate (p < 0.05) compared to the other two patterns. Conclusions: The stimuli used in the colored dummy face paradigm combined color and facial expressions. This had a significant advantage in terms of the evoked P300 and N400 amplitudes and resulted in high classification accuracies and information transfer rates. It was compared with colored ball and gray dummy face stimuli.
Resumo:
I dagens samhälle är det allt viktigare för företag att behålla sina existerande kunder då konkurrensen blir allt hårdare. Detta medför att företag försöker vidta åtgärder för att vårda relationer med sina kunder. Detta problem är även högst relevant inom IT-branschen. Inom IT-branschen är det vanligt att arbeta agilt i IT-projekt. Vår samarbetspartner har sett ett ökat behov av att mäta servicekvalitet på ett återkommande sätt inom IT-projekt, detta för att mäta relevanta variabler som sträcker sig utanför kravspecifikationen. För att mäta framgång gällande detta arbetssätt vill man kunna mäta Nöjd Kund Index (NKI) för att kunna jämföra IT-projekt internt i företaget. Då tidigare forskning visat avsaknad av modeller innehållande både mätning av servicekvalitet samt NKI har lämplig litteratur studerats där det framkommit att modellen SERVQUAL är vedertagen för mätning av servicekvalitet och modellen American Customer Satisfaction Index (ACSI) är vedertagen för mätning av NKI. Detta har legat till grund för arbetets problemformulering och syfte. Syftet med arbetet är att skapa en vidareutvecklad modell för mätning av NKI för att jämföra IT-projekt internt samt återkommande mätning av servicekvalitet inom IT-projekt. Framtagande av denna modell har sedan skett genom forskningsstrategin Design and Creation. Intervjuer har genomförts för kravfångst till den vidareutvecklade modellen. Resultatet av denna forskningsstrategi blev sedan en vidareutvecklad modell baserad på ovan nämnda modeller med återkommande förhållningssätt för mätning av servicekvalitet inom IT-projekt och mätning av NKI för att jämföra IT-projekt internt i företaget. Den framtagna modellen har sedan verifierats genom ytterligare intervjuer med respondenter som innehar god erfarenhet från kundsidan av IT-projekt. Från dessa intervjuer kunde sedan slutsats dras att denna modell är att anse som applicerbar i empirin gällande IT-projekt.
Resumo:
This paper discusses distribution and the historical phases of capitalism. It assumes that technical progress and growth are taking place, and, given that, its question is on the functional distribution of income between labor and capital, having as reference classical theory of distribution and Marx’s falling tendency of the rate of profit. Based on the historical experience, it, first, inverts the model, making the rate of profit as the constant variable in the long run and the wage rate, as the residuum; second, it distinguishes three types of technical progress (capital-saving, neutral and capital-using) and applies it to the history of capitalism, having the UK and France as reference. Given these three types of technical progress, it distinguishes four phases of capitalist growth, where only the second is consistent with Marx prediction. The last phase, after World War II, should be, in principle, capital-saving, consistent with growth of wages above productivity. Instead, since the 1970s wages were kept stagnant in rich countries because of, first, the fact that the Information and Communication Technology Revolution proved to be highly capital using, opening room for a new wage of substitution of capital for labor; second, the new competition coming from developing countries; third, the emergence of the technobureaucratic or professional class; and, fourth, the new power of the neoliberal class coalition associating rentier capitalists and financiers
Resumo:
T'his dissertation proposes alternative models to allow the interconnectioin of the data communication networks of COSERN Companhia Energética do Rio Grande do Norte. These networks comprise the oorporative data network, based on TCP/IP architecture, and the automation system linking remote electric energy distribution substations to the main Operatin Centre, based on digital radio links and using the IEC 60870-5-101 protoco1s. The envisaged interconnection aims to provide automation data originated from substations with a contingent route to the Operation Center, in moments of failure or maintenance of the digital radio links. Among the presented models, the one chosen for development consists of a computational prototype based on a standard personal computer, working under LINUX operational system and running na application, developesd in C language, wich functions as a Gateway between the protocols of the TCP/IP stack and the IEC 60870-5-101 suite. So, it is described this model analysis, implementation and tests of functionality and performance. During the test phase it was basically verified the delay introduced by the TCP/IP network when transporting automation data, in order to guarantee that it was cionsistent with the time periods present on the automation network. Besides , additional modules are suggested to the prototype, in order to handle other issues such as security and prioriz\ation of the automation system data, whenever they are travesing the TCP/IP network. Finally, a study hás been done aiming to integrate, in more complete way, the two considered networks. It uses IP platform as a solution of convergence to the communication subsystem of na unified network, as the most recente market tendencies for supervisory and other automation systems indicate
Resumo:
Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering