3 resultados para Vector spaces -- Problems, exercises, etc.
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Technological developments in microprocessors and ICT landscape have made a shift to a new era where computing power is embedded in numerous small distributed objects and devices in our everyday lives. These small computing devices are ne-tuned to perform a particular task and are increasingly reaching our society at every level. For example, home appliances such as programmable washing machines, microwave ovens etc., employ several sensors to improve performance and convenience. Similarly, cars have on-board computers that use information from many di erent sensors to control things such as fuel injectors, spark plug etc., to perform their tasks e ciently. These individual devices make life easy by helping in taking decisions and removing the burden from their users. All these objects and devices obtain some piece of information about the physical environment. Each of these devices is an island with no proper connectivity and information sharing between each other. Sharing of information between these heterogeneous devices could enable a whole new universe of innovative and intelligent applications. The information sharing between the devices is a diffcult task due to the heterogeneity and interoperability of devices. Smart Space vision is to overcome these issues of heterogeneity and interoperability so that the devices can understand each other and utilize services of each other by information sharing. This enables innovative local mashup applications based on shared data between heterogeneous devices. Smart homes are one such example of Smart Spaces which facilitate to bring the health care system to the patient, by intelligent interconnection of resources and their collective behavior, as opposed to bringing the patient into the health system. In addition, the use of mobile handheld devices has risen at a tremendous rate during the last few years and they have become an essential part of everyday life. Mobile phones o er a wide range of different services to their users including text and multimedia messages, Internet, audio, video, email applications and most recently TV services. The interactive TV provides a variety of applications for the viewers. The combination of interactive TV and the Smart Spaces could give innovative applications that are personalized, context-aware, ubiquitous and intelligent by enabling heterogeneous systems to collaborate each other by sharing information between them. There are many challenges in designing the frameworks and application development tools for rapid and easy development of these applications. The research work presented in this thesis addresses these issues. The original publications presented in the second part of this thesis propose architectures and methodologies for interactive and context-aware applications, and tools for the development of these applications. We demonstrated the suitability of our ontology-driven application development tools and rule basedapproach for the development of dynamic, context-aware ubiquitous iTV applications.
Resumo:
The question of the trainability of executive functions and the impact of such training on related cognitive skills has stirred considerable research interest. Despite a number of studies investigating this, the question has not yet been solved. The general aim of this thesis was to investigate two very different types of training of executive functions: laboratory-based computerized training (Studies I-III) and realworld training through bilingualism (Studies IV-V). Bilingualism as a kind of training of executive functions is based on the idea that managing two languages requires executive resources, and previous studies have suggested a bilingual advantage in executive functions. Three executive functions were studied in the present thesis: updating of working memory (WM) contents, inhibition of irrelevant information, and shifting between tasks and mental sets. Studies I-III investigated the effects of computer-based training of WM updating (Study I), inhibition (Study II), and set shifting (Study III) in healthy young adults. All studies showed increased performance on the trained task. More importantly, improvement on an untrained task tapping the trained executive function (near transfer) was seen in Study I and II. None of the three studies showed improvement on untrained tasks tapping some other cognitive function (far transfer) as a result of training. Study I also used PET to investigate the effects of WM updating training on a neurotransmitter closely linked to WM, namely dopamine. The PET results revealed increased striatal dopamine release during WM updating performance as a result of training. Study IV investigated the ability to inhibit task-irrelevant stimuli in bilinguals and monolinguals by using a dichotic listening task. The results showed that the bilinguals exceeded the monolinguals in inhibiting task-irrelevant information. Study V introduced a new, complementary research approach to study the bilingual executive advantage and its underlying mechanisms. To circumvent the methodological problems related to natural groups design, this approach focuses only on bilinguals and examines whether individual differences in bilingual behavior correlate with executive task performances. Using measures that tap the three above-entioned executive functions, the results suggested that more frequent language switching was associated with better set shifting skills, and earlier acquisition of the second language was related to better inhibition skills. In conclusion, the present behavioral results showed that computer-based training of executive functions can improve performance on the trained task and on closely related tasks, but does not yield a more general improvement of cognitive skills. Moreover, the functional neuroimaging results reveal that WM training modulates striatal dopaminergic function, speaking for training-induced neural plasticity in this important neurotransmitter system. With regard to bilingualism, the results provide further support to the idea that bilingualism can enhance executive functions. In addition, the new complementary research approach proposed here provides some clues as to which aspects of everyday bilingual behavior may be related to the advantage in executive functions in bilingual individuals.
Resumo:
One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.