923 resultados para intelligent tutoring systems, student modelling, individualised feedback, rapid application development, Microsoft Access


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper describes the strategies for Congestion and Incident Management (CIM) on the basis of Automatic Congestion and Incident Detection (ACID) that COSMOS will develop, implement in SCOOT, UTOPIA and MOTION, and validate and demonstrate in London, Piraeus and Torino. Four levels of operation were defined for CIM: strategies, tactics, tools and realisation. The strategies for CIM form the top level of this hierarchy. They have to reflect the strategic requirements of the system operators. The tactics are the means that can be employed by the strategies to achieve particular goals in particular situations. The tools that are used by the tactics relate to the elements of the signal plan and the ways in which they can be modified. Strategies, tactics and tools are generally common to all three systems, while the realisation of individual strategies and tactical decisions, through the use of particular common sets of tools, will generally be system specific. For the covering abstract, see IRRD 490001.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decade, a new idea challenging the classical self-non-self viewpoint has become popular amongst immunologists. It is called the Danger Theory. In this conceptual paper, we look at this theory from the perspective of Artificial Immune System practitioners. An overview of the Danger Theory is presented with particular emphasis on analogies in the Artificial Immune Systems world. A number of potential application areas are then used to provide a framing for a critical assessment of the concept, and its relevance for Artificial Immune Systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION In recent years computer systems have become increasingly complex and consequently the challenge of protecting these systems has become increasingly difficult. Various techniques have been implemented to counteract the misuse of computer systems in the form of firewalls, antivirus software and intrusion detection systems. The complexity of networks and dynamic nature of computer systems leaves current methods with significant room for improvement. Computer scientists have recently drawn inspiration from mechanisms found in biological systems and, in the context of computer security, have focused on the human immune system (HIS). The human immune system provides an example of a robust, distributed system that provides a high level of protection from constant attacks. By examining the precise mechanisms of the human immune system, it is hoped the paradigm will improve the performance of real intrusion detection systems. This paper presents an introduction to recent developments in the field of immunology. It discusses the incorporation of a novel immunological paradigm, Danger Theory, and how this concept is inspiring artificial immune systems (AIS). Applications within the context of computer security are outlined drawing direct reference to the underlying principles of Danger Theory and finally, the current state of intrusion detection systems is discussed and improvements suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ultrasonic tomography is a powerful tool for identifying defects within an object or structure. This method can be applied on structures where x-ray tomography is impractical due to size, low contrast, or safety concerns. By taking many ultrasonic pulse velocity (UPV) readings through the object, an image of the internal velocity variations can be constructed. Air-coupled UPV can allow for more automated and rapid collection of data for tomography of concrete. This research aims to integrate recent developments in air-coupled ultrasonic measurements with advanced tomography technology and apply them to concrete structures. First, non-contact and semi-contact sensor systems are developed for making rapid and accurate UPV measurements through PVC and concrete test samples. A customized tomographic reconstruction program is developed to provide full control over the imaging process including full and reduced spectrum tomographs with percent error and ray density calculations. Finite element models are also used to determine optimal measurement configurations and analysis procedures for efficient data collection and processing. Non-contact UPV is then implemented to image various inclusions within 6 inch (152 mm) PVC and concrete cylinders. Although there is some difficulty in identifying high velocity inclusions, reconstruction error values were in the range of 1.1-1.7% for PVC and 3.6% for concrete. Based upon the success of those tests, further data are collected using non-contact, semi-contact, and full contact measurements to image 12 inch (305 mm) square concrete cross-sections with 1 inch (25 mm) reinforcing bars and 2 inch (51 mm) square embedded damage regions. Due to higher noise levels in collected signals, tomographs of these larger specimens show reconstruction error values in the range of 10-18%. Finally, issues related to the application of these techniques to full-scale concrete structures are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION In recent years computer systems have become increasingly complex and consequently the challenge of protecting these systems has become increasingly difficult. Various techniques have been implemented to counteract the misuse of computer systems in the form of firewalls, antivirus software and intrusion detection systems. The complexity of networks and dynamic nature of computer systems leaves current methods with significant room for improvement. Computer scientists have recently drawn inspiration from mechanisms found in biological systems and, in the context of computer security, have focused on the human immune system (HIS). The human immune system provides an example of a robust, distributed system that provides a high level of protection from constant attacks. By examining the precise mechanisms of the human immune system, it is hoped the paradigm will improve the performance of real intrusion detection systems. This paper presents an introduction to recent developments in the field of immunology. It discusses the incorporation of a novel immunological paradigm, Danger Theory, and how this concept is inspiring artificial immune systems (AIS). Applications within the context of computer security are outlined drawing direct reference to the underlying principles of Danger Theory and finally, the current state of intrusion detection systems is discussed and improvements suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decade, a new idea challenging the classical self-non-self viewpoint has become popular amongst immunologists. It is called the Danger Theory. In this conceptual paper, we look at this theory from the perspective of Artificial Immune System practitioners. An overview of the Danger Theory is presented with particular emphasis on analogies in the Artificial Immune Systems world. A number of potential application areas are then used to provide a framing for a critical assessment of the concept, and its relevance for Artificial Immune Systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decade, a new idea challenging the classical self-non-self viewpoint has become popular amongst immunologists. It is called the Danger Theory. In this conceptual paper, we look at this theory from the perspective of Artificial Immune System practitioners. An overview of the Danger Theory is presented with particular emphasis on analogies in the Artificial Immune Systems world. A number of potential application areas are then used to provide a framing for a critical assessment of the concept, and its relevance for Artificial Immune Systems. Notes: Uwe Aickelin, Department of Computing, University of Bradford, Bradford, BD7 1DP

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we present a quantitative approach using probabilistic verification techniques for the analysis of reliability, availability, maintainability, and safety (RAMS) properties of satellite systems. The subject of our research is satellites used in mission critical industrial applications. A strong case for using probabilistic model checking to support RAMS analysis of satellite systems is made by our verification results. This study is intended to build a foundation to help reliability engineers with a basic background in model checking to apply probabilistic model checking to small satellite systems. We make two major contributions. One of these is the approach of RAMS analysis to satellite systems. In the past, RAMS analysis has been extensively applied to the field of electrical and electronics engineering. It allows system designers and reliability engineers to predict the likelihood of failures from the indication of historical or current operational data. There is a high potential for the application of RAMS analysis in the field of space science and engineering. However, there is a lack of standardisation and suitable procedures for the correct study of RAMS characteristics for satellite systems. This thesis considers the promising application of RAMS analysis to the case of satellite design, use, and maintenance, focusing on its system segments. Data collection and verification procedures are discussed, and a number of considerations are also presented on how to predict the probability of failure. Our second contribution is leveraging the power of probabilistic model checking to analyse satellite systems. We present techniques for analysing satellite systems that differ from the more common quantitative approaches based on traditional simulation and testing. These techniques have not been applied in this context before. We present the use of probabilistic techniques via a suite of detailed examples, together with their analysis. Our presentation is done in an incremental manner: in terms of complexity of application domains and system models, and a detailed PRISM model of each scenario. We also provide results from practical work together with a discussion about future improvements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automation technologies are widely acclaimed to have the potential to significantly reduce energy consumption and energy-related costs in buildings. However, despite the abundance of commercially available technologies, automation in domestic environments keep on meeting commercial failures. The main reason for this is the development process that is used to build the automation applications, which tend to focus more on technical aspects rather than on the needs and limitations of the users. An instance of this problem is the complex and poorly designed home automation front-ends that deter customers from investing in a home automation product. On the other hand, developing a usable and interactive interface is a complicated task for developers due to the multidisciplinary challenges that need to be identified and solved. In this context, the current research work investigates the different design problems associated with developing a home automation interface as well as the existing design solutions that are applied to these problems. The Qualitative Data Analysis approach was used for collecting data from research papers and the open coding process was used to cluster the findings. From the analysis of the data collected, requirements for designing the interface were derived. A home energy management functionality for a Web-based home automation front-end was developed as a proof-of-concept and a user evaluation was used to assess the usability of the interface. The results of the evaluation showed that this holistic approach to designing interfaces improved its usability which increases the chances of its commercial success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method for estimating the dimensions of non-delimited free parking areas by using a static surveillance camera is proposed. The proposed method is specially designed to tackle the main challenges of urban scenarios (multiple moving objects, outdoor illumination conditions and occlusions between vehicles) with no training. The core of this work is the temporal analysis of the video frames to detect the occupancy variation of the parking areas. Two techniques are combined: background subtraction using a mixture of Gaussians to detect and track vehicles and the creation of a transience map to detect the parking and leaving of vehicles. The authors demonstrate that the proposed method yields satisfactory estimates on three real scenarios while being a low computational cost solution that can be applied in any kind of parking area covered by a single camera.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overrecentdecades,remotesensinghasemergedasaneffectivetoolforimprov- ing agriculture productivity. In particular, many works have dealt with the problem of identifying characteristics or phenomena of crops and orchards on different scales using remote sensed images. Since the natural processes are scale dependent and most of them are hierarchically structured, the determination of optimal study scales is mandatory in understanding these processes and their interactions. The concept of multi-scale/multi- resolution inherent to OBIA methodologies allows the scale problem to be dealt with. But for that multi-scale and hierarchical segmentation algorithms are required. The question that remains unsolved is to determine the suitable scale segmentation that allows different objects and phenomena to be characterized in a single image. In this work, an adaptation of the Simple Linear Iterative Clustering (SLIC) algorithm to perform a multi-scale hierarchi- cal segmentation of satellite images is proposed. The selection of the optimal multi-scale segmentation for different regions of the image is carried out by evaluating the intra- variability and inter-heterogeneity of the regions obtained on each scale with respect to the parent-regions defined by the coarsest scale. To achieve this goal, an objective function, that combines weighted variance and the global Moran index, has been used. Two different kinds of experiment have been carried out, generating the number of regions on each scale through linear and dyadic approaches. This methodology has allowed, on the one hand, the detection of objects on different scales and, on the other hand, to represent them all in a sin- gle image. Altogether, the procedure provides the user with a better comprehension of the land cover, the objects on it and the phenomena occurring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic subarea division is vital for traffic system management and traffic network analysis in intelligent transportation systems (ITSs). Since existing methods may not be suitable for big traffic data processing, this paper presents a MapReduce-based Parallel Three-Phase K -Means (Par3PKM) algorithm for solving traffic subarea division problem on a widely adopted Hadoop distributed computing platform. Specifically, we first modify the distance metric and initialization strategy of K -Means and then employ a MapReduce paradigm to redesign the optimized K -Means algorithm for parallel clustering of large-scale taxi trajectories. Moreover, we propose a boundary identifying method to connect the borders of clustering results for each cluster. Finally, we divide traffic subarea of Beijing based on real-world trajectory data sets generated by 12,000 taxis in a period of one month using the proposed approach. Experimental evaluation results indicate that when compared with K -Means, Par2PK-Means, and ParCLARA, Par3PKM achieves higher efficiency, more accuracy, and better scalability and can effectively divide traffic subarea with big taxi trajectory data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate and timely traffic flow prediction is crucial to proactive traffic management and control in data-driven intelligent transportation systems (D2ITS), which has attracted great research interest in the last few years. In this paper, we propose a Spatial-Temporal Weighted K-Nearest Neighbor model, named STW-KNN, in a general MapReduce framework of distributed modeling on a Hadoop platform, to enhance the accuracy and efficiency of short-term traffic flow forecasting. More specifically, STW-KNN considers the spatial-temporal correlation and weight of traffic flow with trend adjustment features, to optimize the search mechanisms containing state vector, proximity measure, prediction function, and K selection. urthermore, STW-KNN is implemented on a widely adopted Hadoop distributed computing platform with the MapReduce parallel processing paradigm, for parallel prediction of traffic flow in real time. inally, with extensive experiments on real-world big taxi trajectory data, STW-KNN is compared with the state-of-the-art prediction models including conventional K-Nearest Neighbor (KNN), Artificial Neural Networks (ANNs), Naïve Bayes (NB), Random orest (R), and C4.. The results demonstrate that the proposed model is superior to existing models on accuracy by decreasing the mean absolute percentage error (MAPE) value more than 11.9% only in time domain and even achieves 89.71% accuracy improvement with the MAPEs of between 4% and 6.% in both space and time domains, and also significantly improves the efficiency and scalability of short-term traffic flow forecasting over existing approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problem Statement: This research aims to understand the contribution of traditional toys as catalysts for motivation and student commitment in the development of Technological Education projects and activities. Research Questions: To what extent do work units related to traditional toys promote student motivation and commitment in the subject of Technological Education. Purpose of Study: Technological Education requires students to gain knowledge and know-how such that motivation and commitment are crucial for the development of classroom projects and activities. It is in this context that traditional toys are assumed to be catalysts for motivation and student interest. Research Methods: In terms of methodology, an exploratory research of a qualitative nature was carried out, based on semi-structured interviews to teachers and students within a 2nd cycle of Basic Education environment, encompassing five state schools in the Viseu municipality, Portugal. Nine teachers and forty-five technological education pupils, aged between 10 and 12, attending the 5th and 6th years of schooling participated. Findings: Content analysis of the answers revealed that the implementation of work units involving the construction of traditional toys are conducive to student motivation and commitment. Starting off with an initial idea, pupils are enabled to experience all the stages of toy building, from conception to completion, contributing to greater student satisfaction in the teaching-learning process. Conclusions: The traditional toys constitute an added value in the subject of Technological Education, promoting student motivation and commitment in the development of projects and activities. Students acquire knowledge and skills, which will enable them to analyze and thus resolve specific situations and prepare them for an increasingly technological world.