942 resultados para real-time implementation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains infor­mation relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of con­cept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network ap­proach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the pres­ence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear tech­niques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the problem of obtaining a dense reconstruction in real-time, from a live video stream. In recent years, multi-view stereo (MVS) has received considerable attention and a number of methods have been proposed. However, most methods operate under the assumption of a relatively sparse set of still images as input and unlimited computation time. Video based MVS has received less attention despite the fact that video sequences offer significant benefits in terms of usability of MVS systems. In this paper we propose a novel video based MVS algorithm that is suitable for real-time, interactive 3d modeling with a hand-held camera. The key idea is a per-pixel, probabilistic depth estimation scheme that updates posterior depth distributions with every new frame. The current implementation is capable of updating 15 million distributions/s. We evaluate the proposed method against the state-of-the-art real-time MVS method and show improvement in terms of accuracy. © 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In-Motes Bins is an agent based real time In-Motes application developed for sensing light and temperature variations in an environment. In-Motes is a mobile agent middleware that facilitates the rapid deployment of adaptive applications in Wireless Sensor Networks (WSN's). In-Motes Bins is based on the injection of mobile agents into the WSN that can migrate or clone following specific rules and performing application specific tasks. Using In-Motes we were able to create and rapidly deploy our application on a WSN consisting of 10 MICA2 motes. Our application was tested in a wine store for a period of four months. In this paper we present the In-Motes Bins application and provide a detailed evaluation of its implementation. © 2007 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research is to develop an optimal kernel which would be used in a real-time engineering and communications system. Since the application is a real-time system, relevant real-time issues are studied in conjunction with kernel related issues. The emphasis of the research is the development of a kernel which would not only adhere to the criteria of a real-time environment, namely determinism and performance, but also provide the flexibility and portability associated with non-real-time environments. The essence of the research is to study how the features found in non-real-time systems could be applied to the real-time system in order to generate an optimal kernel which would provide flexibility and architecture independence while maintaining the performance needed by most of the engineering applications. Traditionally, development of real-time kernels has been done using assembly language. By utilizing the powerful constructs of the C language, a real-time kernel was developed which addressed the goals of flexibility and portability while still meeting the real-time criteria. The implementation of the kernel is carried out using the powerful 68010/20/30/40 microprocessor based systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of serious games in education and their pedagogical benefit is being widely recognized. However, effective integration of serious games in education depends on addressing two big challenges: the successful incorporation of motivation and engagement that can lead to learning; and the highly specialised skills associated with customised development to meet the required pedagogical objectives. This paper presents the Westminster Serious Games Platform (wmin-SGP) an authoring tool that allows educators/domain experts without games design and development technical skills to create bespoke roleplay simulations in three dimensional scenes featuring fully embodied virtual humans capable of verbal and non-verbal interaction with users fit for specific educational objectives. The paper presents the wmin-SGP system architecture and it evaluates its effectiveness in fulfilling its purpose via the implementation of two roleplay simulations, one for Politics and one for Law. In addition, it presents the results of two types of evaluation that address how successfully the wmin-SGP combines usability principles and game core drives based on the Octalysis gamification framework that lead to motivating games experiences. The evaluation results shows that the wmin-SGP: provides an intuitive environment and tools that support users without advanced technical skills to create in real-time bespoke roleplay simulations in advanced graphical interfaces; satisfies most of the usability principles; and provides balanced simulations based on the Octalysis framework core drives. The paper concludes with a discussion of future extension of this real time authoring tool and directions for further development of the Octalysis framework to address learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FPGAs and GPUs are often used when real-time performance in video processing is required. An accelerated processor is chosen based on task-specific priorities (power consumption, processing time and detection accuracy), and this decision is normally made once at design time. All three characteristics are important, particularly in battery-powered systems. Here we propose a method for moving selection of processing platform from a single design-time choice to a continuous run time one.We implement Histogram of Oriented Gradients (HOG) detectors for cars and people and Mixture of Gaussians (MoG) motion detectors running across FPGA, GPU and CPU in a heterogeneous system. We use this to detect illegally parked vehicles in urban scenes. Power, time and accuracy information for each detector is characterised. An anomaly measure is assigned to each detected object based on its trajectory and location, when compared to learned contextual movement patterns. This drives processor and implementation selection, so that scenes with high behavioural anomalies are processed with faster but more power hungry implementations, but routine or static time periods are processed with power-optimised, less accurate, slower versions. Real-time performance is evaluated on video datasets including i-LIDS. Compared to power-optimised static selection, automatic dynamic implementation mapping is 10% more accurate but draws 12W extra power in our testbed desktop system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, depth cameras have been widely utilized in camera tracking for augmented and mixed reality. Many of the studies focus on the methods that generate the reference model simultaneously with the tracking and allow operation in unprepared environments. However, methods that rely on predefined CAD models have their advantages. In such methods, the measurement errors are not accumulated to the model, they are tolerant to inaccurate initialization, and the tracking is always performed directly in reference model's coordinate system. In this paper, we present a method for tracking a depth camera with existing CAD models and the Iterative Closest Point (ICP) algorithm. In our approach, we render the CAD model using the latest pose estimate and construct a point cloud from the corresponding depth map. We construct another point cloud from currently captured depth frame, and find the incremental change in the camera pose by aligning the point clouds. We utilize a GPGPU-based implementation of the ICP which efficiently uses all the depth data in the process. The method runs in real-time, it is robust for outliers, and it does not require any preprocessing of the CAD models. We evaluated the approach using the Kinect depth sensor, and compared the results to a 2D edge-based method, to a depth-based SLAM method, and to the ground truth. The results show that the approach is more stable compared to the edge-based method and it suffers less from drift compared to the depth-based SLAM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This analysis estimates several economic benefits derived from national implementation of the National Oceanic and Atmospheric Administration’s Physical Oceanographic Real-Time System (PORTS®) at the 175 largest ports in the United States. Significant benefits were observed owing to: (1) lower commercial marine accident rates and resultant reductions in morbidity, mortality and property damage; (2) reduced pollution remediation costs; and, (3) increased productivity associated with operation of more fully loaded commercial vessels. Evidence also suggested additional benefits from heightened commercial and recreational fish catch and diminished recreational boating accidents. Annual gross benefits from 58 current PORTS® locations exceeded $217 million with an addition $83 million possible if installed at the largest remaining 117 ports in the United States. Over the ten-year economic life of PORTS® instruments, the present value for installation at all 175 ports could approach $2.5 billion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image processing offers unparalleled potential for traffic monitoring and control. For many years engineers have attempted to perfect the art of automatic data abstraction from sequences of video images. This paper outlines a research project undertaken at Napier University by the authors in the field of image processing for automatic traffic analysis. A software based system implementing TRIP algorithms to count cars and measure vehicle speed has been developed by members of the Transport Engineering Research Unit (TERU) at the University. The TRIP algorithm has been ported and evaluated on an IBM PC platform with a view to hardware implementation of the pre-processing routines required for vehicle detection. Results show that a software based traffic counting system is realisable for single window processing. Due to the high volume of data required to be processed for full frames or multiple lanes, system operations in real time are limited. Therefore specific hardware is required to be designed. The paper outlines a hardware design for implementation of inter-frame and background differencing, background updating and shadow removal techniques. Preliminary results showing the processing time and counting accuracy for the routines implemented in software are presented and a real time hardware pre-processing architecture is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For various reasons, many Algol 68 compilers do not directly implement the parallel processing operations defined in the Revised Algol 68 Report. It is still possible however, to perform parallel processing, multitasking and simulation provided that the implementation permits the creation of a master routine for the coordination and initiation of processes under its control. The package described here is intended for real time applications and runs in conjunction with the Algol 68R system; it extends and develops the original Algol 68RT package, which was designed for use with multiplexers at the Royal Radar Establishment, Malvern. The facilities provided, in addition to the synchronising operations, include an interface to an ICL Communications Processor enabling the abstract processes to be realised as the interaction of several teletypes or visual display units with a real time program providing a useful service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) are the key enablers of the internet of things (IoT) paradigm. Traditionally, sensor network research has been to be unlike the internet, motivated by power and device constraints. The IETF 6LoWPAN draft standard changes this, defining how IPv6 packets can be efficiently transmitted over IEEE 802.15.4 radio links. Due to this 6LoWPAN technology, low power, low cost micro- controllers can be connected to the internet forming what is known as the wireless embedded internet. Another IETF recommendation, CoAP allows these devices to communicate interactively over the internet. The integration of such tiny, ubiquitous electronic devices to the internet enables interesting real-time applications. This thesis work attempts to evaluate the performance of a stack consisting of CoAP and 6LoWPAN over the IEEE 802.15.4 radio link using the Contiki OS and Cooja simulator, along with the CoAP framework Californium (Cf). Ultimately, the implementation of this stack on real hardware is carried out using a raspberry pi as a border router with T-mote sky sensors as slip radios and CoAP servers relaying temperature and humidity data. The reliability of the stack was also demonstrated during scalability analysis conducted on the physical deployment. The interoperability is ensured by connecting the WSN to the global internet using different hardware platforms supported by Contiki and without the use of specialized gateways commonly found in non IP based networks. This work therefore developed and demonstrated a heterogeneous wireless sensor network stack, which is IP based and conducted performance analysis of the stack, both in terms of simulations and real hardware.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vesiculoviruses (VSV) are zoonotic viruses that cause vesicular stomatitis disease in cattle, horses and pigs, as well as sporadic human cases of acute febrile illness. Therefore, diagnosis of VSV infections by reliable laboratory techniques is important to allow a proper case management and implementation of strategies for the containment of virus spread. We show here a sensitive and reproducible real-time reverse transcriptase polymerase chain reaction (RT-PCR) for detection and quantification of VSV. The assay was evaluated with arthropods and serum samples obtained from horses, cattle and patients with acute febrile disease. The real-time RT-PCR amplified the Piry, Carajas, Alagoas and Indiana Vesiculovirus at a melting temperature 81.02 ± 0.8ºC, and the sensitivity of assay was estimated in 10 RNA copies/mL to the Piry Vesiculovirus. The viral genome has been detected in samples of horses and cattle, but not detected in human sera or arthropods. Thus, this assay allows a preliminary differential diagnosis of VSV infections.