10 resultados para area-based matching
em Greenwich Academic Literature Archive - UK
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
This paper introduces a mechanism for representing and recognizing case history patterns with rich internal temporal aspects. A case history is characterized as a collection of elemental cases as in conventional case-based reasoning systems, together with the corresponding temporal constraints that can be relative and/or with absolute values. A graphical representation for case histories is proposed as a directed, partially weighted and labeled simple graph. In terms of such a graphical representation, an eigen-decomposition graph matching algorithm is proposed for recognizing case history patterns.
Resumo:
In this paper, we shall critically examine a special class of graph matching algorithms that follow the approach of node-similarity measurement. A high-level algorithm framework, namely node-similarity graph matching framework (NSGM framework), is proposed, from which, many existing graph matching algorithms can be subsumed, including the eigen-decomposition method of Umeyama, the polynomial-transformation method of Almohamad, the hubs and authorities method of Kleinberg, and the kronecker product successive projection methods of Wyk, etc. In addition, improved algorithms can be developed from the NSGM framework with respects to the corresponding results in graph theory. As the observation, it is pointed out that, in general, any algorithm which can be subsumed from NSGM framework fails to work well for graphs with non-trivial auto-isomorphism structure.
Resumo:
This paper presents a framework for Historical Case-Based Reasoning (HCBR) which allows the expression of both relative and absolute temporal knowledge, representing case histories in the real world. The formalism is founded on a general temporal theory that accommodates both points and intervals as primitive time elements. A case history is formally defined as a collection of (time-independent) elemental cases, together with its corresponding temporal reference. Case history matching is two-fold, i.e., there are two similarity values need to be computed: the non-temporal similarity degree and the temporal similarity degree. On the one hand, based on elemental case matching, the non-temporal similarity degree between case histories is defined by means of computing the unions and intersections of the involved elemental cases. On the other hand, by means of the graphical presentation of temporal references, the temporal similarity degree in case history matching is transformed into conventional graph similarity measurement.
Resumo:
Parallel processing techniques have been used in the past to provide high performance computing resources for activities such as fire-field modelling. This has traditionally been achieved using specialized hardware and software, the expense of which would be difficult to justify for many fire engineering practices. In this article we demonstrate how typical office-based PCs attached to a Local Area Network has the potential to offer the benefits of parallel processing with minimal costs associated with the purchase of additional hardware or software. It was found that good speedups could be achieved on homogeneous networks of PCs, for example a problem composed of ~100,000 cells would run 9.3 times faster on a network of 12 800MHz PCs than on a single 800MHz PC. It was also found that a network of eight 3.2GHz Pentium 4 PCs would run 7.04 times faster than a single 3.2GHz Pentium computer. A dynamic load balancing scheme was also devised to allow the effective use of the software on heterogeneous PC networks. This scheme also ensured that the impact between the parallel processing task and other computer users on the network was minimized.
Resumo:
A hotly debated issue in the area of aviation safety is the number of cabin crew members required to evacuate an aircraft in the event of an emergency. Most countries regulate the minimum number required for the safe operation of an aircraft, but these rulings are based on little if any scientific evidence. Another issue of concern is the failure rate of exits and slides. This paper examines these issues using the latest version of Aircraft Accident Statistics and Knowledge database AASK V4.0, which contains information from 105 survivable crashes and more than 2,000 survivors, including accounts from 155 cabin crew members.
Resumo:
This paper examines different ways of measuring similarity between software design models for Case Based Reasoning (CBR) to facilitate reuse of software design and code. The paper considers structural and behavioural aspects of similarity between software design models. Similarity metrics for comparing static class structures are defined and discussed. A Graph representation of UML class diagrams and corresponding similarity measures for UML class diagrams are defined. A full search graph matching algorithm for measuring structural similarity diagrams based on the identification of the Maximum Common Sub-graph (MCS) is presented. Finally, a simple evaluation of the approach is presented and discussed.
Resumo:
The stencil printing process is an important process in the assembly of Surface Mount Technology (SMT)devices. There is a wide agreement in the industry that the paste printing process accounts for the majority of assembly defects. Experience with this process has shown that typically over 60% of all soldering defects are due to problems associated with the flow properties of solder pastes. Therefore, the rheological measurements can be used as a tool to study the deformation or flow experienced by the pastes during the stencil printing process. This paper presents results on the thixotropic behaviour of three pastes; lead-based solder paste, lead-free solder paste and isotropic conductive adhesive (ICA). These materials are widely used as interconnect medium in the electronics industry. Solder paste are metal alloys suspended in a flux medium while the ICAs consist of silver flakes dispersed in an epoxy resin. The thixotropy behaviour was investigated through two rheological test; (i) hysteresis loop test and (ii) steady shear rate test. In the hysteresis loop test, the shear rate were increased from 0.001 to 100s-1 and then decreased from 100 to 0.001s-1. Meanwhile, in the steady shear rate test, the materials were subjected to a constant shear rate of 0.100, 100 and 0.001s-1 for a period of 240 seconds. All the pastes showed a high degree of shear thinning behaviour with time. This might be due to the agglomeration of particles in the flux or epoxy resin that prohibits pastes flow under low shear rate. The action of high shear rate would break the agglomerates into smaller pieces which facilitates the flow of pastes, thus viscosity is reduced at high shear rate. The solder pastes exhibited a higher degree of structural breakdown compared to the ICAs. The area between the up curve and down curve in the hysteresis curve is an indication of the thixotropic behavior of the pastes. Among the three pastes, lead-free solder paste showed the largest area between the down curve and up curve, which indicating a larger structural breakdown in the pastes, followed by lead-based solder paste and ICA. In a steady shear rate test, viscosity of ICA showed the best recovery with the steeper curve to its original viscosity after the removal of shear, which indicating that the dispersion quality in ICA is good because the high shear has little effect on the microstructure of ICA. In contrast, lead-based paste showed the poorest recovery which means this paste undergo larger structural breakdown and dispersion quality in this paste is poor because the microstructure of the paste is easily disrupted by high shear. The structural breakdown during the application of shear and the recovery after removal of shear is an important characteristic in the paste printing process. If the paste’s viscosity can drop low enough, it may contribute to the aperture filling and quick recovery may prevent slumping.
Resumo:
In this paper, we explore the application of cooperative communications in ultra-wideband (UWB) wireless body area networks (BANs), where a group of on-body devices may collaborate together to communicate with other groups of on-body equipment. Firstly, time-domain UWB channel measurements are presented to characterize the body-centric multipath channel and to facilitate the diversity analysis in a cooperative BAN (CoBAN). We focus on the system deployment scenario when the human subject is in the sitting posture. Important channel parameters such as the pathloss, power variation, power delay profile (PDP), and effective received power (ERP) crosscorrelation are investigated and statistically analyzed. Provided with the model preliminaries, a detailed analysis on the diversity level in a CoBAN is provided. Specifically, an intuitive measure is proposed to quantify the diversity gains in a single-hop cooperative network, which is defined as the number of independent multipaths that can be averaged over to detect symbols. As this measure provides the largest number of redundant copies of transmitted information through the body-centric channel, it can be used as a benchmark to access the performance bound of various diversity-based cooperative schemes in futuristic body sensor systems.
Resumo:
In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.