71 resultados para Mobile and ubiquitous computing
Resumo:
Since its introduction in 1993, the Message Passing Interface (MPI) has become a de facto standard for writing High Performance Computing (HPC) applications on clusters and Massively Parallel Processors (MPPs). The recent emergence of multi-core processor systems presents a new challenge for established parallel programming paradigms, including those based on MPI. This paper presents a new Java messaging system called MPJ Express. Using this system, we exploit multiple levels of parallelism - messaging and threading - to improve application performance on multi-core processors. We refer to our approach as nested parallelism. This MPI-like Java library can support nested parallelism by using Java or Java OpenMP (JOMP) threads within an MPJ Express process. Practicality of this approach is assessed by porting to Java a massively parallel structure formation code from Cosmology called Gadget-2. We introduce nested parallelism in the Java version of the simulation code and report good speed-ups. To the best of our knowledge it is the first time this kind of hybrid parallelism is demonstrated in a high performance Java application. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This article describes work undertaken by the VERA project to investigate how archaeologists work with information technology (IT) on excavation sites. We used a diary study to research the usual patterns of behaviour of archaeologists digging the Silchester Roman town site during the summer of 2007. Although recording had previously been undertaken using pen and paper, during the 2007 season a part of the dig was dedicated to trials of IT and archaeologists used digital pens and paper and Nokia N800 handheld PDAs to record their work. The goal of the trial was to see whether it was possible to record data from the dig whilst still on site, rather than waiting until after the excavation to enter it into the Integrated Archaeological Database (IADB) and to determine whether the archaeologists found the new technology helpful. The digital pens were a success, however, the N800s were not successful given the extreme conditions on site. Our findings confirmed that it was important that technology should fit in well with the work being undertaken rather than being used for its own sake, and should respect established work flows. We also found that the quality of data being entered was a recurrent concern as was the reliability of the infrastructure and equipment.
Resumo:
Fully connected cubic networks (FCCNs) are a class of newly proposed hierarchical interconnection networks for multicomputer systems, which enjoy the strengths of constant node degree and good expandability. The shortest path routing in FCCNs is an open problem. In this paper, we present an oblivious routing algorithm for n-level FCCN with N = 8(n) nodes, and prove that this algorithm creates a shortest path from the source to the destination. At the costs of both an O(N)-parallel-step off-line preprocessing phase and a list of size N stored at each node, the proposed algorithm is carried out at each related node in O(n) time. In some cases the proposed algorithm is superior to the one proposed by Chang and Wang in terms of the length of the routing path. This justifies the utility of our routing strategy. (C) 2006 Elsevier Inc. All rights reserved.
Resumo:
The NERC UK SOLAS-funded Reactive Halogens in the Marine Boundary Layer (RHaMBLe) programme comprised three field experiments. This manuscript presents an overview of the measurements made within the two simultaneous remote experiments conducted in the tropical North Atlantic in May and June 2007. Measurements were made from two mobile and one ground-based platforms. The heavily instrumented cruise D319 on the RRS Discovery from Lisbon, Portugal to São Vicente, Cape Verde and back to Falmouth, UK was used to characterise the spatial distribution of boundary layer components likely to play a role in reactive halogen chemistry. Measurements onboard the ARSF Dornier aircraft were used to allow the observations to be interpreted in the context of their vertical distribution and to confirm the interpretation of atmospheric structure in the vicinity of the Cape Verde islands. Long-term ground-based measurements at the Cape Verde Atmospheric Observatory (CVAO) on São Vicente were supplemented by long-term measurements of reactive halogen species and characterisation of additional trace gas and aerosol species during the intensive experimental period. This paper presents a summary of the measurements made within the RHaMBLe remote experiments and discusses them in their meteorological and chemical context as determined from these three platforms and from additional meteorological analyses. Air always arrived at the CVAO from the North East with a range of air mass origins (European, Atlantic and North American continental). Trace gases were present at stable and fairly low concentrations with the exception of a slight increase in some anthropogenic components in air of North American origin, though NOx mixing ratios during this period remained below 20 pptv (note the non-IUPAC adoption in this manuscript of pptv and ppbv, equivalent to pmol mol−1 and nmol mol−1 to reflect common practice). Consistency with these air mass classifications is observed in the time series of soluble gas and aerosol composition measurements, with additional identification of periods of slightly elevated dust concentrations consistent with the trajectories passing over the African continent. The CVAO is shown to be broadly representative of the wider North Atlantic marine boundary layer; measurements of NO, O3 and black carbon from the ship are consistent with a clean Northern Hemisphere marine background. Aerosol composition measurements do not indicate elevated organic material associated with clean marine air. Closer to the African coast, black carbon and NO levels start to increase, indicating greater anthropogenic influence. Lower ozone in this region is possibly associated with the increased levels of measured halocarbons, associated with the nutrient rich waters of the Mauritanian upwelling. Bromide and chloride deficits in coarse mode aerosol at both the CVAO and on D319 and the continuous abundance of inorganic gaseous halogen species at CVAO indicate significant reactive cycling of halogens. Aircraft measurements of O3 and CO show that surface measurements are representative of the entire boundary layer in the vicinity both in diurnal variability and absolute levels. Above the inversion layer similar diurnal behaviour in O3 and CO is observed at lower mixing ratios in the air that had originated from south of Cape Verde, possibly from within the ITCZ. ECMWF calculations on two days indicate very different boundary layer depths and aircraft flights over the ship replicate this, giving confidence in the calculated boundary layer depth.
Resumo:
A vision system for recognizing rigid and articulated three-dimensional objects in two-dimensional images is described. Geometrical models are extracted from a commercial computer aided design package. The models are then augmented with appearance and functional information which improves the system's hypothesis generation, hypothesis verification, and pose refinement. Significant advantages over existing CAD-based vision systems, which utilize only information available in the CAD system, are realized. Examples show the system recognizing, locating, and tracking a variety of objects in a robot work-cell and in natural scenes.
Resumo:
The archaeology of Britain during the early Middle Pleistocene (MIS 19–12) is represented by a number of key sites across eastern and southern England. These sites include Pakefield, Happisburgh 1, High Lodge, Warren Hill, Waverley Wood, Boxgrove, Kent's Cavern, and Westbury-sub-Mendip, alongside a ‘background scatter’ lithic record associated with the principal river systems (Bytham, pre-diversion Thames, and Solent) and raised beaches (Westbourne–Arundel). Hominin behaviour can be characterised in terms of: preferences for temperate or cool temperate climates and open/woodland mosaic habitats (indicated by mammalian fauna, mollusca, insects, and sediments); a biface-dominated material culture characterised by technological diversity, although with accompanying evidence for distinctive core and flake (Pakefield) and flake tool (High Lodge) assemblages; probable direct hunting-based subsistence strategies (with a focus upon large mammal fauna); and generally locally-focused spatial and landscape behaviours (principally indicated by raw material sources data), although with some evidence of dynamic, mobile and structured technological systems. The British data continues to support a ‘modified short chronology’ to the north of the Alps and the Pyrenees, with highly sporadic evidence for a hominin presence prior to 500–600 ka, although the ages of key assemblages are subject to ongoing debates regarding the chronology of the Bytham river terraces and the early Middle Pleistocene glaciations of East Anglia.
Resumo:
In this paper a look is taken at how the use of implant and electrode technology can be employed to create biological brains for robots, to enable human enhancement and to diminish the effects of certain neural illnesses. In all cases the end result is to increase the range of abilities of the recipients. An indication is given of a number of areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking a biological brain directly with computer technology. The emphasis is placed on practical scientific studies that have been and are being undertaken and reported on. The area of focus is the use of electrode technology, where either a connection is made directly with the cerebral cortex and/or nervous system or where implants into the human body are involved. The paper also considers robots that have biological brains in which human neurons can be employed as the sole thinking machine for a real world robot body.
Resumo:
Adaptive methods which “equidistribute” a given positive weight function are now used fairly widely for selecting discrete meshes. The disadvantage of such schemes is that the resulting mesh may not be smoothly varying. In this paper a technique is developed for equidistributing a function subject to constraints on the ratios of adjacent steps in the mesh. Given a weight function $f \geqq 0$ on an interval $[a,b]$ and constants $c$ and $K$, the method produces a mesh with points $x_0 = a,x_{j + 1} = x_j + h_j ,j = 0,1, \cdots ,n - 1$ and $x_n = b$ such that\[ \int_{xj}^{x_{j + 1} } {f \leqq c\quad {\text{and}}\quad \frac{1} {K}} \leqq \frac{{h_{j + 1} }} {{h_j }} \leqq K\quad {\text{for}}\, j = 0,1, \cdots ,n - 1 . \] A theoretical analysis of the procedure is presented, and numerical algorithms for implementing the method are given. Examples show that the procedure is effective in practice. Other types of constraints on equidistributing meshes are also discussed. The principal application of the procedure is to the solution of boundary value problems, where the weight function is generally some error indicator, and accuracy and convergence properties may depend on the smoothness of the mesh. Other practical applications include the regrading of statistical data.
Resumo:
We consider the linear equality-constrained least squares problem (LSE) of minimizing ${\|c - Gx\|}_2 $, subject to the constraint $Ex = p$. A preconditioned conjugate gradient method is applied to the Kuhn–Tucker equations associated with the LSE problem. We show that our method is well suited for structural optimization problems in reliability analysis and optimal design. Numerical tests are performed on an Alliant FX/8 multiprocessor and a Cray-X-MP using some practical structural analysis data.
Resumo:
Methods for producing nonuniform transformations, or regradings, of discrete data are discussed. The transformations are useful in image processing, principally for enhancement and normalization of scenes. Regradings which “equidistribute” the histogram of the data, that is, which transform it into a constant function, are determined. Techniques for smoothing the regrading, dependent upon a continuously variable parameter, are presented. Generalized methods for constructing regradings such that the histogram of the data is transformed into any prescribed function are also discussed. Numerical algorithms for implementing the procedures and applications to specific examples are described.
Resumo:
The Twitter network has been labelled the most commonly used microblogging application around today. With about 500 million estimated registered users as of June, 2012, Twitter has become a credible medium of sentiment/opinion expression. It is also a notable medium for information dissemination; including breaking news on diverse issues since it was launched in 2007. Many organisations, individuals and even government bodies follow activities on the network in order to obtain knowledge on how their audience reacts to tweets that affect them. We can use postings on Twitter (known as tweets) to analyse patterns associated with events by detecting the dynamics of the tweets. A common way of labelling a tweet is by including a number of hashtags that describe its contents. Association Rule Mining can find the likelihood of co-occurrence of hashtags. In this paper, we propose the use of temporal Association Rule Mining to detect rule dynamics, and consequently dynamics of tweets. We coined our methodology Transaction-based Rule Change Mining (TRCM). A number of patterns are identifiable in these rule dynamics including, new rules, emerging rules, unexpected rules and ?dead' rules. Also the linkage between the different types of rule dynamics is investigated experimentally in this paper.