933 resultados para heterogeneous computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security of RFID authentication protocols has received considerable interest recently. However, an important aspect of such protocols that has not received as much attention is the efficiency of their communication. In this paper we investigate the efficiency benefits of pre-computation for time-constrained applications in small to medium RFID networks. We also outline a protocol utilizing this mechanism in order to demonstrate the benefits and drawbacks of using thisapproach. The proposed protocol shows promising results as it is able to offer the security of untraceableprotocols whilst only requiring the time comparable to that of more efficient but traceable protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An increased interest in utilising groups of Unmanned Aerial Vehicles (UAVs) with heterogeneous capabilities and autonomy is presenting the challenge to effectively manage such during missions and operations. This has been the focus of research in recent years, moving from a traditional UAV management paradigm of n-to-1 (n operators for one UAV, with n being at least two operators) toward 1-to-n (one operator, multiple UAVs). This paper has expanded on the authors’ previous work on UAV functional capability framework, by incorporating the concept of Functional Level of Autonomy (F-LOA) with two configurations: The lower F-LOA configuration contains sufficient information for the operator to generate solutions and make decisions to address perturbation events. Alternatively, the higher F-LOA configuration presents information reflecting on the F-LOA of the UAV, allowing the operator to interpret solutions and decisions generated autonomously, and decide whether to veto from this decision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research was a step forward in developing a data integration framework for Electronic Health Records. The outcome of the research is a conceptual and logical Data Warehousing model for integrating Cardiac Surgery electronic data records. This thesis investigated the main obstacles for the healthcare data integration and proposes a data warehousing model suitable for integrating fragmented data in a Cardiac Surgery Unit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a cluster ensemble method to map the corpus documents into the semantic space embedded in Wikipedia and group them using multiple types of feature space. A heterogeneous cluster ensemble is constructed with multiple types of relations i.e. document-term, document-concept and document-category. A final clustering solution is obtained by exploiting associations between document pairs and hubness of the documents. Empirical analysis with various real data sets reveals that the proposed meth-od outperforms state-of-the-art text clustering approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In particle-strengthened metallic alloys, fatigue damage incubates at inclusion particles near the surface or at the change of geometries. Micromechanical simulation of inclusions such that the fatigue damage incubation mechanisms can be categorized. As micro-plasticity gradient field around different inclusions is different, a novel concept for nonlocal evaluation of micro-plasticity intensity is introduced. The effects of void aspects ration and spatial distributions are quantified for fatigue incubation life in the high-cycle fatigue regime. At last, these effects are integrated based on the statistical facts of inclusions to predict the fatigue life of structural components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work the electrochemical formation of porous Cu/Ag materials is reported via the simple and quick method of hydrogen bubble templating. The bulk and surface composition ratio between Ag and Cu was varied in a systematic manner and was readily controlled by the concentration of precursor metal salts in the electrolyte. The incorporation of Ag within the Cu scaffold only affected the formation of well-defined pores at high Ag loading whereas the internal pore wall structure gradually transformed from dendritic to cube like and finally needle like structures, which was due to the concomitant formation of Cu2O within the structure. The materials were characterised by scanning electron microscopy (SEM), X-ray diffraction (XRD), and X-ray photoelectron spectroscopy (XPS). Their surface properties were further investigated by surface enhanced Raman spectroscopy (SERS) and electrochemically probed by recording the hydrogen evolution reaction (HER) which is highly sensitive to the nature of the surface. The effect of surface composition was then investigated for its influence on two catalytic reactions namely the reduction of ferricyanide ions with thiosulphate ions and the reduction of 4-nitrophenol with NaBH4 in aqueous solution where it was found that the presence of Ag had a beneficial effect in both cases but more so in the case of nitrophenol reduction. It is believed that this material may have many more potential applications in the area of catalysis, electrocatalysis and photocatalysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological advances have led to an influx of affordable hardware that supports sensing, computation and communication. This hardware is increasingly deployed in public and private spaces, tracking and aggregating a wealth of real-time environmental data. Although these technologies are the focus of several research areas, there is a lack of research dealing with the problem of making these capabilities accessible to everyday users. This thesis represents a first step towards developing systems that will allow users to leverage the available infrastructure and create custom tailored solutions. It explores how this notion can be utilized in the context of energy monitoring to improve conventional approaches. The project adopted a user-centered design process to inform the development of a flexible system for real-time data stream composition and visualization. This system features an extensible architecture and defines a unified API for heterogeneous data streams. Rather than displaying the data in a predetermined fashion, it makes this information available as building blocks that can be combined and shared. It is based on the insight that individual users have diverse information needs and presentation preferences. Therefore, it allows users to compose rich information displays, incorporating personally relevant data from an extensive information ecosystem. The prototype was evaluated in an exploratory study to observe its natural use in a real-world setting, gathering empirical usage statistics and conducting semi-structured interviews. The results show that a high degree of customization does not warrant sustained usage. Other factors were identified, yielding recommendations for increasing the impact on energy consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Addressing possibilities for authentic combinations of diverse media within an installation setting, this research tested hybrid blends of the physical, digital and temporal to explore liminal space and image. The practice led research reflected on creation of artworks from three perspectives – material, immaterial and hybrid – and in doing so, developed a new methodological structure that extends conventional forms of triangulation. This study explored how physical and digital elements each sought hierarchical presence, yet simultaneously coexisted, thereby extending the visual and conceptual potential of the work. Outcomes demonstrated how utilising and recording transitional processes of hybrid imagery achieved a convergence of diverse, experiential forms. "Hybrid authority" – an authentic convergence of disparate elements – was articulated in the creation and public sharing of processual works and the creation of an innovative framework for hybrid art practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a discussion of the journal article: "Construcing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation". The article and discussion have appeared in the Journal of the Royal Statistical Society: Series B (Statistical Methodology).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms using indirect infer- ence. We embed this approach within a sequential Monte Carlo algorithm that is completely adaptive. This methodological development was motivated by an application involving data on macroparasite population evolution modelled with a trivariate Markov process. The main objective of the analysis is to compare inferences on the Markov process when considering two di®erent indirect mod- els. The two indirect models are based on a Beta-Binomial model and a three component mixture of Binomials, with the former providing a better ¯t to the observed data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the natural problem of secure n-party computation (in the computationally unbounded attack model) of circuits over an arbitrary finite non-Abelian group (G,⋅), which we call G-circuits. Besides its intrinsic interest, this problem is also motivating by a completeness result of Barrington, stating that such protocols can be applied for general secure computation of arbitrary functions. For flexibility, we are interested in protocols which only require black-box access to the group G (i.e. the only computations performed by players in the protocol are a group operation, a group inverse, or sampling a uniformly random group element). Our investigations focus on the passive adversarial model, where up to t of the n participating parties are corrupted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classical results in unconditionally secure multi-party computation (MPC) protocols with a passive adversary indicate that every n-variate function can be computed by n participants, such that no set of size t < n/2 participants learns any additional information other than what they could derive from their private inputs and the output of the protocol. We study unconditionally secure MPC protocols in the presence of a passive adversary in the trusted setup (‘semi-ideal’) model, in which the participants are supplied with some auxiliary information (which is random and independent from the participant inputs) ahead of the protocol execution (such information can be purchased as a “commodity” well before a run of the protocol). We present a new MPC protocol in the trusted setup model, which allows the adversary to corrupt an arbitrary number t < n of participants. Our protocol makes use of a novel subprotocol for converting an additive secret sharing over a field to a multiplicative secret sharing, and can be used to securely evaluate any n-variate polynomial G over a field F, with inputs restricted to non-zero elements of F. The communication complexity of our protocol is O(ℓ · n 2) field elements, where ℓ is the number of non-linear monomials in G. Previous protocols in the trusted setup model require communication proportional to the number of multiplications in an arithmetic circuit for G; thus, our protocol may offer savings over previous protocols for functions with a small number of monomials but a large number of multiplications.