874 resultados para and Overlay Networks
Resumo:
MPJ Express is our implementation of MPI-like bindings for Java. In this paper we discuss our intermediate buffering layer that makes use of the so-called direct byte buffers introduced in the Java New I/O package. The purpose of this layer is to support the implementation of derived datatypes. MPJ Express is the first Java messaging library that implements this feature using pure Java. In addition, this buffering layer allows efficient implementation of communication devices based on proprietary networks such as Myrinet. In this paper we evaluate the performance of our buffering layer and demonstrate the usefulness of direct byte buffers. Also, we evaluate the performance of MPJ Express against other messaging systems using Myrinet and show that our buffering layer has made it possible to avoid the overheads suffered by other Java systems such as mpiJava that relies on the Java Native Interface.
Resumo:
MPJ Express is our implementation of MPI-like bindings for Java. In this paper we discuss our intermediate buffering layer that makes use of the so-called direct byte buffers introduced in the Java New I/O package. The purpose of this layer is to support the implementation of derived datatypes. MPJ Express is the first Java messaging library that implements this feature using pure Java. In addition, this buffering layer allows efficient implementation of communication devices based on proprietary networks such as Myrinet. In this paper we evaluate the performance of our buffering layer and demonstrate the usefulness of direct byte buffers. Also, we evaluate the performance of MPJ Express against other messaging systems using Myrinet and show that our buffering layer has made it possible to avoid the overheads suffered by other Java systems such as mpiJava that relies on the Java Native Interface.
Resumo:
The role of users is an often-overlooked aspect of studies of innovation and diffusion. Using an actor-network theory (ANT) approach, four case studies examine the processes of implementing a piece of CAD (computer aided design) software, BSLink, in different organisations and describe the tailoring done by users to embed the software into working practices. This not only results in different practices of use at different locations, but also transforms BSLink itself into a proliferation of BSLinks-in-use. A focus group for BSLink users further reveals the gaps between different users' expectations and ways of using the software, and between different BSLinks-in-use. It also demonstrates the contradictory demands this places on its further development. The ANT-informed approach used treats both innovation and diffusion as processes of translation within networks. It also emphasises the political nature of innovation and implementation, and the efforts of various actors to delegate manoeuvres for increased influence onto technological artefacts.
Resumo:
This paper will present a conceptual framework for the examination of land redevelopment based on a complex systems/networks approach. As Alvin Toffler insightfully noted, modern scientific enquiry has become exceptionally good at splitting problems into pieces but has forgotten how to put the pieces back together. Twenty-five years after his remarks, governments and corporations faced with the requirements of sustainability are struggling to promote an ‘integrated’ or ‘holistic’ approach to tackling problems. Despite the talk, both practice and research provide few platforms that allow for ‘joined up’ thinking and action. With socio-economic phenomena, such as land redevelopment, promising prospects open up when we assume that their constituents can make up complex systems whose emergent properties are more than the sum of the parts and whose behaviour is inherently difficult to predict. A review of previous research shows that it has mainly focused on idealised, ‘mechanical’ views of property development processes that fail to recognise in full the relationships between actors, the structures created and their emergent qualities. When reality failed to live up to the expectations of these theoretical constructs then somebody had to be blamed for it: planners, developers, politicians. However, from a ‘synthetic’ point of view the agents and networks involved in property development can be seen as constituents of structures that perform complex processes. These structures interact, forming new more complex structures and networks. Redevelopment then can be conceptualised as a process of transformation: a complex system, a ‘dissipative’ structure involving developers, planners, landowners, state agencies etc., unlocks the potential of previously used sites, transforms space towards a higher order of complexity and ‘consumes’ but also ‘creates’ different forms of capital in the process. Analysis of network relations point toward the ‘dualism’ of structure and agency in these processes of system transformation and change. Insights from actor network theory can be conjoined with notions of complexity and chaos to build an understanding of the ways in which actors actively seek to shape these structures and systems, whilst at the same time are recursively shaped by them in their strategies and actions. This approach transcends the blame game and allows for inter-disciplinary inputs to be placed within a broader explanatory framework that does away with many past dichotomies. Better understanding of the interactions between actors and the emergent qualities of the networks they form can improve our comprehension of the complex socio-spatial phenomena that redevelopment comprises. The insights that this framework provides when applied in UK institutional investment into redevelopment are considered to be significant.
Resumo:
The themes of awareness and influence within the innovation diffusion process are addressed. The innovation diffusion process is typically represented as stages, yet awareness and influence are somewhat under-represented in the literature. Awareness and influence are situated within the contextual setting of individual actors but also within the broader institutional forces. Understanding how actors become aware of an innovation and then how their opinion is influenced is important for creating a more innovation-active UK construction sector. Social network analysis is proposed as one technique for mapping how awareness and influence occur and what they look like as a network. Empirical data are gathered using two modes of enquiry. This is done through a pilot study consisting of chartered professionals and then through a case study organization as it attempted to diffuse an innovation. The analysis demonstrates significant variations across actors’ awareness and influence networks. It is argued that social network analysis can complement other research methods in order to present a richer picture of how actors become aware of innovations and where they draw their influences regarding adopting innovations. In summarizing the findings, a framework for understanding awareness and influence associated with innovation within the UK construction sector is presented. Finally, with the UK construction sector continually being encouraged to be innovative, understanding and managing an actor’s awareness and influence network will be beneficial. The overarching conclusion thus describes the need not only to build research capacity in this area but also to push the boundaries related to the research methods employed.
Resumo:
Deep Brain Stimulation has been used in the study of and for treating Parkinson’s Disease (PD) tremor symptoms since the 1980s. In the research reported here we have carried out a comparative analysis to classify tremor onset based on intraoperative microelectrode recordings of a PD patient’s brain Local Field Potential (LFP) signals. In particular, we compared the performance of a Support Vector Machine (SVM) with two well known artificial neural network classifiers, namely a Multiple Layer Perceptron (MLP) and a Radial Basis Function Network (RBN). The results show that in this study, using specifically PD data, the SVM provided an overall better classification rate achieving an accuracy of 81% recognition.
Resumo:
Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian computation (Pritchard et al., 1999). Applications to estimating the parameters of Ising models and exponential random graphs from noisy data are presented. Each algorithm used in the paper targets an approximation to the true posterior due to the use of MCMC to simulate from the latent graphical model, in lieu of being able to do this exactly in general. The supplementary appendix also describes the nature of the resulting approximation.
Resumo:
The role of platelets in hemostasis and thrombosis is dependent on a complex balance of activatory and inhibitory signaling pathways. Inhibitory signals released from the healthy vasculature suppress platelet activation in the absence of platelet receptor agonists. Activatory signals present at a site of injury initiate platelet activation and thrombus formation; subsequently, endogenous negative signaling regulators dampen activatory signals to control thrombus growth. Understanding the complex interplay between activatory and inhibitory signaling networks is an emerging challenge in the study of platelet biology and necessitates a systematic approach to utilize experimental data effectively. In this review, we will explore the key points of platelet regulation and signaling that maintain platelets in a resting state, mediate activation to elicit thrombus formation or provide negative feedback. Platelet signaling will be described in terms of key signaling molecules that are common to the pathways activated by platelet agonists and can be described as regulatory nodes for both positive and negative regulators. This article is protected by copyright. All rights reserved.
Resumo:
An international conference is a secular ritual which serves to create, recreate and shape global-wide translocal cultural sharings. Social anthropological theories and methods are used to show that, besides being an information flow junction, the international conference is a network crossroad and a way of socialising new members into aninternational research community. It is also capable of creating prestige and honour for the individual researcher,for the arranging research team, university and city. Rituals do not merely reflect the social relations or cosmology of a society, but are events that in themselves do important things through ritual forms and symbolic statements.
Resumo:
This document represents a doctoral thesis held under the Brazilian School of Public and Business Administration of Getulio Vargas Foundation (EBAPE/FGV), developed through the elaboration of three articles. The research that resulted in the articles is within the scope of the project entitled “Windows of opportunities and knowledge networks: implications for catch-up in developing countries”, funded by Support Programme for Research and Academic Production of Faculty (ProPesquisa) of Brazilian School of Public and Business Administration (EBAPE) of Getulio Vargas Foundation.
Resumo:
The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.
Resumo:
This paper presents a non-model based technique to detect, locate, and characterize structural damage by combining the impedance-based structural health monitoring technique with an artificial neural network. The impedance-based structural health monitoring technique, which utilizes the electromechanical coupling property of piezoelectric materials, has shown engineering feasibility in a variety of practical field applications. Relying on high frequency structural excitations (typically>30 kHz), this technique is very sensitive to minor structural changes in the near field of the piezoelectric sensors. In order to quantitatively assess the state of structures, two sets of artificial neural networks, which utilize measured electrical impedance signals for input patterns, were developed. By employing high frequency ranges and by incorporating neural network features, this technique is able to detect the damage in its early stage and to estimate the nature of damage without prior knowledge of the model of structures. The paper concludes with an experimental example, an investigation on a massive quarter scale model of a steel bridge section, in order to verify the performance of this proposed methodology.
Resumo:
A performance comparison between a recently proposed novel technique known as fast orthogonal frequency-division multiplexing (FOFDM) and conventional orthogonal frequency-division multiplexing (OFDM) is undertaken over unamplified, intensity-modulated, and direct-detected directly modulated laser-based optical signals. Key transceiver parameters, such as the maximum achievable transmission capacity and the digital-to-analog/analog-to-digital converter (DAC/ADC) effects are explored thoroughly. It is shown that, similarly to conventional OFDM, the least complex and bandwidth efficient FOFDM can support up to similar to 20 Gb/s over 500 m worst-case multimode fiber (MMF) links having 3 dB effective bandwidths of similar to 200 MHz X km. For compensation of the DAC/ADC roll-off, a power-loading (PL) algorithm is adopted, leading to an FOFDM system improvement of similar to 4 dB. FOFDM and conventional OFDM give similar optimum DAC/ADC parameters over 500 m worst-case MMF, while over 50 km single-mode fiber a maximum deviation of only similar to 1 dB in clipping ratio is observed due to the imperfect chromatic dispersion compensation caused by one-tap equalizers.