989 resultados para Sink nodes


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Terrestrial and oceanic biomass carbon sinks help reduce anthropogenic CO2 emissions and mitigate the long-term effect of increasing atmospheric CO2. Woody plants have large carbon pools because of their long residence time, however N availability can negatively impact tree responses to elevated CO2. Seasonal cycling of internal N in trees is a component that contributes to fitness especially in N limited environments. It involves resorption from senescing leaves of deciduous trees and storage as vegetative storage proteins (VSP) in perennial organs. Populus is a model organism for tree biology that efficiently recycles N. Bark storage proteins (BSP) are the most abundant VSP that serves as seasonal N reserves. Here I show how poplar growth is influenced by N availability and how growth is influenced by shoot competition for stored N reserves. I also provide data that indicates that auxin mediates BSP catabolism during renewed shoot growth. Understanding the components of N accumulation, remobilization and utilization can provide insights leading to increasing N use efficiency (NUE) of perennial plants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numéro spécial: Translational Nanomedicine

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A comprehensive environmental monitoring program was conducted in the Ojo Guareña cave system (Spain), one of the longest cave systems in Europe, to assess the magnitude of the spatiotemporal changes in carbon dioxide gas (CO2) in the cave–soil–atmosphere profile. The key climate-driven processes involved in gas exchange, primarily gas diffusion and cave ventilation due to advective forces, were characterized. The spatial distributions of both processes were described through measurements of CO2 and its carbon isotopic signal (δ13C[CO2]) from exterior, soil and cave air samples analyzed by cavity ring-down spectroscopy (CRDS). The trigger mechanisms of air advection (temperature or air density differences or barometric imbalances) were controlled by continuous logging systems. Radon monitoring was also used to characterize the changing airflow that results in a predictable seasonal or daily pattern of CO2 concentrations and its carbon isotopic signal. Large daily oscillations of CO2 levels, ranging from 680 to 1900 ppm day−1 on average, were registered during the daily oscillations of the exterior air temperature around the cave air temperature. These daily variations in CO2 concentration were unobservable once the outside air temperature was continuously below the cave temperature and a prevailing advective-renewal of cave air was established, such that the daily-averaged concentrations of CO2 reached minimum values close to atmospheric background. The daily pulses of CO2 and other tracer gases such as radon (222Rn) were smoothed in the inner cave locations, where fluctuation of both gases was primarily correlated with medium-term changes in air pressure. A pooled analysis of these data provided evidence that atmospheric air that is inhaled into dynamically ventilated caves can then return to the lower troposphere as CO2-rich cave air.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identifying influential nodes is of theoretical significance in many domains. Although lots of methods have been proposed to solve this problem, their evaluations are under single-source attack in scale-free networks. Meanwhile, some researches have speculated that the combinations of some methods may achieve more optimal results. In order to evaluate this speculation and design a universal strategy suitable for different types of networks under the consideration of multi-source attacks, this paper proposes an attribute fusion method with two independent strategies to reveal the correlation of existing ranking methods and indicators. One is based on feature union (FU) and the other is based on feature ranking (FR). Two different propagation models in the fields of recommendation system and network immunization are used to simulate the efficiency of our proposed method. Experimental results show that our method can enlarge information spreading and restrain virus propagation in the application of recommendation system and network immunization in different types of networks under the condition of multi-source attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a clustered wireless sensor network (WSN) under epidemic-malware propagation conditions and solve the problem of how to evaluate its reliability so as to ensure efficient, continuous, and dependable transmission of sensed data from sensor nodes to the sink. Facing the contradiction between malware intention and continuous-time Markov chain (CTMC) randomness, we introduce a strategic game that can predict malware infection in order to model a successful infection as a CTMC state transition. Next, we devise a novel measure to compute the Mean Time to Failure (MTTF) of a sensor node, which represents the reliability of a sensor node continuously performing tasks such as sensing, transmitting, and fusing data. Since clustered WSNs can be regarded as parallel-serial-parallel systems, the reliability of a clustered WSN can be evaluated via classical reliability theory. Numerical results show the influence of parameters such as the true positive rate and the false positive rate on a sensor node's MTTF. Furthermore, we validate the method of reliability evaluation for a clustered WSN according to the number of sensor nodes in a cluster, the number of clusters in a route, and the number of routes in the WSN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bayesian Belief Networks (BBNs) are emerging as valuable tools for investigating complex ecological problems. In a BBN, the important variables in a problem are identified and causal relationships are represented graphically. Underpinning this is the probabilistic framework in which variables can take on a finite range of mutually exclusive states. Associated with each variable is a conditional probability table (CPT), showing the probability of a variable attaining each of its possible states conditioned on all possible combinations of it parents. Whilst the variables (nodes) are connected, the CPT attached to each node can be quantified independently. This allows each variable to be populated with the best data available, including expert opinion, simulation results or observed data. It also allows the information to be easily updated as better data become available ----- ----- This paper reports on the process of developing a BBN to better understand the initial rapid growth phase (initiation) of a marine cyanobacterium, Lyngbya majuscula, in Moreton Bay, Queensland. Anecdotal evidence suggests that Lyngbya blooms in this region have increased in severity and extent over the past decade. Lyngbya has been associated with acute dermatitis and a range of other health problems in humans. Blooms have been linked to ecosystem degradation and have also damaged commercial and recreational fisheries. However, the causes of blooms are as yet poorly understood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Access All was performance produced following a three-month mentorship in web-based performance that I was commissioned to conduct for the performance company Igneous. This live, triple-site performance event for three performers in three remote venues was specifically designed for presentation at Access Grid Nodes - conference rooms located around the globe equipped with a high end, open source computer teleconferencing technology that allowed multiple nodes to cross-connect with each other. Whilst each room was setup somewhat differently they all deployed the same basic infrastructre of multiple projectors, cameras, and sound as well as a reconfigurable floorspace. At that time these relatively formal setups imposed a clear series of limitations in terms of software capabilities and basic infrastructure and so there was much interest in understanding how far its capabilities might be pushed.----- Numerous performance experiments were undertaken between three Access Grid nodes in QUT Brisbane, VISLAB Sydney and Manchester Supercomputing Centre, England, culminating in the public performance staged simultaneously between the sites with local audiences at each venue and others online. Access All was devised in collaboration with interdisciplinary performance company Bonemap, Kelli Dipple (Interarts curator, Tate Modern London) and Mike Stubbs British curator and Director of FACT (Liverpool).----- This period of research and development was instigated and shaped by a public lecture I had earlier delivered in Sydney for the ‘Global Access Grid Network, Super Computing Global Conference’ entitled 'Performance Practice across Electronic Networks'. The findings of this work went on to inform numerous future networked and performative works produced from 2002 onwards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems have recently been added to the already wide collection of wireless sensor networks applications. The PCS/SCADA environment is somewhat more amenable to the use of heavy cryptographic mechanisms such as public key cryptography than other sensor application environments. The sensor nodes in the environment, however, are still open to devastating attacks such as node capture, which makes designing a secure key management challenging. In this paper, a key management scheme is proposed to defeat node capture attack by offering both forward and backward secrecies. Our scheme overcomes the pitfalls which Nilsson et al.'s scheme suffers from, and is not more expensive than their scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

President’s Message Hello fellow AITPM members, Well I can’t believe it’s already October! My office is already organising its end of year function and looking to plan for 2010. Our whole School is moving to a different building next year, with the lovely L block eventually making way for a new shiny one. Those of you who have entered the Brisbane CBD from the south side, across the Captain Cook Bridge, would know L block as the big 9 storey brick and concrete Lego block ode to 1970’s functional architecture, which greets you on the right hand side. Onto traffic matters: an issue that has been tossing around in my mind of late is that of speed. I know I am growing older and may be prematurely becoming a “grumpy old man”, but everyone around me locally seems to be accelerating off from the stop line much faster than I was taught to for economical driving, both here and in the United States (yes they made my wife and me resit our written and practical driving tests when we lived there). People here in Australia also seem to be driving right on top of the posted speed limit, on whichever part of the Road Hierarchy, whether urban or rural. I was also taught on both sides of the planet that the posted speed limit is a maximum legal speed, not the recommended driving speed. This message did seem to sink in to the American drivers around me when we lived in Oregon - where people did appear to drive more cautiously. Further, posted speed limits in Oregon were, and I presume still are, set more conservative by about 5mph or 10km/h than Australian limits, for any given part of the Road Hierarchy. Another excellent speed limit treatment used in Oregon was in school zones, where reduced speed limits applied “when children are present” rather than during prescribed hours on school days. This would be especially useful here in Australia, where a lot of extra-curricular activities take place around schools outside of the prescribed speed limit hours. Before and after hours school care is on the increase (with parents dropping and collecting children near dawn and dusk in the winter), and many childcentred land uses are located adjacent to schools, such as Scouts/Guides halls, swimming pools and parks. Consequentially, I believe there needs to be some consideration towards more public campaigning about economical driving and the real purpose of the speed limit = or perhaps even a rethink of the speed limit concept, if people really are driving on top of it and it’s not just me becoming grumpier (our industrial psychology friends at the research centres may be able to assist us here). The Queensland organising committee is now in full swing organising the 2010 AITPM National Conference, What’s New?, so please keep a lookout for related content. Best regards to all, Jon Bunker PS A Cartoonists view of traffic engineers I thought you might enjoy this. http://xkcd.com/277/

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Orissa state, India, the DakNet system supports asynchronous Internet communication between an urban hub and rural nodes. DakNet is noteworthy in many respects, not least in how the system leverages existing transport infrastructure. Wi-Fi transceivers mounted on local buses send and receive user data from roadside kiosks, for later transfer to/from the Internet via wireless protocols. This store-and-forward system allows DakNet to offer asynchronous communication capacity to rural users at low cost. The original ambition of the DakNet system was to provide email and SMS facilities to rural communities. Our 2008 study of the communicative ecology surrounding the DakNet system revealed that this ambition has now evolved – in response to market demand – to the extent that e-shopping (rather than email) has become the primary driver behind the DakNet offer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mapping the physical world, the arrangement of continents and oceans, cities and villages, mountains and deserts, while not without its own contentious aspects, can at least draw upon centuries of previous work in cartography and discovery. To map virtual spaces is another challenge altogether. Are cartographic conventions applicable to depictions of the blogosphere, or the internet in general? Is a more mathematical approach required to even start to make sense of the shape of the blogosphere, to understand the network created by and between blogs? With my research comparing information flows in the Australian and French political blogs, visualising the data obtained is important as it can demonstrate the spread of ideas and topics across blogs. However, how best to depict the flows, links, and the spaces between is still unclear. Is network theory and systems of hubs and nodes more relevant than mass communication theories to the research at hand, influencing the nature of any map produced? Is it even a good idea to try and apply boundaries like ‘Australian’ and ‘French’ to parts of a map that does not reflect international borders or the Mercator projection? While drawing upon some of my work-in-progress, this paper will also evaluate previous maps of the blogosphere and approaches to depicting networks of blogs. As such, the paper will provide a greater awareness of the tools available and the strengths and limitations of mapping methodologies, helping to shape the direction of my research in a field still very much under development.