134 resultados para network model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major imperfections in crosslinked polymers include loose or dangling chain ends that lower the crosslink d., thereby reducing elastic recovery and increasing the solvent swelling. These imperfections are hard to detect, quantify and control when the network is initiated by free radical reactions. As an alternative approach, the sol-​gel synthesis of a model poly(ethylene glycol) (PEG-​2000) network is described using controlled amts. of bis- and mono-​triethoxy silyl Pr urethane PEG precursors to give silsesquioxane (SSQ, R-​SiO1.5) structures as crosslink junctions with a controlled no. of dangling chains. The effect of the no. of dangling chains on the structure and connectivity of the dried SSQ networks has been detd. by step-​crystn. differential scanning calorimetry. The role that micelle formation plays in controlling the sol-​gel PEG network connectivity has been studied by dynamic light scattering of the bis- and mono-​triethoxy silyl precursors and the networks have been characterized by 29Si solid state NMR, sol fraction and swelling measurements. These show that the dangling chains will increase the mesh size and water uptake. Compared to other end-​linked PEG hydrogels, the SSQ-​crosslinked networks show a low sol fraction and high connectivity, which reduces solvent swelling, degree of crystallinity and the crystal transition temp. The increased degree of freedom in segment movement on the addn. of dangling chains in the SSQ-​crosslinked network facilitates the packing process in crystn. of the dry network and, in the hydrogel, helps to accommodate more water mols. before reaching equil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise social networks are organizationally bounded online platforms for users to interact with another and maintain interpersonal relationships. The allure of these technologies is often seen in intra-organizational communication, collaboration and innovation. How these technologies actually support organizational innovation efforts remains unclear. A specific challenge is whether digital content on these platforms converts to actual innovation development efforts. In this study we set out to examine innovation-centric content flows on enterprise social networking platforms, and advance a conceptual model that seeks to explain which innovation conveyed in the digital content will traverse from the digital platform into regular processes. We describe important constructs of our model and offer strategies for the operationalization of the constructs. We conclude with an outlook to our ongoing empirical study that will explore and validate the key propositions of our model, and we sketch some potential implications for industry and academia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biodiesel, produced from renewable feedstock represents a more sustainable source of energy and will therefore play a significant role in providing the energy requirements for transportation in the near future. Chemically, all biodiesels are fatty acid methyl esters (FAME), produced from raw vegetable oil and animal fat. However, clear differences in chemical structure are apparent from one feedstock to the next in terms of chain length, degree of unsaturation, number of double bonds and double bond configuration-which all determine the fuel properties of biodiesel. In this study, prediction models were developed to estimate kinematic viscosity of biodiesel using an Artificial Neural Network (ANN) modelling technique. While developing the model, 27 parameters based on chemical composition commonly found in biodiesel were used as the input variables and kinematic viscosity of biodiesel was used as output variable. Necessary data to develop and simulate the network were collected from more than 120 published peer reviewed papers. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture and learning algorithm were optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the coefficient of determination (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found high predictive accuracy of the ANN in predicting fuel properties of biodiesel and has demonstrated the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties. Therefore the model developed in this study can be a useful tool to accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network coding is a method for achieving channel capacity in networks. The key idea is to allow network routers to linearly mix packets as they traverse the network so that recipients receive linear combinations of packets. Network coded systems are vulnerable to pollution attacks where a single malicious node floods the network with bad packets and prevents the receiver from decoding correctly. Cryptographic defenses to these problems are based on homomorphic signatures and MACs. These proposals, however, cannot handle mixing of packets from multiple sources, which is needed to achieve the full benefits of network coding. In this paper we address integrity of multi-source mixing. We propose a security model for this setting and provide a generic construction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation is an important issue in the development and application of Bayesian Belief Network (BBN) models, especially when the outcome of the model cannot be directly observed. Despite this, few frameworks for validating BBNs have been proposed and fewer have been applied to substantive real-world problems. In this paper we adopt the approach by Pitchforth and Mengersen (2013), which includes nine validation tests that each focus on the structure, discretisation, parameterisation and behaviour of the BBNs included in the case study. We describe the process and result of implementing a validation framework on a model of a real airport terminal system with particular reference to its effectiveness in producing a valid model that can be used and understood by operational decision makers. In applying the proposed validation framework we demonstrate the overall validity of the Inbound Passenger Facilitation Model as well as the effectiveness of the validity framework itself.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: An arteriovenous loop (AVL) enclosed in a polycarbonate chamber in vivo, produces a fibrin exudate which acts as a provisional matrix for the development of a tissue engineered microcirculatory network. Objectives: By administering enoxaparin sodium - an inhibitor of fibrin polymerization, the significance of fibrin scaffold formation on AVL construct size (including the AVL, fibrin scaffold, and new tissue growth into the fibrin), growth, and vascularization were assessed and compared to controls. Methods: In Sprague Dawley rats, an AVL was created on femoral vessels and inserted into a polycarbonate chamber in the groin in 3 control groups (Series I) and 3 experimental groups (Series II). Two hours before surgery and 6 hours post-surgery, saline (Series I) or enoxaparin sodium (0.6 mg/kg, Series II) was administered intra-peritoneally. Thereafter, the rats were injected daily with saline (Series I) or enoxaparin sodium (1.5 mg/kg, Series II) until construct retrieval at 3, 10, or 21 days. The retrieved constructs underwent weight and volume measurements, and morphologic/morphometric analysis of new tissue components. Results: Enoxaparin sodium treatment resulted in the development of smaller AVL constructs at 3, 10, and 21 days. Construct weight and volume were significantly reduced at 10 days (control weight 0.337 ± 0.016 g [Mean ± SEM] vs treated 0.228 ± 0.048, [P < .001]: control volume 0.317 ± 0.015 mL vs treated 0.184 ± 0.039 mL [P < .01]) and 21 days (control weight 0.306 ± 0.053 g vs treated 0.198 ± 0.043 g [P < .01]: control volume 0.285 ± 0.047 mL vs treated 0.148 ± 0.041 mL, [P < .01]). Angiogenesis was delayed in the enoxaparin sodium-treated constructs with the absolute vascular volume significantly decreased at 10 days (control vascular volume 0.029 ± 0.03 mL vs treated 0.012 ± 0.002 mL [P < .05]). Conclusion: In this in vivo tissue engineering model, endogenous, extra-vascularly deposited fibrin volume determines construct size and vascular growth in the first 3 weeks and is, therefore, critical to full construct development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article develops methods for spatially predicting daily change of dissolved oxygen (Dochange) at both sampled locations (134 freshwater sites in 2002 and 2003) and other locations of interest throughout a river network in South East Queensland, Australia. In order to deal with the relative sparseness of the monitoring locations in comparison to the number of locations where one might want to make predictions, we make a classification of the river and stream locations. We then implement optimal spatial prediction (ordinary and constrained kriging) from geostatistics. Because of their directed-tree structure, rivers and streams offer special challenges. A complete approach to spatial prediction on a river network is given, with special attention paid to environmental exceedances. The methodology is used to produce a map of Dochange predictions for 2003. Dochange is one of the variables measured as part of the Ecosystem Health Monitoring Program conducted within the Moreton Bay Waterways and Catchments Partnership.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter we will make the transition towards the design of business models and the related critical issues. We develop a model that helps us understand the causalities that play a role in understanding the viability and feasibility of the business models, i.e. long-term profitability and market adoption. We argue that designing viable business models requires balancing the requirements and interests of the actors involved, within and between the various business model domains. Requirements in the service domain guide the design choices in the technology domain, which in turn affect network formation and the financial arrangements. It is important to understand the Critical Design Issues (CDIs) involved in business models and their interdependencies. In this chapter, we present the Critical Design Issues involved in designing mobile service business models, and demonstrate how they are linked to the Critical Success Factors (CSFs) with regard to business model viability. This results in a causal model for understanding business model viability, as well as providing grounding for the business model design approach outlined in Chapter 5.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Network Protocol Version 3 (DNP3) is the de-facto communication protocol for power grids. Standard-based interoperability among devices has made the protocol useful to other infrastructures such as water, sewage, oil and gas. DNP3 is designed to facilitate interaction between master stations and outstations. In this paper, we apply a formal modelling methodology called Coloured Petri Nets (CPN) to create an executable model representation of DNP3 protocol. The model facilitates the analysis of the protocol to ensure that the protocol will behave as expected. Also, we illustrate how to verify and validate the behaviour of the protocol, using the CPN model and the corresponding state space tool to determine if there are insecure states. With this approach, we were able to identify a Denial of Service (DoS) attack against the DNP3 protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land-use regression (LUR) is a technique that can improve the accuracy of air pollution exposure assessment in epidemiological studies. Most LUR models are developed for single cities, which places limitations on their applicability to other locations. We sought to develop a model to predict nitrogen dioxide (NO2) concentrations with national coverage of Australia by using satellite observations of tropospheric NO2 columns combined with other predictor variables. We used a generalised estimating equation (GEE) model to predict annual and monthly average ambient NO2 concentrations measured by a national monitoring network from 2006 through 2011. The best annual model explained 81% of spatial variation in NO2 (absolute RMS error=1.4 ppb), while the best monthly model explained 76% (absolute RMS error=1.9 ppb). We applied our models to predict NO2 concentrations at the ~350,000 census mesh blocks across the country (a mesh block is the smallest spatial unit in the Australian census). National population-weighted average concentrations ranged from 7.3 ppb (2006) to 6.3 ppb (2011). We found that a simple approach using tropospheric NO2 column data yielded models with slightly better predictive ability than those produced using a more involved approach that required simulation of surface-to-column ratios. The models were capable of capturing within-urban variability in NO2, and offer the ability to estimate ambient NO2 concentrations at monthly and annual time scales across Australia from 2006–2011. We are making our model predictions freely available for research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supervisory Control and Data Acquisition systems (SCADA) are widely used to control critical infrastructure automatically. Capturing and analyzing packet-level traffic flowing through such a network is an essential requirement for problems such as legacy network mapping and fault detection. Within the framework of captured network traffic, we present a simple modeling technique, which supports the mapping of the SCADA network topology via traffic monitoring. By characterizing atomic network components in terms of their input-output topology and the relationship between their data traffic logs, we show that these modeling primitives have good compositional behaviour, which allows complex networks to be modeled. Finally, the predictions generated by our model are found to be in good agreement with experimentally obtained traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a dynamic model to identify influential users of micro-blogging services. Micro-blogging services, such as Twitter, allow their users (twitterers) to publish tweets and choose to follow other users to receive tweets. Previous work on user influence on Twitter, concerns more on following link structure and the contents user published, seldom emphasizes the importance of interactions among users. We argue that, by emphasizing on user actions in micro-blogging platform, user influence could be measured more accurately. Since micro-blogging is a powerful social media and communication platform, identifying influential users according to user interactions has more practical meanings, e.g., advertisers may concern how many actions – buying, in this scenario – the influential users could initiate rather than how many advertisements they spread. By introducing the idea of PageRank algorithm, innovatively, we propose our model using action-based network which could capture the ability of influential users when they interacting with micro-blogging platform. Taking the evolving prosperity of micro-blogging into consideration, we extend our actionbaseduser influence model into a dynamic one, which could distinguish influential users in different time periods. Simulation results demonstrate that our models could support and give reasonable explanations for the scenarios that we considered.