896 resultados para Android,Peer to Peer,Wifi,Mesh Network


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The loss of habitat and biodiversity worldwide has led to considerable resources being spent for conservation purposes on actions such as the acquisition and management of land, the rehabilitation of degraded habitats, and the purchase of easements from private landowners. Prioritising these actions is challenging due to the complexity of the problem and because there can be multiple actors undertaking conservation actions, often with divergent or partially overlapping objectives. We use a modelling framework to explore this issue with a study involving two agents sequentially purchasing land for conservation. We apply our model to simulated data using distributions taken from real data to simulate the cost of patches and the rarity and co-occurence of species. In our model each agent attempted to implement a conservation network that met its target for the minimum cost using the conservation planning software Marxan. We examine three scenarios where the conservation targets of the agents differ. The first scenario (called NGO-NGO) models the situation where two NGOs are both are targeting different sets of threatened species. The second and third scenarios (called NGO-Gov and Gov-NGO, respectively) represent a case where a government agency attempts to implement a complementary conservation network representing all species, while an NGO is focused on achieving additional protection for the most endangered species. For each of these scenarios we examined three types of interactions between agents: i) acting in isolation where the agents are attempting to achieve their targets solely though their own actions ii) sharing information where each agent is aware of the species representation achieved within the other agent’s conservation network and, iii) pooling resources where agents combine their resources and undertake conservation actions as a single entity. The latter two interactions represent different types of collaborations and in each scenario we determine the cost savings from sharing information or pooling resources. In each case we examined the utility of these interactions from the viewpoint of the combined conservation network resulting from both agents' actions, as well as from each agent’s individual perspective. The costs for each agent to achieve their objectives varied depending on the order in which the agents acted, the type of interaction between agents, and the specific goals of each agent. There were significant cost savings from increased collaboration via sharing information in the NGO-NGO scenario were the agent’s representation goals were mutually exclusive (in terms of specie targeted). In the NGO-Gov and Gov-NGO scenarios, collaboration generated much smaller savings. If the two agents collaborate by pooling resources there are multiple ways the total cost could be shared between both agents. For each scenario we investigate the costs and benefits for all possible cost sharing proportions. We find that there are a range of cost sharing proportions where both agents can benefit in the NGO-NGO scenarios while the NGO-Gov and Gov-NGO scenarios again showed little benefit. Although the model presented here has a range of simplifying assumptions, it demonstrates that the value of collaboration can vary significantly in different situations. In most cases, collaborating would have associated costs and these costs need to be weighed against the potential benefits from collaboration. The model demonstrates a method for determining the range of collaboration costs that would result in collaboration providing an efficient use of scarce conservation resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The VPAC(1) receptor belongs to family B of G protein-coupled receptors (GPCR-B) and is activated upon binding of the vasoactive intestinal peptide (VIP). Despite the recent determination of the structure of the N terminus of several members of this receptor family, little is known about the structure of the transmembrane (TM) region and about the molecular mechanisms leading to activation. In the present study, we designed a new structural model of the TM domain and combined it with experimental mutagenesis experiments to investigate the interaction network that governs ligand binding and receptor activation. Our results suggest that this network involves the cluster of residues Arg(188) in TM2, Gln(380) in TM7, and Asn(229) in TM3. This cluster is expected to be altered upon VIP binding, because Arg(188) has been shown previously to interact with Asp(3) of VIP. Several point mutations at positions 188, 229, and 380 were experimentally characterized and were shown to severely affect VIP binding and/or VIP-mediated cAMP production. Double mutants built from reciprocal residue exchanges exhibit strong cooperative or anticooperative effects, thereby indicating the spatial proximity of residues Arg(188), Gln(380), and Asn(229). Because these residues are highly conserved in the GPCR-B family, they can moreover be expected to have a general role in mediating function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Designers of self-adaptive systems often formulate adaptive design decisions, making unrealistic or myopic assumptions about the system's requirements and environment. The decisions taken during this formulation are crucial for satisfying requirements. In environments which are characterized by uncertainty and dynamism, deviation from these assumptions is the norm and may trigger 'surprises'. Our method allows designers to make explicit links between the possible emergence of surprises, risks and design trade-offs. The method can be used to explore the design decisions for self-adaptive systems and choose among decisions that better fulfil (or rather partially fulfil) non-functional requirements and address their trade-offs. The analysis can also provide designers with valuable input for refining the adaptation decisions to balance, for example, resilience (i.e. Satisfiability of non-functional requirements and their trade-offs) and stability (i.e. Minimizing the frequency of adaptation). The objective is to provide designers of self adaptive systems with a basis for multi-dimensional what-if analysis to revise and improve the understanding of the environment and its effect on non-functional requirements and thereafter decision-making. We have applied the method to a wireless sensor network for flood prediction. The application shows that the method gives rise to questions that were not explicitly asked before at design-time and assists designers in the process of risk-aware, what-if and trade-off analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the performance through numerical simulations of a new modulation format: serial dark soliton (SDS) for wide-area 100-Gb/s applications. We compare the performance of the SDS with conventional dark soliton, amplitude-modulation phase-shift keying (also known as duobinary), nonreturn-to-zero, and return-to-zero modulation formats, when subjected to typical wide-area-network impairments. We show that the SDS has a strong chromatic dispersion and polarization-mode-dispersion tolerance, while maintaining a compact spectrum suitable for strong filtering requirement in ultradense wavelength-division-multiplexing applications. The SDS can be generated using commercially available components for 40-Gb/s applications and is cost efficient when compared with other 100-Gb/s electrical-time-division-multiplexing systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - To generate a reflectance model of the fundus that allows an accurate non-invasive quantification of blood and pigments. Methods - A Monte Carlo simulation was used to produce a mathematical model of light interaction with the fundus at different wavelengths. The model predictions were compared with fundus images from normal volunteers in several spectral bands (peaks at 507, 525, 552, 585, 596 and 611nm). Th e model was then used to calculate the concentration and distribution of the known absorbing components of the fundus. Results - The shape of the statistical distribution of the image data generally corresponded to that of the model data; the model however appears to overestimate the reflectance of the fundus in the longer wavelength region.As the absorption by xanthophyll has no significant eff ect on light transport above 534nm, its distribution in the fundus was quantified: the wavelengths where both shape and distribution of image and model data matched (<553nm) were used to train a neural network which was then applied to every point in the image data. The xanthophyll distribution thus found was in agreement with published literature data in normal subjects. Conclusion - We have developed a method for optimising multi-spectral imaging of the fundus and a computer image analysis capable of estimating information about the structure and properties of the fundus. Th e technique successfully calculates the distribution of xanthophyll in the fundus of healthy volunteers. Further improvement of the model is required to allow the deduction of other parameters from images; investigations in known pathology models are also necessary to establish if this method is of clinical use in detecting early chroido-retinopathies, hence providing a useful screening and diagnostic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since wind at the earth's surface has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safe and economic use of wind energy. In this paper, we investigated a combination of numeric and probabilistic models: a Gaussian process (GP) combined with a numerical weather prediction (NWP) model was applied to wind-power forecasting up to one day ahead. First, the wind-speed data from NWP was corrected by a GP, then, as there is always a defined limit on power generated in a wind turbine due to the turbine controlling strategy, wind power forecasts were realized by modeling the relationship between the corrected wind speed and power output using a censored GP. To validate the proposed approach, three real-world datasets were used for model training and testing. The empirical results were compared with several classical wind forecast models, and based on the mean absolute error (MAE), the proposed model provides around 9% to 14% improvement in forecasting accuracy compared to an artificial neural network (ANN) model, and nearly 17% improvement on a third dataset which is from a newly-built wind farm for which there is a limited amount of training data. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a simplified implementation of the Hoshen-Kopelman cluster counting algorithm adapted for honeycomb networks. In our implementation of the algorithm we assume that all nodes in the network are occupied and links between nodes can be intact or broken. The algorithm counts how many clusters there are in the network and determines which nodes belong to each cluster. The network information is stored into two sets of data. The first one is related to the connectivity of the nodes and the second one to the state of links. The algorithm finds all clusters in only one scan across the network and thereafter cluster relabeling operates on a vector whose size is much smaller than the size of the network. Counting the number of clusters of each size, the algorithm determines the cluster size probability distribution from which the mean cluster size parameter can be estimated. Although our implementation of the Hoshen-Kopelman algorithm works only for networks with a honeycomb (hexagonal) structure, it can be easily changed to be applied for networks with arbitrary connectivity between the nodes (triangular, square, etc.). The proposed adaptation of the Hoshen-Kopelman cluster counting algorithm is applied to studying the thermal degradation of a graphene-like honeycomb membrane by means of Molecular Dynamics simulation with a Langevin thermostat. ACM Computing Classification System (1998): F.2.2, I.5.3.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Healthy brain functioning depends on efficient communication of information between brain regions, forming complex networks. By quantifying synchronisation between brain regions, a functionally connected brain network can be articulated. In neurodevelopmental disorders, where diagnosis is based on measures of behaviour and tasks, a measure of the underlying biological mechanisms holds promise as a potential clinical tool. Graph theory provides a tool for investigating the neural correlates of neuropsychiatric disorders, where there is disruption of efficient communication within and between brain networks. This research aimed to use recent conceptualisation of graph theory, along with measures of behaviour and cognitive functioning, to increase understanding of the neurobiological risk factors of atypical development. Using magnetoencephalography to investigate frequency-specific temporal dynamics at rest, the research aimed to identify potential biological markers derived from sensor-level whole-brain functional connectivity. Whilst graph theory has proved valuable for insight into network efficiency, its application is hampered by two limitations. First, its measures have hardly been validated in MEG studies, and second, graph measures have been shown to depend on methodological assumptions that restrict direct network comparisons. The first experimental study (Chapter 3) addressed the first limitation by examining the reproducibility of graph-based functional connectivity and network parameters in healthy adult volunteers. Subsequent chapters addressed the second limitation through adapted minimum spanning tree (a network analysis approach that allows for unbiased group comparisons) along with graph network tools that had been shown in Chapter 3 to be highly reproducible. Network topologies were modelled in healthy development (Chapter 4), and atypical neurodevelopment (Chapters 5 and 6). The results provided support to the proposition that measures of network organisation, derived from sensor-space MEG data, offer insights helping to unravel the biological basis of typical brain maturation and neurodevelopmental conditions, with the possibility of future clinical utility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Setting out from the database of Operophtera brumata, L. in between 1973 and 2000 due to the Light Trap Network in Hungary, we introduce a simple theta-logistic population dynamical model based on endogenous and exogenous factors, only. We create an indicator set from which we can choose some elements with which we can improve the fitting results the most effectively. Than we extend the basic simple model with additive climatic factors. The parameter optimization is based on the minimized root mean square error. The best model is chosen according to the Akaike Information Criterion. Finally we run the calibrated extended model with daily outputs of the regional climate model RegCM3.1, regarding 1961-1990 as reference period and 2021-2050 with 2071-2100 as future predictions. The results of the three time intervals are fitted with Beta distributions and compared statistically. The expected changes are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our method is presented with displaying time series, consisting of the daily amount of precipitation of 100 years, which has meant a separate challenge, as the precipitation data shows significant deviations. By nowadays, mankind has changed its environment to such an extent that it has a significant effect on other species as well. The Lepidoptera data series of the National Plant Protection and Forestry Light Trap Network can be used to justify this. This network has a national coverage, a large number of collected Lepidoptera, and an available, long data series of several years. For obtaining information from these data, the setting up of an easy to manage database is necessary. Furthermore, it is important to represent our data and our results in an easily analysable and expressive way. In this article the setting up of the database is introduced, together with the presentation of a three dimensional visualization method, which depicts the long-range and seasonal changes together.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hospitalization can be a very stressful experience, especially for children. With the use of technology, Intranet communication can be successful in obtaining interaction that these individuals lack to accomplish a positive adjustment to the hospital setting. The purpose of this exploratory, pilot project is to examine the use of networking chronically ill, hospitalized children with other hospitalized chronically ill children through Intranet communication.^ A target population of chronically ill hospitalized children, in at least Piaget's concrete operational stage, was asked to use the Intranet system to network with other chronically ill hospitalized children during their hospital stay, for one month or until discharge. The length of time of usage was recorded on a log sheet, and questionnaires were filled out at the end of the study.^ Statistical analysis was utilized to determine frequency of network usage, duration, demographics, and the impact on hospitalization. Results indicated that Intranet communication between chronically ill hospitalized children was utilized by the participants from 7-15 age groups; and had a positive impact on their hospitalization. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, wireless communication infrastructures have been widely deployed for both personal and business applications. IEEE 802.11 series Wireless Local Area Network (WLAN) standards attract lots of attention due to their low cost and high data rate. Wireless ad hoc networks which use IEEE 802.11 standards are one of hot spots of recent network research. Designing appropriate Media Access Control (MAC) layer protocols is one of the key issues for wireless ad hoc networks. ^ Existing wireless applications typically use omni-directional antennas. When using an omni-directional antenna, the gain of the antenna in all directions is the same. Due to the nature of the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 standards, only one of the one-hop neighbors can send data at one time. Nodes other than the sender and the receiver must be either in idle or listening state, otherwise collisions could occur. The downside of the omni-directionality of antennas is that the spatial reuse ratio is low and the capacity of the network is considerably limited. ^ It is therefore obvious that the directional antenna has been introduced to improve spatial reutilization. As we know, a directional antenna has the following benefits. It can improve transport capacity by decreasing interference of a directional main lobe. It can increase coverage range due to a higher SINR (Signal Interference to Noise Ratio), i.e., with the same power consumption, better connectivity can be achieved. And the usage of power can be reduced, i.e., for the same coverage, a transmitter can reduce its power consumption. ^ To utilizing the advantages of directional antennas, we propose a relay-enabled MAC protocol. Two relay nodes are chosen to forward data when the channel condition of direct link from the sender to the receiver is poor. The two relay nodes can transfer data at the same time and a pipelined data transmission can be achieved by using directional antennas. The throughput can be improved significant when introducing the relay-enabled MAC protocol. ^ Besides the strong points, directional antennas also have some explicit drawbacks, such as the hidden terminal and deafness problems and the requirements of retaining location information for each node. Therefore, an omni-directional antenna should be used in some situations. The combination use of omni-directional and directional antennas leads to the problem of configuring heterogeneous antennas, i e., given a network topology and a traffic pattern, we need to find a tradeoff between using omni-directional and using directional antennas to obtain a better network performance over this configuration. ^ Directly and mathematically establishing the relationship between the network performance and the antenna configurations is extremely difficult, if not intractable. Therefore, in this research, we proposed several clustering-based methods to obtain approximate solutions for heterogeneous antennas configuration problem, which can improve network performance significantly. ^ Our proposed methods consist of two steps. The first step (i.e., clustering links) is to cluster the links into different groups based on the matrix-based system model. After being clustered, the links in the same group have similar neighborhood nodes and will use the same type of antenna. The second step (i.e., labeling links) is to decide the type of antenna for each group. For heterogeneous antennas, some groups of links will use directional antenna and others will adopt omni-directional antenna. Experiments are conducted to compare the proposed methods with existing methods. Experimental results demonstrate that our clustering-based methods can improve the network performance significantly. ^