990 resultados para Transport-protocol selection
Resumo:
To exploit the popularity of TCP as still the dominant sender and protocol of choice for transporting data reliably across the heterogeneous Internet, this thesis explores end-to-end performance issues and behaviours of TCP senders when transferring data to wireless end-users. The theme throughout is on end-users located specifically within 802.11 WLANs at the edges of the Internet, a largely untapped area of work. To exploit the interests of researchers wanting to study the performance of TCP accurately over heterogeneous conditions, this thesis proposes a flexible wired-to-wireless experimental testbed that better reflects conditions in the real-world. To exploit the transparent functionalities between TCP in the wired domain and the IEEE 802.11 WLAN protocols, this thesis proposes a more accurate methodology for gauging the transmission and error characteristics of real-world 802.11 WLANs. It also aims to correlate any findings with the functionality of fixed TCP senders. To exploit the popularity of Linux as a popular operating system for many of the Internet’s data servers, this thesis studies and evaluates various sender-side TCP congestion control implementations within the recent Linux v2.6. A selection of the implementations are put under systematic testing using real-world wired-to-wireless conditions in order to screen and present a viable candidate/s for further development and usage in the modern-day heterogeneous Internet. Overall, this thesis comprises a set of systematic evaluations of TCP senders over 802.11 WLANs, incorporating measurements in the form of simulations, emulations, and through the use of a real-world-like experimental testbed. The goal of the work is to ensure that all aspects concerned are comprehensively investigated in order to establish rules that can help to decide under which circumstances the deployment of TCP is optimal i.e. a set of paradigms for advancing the state-of-the-art in data transport across the Internet.
Resumo:
Mouse embiyonic stem (ES) cells have the potential to differentiate into insulin-producing cells, but efficient protocols for in vitro differentiation have not been established. Here we have developed a new optimized four-stage differentiation protocol and compared this with an established reference protocol. The new protocol minimized differentiation towards neuronal progeny, resulting in a population of insulin-producing cells with ß-cell characteristics but lacking neuronal features. The yield of glucagon and somatostatin cells was negligible. Crucial for this improved yield was the removal of a nestin selection step as well as removal of culture supplements that promote differentiation towards the neuronal lineage. Supplementation of the differentiation medium with insulin and fetal calf serum was beneficial for differentiation towards monohor-monal insulin-positive cells. After implantation into diabetic mice these insulin-producing cells produced a time-dependent improvement of the diabetic metabolic state, in contrast to cells differentiated according to the reference protocol. Using a spinner culture instead of an adherent culture of ES cells prevented the differentiation towards insulin-producing cells. Thus, prevention of cell attachment in a spinner culture represents a means to keep ES cells in an undifferentiated state and to inhibit differentiation. In conclusion, this study describes a new optimized four-stage protocol for differentiating ES cells to insulin-producing cells with minimal neuronal cell formation. Copyright © 2008 Cognizant Comm. Corp.
Resumo:
The multiple-input multiple-output (MIMO) technique can be used to improve the performance of ad hoc networks. Various medium access control (MAC) protocols with multiple contention slots have been proposed to exploit spatial multiplexing for increasing the transport throughput of MIMO ad hoc networks. However, the existence of multiple request-to-send/clear-to-send (RTS/CTS) contention slots represents a severe overhead that limits the improvement on transport throughput achieved by spatial multiplexing. In addition, when the number of contention slots is fixed, the efficiency of RTS/CTS contention is affected by the transmitting power of network nodes. In this study, a joint optimisation scheme on both transmitting power and contention slots number for maximising the transport throughput is presented. This includes the establishment of an analytical model of a simplified MAC protocol with multiple contention slots, the derivation of transport throughput as a function of both transmitting power and the number of contention slots, and the optimisation process based on the transport throughput formula derived. The analytical results obtained, verified by simulation, show that much higher transport throughput can be achieved using the joint optimisation scheme proposed, compared with the non-optimised cases and the results previously reported.
Resumo:
In recent years, the internet has grown exponentially, and become more complex. This increased complexity potentially introduces more network-level instability. But for any end-to-end internet connection, maintaining the connection's throughput and reliability at a certain level is very important. This is because it can directly affect the connection's normal operation. Therefore, a challenging research task is to improve a network's connection performance by optimizing its throughput and reliability. This dissertation proposed an efficient and reliable transport layer protocol (called concurrent TCP (cTCP)), an extension of the current TCP protocol, to optimize end-to-end connection throughput and enhance end-to-end connection fault tolerance. The proposed cTCP protocol could aggregate multiple paths' bandwidth by supporting concurrent data transfer (CDT) on a single connection. Here concurrent data transfer was defined as the concurrent transfer of data from local hosts to foreign hosts via two or more end-to-end paths. An RTT-Based CDT mechanism, which was based on a path's RTT (Round Trip Time) to optimize CDT performance, was developed for the proposed cTCP protocol. This mechanism primarily included an RTT-Based load distribution and path management scheme, which was used to optimize connections' throughput and reliability. A congestion control and retransmission policy based on RTT was also provided. According to experiment results, under different network conditions, our RTT-Based CDT mechanism could acquire good CDT performance. Finally a CWND-Based CDT mechanism, which was based on a path's CWND (Congestion Window), to optimize CDT performance was introduced. This mechanism primarily included: a CWND-Based load allocation scheme, which assigned corresponding data to paths based on their CWND to achieve aggregate bandwidth; a CWND-Based path management, which was used to optimize connections' fault tolerance; and a congestion control and retransmission management policy, which was similar to regular TCP in its separate path handling. According to corresponding experiment results, this mechanism could acquire near-optimal CDT performance under different network conditions.
Resumo:
An awareness of mercury (Hg) contamination in various aquatic environments around the world has increased over the past decade, mostly due to its ability to concentrate in the biota. Because the presence and distribution of Hg in aquatic systems depend on many factors (e.g., pe, pH, salinity, temperature, organic and inorganic ligands, sorbents, etc.), it is crucial to understand its fate and transport in the presence of complexing constituents and natural sorbents, under those different factors. An improved understanding of the subject will support the selection of monitoring, remediation, and restoration technologies. The coupling of equilibrium chemical reactions with transport processes in the model PHREEQC offers an advantage in simulating and predicting the fate and transport of aqueous chemical species of interest. Thus, a great variety of reactive transport problems could be addressed in aquatic systems with boundary conditions of specific interest. Nevertheless, PHREEQC lacks a comprehensive thermodynamic database for Hg. Therefore, in order to use PHREEQC to address the fate and transport of Hg in aquatic environments, it is necessary to expand its thermodynamic database, confirm it and then evaluate it in applications where potential exists for its calibration and continued validation. The objectives of this study were twofold: 1) to develop, expand, and confirm the Hg database of the hydrogeochemical PHREEQC to enhance its capability to simulate the fate of Hg species in the presence of complexing constituents and natural sorbents under different conditions of pH, redox, salinity and temperature; and 2) to apply and evaluate the new database in flow and transport scenarios, at two field test beds: Oak Ridge Reservation, Oak Ridge, TN and Everglades National Park, FL, where Hg is present and is of much concern. Overall, this research enhanced the capability of the PHREEQC model to simulate the coupling of the Hg reactions in transport conditions. It also demonstrated its usefulness when applied to field situations.
Resumo:
The GloboLakes project, a global observatory of lake responses to environmental change, aims to exploit current satellite missions and long remote-sensing archives to synoptically study multiple lake ecosystems, assess their current condition, reconstruct past trends to system trajectories, and assess lake sensitivity to multiple drivers of change. Here we describe the selection protocol for including lakes in the global observatory based upon remote-sensing techniques and an initial pool of the largest 3721 lakes and reservoirs in the world, as listed in the Global Lakes and Wetlands Database. An 18-year-long archive of satellite data was used to create spatial and temporal filters for the identification of waterbodies that are appropriate for remote-sensing methods. Further criteria were applied and tested to ensure the candidate sites span a wide range of ecological settings and characteristics; a total 960 lakes, lagoons, and reservoirs were selected. The methodology proposed here is applicable to new generation satellites, such as the European Space Agency Sentinel-series.
Resumo:
This study considers a dual-hop cognitive inter-vehicular relay-assisted communication system where all
communication links are non-line of sight ones and their fading is modelled by the double Rayleigh fading distribution.
Road-side relays (or access points) implementing the decode-and-forward relaying protocol are employed and one of
them is selected according to a predetermined policy to enable communication between vehicles. The performance of
the considered cognitive cooperative system is investigated for Kth best partial and full relay selection (RS) as well as
for two distinct fading scenarios. In the first scenario, all channels are double Rayleigh distributed. In the second
scenario, only the secondary source to relay and relay to destination channels are considered to be subject to double
Rayleigh fading whereas, channels between the secondary transmitters and the primary user are modelled by the
Rayleigh distribution. Exact and approximate expressions for the outage probability performance for all considered RS
policies and fading scenarios are presented. In addition to the analytical results, complementary computer simulated
performance evaluation results have been obtained by means of Monte Carlo simulations. The perfect match between
these two sets of results has verified the accuracy of the proposed mathematical analysis.
Resumo:
The application of membrane separation processes (PSM) for treatment of radioactive waste requires the selection of a suitable membrane for the treatment of waste, as the membrane will be directly exposed to the radioactive liquid waste, and also exposed to ionizing radiation. The nanofiltration membrane is most suitable for treatment of radioactive waste, since it has high rejection of multivalent ions. Usually the membranes are made of polymers and depending on the composition of the waste, type and dose of radiation absorbed may be changes in the structure of the membrane, resulting in loss of its transport properties. We tested two commercial nanofiltration membranes: NF and SW Dow/Filmtec. The waste liquid used was obtained in the process of conversion of uranium hexafluoride gas to solid uranium dioxide, known as "carbonated water". The membranes were characterized as their transport properties (hydraulic permeability, permeate flux and salt rejection) before and after their immersion in the waste for 24 hours. The surface of the membranes was also evaluated by SEM and FTIR. It was observed that in both the porosity of the membrane selective layer was altered, but not the membrane surface charge, which is responsible for the selectivity of the membrane. The NF membranes and SW showed uranium ion rejection of 64% and 55% respectively.
Resumo:
Software engineering best practices allow significantly improving the software development. However, the implementation of best practices requires skilled professionals, financial investment and technical support to facilitate implementation and achieve the respective improvement. In this paper we proposes a protocol to design techniques to implement best practices of software engineering. The protocol includes the identification and selection of process to improve, the study of standards and models, identification of best practices associated with the process and the possible implementation techniques. In addition, technical design activities are defined in order to create or adapt the techniques of implementing best practices for software development.
Resumo:
The Internet has grown in size at rapid rates since BGP records began, and continues to do so. This has raised concerns about the scalability of the current BGP routing system, as the routing state at each router in a shortest-path routing protocol will grow at a supra-linearly rate as the network grows. The concerns are that the memory capacity of routers will not be able to keep up with demands, and that the growth of the Internet will become ever more cramped as more and more of the world seeks the benefits of being connected. Compact routing schemes, where the routing state grows only sub-linearly relative to the growth of the network, could solve this problem and ensure that router memory would not be a bottleneck to Internet growth. These schemes trade away shortest-path routing for scalable memory state, by allowing some paths to have a certain amount of bounded “stretch”. The most promising such scheme is Cowen Routing, which can provide scalable, compact routing state for Internet routing, while still providing shortest-path routing to nearly all other nodes, with only slightly stretched paths to a very small subset of the network. Currently, there is no fully distributed form of Cowen Routing that would be practical for the Internet. This dissertation describes a fully distributed and compact protocol for Cowen routing, using the k-core graph decomposition. Previous compact routing work showed the k-core graph decomposition is useful for Cowen Routing on the Internet, but no distributed form existed. This dissertation gives a distributed k-core algorithm optimised to be efficient on dynamic graphs, along with with proofs of its correctness. The performance and efficiency of this distributed k-core algorithm is evaluated on large, Internet AS graphs, with excellent results. This dissertation then goes on to describe a fully distributed and compact Cowen Routing protocol. This protocol being comprised of a landmark selection process for Cowen Routing using the k-core algorithm, with mechanisms to ensure compact state at all times, including at bootstrap; a local cluster routing process, with mechanisms for policy application and control of cluster sizes, ensuring again that state can remain compact at all times; and a landmark routing process is described with a prioritisation mechanism for announcements that ensures compact state at all times.
Resumo:
Background: Post-discharge mortality is a frequent but poorly recognized contributor to child mortality in resource limited countries. The identification of children at high risk for post-discharge mortality is a critically important first step in addressing this problem. Objectives: The objective of this project was to determine the variables most likely to be associated with post-discharge mortality which are to be included in a prediction modelling study. Methods: A two-round modified Delphi process was completed for the review of a priori selected variables and selection of new variables. Variables were evaluated on relevance according to (1) prediction (2) availability (3) cost and (4) time required for measurement. Participants included experts in a variety of relevant fields. Results: During the first round of the modified Delphi process, 23 experts evaluated 17 variables. Forty further variables were suggested and were reviewed during the second round by 12 experts. During the second round 16 additional variables were evaluated. Thirty unique variables were compiled for use in the prediction modelling study. Conclusion: A systematic approach was utilized to generate an optimal list of candidate predictor variables for the incorporation into a study on prediction of pediatric post-discharge mortality in a resource poor setting.
Resumo:
The goal of Vehicle Routing Problems (VRP) and their variations is to transport a set of orders with the minimum number of vehicles at least cost. Most approaches are designed to solve specific problem variations independently, whereas in real world applications, different constraints are handled concurrently. This research extends solutions obtained for the traveling salesman problem with time windows to a much wider class of route planning problems in logistics. The work describes a novel approach that: supports a heterogeneous fleet of vehicles dynamically reduces the number of vehicles respects individual capacity restrictions satisfies pickup and delivery constraints takes Hamiltonian paths (rather than cycles) The proposed approach uses Monte-Carlo Tree Search and in particular Nested Rollout Policy Adaptation. For the evaluation of the work, real data from the industry was obtained and tested and the results are reported.
Resumo:
Part 18: Optimization in Collaborative Networks