972 resultados para 291700 Communications Technologies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of reproductive and genetic technologies can increase the efficiency of selective breeding programs for aquaculture species. Four technologies are considered, namely: marker-assisted selection, DNA fingerprinting, in-vitro fertilization, and cryopreservation. Marker-assisted selection can result in greater genetic gain, particularly for traits difficult or expensive to measure, than conventional selection methods, but its application is currently limited by lack of high density linkage maps and by the high cost of genotyping. DNA fingerprinting is most useful for genetic tagging and parentage verification. Both in-vitro fertilization and cryopreservation techniques can increase the accuracy of selection while controlling accumulation of inbreeding in long-term selection programs. Currently, the cost associated with the utilization of reproductive and genetic techniques is possibly the most important factor limiting their use in genetic improvement programs for aquatic species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop "Technologies and Methodologies for the Detection of Harmful Algae and their Toxins" convened in St. Petersburg, Florida, October 22- 24, 2008 and was co-sponsored by ACT (http://act-us.info); the Cooperative Institute for Coastal and Estuarine Environmental Technology (CICEET, http://ciceet.unh.edu); and the Florida Fish and Wildlife Conservation Commission (FWC, http://www.myfwc.com). Participants from various sectors, including researchers, coastal decision makers, and technology vendors, collaborated to exchange information and build consensus. They focused on the status of currently available detection technologies and methodologies for harmful algae (HA) and their toxins, provided direction for developing operational use of existing technology, and addressed requirements for future technology developments in this area. Harmful algal blooms (HABs) in marine and freshwater systems are increasingly common worldwide and are known to cause extensive ecological, economic, and human health problems. In US waters, HABs are encountered in a growing number of locations and are also increasing in duration and severity. This expansion in HABs has led to elevated incidences of poisonous seafood, toxin-contaminated drinking water, mortality of fish and other animals dependent upon aquatic resources (including protected species), public health and economic impacts in coastal and lakeside communities, losses to aquaculture enterprises, and long-term aquatic ecosystem changes. This meeting represented the fourth ACT sponsored workshop that has addressed technology developments for improved monitoring of water-born pathogens and HA species in some form. A primary motivation was to assess the need and community support for an ACT-led Performance Demonstration of Harmful Algae Detection Technologies and Methodologies in order to facilitate their integration into regional ocean observing systems operations. The workshop focused on the identification of region-specific monitoring needs and available technologies and methodologies for detection/quantification of harmful algal species and their toxins along the US marine and freshwater coasts. To address this critical environmental issue, several technologies and methodologies have been, or are being, developed to detect and quantify various harmful algae and their associated toxins in coastal marine and freshwater environments. There are many challenges to nationwide adoption of HAB detection as part of a core monitoring infrastructure: the geographic uniqueness of primary algal species of concern around the country, the variety of HAB impacts, and the need for a clear vision of the operational requirements for monitoring the various species. Nonetheless, it was a consensus of the workshop participants that ACT should support the development of HA detection technology performance demonstrations but that these would need to be tuned regionally to algal species and toxins of concern in order to promote the adoption of state of the art technologies into HAR monitoring networks. [PDF contains 36 pages]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) convened a workshop on "Wave Sensor Technologies" in St. Petersburg, Florida on March 7-9, 2007, hosted by the University of South Florida (USF) College of Marine Science, an ACT partner institution. The primary objectives of this workshop were to: 1) define the present state of wave measurement technologies, 2) identify the major impediments to their advancement, and 3) make strategic recommendations for future development and on the necessary steps to integrate wave measurement sensors into operational coastal ocean observing systems. The participants were from various sectors, including research scientists, technology developers and industry providers, and technology users, such as operational coastal managers and coastal decision makers. Waves consistently are ranked as a critical variable for numerous coastal issues, from maritime transportation to beach erosion to habitat restoration. For the purposes of this workshop, the participants focused on measuring "wind waves" (i.e., waves on the water surface, generated by the wind, restored by gravity and existing between approximately 3 and 30-second periods), although it was recognized that a wide range of both forced and free waves exist on and in the oceans. Also, whereas the workshop put emphasis on the nearshore coastal component of wave measurements, the participants also stressed the importance of open ocean surface waves measurement. Wave sensor technologies that are presently available for both environments include bottom-mounted pressure gauges, surface following buoys, wave staffs, acoustic Doppler current profilers, and shore-based remote sensing radar instruments. One of the recurring themes of workshop discussions was the dichotomous nature of wave data users. The two separate groups, open ocean wave data users and the nearshore/coastal wave data users, have different requirements. Generally, the user requirements increase both in spatial/temporal resolution and precision as one moves closer to shore. Most ocean going mariners are adequately satisfied with measurements of wave period and height and a wave general direction. However, most coastal and nearshore users require at least the first five Fourier parameters ("First 5"): wave energy and the first four directional Fourier coefficients. Furthermore, wave research scientists would like sensors capable of providing measurements beyond the first four Fourier coefficients. It was debated whether or not high precision wave observations in one location can take the place of a less precise measurement at a different location. This could be accomplished by advancing wave models and using wave models to extend data to nearby areas. However, the consensus was that models are no substitution for in situ wave data.[PDF contains 26 pages]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop entitled, "Biological Platforms as Sensor Technologies and their Use as Indicators for the Marine Environment" was held in Seward, Alaska, September 19 - 21,2007. The workshop was co-hosted by the University of Alaska Fairbanks (UAF) and the Alaska SeaLife Center (ASLC). The workshop was attended by 25 participants representing a wide range of research scientists, managers, and manufacturers who develop and deploy sensory equipment using aquatic vertebrates as the mode of transport. Eight recommendations were made by participants at the conclusion of the workshop and are presented here without prioritization: 1. Encourage research toward development of energy scavenging devices of suitable sizes for use in remote sensing packages attached to marine animals. 2. Encourage funding sources for development of new sensor technologies and animal-borne tags. 3. Develop animal-borne environmental sensor platforms that offer more combined systems and improved data recovery methodologies, and expand the geographic scope of complementary fixed sensor arrays. 4. Engage the oceanographic community by: a. Offering a mini workshop at an AGU ocean sciences conference for people interested in developing an ocean carbon program that utilizes animal-borne sensor technology. b. Outreach to chemical oceanographers. 5. Min v2d6.sheepserver.net e and merge technologies from other disciplines that may be applied to marine sensors (e.g. biomedical field). 6. Encourage the NOAA Permitting Office to: a. Make a more predictable, reliable, and consistent permitting system for using animal platforms. b. Establish an evaluation process. c. Adhere to established standards. 7. Promote the expanded use of calibrated hydrophones as part of existing animal platforms. 8. Encourage the Integrated Ocean Observing System (IOOS) to promote animal tracking as effective samplers of the marine environment, and use of animals as ocean sensor technology platforms. [PDF contains 20 pages]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) convened a workshop on Evaluating Approaches and Technologies for Monitoring Organic Contaminants in the Aquatic Environment in Ann Arbor, MI on July 21-23, 2006. The primary objectives of this workshop were to: 1) identify the priority management information needs relative to organic contaminant loading; 2) explore the most appropriate approaches to estimating mass loading; and 3) evaluate the current status of the sensor technology. To meet these objectives, a mixture of leading research scientists, resource managers, and industry representatives were brought together for a focused two-day workshop. The workshop featured four plenary talks followed by breakout sessions in which arranged groups of participants where charged to respond to a series of focused discussion questions. At present, there are major concerns about the inadequacies in approaches and technologies for quantifying mass emissions and detection of organic contaminants for protecting municipal water supplies and receiving waters. Managers use estimates of land-based contaminant loadings to rivers, lakes, and oceans to assess relative risk among various contaminant sources, determine compliance with regulatory standards, and define progress in source reduction. However, accurately quantifying contaminant loading remains a major challenge. Loading occurs over a range of hydrologic conditions, requiring measurement technologies that can accommodate a broad range of ambient conditions. In addition, in situ chemical sensors that provide a means for acquiring continuous concentration measurements are still under development, particularly for organic contaminants that typically occur at low concentrations. Better approaches and strategies for estimating contaminant loading, including evaluations of both sampling design and sensor technologies, need to be identified. The following general recommendations were made in an effort to advance future organic contaminant monitoring: 1. Improve the understanding of material balance in aquatic systems and the relationship between potential surrogate measures (e.g., DOC, chlorophyll, particle size distribution) and target constituents. 2. Develop continuous real-time sensors to be used by managers as screening measures and triggers for more intensive monitoring. 3. Pursue surrogate measures and indicators of organic pollutant contamination, such as CDOM, turbidity, or non-equilibrium partitioning. 4. Develop continuous field-deployable sensors for PCBs, PAHs, pyrethroids, and emerging contaminants of concern and develop strategies that couple sampling approaches with tools that incorporate sensor synergy (i.e., measure appropriate surrogates along with the dissolved organics to allow full mass emission estimation).[PDF contains 20 pages]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop entitled "Technologies for Measuring Currents in Coastal Environments" was held in Portland, Maine, October 26-28, 2005, with sponsorship by the Gulf of Maine Ocean Observing System (GoMOOS), an ACT partner organization. The primary goals of the event were to summarize recent trends in nearshore research and management applications for current meter technologies, identify how current meters can assist coastal managers to fulfill their regulatory and management objectives, and to recommend actions to overcome barriers to use of the technologies. The workshop was attended by 25 participants representing state and federal environmental management agencies, manufacturers of current meter technologies, and researchers from academic institutions and private industry. Common themes that were discussed during the workshop included 1) advantages and limitations of existing current measuring equipment, 2) reliability and ease of use with each instrument type, 3) data decoding and interpretation procedures, and 4) mechanisms to facilitate better training and guidance to a broad user group. Seven key recommendations, which were ranked in order of importance during the last day of the workshop are listed below. 1. Forums should be developed to facilitate the exchange of information among users and industry: a) On-line forums that not only provide information on specific instruments and technologies, but also provide an avenue for the exchange of user experiences with various instruments (i.e. problems encountered, cautions, tips, advantages, etc). (see References for manufacturer websites with links to application and technical forums at end of report) b) Regional training/meetings for operational managers to exchange ideas on methods for measuring currents and evaluating data. c) Organize mini-meetings or tutorial sessions within larger conference venues. 2. A committee of major stakeholders should be convened to develop common standards (similar to the Institute of Electrical and Electronics Engineers (IEEE) committee) that enable users to switch sensors without losing software or display capabilities. (pdf contains 28 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Partner University of Michigan convened a workshop on the Applications of Drifting Buoy Technologies for Coastal Watershed and Ecosystem Modeling in Ann Arbor, Michigan on June 5 to 7,2005. The objectives of the workshop were to: (1) educate potential users (managers and scientists) about the current capabilities and uses of drifting buoy technologies; (2) provide an opportunity for users (managers and scientists) to experience first hand the deployment and retrieval of various drifting buoys, as well as experience the capabilities of the buoys' technologies; (3) engage manufacturers with scientists and managers in discussions on drifting buoys' capabilities and their requirements to promote further applications of these systems; (4) promote a dialogue about realistic advantages and limitations of current drifting buoy technologies; and (5) develop a set of key recommendations for advancing both the capabilities and uses of drifting buoy technologies for coastal watershed and ecosystem modeling. To achieve these goals, representatives from research, academia, industry, and resource management were invited to participate in this workshop. Attendees obtained "hands on" experience as they participated in the deployment and retrieval of various drifting buoy systems on Big Portage Lake, a 644 acre lake northwest of Ann Arbor. Working groups then convened for discussions on current commercial usages and environmental monitoring approaches including; user requirements for drifting buoys, current status of drifting buoy systems and enabling technologies, and the challenges and strategies for bringing new drifting buoys "on-line". The following general recommendations were made to: 1). organize a testing program of drifting buoys for marketing their capabilities to resource managers and users. 2). develop a fact sheet to highlight the utility of drifting buoys. 3). facilitate technology transfer for advancements in drifter buoys that may be occurring through military funding and development in order to enhance their technical capability for environmental applications. (pdf contains 18 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technology generation and dissemination are important components of rural transformation programmes. Nigerian fisheries sub-sector is still hampered by low productivity (especially in aquaculture) and low output (capture fisheries and post-harvest technologies). Research institutions and the Universities have made efforts in developing improved technologies to find solution to these problems, yet the level of adoption of the technologies remain low. This is due to a combination of various factors among which are faulty agricultural policies; institutional framework and unfavourable socio-economic environment. Niger State plays an important role in production in Nigeria and host the only research institute with the mandate in inland fisheries. It is important therefore to know the effectiveness of various extension approaches used in disseminating the technologies developed and their impact on adopters. Forty fishers were randomly selected in Shiroro L.G.A. of the Niger State and interviewed. The study probed into their socio-economic characteristics, traditional practices, extent of awareness and adoption for fisheries technologies and the effectiveness and impact of various approaches used by the extension organizations to disseminate the technologies. The results show that the economically active age group of the fishers was in the range of 20-50 years (87.5%). Males (95%) dominate the fisher population. 47.5% of the respondents have average household size of 6-10 and 57.5% were below primary school in educational attainment. Only 57.5% belonged to cooperative societies, while 90.0% of the fishers have no access to credit other than personal finance. Majority of fish-farmers (60%) operate at homestead level with pond size less than 50 square meter, stock under polyculture fishing methods are at subsistence level, while 67.5% of processors use mud-oven to cure, by smoking, freshly caught fish. Disseminated aquaculture technologies have low level of awareness (5-20%) and adoption (2.5-22.5%). For capture fisheries and post-harvest technologies awareness levels of 47.5-72.5% and adoption levels of 27.5-50.0% were recorded. Method demonstration (87.5%), result demonstration (75.0%) and field days (47.5%) are the major approaches used by the ADP. Respondents were of the opinion that method demonstration (65%), result demonstration (57.5%) and field day (30.0%) are effective. 62.5% of respondents had enhanced income due impact of extension activities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smart and mobile environments require seamless connections. However, due to the frequent process of ''discovery'' and disconnection of mobile devices while data interchange is happening, wireless connections are often interrupted. To minimize this drawback, a protocol that enables an easy and fast synchronization is crucial. Bearing this in mind, Bluetooth technology appears to be a suitable solution to carry on such connections due to the discovery and pairing capabilities it provides. Nonetheless, the time and energy spent when several devices are being discovered and used at the same time still needs to be managed properly. It is essential that this process of discovery takes as little time and energy as possible. In addition to this, it is believed that the performance of the communications is not constant when the transmission speeds and throughput increase, but this has not been proved formally. Therefore, the purpose of this project is twofold: Firstly, to design and build a framework-system capable of performing controlled Bluetooth device discovery, pairing and communications. Secondly, to analyze and test the scalability and performance of the \emph{classic} Bluetooth standard under different scenarios and with various sensors and devices using the framework developed. To achieve the first goal, a generic Bluetooth platform will be used to control the test conditions and to form a ubiquitous wireless system connected to an Android Smartphone. For the latter goal, various stress-tests will be carried on to measure the consumption rate of battery life as well as the quality of the communications between the devices involved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the branches of astronomy, radio astronomy is unique in that it spans the largest portion of the electromagnetic spectrum, e.g., from about 10 MHz to 300 GHz. On the other hand, due to scientific priorities as well as technological limitations, radio astronomy receivers have traditionally covered only about an octave bandwidth. This approach of "one specialized receiver for one primary science goal" is, however, not only becoming too expensive for next-generation radio telescopes comprising thousands of small antennas, but also is inadequate to answer some of the scientific questions of today which require simultaneous coverage of very large bandwidths.

This thesis presents significant improvements on the state of the art of two key receiver components in pursuit of decade-bandwidth radio astronomy: 1) reflector feed antennas; 2) low-noise amplifiers on compound-semiconductor technologies. The first part of this thesis introduces the quadruple-ridged flared horn, a flexible, dual linear-polarization reflector feed antenna that achieves 5:1-7:1 frequency bandwidths while maintaining near-constant beamwidth. The horn is unique in that it is the only wideband feed antenna suitable for radio astronomy that: 1) can be designed to have nominal 10 dB beamwidth between 30 and 150 degrees; 2) requires one single-ended 50 Ohm low-noise amplifier per polarization. Design, analysis, and measurements of several quad-ridged horns are presented to demonstrate its feasibility and flexibility.

The second part of the thesis focuses on modeling and measurements of discrete high-electron mobility transistors (HEMTs) and their applications in wideband, extremely low-noise amplifiers. The transistors and microwave monolithic integrated circuit low-noise amplifiers described herein have been fabricated on two state-of-the-art HEMT processes: 1) 35 nm indium phosphide; 2) 70 nm gallium arsenide. DC and microwave performance of transistors from both processes at room and cryogenic temperatures are included, as well as first-reported measurements of detailed noise characterization of the sub-micron HEMTs at both temperatures. Design and measurements of two low-noise amplifiers covering 1--20 and 8—50 GHz fabricated on both processes are also provided, which show that the 1--20 GHz amplifier improves the state of the art in cryogenic noise and bandwidth, while the 8--50 GHz amplifier achieves noise performance only slightly worse than the best published results but does so with nearly a decade bandwidth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signal processing techniques play important roles in the design of digital communication systems. These include information manipulation, transmitter signal processing, channel estimation, channel equalization and receiver signal processing. By interacting with communication theory and system implementing technologies, signal processing specialists develop efficient schemes for various communication problems by wisely exploiting various mathematical tools such as analysis, probability theory, matrix theory, optimization theory, and many others. In recent years, researchers realized that multiple-input multiple-output (MIMO) channel models are applicable to a wide range of different physical communications channels. Using the elegant matrix-vector notations, many MIMO transceiver (including the precoder and equalizer) design problems can be solved by matrix and optimization theory. Furthermore, the researchers showed that the majorization theory and matrix decompositions, such as singular value decomposition (SVD), geometric mean decomposition (GMD) and generalized triangular decomposition (GTD), provide unified frameworks for solving many of the point-to-point MIMO transceiver design problems.

In this thesis, we consider the transceiver design problems for linear time invariant (LTI) flat MIMO channels, linear time-varying narrowband MIMO channels, flat MIMO broadcast channels, and doubly selective scalar channels. Additionally, the channel estimation problem is also considered. The main contributions of this dissertation are the development of new matrix decompositions, and the uses of the matrix decompositions and majorization theory toward the practical transmit-receive scheme designs for transceiver optimization problems. Elegant solutions are obtained, novel transceiver structures are developed, ingenious algorithms are proposed, and performance analyses are derived.

The first part of the thesis focuses on transceiver design with LTI flat MIMO channels. We propose a novel matrix decomposition which decomposes a complex matrix as a product of several sets of semi-unitary matrices and upper triangular matrices in an iterative manner. The complexity of the new decomposition, generalized geometric mean decomposition (GGMD), is always less than or equal to that of geometric mean decomposition (GMD). The optimal GGMD parameters which yield the minimal complexity are derived. Based on the channel state information (CSI) at both the transmitter (CSIT) and receiver (CSIR), GGMD is used to design a butterfly structured decision feedback equalizer (DFE) MIMO transceiver which achieves the minimum average mean square error (MSE) under the total transmit power constraint. A novel iterative receiving detection algorithm for the specific receiver is also proposed. For the application to cyclic prefix (CP) systems in which the SVD of the equivalent channel matrix can be easily computed, the proposed GGMD transceiver has K/log_2(K) times complexity advantage over the GMD transceiver, where K is the number of data symbols per data block and is a power of 2. The performance analysis shows that the GGMD DFE transceiver can convert a MIMO channel into a set of parallel subchannels with the same bias and signal to interference plus noise ratios (SINRs). Hence, the average bit rate error (BER) is automatically minimized without the need for bit allocation. Moreover, the proposed transceiver can achieve the channel capacity simply by applying independent scalar Gaussian codes of the same rate at subchannels.

In the second part of the thesis, we focus on MIMO transceiver design for slowly time-varying MIMO channels with zero-forcing or MMSE criterion. Even though the GGMD/GMD DFE transceivers work for slowly time-varying MIMO channels by exploiting the instantaneous CSI at both ends, their performance is by no means optimal since the temporal diversity of the time-varying channels is not exploited. Based on the GTD, we develop space-time GTD (ST-GTD) for the decomposition of linear time-varying flat MIMO channels. Under the assumption that CSIT, CSIR and channel prediction are available, by using the proposed ST-GTD, we develop space-time geometric mean decomposition (ST-GMD) DFE transceivers under the zero-forcing or MMSE criterion. Under perfect channel prediction, the new system minimizes both the average MSE at the detector in each space-time (ST) block (which consists of several coherence blocks), and the average per ST-block BER in the moderate high SNR region. Moreover, the ST-GMD DFE transceiver designed under an MMSE criterion maximizes Gaussian mutual information over the equivalent channel seen by each ST-block. In general, the newly proposed transceivers perform better than the GGMD-based systems since the super-imposed temporal precoder is able to exploit the temporal diversity of time-varying channels. For practical applications, a novel ST-GTD based system which does not require channel prediction but shares the same asymptotic BER performance with the ST-GMD DFE transceiver is also proposed.

The third part of the thesis considers two quality of service (QoS) transceiver design problems for flat MIMO broadcast channels. The first one is the power minimization problem (min-power) with a total bitrate constraint and per-stream BER constraints. The second problem is the rate maximization problem (max-rate) with a total transmit power constraint and per-stream BER constraints. Exploiting a particular class of joint triangularization (JT), we are able to jointly optimize the bit allocation and the broadcast DFE transceiver for the min-power and max-rate problems. The resulting optimal designs are called the minimum power JT broadcast DFE transceiver (MPJT) and maximum rate JT broadcast DFE transceiver (MRJT), respectively. In addition to the optimal designs, two suboptimal designs based on QR decomposition are proposed. They are realizable for arbitrary number of users.

Finally, we investigate the design of a discrete Fourier transform (DFT) modulated filterbank transceiver (DFT-FBT) with LTV scalar channels. For both cases with known LTV channels and unknown wide sense stationary uncorrelated scattering (WSSUS) statistical channels, we show how to optimize the transmitting and receiving prototypes of a DFT-FBT such that the SINR at the receiver is maximized. Also, a novel pilot-aided subspace channel estimation algorithm is proposed for the orthogonal frequency division multiplexing (OFDM) systems with quasi-stationary multi-path Rayleigh fading channels. Using the concept of a difference co-array, the new technique can construct M^2 co-pilots from M physical pilot tones with alternating pilot placement. Subspace methods, such as MUSIC and ESPRIT, can be used to estimate the multipath delays and the number of identifiable paths is up to O(M^2), theoretically. With the delay information, a MMSE estimator for frequency response is derived. It is shown through simulations that the proposed method outperforms the conventional subspace channel estimator when the number of multipaths is greater than or equal to the number of physical pilots minus one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work presented in this thesis revolves around erasure correction coding, as applied to distributed data storage and real-time streaming communications.

First, we examine the problem of allocating a given storage budget over a set of nodes for maximum reliability. The objective is to find an allocation of the budget that maximizes the probability of successful recovery by a data collector accessing a random subset of the nodes. This optimization problem is challenging in general because of its combinatorial nature, despite its simple formulation. We study several variations of the problem, assuming different allocation models and access models, and determine the optimal allocation and the optimal symmetric allocation (in which all nonempty nodes store the same amount of data) for a variety of cases. Although the optimal allocation can have nonintuitive structure and can be difficult to find in general, our results suggest that, as a simple heuristic, reliable storage can be achieved by spreading the budget maximally over all nodes when the budget is large, and spreading it minimally over a few nodes when it is small. Coding would therefore be beneficial in the former case, while uncoded replication would suffice in the latter case.

Second, we study how distributed storage allocations affect the recovery delay in a mobile setting. Specifically, two recovery delay optimization problems are considered for a network of mobile storage nodes: the maximization of the probability of successful recovery by a given deadline, and the minimization of the expected recovery delay. We show that the first problem is closely related to the earlier allocation problem, and solve the second problem completely for the case of symmetric allocations. It turns out that the optimal allocations for the two problems can be quite different. In a simulation study, we evaluated the performance of a simple data dissemination and storage protocol for mobile delay-tolerant networks, and observed that the choice of allocation can have a significant impact on the recovery delay under a variety of scenarios.

Third, we consider a real-time streaming system where messages created at regular time intervals at a source are encoded for transmission to a receiver over a packet erasure link; the receiver must subsequently decode each message within a given delay from its creation time. For erasure models containing a limited number of erasures per coding window, per sliding window, and containing erasure bursts whose maximum length is sufficiently short or long, we show that a time-invariant intrasession code asymptotically achieves the maximum message size among all codes that allow decoding under all admissible erasure patterns. For the bursty erasure model, we also show that diagonally interleaved codes derived from specific systematic block codes are asymptotically optimal over all codes in certain cases. We also study an i.i.d. erasure model in which each transmitted packet is erased independently with the same probability; the objective is to maximize the decoding probability for a given message size. We derive an upper bound on the decoding probability for any time-invariant code, and show that the gap between this bound and the performance of a family of time-invariant intrasession codes is small when the message size and packet erasure probability are small. In a simulation study, these codes performed well against a family of random time-invariant convolutional codes under a number of scenarios.

Finally, we consider the joint problems of routing and caching for named data networking. We propose a backpressure-based policy that employs virtual interest packets to make routing and caching decisions. In a packet-level simulation, the proposed policy outperformed a basic protocol that combines shortest-path routing with least-recently-used (LRU) cache replacement.