933 resultados para Technical intermediaries
Resumo:
This study evaluates the performance of a wide range of aquaculture systems in Bangladesh. It is by far the largest of its kind attempted to date. The purpose of this study was to identify and analyze the most important production systems, rather than to provide a nationally representative overview of the entire aquaculture sector of Bangladesh. As such, the study yields a huge amount of new information on production technologies that have never been thoroughly researched before. The study reveals an extremely diverse array of specialized, dynamic and rapidly evolving production technologies, adapted to a variety of market niches and local environmental conditions. This is a testament to the innovativeness of farmers and other value chain actors who have been the principal drivers of this development in Bangladesh. Data was collected from six geographical hubs. This survey was conducted from November 2011 to June 2012. Technological performance in terms of detailed input and output information, fish management practices, credit and marketing, and social and environmental issues were captured by the survey questionnaire, which had both open and closed format questions. The study generated insights that enable better understanding of aquaculture development in Bangladesh.
Resumo:
Over the past 50 years, economic and technological developments have dramatically increased the human contribution to ambient noise in the ocean. The dominant frequencies of most human-made noise in the ocean is in the low-frequency range (defined as sound energy below 1000Hz), and low-frequency sound (LFS) may travel great distances in the ocean due to the unique propagation characteristics of the deep ocean (Munk et al. 1989). For example, in the Northern Hemisphere oceans low-frequency ambient noise levels have increased by as much as 10 dB during the period from 1950 to 1975 (Urick 1986; review by NRC 1994). Shipping is the overwhelmingly dominant source of low-frequency manmade noise in the ocean, but other sources of manmade LFS including sounds from oil and gas industrial development and production activities (seismic exploration, construction work, drilling, production platforms), and scientific research (e.g., acoustic tomography and thermography, underwater communication). The SURTASS LFA system is an additional source of human-produced LFS in the ocean, contributing sound energy in the 100-500 Hz band. When considering a document that addresses the potential effects of a low-frequency sound source on the marine environment, it is important to focus upon those species that are the most likely to be affected. Important criteria are: 1) the physics of sound as it relates to biological organisms; 2) the nature of the exposure (i.e. duration, frequency, and intensity); and 3) the geographic region in which the sound source will be operated (which, when considered with the distribution of the organisms will determine which species will be exposed). The goal in this section of the LFA/EIS is to examine the status, distribution, abundance, reproduction, foraging behavior, vocal behavior, and known impacts of human activity of those species may be impacted by LFA operations. To focus our efforts, we have examined species that may be physically affected and are found in the region where the LFA source will be operated. The large-scale geographic location of species in relation to the sound source can be determined from the distribution of each species. However, the physical ability for the organism to be impacted depends upon the nature of the sound source (i.e. explosive, impulsive, or non-impulsive); and the acoustic properties of the medium (i.e. seawater) and the organism. Non-impulsive sound is comprised of the movement of particles in a medium. Motion is imparted by a vibrating object (diaphragm of a speaker, vocal chords, etc.). Due to the proximity of the particles in the medium, this motion is transmitted from particle to particle in waves away from the sound source. Because the particle motion is along the same axis as the propagating wave, the waves are longitudinal. Particles move away from then back towards the vibrating source, creating areas of compression (high pressure) and areas of rarefaction (low pressure). As the motion is transferred from one particle to the next, the sound propagates away from the sound source. Wavelength is the distance from one pressure peak to the next. Frequency is the number of waves passing per unit time (Hz). Sound velocity (not to be confused with particle velocity) is the impedance is loosely equivalent to the resistance of a medium to the passage of sound waves (technically it is the ratio of acoustic pressure to particle velocity). A high impedance means that acoustic particle velocity is small for a given pressure (low impedance the opposite). When a sound strikes a boundary between media of different impedances, both reflection and refraction, and a transfer of energy can occur. The intensity of the reflection is a function of the intensity of the sound wave and the impedances of the two media. Two key factors in determining the potential for damage due to a sound source are the intensity of the sound wave and the impedance difference between the two media (impedance mis-match). The bodies of the vast majority of organisms in the ocean (particularly phytoplankton and zooplankton) have similar sound impedence values to that of seawater. As a result, the potential for sound damage is low; organisms are effectively transparent to the sound – it passes through them without transferring damage-causing energy. Due to the considerations above, we have undertaken a detailed analysis of species which met the following criteria: 1) Is the species capable of being physically affected by LFS? Are acoustic impedence mis-matches large enough to enable LFS to have a physical affect or allow the species to sense LFS? 2) Does the proposed SURTASS LFA geographical sphere of acoustic influence overlap the distribution of the species? Species that did not meet the above criteria were excluded from consideration. For example, phytoplankton and zooplankton species lack acoustic impedance mis-matches at low frequencies to expect them to be physically affected SURTASS LFA. Vertebrates are the organisms that fit these criteria and we have accordingly focused our analysis of the affected environment on these vertebrate groups in the world’s oceans: fishes, reptiles, seabirds, pinnipeds, cetaceans, pinnipeds, mustelids, sirenians (Table 1).
Resumo:
This research proposes a method for extracting technology intelligence (TI) systematically from a large set of document data. To do this, the internal and external sources in the form of documents, which might be valuable for TI, are first identified. Then the existing techniques and software systems applicable to document analysis are examined. Finally, based on the reviews, a document-mining framework designed for TI is suggested and guidelines for software selection are proposed. The research output is expected to support intelligence operatives in finding suitable techniques and software systems for getting value from document-mining and thus facilitate effective knowledge management. Copyright © 2012 Inderscience Enterprises Ltd.
Resumo:
Understanding how and why changes propagate during engineering design is critical because most products and systems emerge from predecessors and not through clean sheet design. This paper applies change propagation analysis methods and extends prior reasoning through examination of a large data set from industry including 41,500 change requests, spanning 8 years during the design of a complex sensor system. Different methods are used to analyze the data and the results are compared to each other and evaluated in the context of previous findings. In particular the networks of connected parent, child and sibling changes are resolved over time and mapped to 46 subsystem areas. A normalized change propagation index (CPI) is then developed, showing the relative strength of each area on the absorber-multiplier spectrum between -1 and +1. Multipliers send out more changes than they receive and are good candidates for more focused change management. Another interesting finding is the quantitative confirmation of the "ripple" change pattern. Unlike the earlier prediction, however, it was found that the peak of cyclical change activity occurred late in the program driven by systems integration and functional testing. Patterns emerged from the data and offer clear implications for technical change management approaches in system design. Copyright © 2007 by ASME.
Resumo:
Accurate knowledge of traffic demands in a communication network enables or enhances a variety of traffic engineering and network management tasks of paramount importance for operational networks. Directly measuring a complete set of these demands is prohibitively expensive because of the huge amounts of data that must be collected and the performance impact that such measurements would impose on the regular behavior of the network. As a consequence, we must rely on statistical techniques to produce estimates of actual traffic demands from partial information. The performance of such techniques is however limited due to their reliance on limited information and the high amount of computations they incur, which limits their convergence behavior. In this paper we study a two-step approach for inferring network traffic demands. First we elaborate and evaluate a modeling approach for generating good starting points to be fed to iterative statistical inference techniques. We call these starting points informed priors since they are obtained using actual network information such as packet traces and SNMP link counts. Second we provide a very fast variant of the EM algorithm which extends its computation range, increasing its accuracy and decreasing its dependence on the quality of the starting point. Finally, we evaluate and compare alternative mechanisms for generating starting points and the convergence characteristics of our EM algorithm against a recently proposed Weighted Least Squares approach.
Resumo:
With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.
Resumo:
There has been considerable work done in the study of Web reference streams: sequences of requests for Web objects. In particular, many studies have looked at the locality properties of such streams, because of the impact of locality on the design and performance of caching and prefetching systems. However, a general framework for understanding why reference streams exhibit given locality properties has not yet emerged. In this work we take a first step in this direction, based on viewing the Web as a set of reference streams that are transformed by Web components (clients, servers, and intermediaries). We propose a graph-based framework for describing this collection of streams and components. We identify three basic stream transformations that occur at nodes of the graph: aggregation, disaggregation and filtering, and we show how these transformations can be used to abstract the effects of different Web components on their associated reference streams. This view allows a structured approach to the analysis of why reference streams show given properties at different points in the Web. Applying this approach to the study of locality requires good metrics for locality. These metrics must meet three criteria: 1) they must accurately capture temporal locality; 2) they must be independent of trace artifacts such as trace length; and 3) they must not involve manual procedures or model-based assumptions. We describe two metrics meeting these criteria that each capture a different kind of temporal locality in reference streams. The popularity component of temporal locality is captured by entropy, while the correlation component is captured by interreference coefficient of variation. We argue that these metrics are more natural and more useful than previously proposed metrics for temporal locality. We use this framework to analyze a diverse set of Web reference traces. We find that this framework can shed light on how and why locality properties vary across different locations in the Web topology. For example, we find that filtering and aggregation have opposing effects on the popularity component of the temporal locality, which helps to explain why multilevel caching can be effective in the Web. Furthermore, we find that all transformations tend to diminish the correlation component of temporal locality, which has implications for the utility of different cache replacement policies at different points in the Web.
Resumo:
This thesis examines the ways in which Otherworldly women acted as intermediaries between the Otherworld and mortal world in early Irish literature. First it establishes the position of women in early Ireland so that appropriate comparisons can be made between mortal and Otherworld women throughout the thesis. Also, it defines what is meant by the ‘Otherworld’ and its relevence to the early Irish. It then goes on to discuss the differing goals of various intermediaries in early Irish texts, and in what manner they interact with mortals. It briefly looks at how Otherworld male intermediaries are treated differently in the literature, and why early authors might have used women in these roles as often as they did.