908 resultados para throughput
Resumo:
This paper describes the spectral design and manufacture of the narrow bandpass filters and 6-18µm broadband antireflection coatings for the 21-channel NASA EOS-AURA High Resolution Dynamics Limb Sounder (HIRDLS). A method of combining the measured spectral characteristics of each filter and antireflection coating, together with the spectral response of the other optical elements in the instrument to obtain a predicted system throughput response is presented. The design methods used to define the filter and coating spectral requirements, choice of filter materials, multilayer designs and deposition techniques are discussed.
Resumo:
Dense deployments of wireless local area networks (WLANs) are becoming a norm in many cities around the world. However, increased interference and traffic demands can severely limit the aggregate throughput achievable unless an effective channel assignment scheme is used. In this work, a simple and effective distributed channel assignment (DCA) scheme is proposed. It is shown that in order to maximise throughput, each access point (AP) simply chooses the channel with the minimum number of active neighbour nodes (i.e. nodes associated with neighbouring APs that have packets to send). However, application of such a scheme to practice depends critically on its ability to estimate the number of neighbour nodes in each channel, for which no practical estimator has been proposed before. In view of this, an extended Kalman filter (EKF) estimator and an estimate of the number of nodes by AP are proposed. These not only provide fast and accurate estimates but can also exploit channel switching information of neighbouring APs. Extensive packet level simulation results show that the proposed minimum neighbour and EKF estimator (MINEK) scheme is highly scalable and can provide significant throughput improvement over other channel assignment schemes.
Resumo:
A spectral performance model, designed to simulate the system spectral throughput for each of the 21 channels in the HIRDLS radiometer, is described. This model uses the measured spectral characteristics of each of the components in the optical train, appropriately corrected for their optical environment, to determine the end-to-end spectral throughput profile for each channel. This profile is then combined with the predicted thermal emission from the atmosphere, arising from the height of interest, to establish an in-band (wanted) to out-of-band (unwanted) radiance ratio. The results from the use of the model demonstrate that the instrument level radiometric requirements for the instrument will be achieved. The optical arrangement and spectral design requirements for filtering in the HIRDLS instrument are described together with a presentation of the performance achieved for the complete set of manufactured filters. Compliance of the predicted passband throughput model to the spectral positioning requi rements of the instrument is also demonstrated.
Resumo:
The HIRDLS instrument contains 21 spectral channels spanning a wavelength range from 6 to 18mm. For each of these channels the spectral bandwidth and position are isolated by an interference bandpass filter at 301K placed at an intermediate focal plane of the instrument. A second filter cooled to 65K positioned at the same wavelength but designed with a wider bandwidth is placed directly in front of each cooled detector element to reduce stray radiation from internally reflected in-band signals, and to improve the out-of-band blocking. This paper describes the process of determining the spectral requirements for the two bandpass filters and the antireflection coatings used on the lenses and dewar window of the instrument. This process uses a system throughput performance approach taking the instrument spectral specification as a target. It takes into account the spectral characteristics of the transmissive optical materials, the relative spectral response of the detectors, thermal emission from the instrument, and the predicted atmospheric signal to determine the radiance profile for each channel. Using this design approach an optimal design for the filters can be achieved, minimising the number of layers to improve the in-band transmission and to aid manufacture. The use of this design method also permits the instrument spectral performance to be verified using the measured response from manufactured components. The spectral calculations for an example channel are discussed, together with the spreadsheet calculation method. All the contributions made by the spectrally active components to the resulting instrument channel throughput are identified and presented.
Resumo:
This paper analyzes the delay performance of Enhanced relay-enabled Distributed Coordination Function (ErDCF) for wireless ad hoc networks under ideal condition and in the presence of transmission errors. Relays are nodes capable of supporting high data rates for other low data rate nodes. In ideal channel ErDCF achieves higher throughput and reduced energy consumption compared to IEEE 802.11 Distributed Coordination Function (DCF). This gain is still maintained in the presence of errors. It is also expected of relays to reduce the delay. However, the impact on the delay behavior of ErDCF under transmission errors is not known. In this work, we have presented the impact of transmission errors on delay. It turns out that under transmission errors of sufficient magnitude to increase dropped packets, packet delay is reduced. This is due to increase in the probability of failure. As a result the packet drop time increases, thus reflecting the throughput degradation.
Resumo:
A reconfigurable scalar quantiser capable of accepting n-bit input data is presented. The data length n can be varied in the range 1... N-1 under partial-run time reconfiguration, p-RTR. Issues as improvement in throughput using this reconfigurable quantiser of p-RTR against RTR for data of variable length are considered. The quantiser design referred to as the priority quantiser PQ is then compared against a direct design of the quantiser DIQ. It is then evaluated that for practical quantiser sizes, PQ shows better area usage when both are targeted onto the same FPGA. Other benefits are also identified.
Resumo:
This paper presents a simple clocking technique to migrate classical synchronous pipelined designs to a synchronous functional-equivalent alternative system in the context of FPGAs. When the new pipelined design runs at the same throughput of the original design, around 30% better mW/MHz ratio was observed in Virtex-based FPGA circuits. The evaluation is done using a simple but representative and practical systolic design as an example. The technique in essence is a simple replacement of the clocking mechanism for the pipe-storage elements; however no extra design effort is needed. The results show that the proposed technique allows immediate power and area-time savings of existing designs rather than exploring potential benefits by a new logic design to the problem using the classic pipeline clocking mechanism.
Resumo:
Since 1966, coded orthogonal frequency division multiplexing (COFDM) has been investigated to determine the possibility of reducing the overall throughput of a digitally modulated terrestrial television channel. In the investigations, many assumptions have emerged. One common misconception is that in a terrestrial environment, COFDM has an inherent immunity to multipath interference. A theoretical analysis of a multipath channel, along with simulation results has shown that this assumption does not hold the information is considered when including the radio frequency modulation and demodulation. This paper presents a background into the inception of COFDM, a mathematical analysis of the digitally modulated television signal under multipath conditions and the results of a European Digital Video Broadcasting-Terrestrial (DVB-T) compliant simulation model with MPEG-2 bitstreams transmitted under various multipath conditions.
Resumo:
*** Purpose – Computer tomography (CT) for 3D reconstruction entails a huge number of coplanar fan-beam projections for each of a large number of 2D slice images, and excessive radiation intensities and dosages. For some applications its rate of throughput is also inadequate. A technique for overcoming these limitations is outlined. *** Design methodology/approach – A novel method to reconstruct 3D surface models of objects is presented, using, typically, ten, 2D projective images. These images are generated by relative motion between this set of objects and a set of ten fanbeam X-ray sources and sensors, with their viewing axes suitably distributed in 2D angular space. *** Findings – The method entails a radiation dosage several orders of magnitude lower than CT, and requires far less computational power. Experimental results are given to illustrate the capability of the technique *** Practical implications – The substantially lower cost of the method and, more particularly, its dramatically lower irradiation make it relevant to many applications precluded by current techniques *** Originality/value – The method can be used in many applications such as aircraft hold-luggage screening, 3D industrial modelling and measurement, and it should also have important applications to medical diagnosis and surgery.
Resumo:
R. Benjamin (1995) addressed the application of the “object 3D” X ray reconstruction technique for electronically “unpacking” suspect items, when screening aircraft luggage. However, there is no satisfactory solution to the mass screening of hold luggage. Computed Tomography, CT, entails excessive radiation dosages, and its rate of throughput is quite inadequate. A novel variant of “object 3D” is therefore put forward, adapting some of the technology of existing cabin luggage screening systems-but on a substantially larger scale-which does achieve the required throughput at an acceptable radiation dosage and cost.
Resumo:
This chapter considers the Multiband Orthogonal Frequency Division Multiplexing (MB- OFDM) modulation and demodulation with the intention to optimize the Ultra-Wideband (UWB) system performance. OFDM is a type of multicarrier modulation and becomes the most important aspect for the MB-OFDM system performance. It is also a low cost digital signal component efficiently using Fast Fourier Transform (FFT) algorithm to implement the multicarrier orthogonality. Within the MB-OFDM approach, the OFDM modulation is employed in each 528 MHz wide band to transmit the data across the different bands while also using the frequency hopping technique across different bands. Each parallel bit stream can be mapped onto one of the OFDM subcarriers. Quadrature Phase Shift Keying (QPSK) and Dual Carrier Modulation (DCM) are currently used as the modulation schemes for MB-OFDM in the ECMA-368 defined UWB radio platform. A dual QPSK soft-demapper is suitable for ECMA-368 that exploits the inherent Time-Domain Spreading (TDS) and guard symbol subcarrier diversity to improve the receiver performance, yet merges decoding operations together to minimize hardware and power requirements. There are several methods to demap the DCM, which are soft bit demapping, Maximum Likelihood (ML) soft bit demapping, and Log Likelihood Ratio (LLR) demapping. The Channel State Information (CSI) aided scheme coupled with the band hopping information is used as a further technique to improve the DCM demapping performance. ECMA-368 offers up to 480 Mb/s instantaneous bit rate to the Medium Access Control (MAC) layer, but depending on radio channel conditions dropped packets unfortunately result in a lower throughput. An alternative high data rate modulation scheme termed Dual Circular 32-QAM that fits within the configuration of the current standard increasing system throughput thus maintaining the high rate throughput even with a moderate level of dropped packets.
Resumo:
Aim: A nested case-control discovery study was undertaken 10 test whether information within the serum peptidome can improve on the utility of CA125 for early ovarian cancer detection. Materials and Methods: High-throughput matrix-assisted laser desorption ionisation mass spectrometry (MALDI-MS) was used to profile 295 serum samples from women pre-dating their ovarian cancer diagnosis and from 585 matched control samples. Classification rules incorporating CA125 and MS peak intensities were tested for discriminating ability. Results: Two peaks were found which in combination with CA125 discriminated cases from controls up to 15 and 11 months before diagnosis, respectively, and earlier than using CA125 alone. One peak was identified as connective tissue-activating peptide III (CTAPIII), whilst the other was putatively identified as platelet factor 4 (PF4). ELISA data supported the down-regulation of PF4 in early cancer cases. Conclusion: Serum peptide information with CA125 improves lead time for early detection of ovarian cancer. The candidate markers are platelet-derived chemokines, suggesting a link between platelet function and tumour development.
Resumo:
Food security is one of this century’s key global challenges. By 2050 the world will require increased crop production in order to feed its predicted 9 billion people. This must be done in the face of changing consumption patterns, the impacts of climate change and the growing scarcity of water and land. Crop production methods will also have to sustain the environment, preserve natural resources and support livelihoods of farmers and rural populations around the world. There is a pressing need for the ‘sustainable intensifi cation’ of global agriculture in which yields are increased without adverse environmental impact and without the cultivation of more land. Addressing the need to secure a food supply for the whole world requires an urgent international effort with a clear sense of long-term challenges and possibilities. Biological science, especially publicly funded science, must play a vital role in the sustainable intensifi cation of food crop production. The UK has a responsibility and the capacity to take a leading role in providing a range of scientifi c solutions to mitigate potential food shortages. This will require signifi cant funding of cross-disciplinary science for food security. The constraints on food crop production are well understood, but differ widely across regions. The availability of water and good soils are major limiting factors. Signifi cant losses in crop yields occur due to pests, diseases and weed competition. The effects of climate change will further exacerbate the stresses on crop plants, potentially leading to dramatic yield reductions. Maintaining and enhancing the diversity of crop genetic resources is vital to facilitate crop breeding and thereby enhance the resilience of food crop production. Addressing these constraints requires technologies and approaches that are underpinned by good science. Some of these technologies build on existing knowledge, while others are completely radical approaches, drawing on genomics and high-throughput analysis. Novel research methods have the potential to contribute to food crop production through both genetic improvement of crops and new crop and soil management practices. Genetic improvements to crops can occur through breeding or genetic modifi cation to introduce a range of desirable traits. The application of genetic methods has the potential to refi ne existing crops and provide incremental improvements. These methods also have the potential to introduce radical and highly signifi cant improvements to crops by increasing photosynthetic effi ciency, reducing the need for nitrogen or other fertilisers and unlocking some of the unrealised potential of crop genomes. The science of crop management and agricultural practice also needs to be given particular emphasis as part of a food security grand challenge. These approaches can address key constraints in existing crop varieties and can be applied widely. Current approaches to maximising production within agricultural systems are unsustainable; new methodologies that utilise all elements of the agricultural system are needed, including better soil management and enhancement and exploitation of populations of benefi cial soil microbes. Agronomy, soil science and agroecology—the relevant sciences—have been neglected in recent years. Past debates about the use of new technologies for agriculture have tended to adopt an either/or approach, emphasising the merits of particular agricultural systems or technological approaches and the downsides of others. This has been seen most obviously with respect to genetically modifi ed (GM) crops, the use of pesticides and the arguments for and against organic modes of production. These debates have failed to acknowledge that there is no technological panacea for the global challenge of sustainable and secure global food production. There will always be trade-offs and local complexities. This report considers both new crop varieties and appropriate agroecological crop and soil management practices and adopts an inclusive approach. No techniques or technologies should be ruled out. Global agriculture demands a diversity of approaches, specific to crops, localities, cultures and other circumstances. Such diversity demands that the breadth of relevant scientific enquiry is equally diverse, and that science needs to be combined with social, economic and political perspectives. In addition to supporting high-quality science, the UK needs to maintain and build its capacity to innovate, in collaboration with international and national research centres. UK scientists and agronomists have in the past played a leading role in disciplines relevant to agriculture, but training in agricultural sciences and related topics has recently suffered from a lack of policy attention and support. Agricultural extension services, connecting farmers with new innovations, have been similarly neglected in the UK and elsewhere. There is a major need to review the support for and provision of extension services, particularly in developing countries. The governance of innovation for agriculture needs to maximise opportunities for increasing production, while at the same time protecting societies, economies and the environment from negative side effects. Regulatory systems need to improve their assessment of benefits. Horizon scanning will ensure proactive consideration of technological options by governments. Assessment of benefi ts, risks and uncertainties should be seen broadly, and should include the wider impacts of new technologies and practices on economies and societies. Public and stakeholder dialogue—with NGOs, scientists and farmers in particular—needs to be a part of all governance frameworks.
Resumo:
Basic Network transactions specifies that datagram from source to destination is routed through numerous routers and paths depending on the available free and uncongested paths which results in the transmission route being too long, thus incurring greater delay, jitter, congestion and reduced throughput. One of the major problems of packet switched networks is the cell delay variation or jitter. This cell delay variation is due to the queuing delay depending on the applied loading conditions. The effect of delay, jitter accumulation due to the number of nodes along transmission routes and dropped packets adds further complexity to multimedia traffic because there is no guarantee that each traffic stream will be delivered according to its own jitter constraints therefore there is the need to analyze the effects of jitter. IP routers enable a single path for the transmission of all packets. On the other hand, Multi-Protocol Label Switching (MPLS) allows separation of packet forwarding and routing characteristics to enable packets to use the appropriate routes and also optimize and control the behavior of transmission paths. Thus correcting some of the shortfalls associated with IP routing. Therefore MPLS has been utilized in the analysis for effective transmission through the various networks. This paper analyzes the effect of delay, congestion, interference, jitter and packet loss in the transmission of signals from source to destination. In effect the impact of link failures, repair paths in the various physical topologies namely bus, star, mesh and hybrid topologies are all analyzed based on standard network conditions.
Resumo:
This paper proposes a practical approach to the enhancement of Quality of Service (QoS) routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and a Label Switched path (LSP) request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis examined several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision. Results obtained indicated that creating multi-repair paths and prioritizing packets reduces delay and increases throughput in which case the delays at the ingress/egress LSPs were low compared to when the signals had not been classified. Therefore the proposed scheme provided a means to improve the QoS in path restoration in MPLS using available network resources. Prioritizing the packets in the data plane has revealed that the amount of traffic transmitted using a medium and low priority Label Switch Paths (LSPs) does not have any impact on the explicit rate of the high priority LSP in which case the problem of a knock-on effect is eliminated.