988 resultados para All-optical packet routing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet has grown in size at rapid rates since BGP records began, and continues to do so. This has raised concerns about the scalability of the current BGP routing system, as the routing state at each router in a shortest-path routing protocol will grow at a supra-linearly rate as the network grows. The concerns are that the memory capacity of routers will not be able to keep up with demands, and that the growth of the Internet will become ever more cramped as more and more of the world seeks the benefits of being connected. Compact routing schemes, where the routing state grows only sub-linearly relative to the growth of the network, could solve this problem and ensure that router memory would not be a bottleneck to Internet growth. These schemes trade away shortest-path routing for scalable memory state, by allowing some paths to have a certain amount of bounded “stretch”. The most promising such scheme is Cowen Routing, which can provide scalable, compact routing state for Internet routing, while still providing shortest-path routing to nearly all other nodes, with only slightly stretched paths to a very small subset of the network. Currently, there is no fully distributed form of Cowen Routing that would be practical for the Internet. This dissertation describes a fully distributed and compact protocol for Cowen routing, using the k-core graph decomposition. Previous compact routing work showed the k-core graph decomposition is useful for Cowen Routing on the Internet, but no distributed form existed. This dissertation gives a distributed k-core algorithm optimised to be efficient on dynamic graphs, along with with proofs of its correctness. The performance and efficiency of this distributed k-core algorithm is evaluated on large, Internet AS graphs, with excellent results. This dissertation then goes on to describe a fully distributed and compact Cowen Routing protocol. This protocol being comprised of a landmark selection process for Cowen Routing using the k-core algorithm, with mechanisms to ensure compact state at all times, including at bootstrap; a local cluster routing process, with mechanisms for policy application and control of cluster sizes, ensuring again that state can remain compact at all times; and a landmark routing process is described with a prioritisation mechanism for announcements that ensures compact state at all times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En este proyecto se analizaron las características y el modo de operación de las fibras ópticas plásticas en un enlace óptico WDM (Wavelenght Division Multiplexing) operando en el espectro visible. Se estudiaron los componentes activos y pasivos necesarios para el enlace, como son las fuentes LED, multiplexores, filtros y acopladores. Se analizaron los efectos no lineales que se pueden presentar en la fibra óptica, y que son importantes de considerar al transmitir señales WDM. Para respaldar el análisis se simuló en MATLAB un enlace óptico en el dominio de la frecuencia utilizando fuentes LED que emiten en el espectro visible, junto con multiplexores WDM, filtros de absorción, acopladores y como medio de transmisión la Fibra Óptica Plástica (POF -Plastic Optical Fiber).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi: 10.1594/PANGAEA.854832 (Valente et al., 2015).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oceans environmental monitoring and seafloor exploitation need in situ sensors and optical devices (cameras, lights) in various locations and on various carriers in order to initiate and to calibrate environmental models or to operate underwater industrial process supervision. For more than 10 years Ifremer deploys in situ monitoring systems for various seawater parameters and in situ observation systems based on lights and HD Cameras. To be economically operational, these systems must be equipped with a biofouling protection dedicated to the sensors and optical devices used in situ. Indeed, biofouling, in less than 15 days [1] will modify the transducing interfaces of the sensors and causes unacceptable bias on the measurements provided by the in situ monitoring system. In the same way biofouling will decrease the optical properties of windows and thus altering the lighting and the quality fot he images recorded by the camera.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the standard Vehicle Routing Problem (VRP), we route a fleet of vehicles to deliver the demands of all customers such that the total distance traveled by the fleet is minimized. In this dissertation, we study variants of the VRP that minimize the completion time, i.e., we minimize the distance of the longest route. We call it the min-max objective function. In applications such as disaster relief efforts and military operations, the objective is often to finish the delivery or the task as soon as possible, not to plan routes with the minimum total distance. Even in commercial package delivery nowadays, companies are investing in new technologies to speed up delivery instead of focusing merely on the min-sum objective. In this dissertation, we compare the min-max and the standard (min-sum) objective functions in a worst-case analysis to show that the optimal solution with respect to one objective function can be very poor with respect to the other. The results motivate the design of algorithms specifically for the min-max objective. We study variants of min-max VRPs including one problem from the literature (the min-max Multi-Depot VRP) and two new problems (the min-max Split Delivery Multi-Depot VRP with Minimum Service Requirement and the min-max Close-Enough VRP). We develop heuristics to solve these three problems. We compare the results produced by our heuristics to the best-known solutions in the literature and find that our algorithms are effective. In the case where benchmark instances are not available, we generate instances whose near-optimal solutions can be estimated based on geometry. We formulate the Vehicle Routing Problem with Drones and carry out a theoretical analysis to show the maximum benefit from using drones in addition to trucks to reduce delivery time. The speed-up ratio depends on the number of drones loaded onto one truck and the speed of the drone relative to the speed of the truck.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation on endoscopic optical coherence tomography (OCT). As a noninvasive imaging modality, OCT emerges as an increasingly important diagnostic tool for many clinical applications. Despite of many of its merits, such as high resolution and depth resolvability, a major limitation is the relatively shallow penetration depth in tissue (about 2∼3 mm). This is mainly due to tissue scattering and absorption. To overcome this limitation, people have been developing many different endoscopic OCT systems. By utilizing a minimally invasive endoscope, the OCT probing beam can be brought to the close vicinity of the tissue of interest and bypass the scattering of intervening tissues so that it can collect the reflected light signal from desired depth and provide a clear image representing the physiological structure of the region, which can not be disclosed by traditional OCT. In this thesis, three endoscope designs have been studied. While they rely on vastly different principles, they all converge to solve this long-standing problem.

A hand-held endoscope with manual scanning is first explored. When a user is holding a hand- held endoscope to examine samples, the movement of the device provides a natural scanning. We proposed and implemented an optical tracking system to estimate and record the trajectory of the device. By registering the OCT axial scan with the spatial information obtained from the tracking system, one can use this system to simply ‘paint’ a desired volume and get any arbitrary scanning pattern by manually waving the endoscope over the region of interest. The accuracy of the tracking system was measured to be about 10 microns, which is comparable to the lateral resolution of most OCT system. Targeted phantom sample and biological samples were manually scanned and the reconstructed images verified the method.

Next, we investigated a mechanical way to steer the beam in an OCT endoscope, which is termed as Paired-angle-rotation scanning (PARS). This concept was proposed by my colleague and we further developed this technology by enhancing the longevity of the device, reducing the diameter of the probe, and shrinking down the form factor of the hand-piece. Several families of probes have been designed and fabricated with various optical performances. They have been applied to different applications, including the collector channel examination for glaucoma stent implantation, and vitreous remnant detection during live animal vitrectomy.

Lastly a novel non-moving scanning method has been devised. This approach is based on the EO effect of a KTN crystal. With Ohmic contact of the electrodes, the KTN crystal can exhibit a special mode of EO effect, termed as space-charge-controlled electro-optic effect, where the carrier electron will be injected into the material via the Ohmic contact. By applying a high voltage across the material, a linear phase profile can be built under this mode, which in turn deflects the light beam passing through. We constructed a relay telescope to adapt the KTN deflector into a bench top OCT scanning system. One of major technical challenges for this system is the strong chromatic dispersion of KTN crystal within the wavelength band of OCT system. We investigated its impact on the acquired OCT images and proposed a new approach to estimate and compensate the actual dispersion. Comparing with traditional methods, the new method is more computational efficient and accurate. Some biological samples were scanned by this KTN based system. The acquired images justified the feasibility of the usage of this system into a endoscopy setting. My research above all aims to provide solutions to implement an OCT endoscope. As technology evolves from manual, to mechanical, and to electrical approaches, different solutions are presented. Since all have their own advantages and disadvantages, one has to determine the actual requirements and select the best fit for a specific application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern data centers host hundreds of thousands of servers to achieve economies of scale. Such a huge number of servers create challenges for the data center network (DCN) to provide proportionally large bandwidth. In addition, the deployment of virtual machines (VMs) in data centers raises the requirements for efficient resource allocation and find-grained resource sharing. Further, the large number of servers and switches in the data center consume significant amounts of energy. Even though servers become more energy efficient with various energy saving techniques, DCN still accounts for 20% to 50% of the energy consumed by the entire data center. The objective of this dissertation is to enhance DCN performance as well as its energy efficiency by conducting optimizations on both host and network sides. First, as the DCN demands huge bisection bandwidth to interconnect all the servers, we propose a parallel packet switch (PPS) architecture that directly processes variable length packets without segmentation-and-reassembly (SAR). The proposed PPS achieves large bandwidth by combining switching capacities of multiple fabrics, and it further improves the switch throughput by avoiding padding bits in SAR. Second, since certain resource demands of the VM are bursty and demonstrate stochastic nature, to satisfy both deterministic and stochastic demands in VM placement, we propose the Max-Min Multidimensional Stochastic Bin Packing (M3SBP) algorithm. M3SBP calculates an equivalent deterministic value for the stochastic demands, and maximizes the minimum resource utilization ratio of each server. Third, to provide necessary traffic isolation for VMs that share the same physical network adapter, we propose the Flow-level Bandwidth Provisioning (FBP) algorithm. By reducing the flow scheduling problem to multiple stages of packet queuing problems, FBP guarantees the provisioned bandwidth and delay performance for each flow. Finally, while DCNs are typically provisioned with full bisection bandwidth, DCN traffic demonstrates fluctuating patterns, we propose a joint host-network optimization scheme to enhance the energy efficiency of DCNs during off-peak traffic hours. The proposed scheme utilizes a unified representation method that converts the VM placement problem to a routing problem and employs depth-first and best-fit search to find efficient paths for flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Opto-acoustic imaging is a growing field of research in recent years, providing functional imaging of physiological biomarkers, such as the oxygenation of haemoglobin. Piezo electric transducers are the industry standard detector for ultrasonics, but their limited bandwidth, susceptibility to electromagnetic interference and their inversely proportional sensitivity to size all affect the detector performance. Sensors based on polymer optical fibres (POF) are immune to electromagnetic interference, have lower acoustic impedance and a reduced Young's Modulus compared to silica fibres. Furthermore, POF enables the possibility of a wideband sensor and a size appropriate to endoscopy. Micro-structured POF (mPOF) used in an interferometric detector has been shown to be an order of magnitude more sensitive than silica fibre at 1 MHz and 3 times more sensitive at 10 MHz. We present the first opto-acoustic measurements obtained using a 4.7mm PMMA mPOF Bragg grating with a fibre diameter of 130 μm and present the lateral directivity pattern of a PMMA mPOF FBG ultrasound sensor over a frequency range of 1-50 MHz. We discuss the impact of the pattern with respect to the targeted application and draw conclusions on how to mitigate the problems encountered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose To compare measurements taken using a swept-source optical coherence tomography-based optical biometer (IOLmaster 700) and an optical low-coherence reflectometry biometer (Lenstar 900), and to determine the clinical impacts of differences in their measurements on intraocular lens (IOL) power predictions. Methods Eighty eyes of 80 patients scheduled to undergo cataract surgery were examined with both biometers. The measurements made using each device were axial length (AL), central corneal thickness (CCT), aqueous depth (AQD), lens thickness (LT), mean keratometry (MK), white-to-white distance (WTW), and pupil diameter (PD). Holladay 2 and SRK/T formulas were used to calculate IOL power. Differences in measurement between the two biometers were determined using the paired t-test. Agreement was assessed through intraclass correlation coefficients (ICC) and Bland–Altman plots. Results Mean patient age was 76.3±6.8 years (range 59–89). Using the Lenstar, AL and PD could not be measured in 12.5 and 5.25% of eyes, respectively, while IOLMaster 700 took all measurements in all eyes. The variables CCT, AQD, LT, and MK varied significantly between the two biometers. According to ICCs, correlation between measurements made with both devices was excellent except for WTW and PD. Using the SRK/T formula, IOL power prediction based on the data from the two devices were statistically different, but differences were not clinically significant. Conclusions No clinically relevant differences were detected between the biometers in terms of their measurements and IOL power predictions. Using the IOLMaster 700, it was easier to obtain biometric measurements in eyes with less transparent ocular media or longer AL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to develop and validate a multivariate predictive model to detect glaucoma by using a combination of retinal nerve fiber layer (RNFL), retinal ganglion cell-inner plexiform (GCIPL), and optic disc parameters measured using spectral-domain optical coherence tomography (OCT). Methods: Five hundred eyes from 500 participants and 187 eyes of another 187 participants were included in the study and validation groups, respectively. Patients with glaucoma were classified in five groups based on visual field damage. Sensitivity and specificity of all glaucoma OCT parameters were analyzed. Receiver operating characteristic curves (ROC) and areas under the ROC (AUC) were compared. Three predictive multivariate models (quantitative, qualitative, and combined) that used a combination of the best OCT parameters were constructed. A diagnostic calculator was created using the combined multivariate model. Results: The best AUC parameters were: inferior RNFL, average RNFL, vertical cup/disc ratio, minimal GCIPL, and inferior-temporal GCIPL. Comparisons among the parameters did not show that the GCIPL parameters were better than those of the RNFL in early and advanced glaucoma. The highest AUC was in the combined predictive model (0.937; 95% confidence interval, 0.911–0.957) and was significantly (P = 0.0001) higher than the other isolated parameters considered in early and advanced glaucoma. The validation group displayed similar results to those of the study group. Conclusions: Best GCIPL, RNFL, and optic disc parameters showed a similar ability to detect glaucoma. The combined predictive formula improved the glaucoma detection compared to the best isolated parameters evaluated. The diagnostic calculator obtained good classification from participants in both the study and validation groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the optical design of the far infrared imaging spectrometer for the JAXA's SPICA mission. The SAFARI instrument, is a cryogenic imaging Fourier transform spectrometer (iFTS), designed to perform backgroundlimited spectroscopic and photometric imaging in the band 34-210 μm. The all-reflective optical system is highly modular and consists of three main modules; input optics module, interferometer module (FTS) and camera bay optics. A special study has been dedicated to the spectroscopic performance of the instrument, in which the spectral response and interference of the instrument have been modeled, as the FTS mechanism scans over the total desired OPD range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SpicA FAR infrared Instrument, SAFARI, is one of the instruments planned for the SPICA mission. The SPICA mission is the next great leap forward in space-based far-infrared astronomy and will study the evolution of galaxies, stars and planetary systems. SPICA will utilize a deeply cooled 2.5m-class telescope, provided by European industry, to realize zodiacal background limited performance, and high spatial resolution. The instrument SAFARI is a cryogenic grating-based point source spectrometer working in the wavelength domain 34 to 230 μm, providing spectral resolving power from 300 to at least 2000. The instrument shall provide low and high resolution spectroscopy in four spectral bands. Low Resolution mode is the native instrument mode, while the high Resolution mode is achieved by means of a Martin-Pupplet interferometer. The optical system is all-reflective and consists of three main modules; an input optics module, followed by the Band and Mode Distributing Optics and the grating Modules. The instrument utilizes Nyquist sampled filled linear arrays of very sensitive TES detectors. The work presented in this paper describes the optical design architecture and design concept compatible with the current instrument performance and volume design drivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is to explore new and improved methods for greater sample introduction efficiency and enhanced analytical performance with inductively coupled plasma optical emission spectrometry (ICP-OES). Three projects are discussed in which the capabilities and applications of ICP-OES are expanded: 1. In the first project, a conventional ultrasonic nebuliser was modified to replace the heater/condenser with an infrared heated pre-evaporation tube. In continuation from previous works with pre-evaporation, the current work investigated the effects of heating with infrared block and rope heaters on two different ICP-OES instruments. Comparisons were made between several methods and setups in which temperatures were varied. By monitoring changes to sensitivity, detection limit, precision, and robustness, and analyzing two certified reference materials, a method with improved sample introduction efficiency and comparable analytical performance to a previous method was established. 2. The second project involved improvements to a previous work in which a multimode sample introduction system (MSIS) was modified by inserting a pre-evaporation tube between the MSIS and torch. The new work focused on applying an infrared heated ceramic rope for pre-evaporation. This research was conducted in all three MSIS modes (nebulisation mode, hydride generation mode, and dual mode) and on two different ICP-OES instruments, and comparisons were made between conventional setups in terms of sensitivity, detection limit, precision, and robustness. By tracking both hydride-forming and non-hydride forming elements, the effects of heating in combination with hydride generation were probed. Finally, optimal methods were validated by analysis of two certified reference materials. 3. A final project was completed in collaboration with ZincNyx Energy Solutions. This project sought to develop a method for the overall analysis of a 12 M KOH zincate fuel, which is used in green energy backup systems. By employing various techniques including flow injection analysis and standard additions, a final procedure was formulated for the verification of K concentration, as well as the measurement of additives (Al, Fe, Mg, In, Si), corrosion products (such C from CO₃²¯), and Zn particles both in and filtered from solution. Furthermore, the effects of exposing the potassium zincate electrolyte fuel to air were assessed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is presented for accurate measurement of spectral flux-reflectance (albedo) in a laboratory, for media with long optical path lengths, such as snow and ice. The approach uses an acrylic hemispheric dome, which, when placed over the surface being studied, serves two functions: (i) it creates an overcast “sky” to illuminate the target surface from all directions within a hemisphere, and (ii) serves as a platform for measuring incident and backscattered spectral radiances, which can be integrated to obtain fluxes. The fluxes are relative measurements and because their ratio is used to determine flux-reflectance, no absolute radiometric calibrations are required. The dome and surface must meet minimum size requirements based on the scattering properties of the surface. This technique is suited for media with long photon path lengths since the backscattered illumination is collected over a large enough area to include photons that reemerge from the domain far from their point of entry because of multiple scattering and small absorption. Comparison between field and laboratory albedo of a portable test surface demonstrates the viability of this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional web search engines are centralised in that a single entity crawls and indexes the documents selected for future retrieval, and the relevance models used to determine which documents are relevant to a given user query. As a result, these search engines suffer from several technical drawbacks such as handling scale, timeliness and reliability, in addition to ethical concerns such as commercial manipulation and information censorship. Alleviating the need to rely entirely on a single entity, Peer-to-Peer (P2P) Information Retrieval (IR) has been proposed as a solution, as it distributes the functional components of a web search engine – from crawling and indexing documents, to query processing – across the network of users (or, peers) who use the search engine. This strategy for constructing an IR system poses several efficiency and effectiveness challenges which have been identified in past work. Accordingly, this thesis makes several contributions towards advancing the state of the art in P2P-IR effectiveness by improving the query processing and relevance scoring aspects of a P2P web search. Federated search systems are a form of distributed information retrieval model that route the user’s information need, formulated as a query, to distributed resources and merge the retrieved result lists into a final list. P2P-IR networks are one form of federated search in routing queries and merging result among participating peers. The query is propagated through disseminated nodes to hit the peers that are most likely to contain relevant documents, then the retrieved result lists are merged at different points along the path from the relevant peers to the query initializer (or namely, customer). However, query routing in P2P-IR networks is considered as one of the major challenges and critical part in P2P-IR networks; as the relevant peers might be lost in low-quality peer selection while executing the query routing, and inevitably lead to less effective retrieval results. This motivates this thesis to study and propose query routing techniques to improve retrieval quality in such networks. Cluster-based semi-structured P2P-IR networks exploit the cluster hypothesis to organise the peers into similar semantic clusters where each such semantic cluster is managed by super-peers. In this thesis, I construct three semi-structured P2P-IR models and examine their retrieval effectiveness. I also leverage the cluster centroids at the super-peer level as content representations gathered from cooperative peers to propose a query routing approach called Inverted PeerCluster Index (IPI) that simulates the conventional inverted index of the centralised corpus to organise the statistics of peers’ terms. The results show a competitive retrieval quality in comparison to baseline approaches. Furthermore, I study the applicability of using the conventional Information Retrieval models as peer selection approaches where each peer can be considered as a big document of documents. The experimental evaluation shows comparative and significant results and explains that document retrieval methods are very effective for peer selection that brings back the analogy between documents and peers. Additionally, Learning to Rank (LtR) algorithms are exploited to build a learned classifier for peer ranking at the super-peer level. The experiments show significant results with state-of-the-art resource selection methods and competitive results to corresponding classification-based approaches. Finally, I propose reputation-based query routing approaches that exploit the idea of providing feedback on a specific item in the social community networks and manage it for future decision-making. The system monitors users’ behaviours when they click or download documents from the final ranked list as implicit feedback and mines the given information to build a reputation-based data structure. The data structure is used to score peers and then rank them for query routing. I conduct a set of experiments to cover various scenarios including noisy feedback information (i.e, providing positive feedback on non-relevant documents) to examine the robustness of reputation-based approaches. The empirical evaluation shows significant results in almost all measurement metrics with approximate improvement more than 56% compared to baseline approaches. Thus, based on the results, if one were to choose one technique, reputation-based approaches are clearly the natural choices which also can be deployed on any P2P network.