963 resultados para Round Robin DNS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Satellite data are increasingly used to provide observation-based estimates of the effects of aerosols on climate. The Aerosol-cci project, part of the European Space Agency's Climate Change Initiative (CCI), was designed to provide essential climate variables for aerosols from satellite data. Eight algorithms, developed for the retrieval of aerosol properties using data from AATSR (4), MERIS (3) and POLDER, were evaluated to determine their suitability for climate studies. The primary result from each of these algorithms is the aerosol optical depth (AOD) at several wavelengths, together with the Ångström exponent (AE) which describes the spectral variation of the AOD for a given wavelength pair. Other aerosol parameters which are possibly retrieved from satellite observations are not considered in this paper. The AOD and AE (AE only for Level 2) were evaluated against independent collocated observations from the ground-based AERONET sun photometer network and against “reference” satellite data provided by MODIS and MISR. Tools used for the evaluation were developed for daily products as produced by the retrieval with a spatial resolution of 10 × 10 km2 (Level 2) and daily or monthly aggregates (Level 3). These tools include statistics for L2 and L3 products compared with AERONET, as well as scoring based on spatial and temporal correlations. In this paper we describe their use in a round robin (RR) evaluation of four months of data, one month for each season in 2008. The amount of data was restricted to only four months because of the large effort made to improve the algorithms, and to evaluate the improvement and current status, before larger data sets will be processed. Evaluation criteria are discussed. Results presented show the current status of the European aerosol algorithms in comparison to both AERONET and MODIS and MISR data. The comparison leads to a preliminary conclusion that the scores are similar, including those for the references, but the coverage of AATSR needs to be enhanced and further improvements are possible for most algorithms. None of the algorithms, including the references, outperforms all others everywhere. AATSR data can be used for the retrieval of AOD and AE over land and ocean. PARASOL and one of the MERIS algorithms have been evaluated over ocean only and both algorithms provide good results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Within the ESA Climate Change Initiative (CCI) project Aerosol_cci (2010–2013), algorithms for the production of long-term total column aerosol optical depth (AOD) datasets from European Earth Observation sensors are developed. Starting with eight existing pre-cursor algorithms three analysis steps are conducted to improve and qualify the algorithms: (1) a series of experiments applied to one month of global data to understand several major sensitivities to assumptions needed due to the ill-posed nature of the underlying inversion problem, (2) a round robin exercise of "best" versions of each of these algorithms (defined using the step 1 outcome) applied to four months of global data to identify mature algorithms, and (3) a comprehensive validation exercise applied to one complete year of global data produced by the algorithms selected as mature based on the round robin exercise. The algorithms tested included four using AATSR, three using MERIS and one using PARASOL. This paper summarizes the first step. Three experiments were conducted to assess the potential impact of major assumptions in the various aerosol retrieval algorithms. In the first experiment a common set of four aerosol components was used to provide all algorithms with the same assumptions. The second experiment introduced an aerosol property climatology, derived from a combination of model and sun photometer observations, as a priori information in the retrievals on the occurrence of the common aerosol components. The third experiment assessed the impact of using a common nadir cloud mask for AATSR and MERIS algorithms in order to characterize the sensitivity to remaining cloud contamination in the retrievals against the baseline dataset versions. The impact of the algorithm changes was assessed for one month (September 2008) of data: qualitatively by inspection of monthly mean AOD maps and quantitatively by comparing daily gridded satellite data against daily averaged AERONET sun photometer observations for the different versions of each algorithm globally (land and coastal) and for three regions with different aerosol regimes. The analysis allowed for an assessment of sensitivities of all algorithms, which helped define the best algorithm versions for the subsequent round robin exercise; all algorithms (except for MERIS) showed some, in parts significant, improvement. In particular, using common aerosol components and partly also a priori aerosol-type climatology is beneficial. On the other hand the use of an AATSR-based common cloud mask meant a clear improvement (though with significant reduction of coverage) for the MERIS standard product, but not for the algorithms using AATSR. It is noted that all these observations are mostly consistent for all five analyses (global land, global coastal, three regional), which can be understood well, since the set of aerosol components defined in Sect. 3.1 was explicitly designed to cover different global aerosol regimes (with low and high absorption fine mode, sea salt and dust).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Thesis focused on hardware based Load balancing solution of web traffic through a load balancer F5 content switch. In this project, the implemented scenario for distributing HTTPtraffic load is based on different CPU usages (processing speed) of multiple member servers.Two widely used load balancing algorithms Round Robin (RR) and Ratio model (weighted Round Robin) are implemented through F5 load balancer. For evaluating the performance of F5 content switch, some experimental tests has been taken on implemented scenarios using RR and Ratio model load balancing algorithms. The performance is examined in terms of throughput (bits/sec) and Response time of member servers in a load balancing pool. From these experiments we have observed that Ratio Model load balancing algorithm is most suitable in the environment of load balancing servers with different CPU usages as it allows assigning the weight according to CPU usage both in static and dynamic load balancing of servers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Load balance is a critical issue in distributed systems, such as server grids. In this paper, we propose a Balanced Load Queue (BLQ) model, which combines the queuing theory and hydro-dynamic theory, to model load balance in server grids. Base on the BLQ model, we claim that if the system is in the state of global fairness, then the performance of the whole system is the best. We propose a load balanced algorithm based on the model: the algorithm tries its best to keep the system in the global fairness status using job deviation. We present three strategies: best node, best neighbour, and random selection, for job deviation. A number of experiments are conducted for the comparison of the three strategies, and the results show that the best neighbour strategy is the best among the proposed strategies. Furthermore, the proposed algorithm with best neighbour strategy is better than the traditional round robin algorithm in term of processing delay, and the proposed algorithm needs very limited system information and is robust.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

DDoS attack source traceback is an open and challenging problem. Deterministic packet marking (DPM) is a simple and effective traceback mechanism, but the current DPM based traceback schemes are not practical due to their scalability constraint. We noticed a factor that only a limited number of computers and routers are involved in an attack session. Therefore, we only need to mark these involved nodes for traceback purpose, rather than marking every node of the Internet as the existing schemes doing. Based on this finding, we propose a novel marking on demand (MOD) traceback scheme based on the DPM mechanism. In order to traceback to involved attack source, what we need to do is to mark these involved ingress routers using the traditional DPM strategy. Similar to existing schemes, we require participated routers to install a traffic monitor. When a monitor notices a surge of suspicious network flows, it will request a unique mark from a globally shared MOD server, and mark the suspicious flows with the unique marks. At the same time, the MOD server records the information of the marks and their related requesting IP addresses. Once a DDoS attack is confirmed, the victim can obtain the attack sources by requesting the MOD server with the marks extracted from attack packets. Moreover, we use the marking space in a round-robin style, which essentially addresses the scalability problem of the existing DPM based traceback schemes. We establish a mathematical model for the proposed traceback scheme, and thoroughly analyze the system. Theoretical analysis and extensive real-world data experiments demonstrate that the proposed traceback method is feasible and effective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O material apresenta políticas de escalonamento de processos e threads. O escalonamento de processos (ou Escalonamento do processador) trata da decisão sobre qual processo será executado em um determinado instante e por qual processador. O material apresenta também algoritmos de escalonamento relevantes, incluindo exemplos de algoritmos preemptivos e não-preemptivos, objetivos e critérios do escalonamento e diferentes tipos de escalonamentos: Escalonamento FIFO (first-in first-out), Escalonamento circular RR (Round-Robin ), Escalonamento SPF (Shortest Process First), Escalonamento SRT (Shortest Remaining Time), Escalonamento FSS (Fair Share Scheduling), Escalonamento de tempo real, Escalonamento de threads Java – JVM, Escalonamento no Windows XP e UNIX.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A videoaula apresenta o escalonamento de processos, com foco para as políticas de escalonamento do processador. Destaca o escalonamento com prioridades (estática ou dinâmica), o funcionamento do escalonamento de threads em Java, os níveis de escalonamento (alto nível, nível intermediário, baixo nível) e os critérios que são levados em conta pelo algoritmo de escalonamento. Apresenta também os objetivos e critérios do escalonamento e seus seguintes tipos: Escalonamento FIFO (first-in first-out), Escalonamento circular RR (Round-Robin ), Escalonamento SPF (Shortest Process First), Escalonamento SRT (Shortest Remaining Time), Escalonamento FSS (Fair Share Scheduling) - Escalonamento por fração justa, Escalonamento de tempo real, Escalonamento de threads Java – JVM, Escalonamento no Windows XP e UNIX.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The third primary production algorithm round robin (PPARR3) compares output from 24 models that estimate depth-integrated primary production from satellite measurements of ocean color, as well as seven general circulation models (GCMs) coupled with ecosystem or biogeochemical models. Here we compare the global primary production fields corresponding to eight months of 1998 and 1999 as estimated from common input fields of photosynthetically-available radiation (PAR), sea-surface temperature (SST), mixed-layer depth, and chlorophyll concentration. We also quantify the sensitivity of the ocean-color-based models to perturbations in their input variables. The pair-wise correlation between ocean-color models was used to cluster them into groups or related output, which reflect the regions and environmental conditions under which they respond differently. The groups do not follow model complexity with regards to wavelength or depth dependence, though they are related to the manner in which temperature is used to parameterize photosynthesis. Global average PP varies by a factor of two between models. The models diverged the most for the Southern Ocean, SST under 10 degrees C, and chlorophyll concentration exceeding 1 mg Chlm(-3). Based on the conditions under which the model results diverge most, we conclude that current ocean-color-based models are challenged by high-nutrient low-chlorophyll conditions, and extreme temperatures or chlorophyll concentrations. The GCM-based models predict comparable primary production to those based on ocean color: they estimate higher values in the Southern Ocean, at low SST, and in the equatorial band, while they estimate lower values in eutrophic regions (probably because the area of high chlorophyll concentrations is smaller in the GCMs). Further progress in primary production modeling requires improved understanding of the effect of temperature on photosynthesis and better parameterization of the maximum photosynthetic rate. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this action research study of my classroom of 10th grade Algebra II students, I investigated three related areas. First, I looked at how heterogeneous cooperative groups, where students in the group are responsible to present material, increase the number of students on task and the time on task when compared to individual practice. I noticed that their time on task might have been about the same, but they were communicating with each other mathematically. The second area I examined was the effect heterogeneous cooperative groups had on the teacher’s and the students’ verbal and nonverbal problem solving skills and understanding when compared to individual practice. At the end of the action research, students were questioning each other, and the instructor was answering questions only when the entire group had a question. The third area of data collection focused on what effect heterogeneous cooperative groups had on students’ listening skills when compared to individual practice. In the research I implemented individual quizzes and individual presentations. Both of these had a positive effect on listing in the groups. As a result of this research, I plan to continue implementing the round robin style of in- class practice with heterogeneous grouping and randomly selected individual presentations. For individual accountability I will continue the practice of individual quizzes one to two times a week.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this action research study of my classroom of 10th grade Algebra II students, I investigated three related areas. First, I looked at how heterogeneous cooperative groups, where students in the group are responsible to present material, increase the number of students on task and the time on task when compared to individual practice. I noticed that their time on task might have been about the same, but they were communicating with each other mathematically. The second area I examined was the effect heterogeneous cooperative groups had on the teacher’s and the students’ verbal and nonverbal problem solving skills and understanding when compared to individual practice. At the end of the action research, students were questioning each other, and the instructor was answering questions only when the entire group had a question. The third area of data collection focused on what effect heterogeneous cooperative groups had on students’ listening skills when compared to individual practice. In the research I implemented individual quizzes and individual presentations. Both of these had a positive effect on listing in the groups. As a result of this research, I plan to continue implementing the round robin style of in- class practice with heterogeneous grouping and randomly selected individual presentations. For individual accountability I will continue the practice of individual quizzes one to two times a week.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Point-of-care testing (POCT) remains under scrutiny by healthcare professionals because of its ill-tried, young history. POCT methods are being developed by a few major equipment companies based on rapid progress in informatics and nanotechnology. Issues as POCT quality control, comparability with standard laboratory procedures, standardisation, traceability and round robin testing are being left to hospitals. As a result, the clinical and operational benefits of POCT were first evident for patients on the operating table. For the management of cardiovascular surgery patients, POCT technology is an indispensable aid. Improvement of the technology has meant that clinical laboratory pathologists now recognise the need for POCT beyond their high-throughput areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There has been a continuous evolutionary process in asphalt pavement design. In the beginning it was crude and based on past experience. Through research, empirical methods were developed based on materials response to specific loading at the AASHO Road Test. Today, pavement design has progressed to a mechanistic-empirical method. This methodology takes into account the mechanical properties of the individual layers and uses empirical relationships to relate them to performance. The mechanical tests that are used as part of this methodology include dynamic modulus and flow number, which have been shown to correlate with field pavement performance. This thesis was based on a portion of a research project being conducted at Michigan Technological University (MTU) for the Wisconsin Department of Transportation (WisDOT). The global scope of this project dealt with the development of a library of values as they pertain to the mechanical properties of the asphalt pavement mixtures paved in Wisconsin. Additionally, a comparison with the current associated pavement design to that of the new AASHTO Design Guide was conducted. This thesis describes the development of the current pavement design methodology as well as the associated tests as part of a literature review. This report also details the materials that were sampled from field operations around the state of Wisconsin and their testing preparation and procedures. Testing was conducted on available round robin and three Wisconsin mixtures and the main results of the research were: The test history of the Superpave SPT (fatigue and permanent deformation dynamic modulus) does not affect the mean response for both dynamic modulus and flow number, but does increase the variability in the test results of the flow number. The method of specimen preparation, compacting to test geometry versus sawing/coring to test geometry, does not statistically appear to affect the intermediate and high temperature dynamic modulus and flow number test results. The 2002 AASHTO Design Guide simulations support the findings of the statistical analyses that the method of specimen preparation did not impact the performance of the HMA as a structural layer as predicted by the Design Guide software. The methodologies for determining the temperature-viscosity relationship as stipulated by Witczak are sensitive to the viscosity test temperatures employed. The increase in asphalt binder content by 0.3% was found to actually increase the dynamic modulus at the intermediate and high test temperature as well as flow number. This result was based the testing that was conducted and was contradictory to previous research and the hypothesis that was put forth for this thesis. This result should be used with caution and requires further review. Based on the limited results presented herein, the asphalt binder grade appears to have a greater impact on performance in the Superpave SPT than aggregate angularity. Dynamic modulus and flow number was shown to increase with traffic level (requiring an increase in aggregate angularity) and with a decrease in air voids and confirm the hypotheses regarding these two factors. Accumulated micro-strain at flow number as opposed to the use of flow number appeared to be a promising measure for comparing the quality of specimens within a specific mixture. At the current time the Design Guide and its associate software needs to be further improved prior to implementation by owner/agencies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work introduces the lines of research that the NGCPV project is pursuing and some of the first results obtained. Sponsored by the European Commission under the 7th Framework Program and NEDO (Japan) within the first collaborative call launched by both Bodies in the field of energy, NGCPV project aims at approaching the cost of the photovoltaic kWh to competitive prices in the framework of high concentration photovoltaics (CPV) by exploring the development and assessment of concentrator photovoltaic solar cells and modules, novel materials and new solar cell structures as well as methods and procedures to standardize measurement technology for concentrator photovoltaic cells and modules. More specific objectives we are facing are: (1) to manufacture a cell prototype with an efficiency of at least 45% and to undertake an experimental activity, (2) to manufacture a 35% module prototype and elaborate the roadmap towards the achievement of 40%, (3) to develop reliable characterization techniques for III-V materials and quantum structures, (4) to achieve and agreement within 5% in the characterization of CPV cells and modules in a round robin scheme, and (5) to evaluate the potential of new materials, devices technologies and quantum nanostructures to improve the efficiency of solar cells for CPV.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this Master’s Thesis a new Distributed Award Protocol (DAP) for robot communication and cooperation is presented. Task assignment (contract awarding) is done dynamically with contracts assigned to robots based upon the best bid received. Instead of having a manager and a contractor it is proposed a fully distributed bidding/awarding mechanism without a distinguished master. The best bidding robots are awarded with contract for execution. The contractors make decisions locally. This brings the following benefits: no communication bottleneck, low computational power requirement, increased robustness. DAP can handle multitasking. Tasks can be injected into system during the execution of already allocated tasks. As tasks have priorities, in the next cycle after taking into account actual bid parameters of all robots, tasks can be re-allocated. The aim is to minimize a global cost function which is a compromise between cost of task execution and cost of resources usage. Information about tasks and bid values is spread among robots with the use of a Round Robin Route, which is a novel solution proposed in this work. This method allows also identifying failed robots. Such failed robot is eliminated from the list of awarded robots and its replacement is found so the task is still executed by a team. If the failure of a robot was temporary (e.g. communication noise) and the robot can recover, it can again participate in the next bidding/awarding process. Using a bidding/awarding mechanism allows robots to dynamically relocate among tasks. This is also contributes to system robustness. DAP was evaluated through multiple experiments done in the multi-robot simulation system. Various scenarios were tested to check the idea of the main algorithm. Different failures of robots (communication failures, partial hardware malfunctions) were simulated and observations were made regarding how DAP recovers from them. Also the DAP flexibility to environment changes was watched. The experiments in the simulated environment confirmed the above features of DAP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Доклад, поместен в сборника на Националната конференция "Образованието в информационното общество", Пловдив, май, 2012 г.