966 resultados para FAST ALGORITHM


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document describes algorithms based on Elliptic Cryptography (ECC) for use within the Secure Shell (SSH) transport protocol. In particular, it specifies Elliptic Curve Diffie-Hellman (ECDH) key agreement, Elliptic Curve Menezes-Qu-Vanstone (ECMQV) key agreement, and Elliptic Curve Digital Signature Algorithm (ECDSA) for use in the SSH Transport Layer protocol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of semantic grid, QoS-based Web service composition is an important problem. In semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the composition consider not only QoS properties of Web services, but also inter service dependencies and conflicts which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address the Web service composition optimization problem in the presence of domain constraints and inter service dependencies and conflicts. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An algorithm based on the concept of Kalman filtering is proposed in this paper for the estimation of power system signal attributes, like amplitude, frequency and phase angle. This technique can be used in protection relays, digital AVRs, DSTATCOMs, FACTS and other power electronics applications. Furthermore this algorithm is particularly suitable for the integration of distributed generation sources to power grids when fast and accurate detection of small variations of signal attributes are needed. Practical considerations such as the effect of noise, higher order harmonics, and computational issues of the algorithm are considered and tested in the paper. Several computer simulations are presented to highlight the usefulness of the proposed approach. Simulation results show that the proposed technique can simultaneously estimate the signal attributes, even if it is highly distorted due to the presence of non-linear loads and noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unmanned Aerial Vehicles (UAVs) are emerging as an ideal platform for a wide range of civil applications such as disaster monitoring, atmospheric observation and outback delivery. However, the operation of UAVs is currently restricted to specially segregated regions of airspace outside of the National Airspace System (NAS). Mission Flight Planning (MFP) is an integral part of UAV operation that addresses some of the requirements (such as safety and the rules of the air) of integrating UAVs in the NAS. Automated MFP is a key enabler for a number of UAV operating scenarios as it aids in increasing the level of onboard autonomy. For example, onboard MFP is required to ensure continued conformance with the NAS integration requirements when there is an outage in the communications link. MFP is a motion planning task concerned with finding a path between a designated start waypoint and goal waypoint. This path is described with a sequence of 4 Dimensional (4D) waypoints (three spatial and one time dimension) or equivalently with a sequence of trajectory segments (or tracks). It is necessary to consider the time dimension as the UAV operates in a dynamic environment. Existing methods for generic motion planning, UAV motion planning and general vehicle motion planning cannot adequately address the requirements of MFP. The flight plan needs to optimise for multiple decision objectives including mission safety objectives, the rules of the air and mission efficiency objectives. Online (in-flight) replanning capability is needed as the UAV operates in a large, dynamic and uncertain outdoor environment. This thesis derives a multi-objective 4D search algorithm entitled Multi- Step A* (MSA*) based on the seminal A* search algorithm. MSA* is proven to find the optimal (least cost) path given a variable successor operator (which enables arbitrary track angle and track velocity resolution). Furthermore, it is shown to be of comparable complexity to multi-objective, vector neighbourhood based A* (Vector A*, an extension of A*). A variable successor operator enables the imposition of a multi-resolution lattice structure on the search space (which results in fewer search nodes). Unlike cell decomposition based methods, soundness is guaranteed with multi-resolution MSA*. MSA* is demonstrated through Monte Carlo simulations to be computationally efficient. It is shown that multi-resolution, lattice based MSA* finds paths of equivalent cost (less than 0.5% difference) to Vector A* (the benchmark) in a third of the computation time (on average). This is the first contribution of the research. The second contribution is the discovery of the additive consistency property for planning with multiple decision objectives. Additive consistency ensures that the planner is not biased (which results in a suboptimal path) by ensuring that the cost of traversing a track using one step equals that of traversing the same track using multiple steps. MSA* mitigates uncertainty through online replanning, Multi-Criteria Decision Making (MCDM) and tolerance. Each trajectory segment is modeled with a cell sequence that completely encloses the trajectory segment. The tolerance, measured as the minimum distance between the track and cell boundaries, is the third major contribution. Even though MSA* is demonstrated for UAV MFP, it is extensible to other 4D vehicle motion planning applications. Finally, the research proposes a self-scheduling replanning architecture for MFP. This architecture replicates the decision strategies of human experts to meet the time constraints of online replanning. Based on a feedback loop, the proposed architecture switches between fast, near-optimal planning and optimal planning to minimise the need for hold manoeuvres. The derived MFP framework is original and shown, through extensive verification and validation, to satisfy the requirements of UAV MFP. As MFP is an enabling factor for operation of UAVs in the NAS, the presented work is both original and significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The following exegesis will detail the key advantages and disadvantages of combining a traditional talk show genre with a linear documentary format using a small production team and a limited budget in a fast turnaround weekly environment. It will deal with the Australian Broadcasting Corporation series Talking Heads, broadcast weekly in the early evening schedule for the network at 18.30 with the presenter Peter Thompson. As Executive Producer for the programme at its inception I was responsible for setting it up for the ABC in Brisbane, a role that included selecting most of the team to work on the series and commissioning the music, titles and all other aspects required to bring the show to the screen. What emerged when producing this generic hybrid will be examined at length, including: „h The talk show/documentary hybrid format needs longer than 26¡¦30¡¨ to be entirely successful. „h The type of presenter ideally suited to the talk show/documentary format requires someone who is genuinely interested in their guests and flexible enough to maintain the format against tangential odds. „h The use of illustrative footage shot in a documentary style narrative improves the talk show format. iv „h The fast turnaround of the talk show/documentary hybrid puts tremendous pressure on the time frames for archive research and copyright clearance and therefore needs to be well-resourced. „h In a fast turnaround talk show/documentary format the field components are advantageous but require very low shooting ratios to be sustainable. „h An intimate set works best for a talk show hybrid like this. Also submitted are two DVDs of recordings of programmes I produced and directed from the first and third series. These are for consideration in the practical component of this project and reflect the changes that I made to the series.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study addresses calls in the literature for the external validation of Western-based marketing concepts and theory in the East. Using DINESERV, the relationships between service quality, overall service quality perceptions, customer satisfaction, and repurchase intentions in the Malaysian fast food industry are examined. A questionnaire was administered to Malaysian fast food consumers at a large university, resulting in findings that support the five-dimensional nature of DINESERV and three of four proposed hypotheses. This study contributes to knowledge of service quality in developing countries and is the first to examine DINESERV in the Malaysian fast food industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes experiments conducted in order to simultaneously tune 15 joints of a humanoid robot. Two Genetic Algorithm (GA) based tuning methods were developed and compared against a hand-tuned solution. The system was tuned in order to minimise tracking error while at the same time achieve smooth joint motion. Joint smoothness is crucial for the accurate calculation of online ZMP estimation, a prerequisite for a closedloop dynamically stable humanoid walking gait. Results in both simulation and on a real robot are presented, demonstrating the superior smoothness performance of the GA based methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents a fast and robust stereo object recognition method. The method is currently unable to identify the rotation of objects. This makes it very good at locating spheres which are rotationally independent. Approximate methods for located non-spherical objects have been developed. Fundamental to the method is that the correspondence problem is solved using information about the dimensions of the object being located. This is in contrast to previous stereo object recognition systems where the scene is first reconstructed by point matching techniques. The method is suitable for real-time application on low-power devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To examine the influence of two different fast-start pacing strategies on performance and oxygen consumption (V˙O2) during cycle ergometer time trials lasting ∼5 min. Methods: Eight trained male cyclists performed four cycle ergometer time trials whereby the total work completed (113 ± 11.5 kJ; mean ± SD) was identical to the better of two 5-min self-paced familiarization trials. During the performance trials, initial power output was manipulated to induce either an all-out or a fast start. Power output during the first 60 s of the fast-start trial was maintained at 471.0 ± 48.0 W, whereas the all-out start approximated a maximal starting effort for the first 15 s (mean power: 753.6 ± 76.5 W) followed by 45 s at a constant power output (376.8 ± 38.5 W). Irrespective of starting strategy, power output was controlled so that participants would complete the first quarter of the trial (28.3 ± 2.9 kJ) in 60 s. Participants performed two trials using each condition, with their fastest time trial compared. Results: Performance time was significantly faster when cyclists adopted the all-out start (4 min 48 s ± 8 s) compared with the fast start (4 min 51 s ± 8 s; P < 0.05). The first-quarter V˙O2 during the all-out start trial (3.4 ± 0.4 L·min-1) was significantly higher than during the fast-start trial (3.1 ± 0.4 L·min-1; P < 0.05). After removal of an outlier, the percentage increase in first-quarter V˙O2 was significantly correlated (r = -0.86, P < 0.05) with the relative difference in finishing time. Conclusions: An all-out start produces superior middle distance cycling performance when compared with a fast start. The improvement in performance may be due to a faster V˙O2 response rather than time saved due to a rapid acceleration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing is a latest new computing paradigm where applications, data and IT services are provided over the Internet. Cloud computing has become a main medium for Software as a Service (SaaS) providers to host their SaaS as it can provide the scalability a SaaS requires. The challenges in the composite SaaS placement process rely on several factors including the large size of the Cloud network, SaaS competing resource requirements, SaaS interactions between its components and SaaS interactions with its data components. However, existing applications’ placement methods in data centres are not concerned with the placement of the component’s data. In addition, a Cloud network is much larger than data center networks that have been discussed in existing studies. This paper proposes a penalty-based genetic algorithm (GA) to the composite SaaS placement problem in the Cloud. We believe this is the first attempt to the SaaS placement with its data in Cloud provider’s servers. Experimental results demonstrate the feasibility and the scalability of the GA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Web service composition is an important problem in web service based systems. It is about how to build a new value-added web service using existing web services. A web service may have many implementations, all of which have the same functionality, but may have different QoS values. Thus, a significant research problem in web service composition is how to select a web service implementation for each of the web services such that the composite web service gives the best overall performance. This is so-called optimal web service selection problem. There may be mutual constraints between some web service implementations. Sometimes when an implementation is selected for one web service, a particular implementation for another web service must be selected. This is so called dependency constraint. Sometimes when an implementation for one web service is selected, a set of implementations for another web service must be excluded in the web service composition. This is so called conflict constraint. Thus, the optimal web service selection is a typical constrained ombinatorial optimization problem from the computational point of view. This paper proposes a new hybrid genetic algorithm for the optimal web service selection problem. The hybrid genetic algorithm has been implemented and evaluated. The evaluation results have shown that the hybrid genetic algorithm outperforms other two existing genetic algorithms when the number of web services and the number of constraints are large.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Composite web services comprise several component web services. When a composite web service is executed centrally, a single web service engine is responsible for coordinating the execution of the components, which may create a bottleneck and degrade the overall throughput of the composite service when there are a large number of service requests. Potentially this problem can be handled by decentralizing execution of the composite web service, but this raises the issue of how to partition a composite service into groups of component services such that each group can be orchestrated by its own execution engine while ensuring acceptable overall throughput of the composite service. Here we present a novel penalty-based genetic algorithm to solve the composite web service partitioning problem. Empirical results show that our new algorithm outperforms existing heuristic-based solutions.