964 resultados para efficient algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an improved field weakening algorithm for synchronous reluctance motor (RSMs) drives. The proposed algorithm is robust to the variations in the machine d- and q-axes inductances. The transition between the maximum torque per ampere (MTPA), current and voltage limits as well as the maximum torque per flux (MTPF) trajectories is smooth. The proposed technique is combined with the direct torque control method to attain a high performance drive in the field weakening region. Simulation and experimental results are supplemented to verify the effectiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider online prediction problems where the loss between the prediction and the outcome is measured by the squared Euclidean distance and its generalization, the squared Mahalanobis distance. We derive the minimax solutions for the case where the prediction and action spaces are the simplex (this setup is sometimes called the Brier game) and the \ell_2 ball (this setup is related to Gaussian density estimation). We show that in both cases the value of each sub-game is a quadratic function of a simple statistic of the state, with coefficients that can be efficiently computed using an explicit recurrence relation. The resulting deterministic minimax strategy and randomized maximin strategy are linear functions of the statistic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of online algorithms have been developed that have small additional loss (regret) compared to the best “shifting expert”. In this model, there is a set of experts and the comparator is the best partition of the trial sequence into a small number of segments, where the expert of smallest loss is chosen in each segment. The regret is typically defined for worst-case data / loss sequences. There has been a recent surge of interest in online algorithms that combine good worst-case guarantees with much improved performance on easy data. A practically relevant class of easy data is the case when the loss of each expert is iid and the best and second best experts have a gap between their mean loss. In the full information setting, the FlipFlop algorithm by De Rooij et al. (2014) combines the best of the iid optimal Follow-The-Leader (FL) and the worst-case-safe Hedge algorithms, whereas in the bandit information case SAO by Bubeck and Slivkins (2012) competes with the iid optimal UCB and the worst-case-safe EXP3. We ask the same question for the shifting expert problem. First, we ask what are the simple and efficient algorithms for the shifting experts problem when the loss sequence in each segment is iid with respect to a fixed but unknown distribution. Second, we ask how to efficiently unite the performance of such algorithms on easy data with worst-case robustness. A particular intriguing open problem is the case when the comparator shifts within a small subset of experts from a large set under the assumption that the losses in each segment are iid.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We aim to design strategies for sequential decision making that adjust to the difficulty of the learning problem. We study this question both in the setting of prediction with expert advice, and for more general combinatorial decision tasks. We are not satisfied with just guaranteeing minimax regret rates, but we want our algorithms to perform significantly better on easy data. Two popular ways to formalize such adaptivity are second-order regret bounds and quantile bounds. The underlying notions of 'easy data', which may be paraphrased as "the learning problem has small variance" and "multiple decisions are useful", are synergetic. But even though there are sophisticated algorithms that exploit one of the two, no existing algorithm is able to adapt to both. In this paper we outline a new method for obtaining such adaptive algorithms, based on a potential function that aggregates a range of learning rates (which are essential tuning parameters). By choosing the right prior we construct efficient algorithms and show that they reap both benefits by proving the first bounds that are both second-order and incorporate quantiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the shift toward energy efficient vehicles (EEVs) in recent years, it is important that the effects of this transition are properly examined. This paper investigates some of these effects by analyzing annual kilometers traveled (AKT) of private vehicle owners in Stockholm in 2008. The difference in emissions associated with EEV adoption is estimated, along with the effect of a congestion-pricing exemption for EEVs on vehicle usage. Propensity score matching is used to compare AKT rates of different vehicle owner groups based on the treatments of: EEV ownership and commuting across the cordon, controlling for confounding factors such as demographics. Through this procedure, rebound effects are identified, with some EEV owners found to have driven up to 12.2% further than non-EEV owners. Although some of these differences could be attributed to the congestion-pricing exemption, the results were not statistically significant. Overall, taking into account lifecycle emissions of each fuel type, average EEV emissions were 50.5% less than average non-EEV emissions, with this reduction in emissions offset by 2.0% due to rebound effects. Although it is important for policy-makers to consider the potential for unexpected negative effects in similar transitions, the overall benefit of greatly reduced emissions appears to outweigh any rebound effects present in this case study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and maintenance of large and complex ontologies are often time-consuming and error-prone. Thus, automated ontology learning and revision have attracted intensive research interest. In data-centric applications where ontologies are designed or automatically learnt from the data, when new data instances are added that contradict to the ontology, it is often desirable to incrementally revise the ontology according to the added data. This problem can be intuitively formulated as the problem of revising a TBox by an ABox. In this paper we introduce a model-theoretic approach to such an ontology revision problem by using a novel alternative semantic characterisation of DL-Lite ontologies. We show some desired properties for our ontology revision. We have also developed an algorithm for reasoning with the ontology revision without computing the revision result. The algorithm is efficient as its computational complexity is in coNP in the worst case and in PTIME when the size of the new data is bounded.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction of dynamic pricing in present retail market, considerably affects customers with an increased cost of energy consumption. Therefore, customers are enforced to control their loads according to price variation. This paper proposes a new technique of Home Energy Management, which helps customers to minimize their cost of energy consumption by appropriately controlling their loads. Thermostatically Controllable Appliances (TCAs) such as air conditioner and water heater are focused in this study, as they consume more than 50% of the total household energy consumption. The control process includes stochastic dynamic programming, which incorporated uncertainties in price and demand variation. It leads to an accurate selection of appliance settings. It is followed by a real time control of selected appliances with its optimal settings. Temperature set points of TCAs are adjusted based on price droop which is a reflection of actual cost of energy consumption. Customer satisfaction is maintained within limits using constraint optimization. It is showed that considerable energy savings is achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smart Card Automated Fare Collection (AFC) data has been extensively exploited to understand passenger behavior, passenger segment, trip purpose and improve transit planning through spatial travel pattern analysis. The literature has been evolving from simple to more sophisticated methods such as from aggregated to individual travel pattern analysis, and from stop-to-stop to flexible stop aggregation. However, the issue of high computing complexity has limited these methods in practical applications. This paper proposes a new algorithm named Weighted Stop Density Based Scanning Algorithm with Noise (WS-DBSCAN) based on the classical Density Based Scanning Algorithm with Noise (DBSCAN) algorithm to detect and update the daily changes in travel pattern. WS-DBSCAN converts the classical quadratic computation complexity DBSCAN to a problem of sub-quadratic complexity. The numerical experiment using the real AFC data in South East Queensland, Australia shows that the algorithm costs only 0.45% in computation time compared to the classical DBSCAN, but provides the same clustering results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, demand for automated Gas metal arc welding (GMAW) is growing and consequently need for intelligent systems is increased to ensure the accuracy of the procedure. To date, welding pool geometry has been the most used factor in quality assessment of intelligent welding systems. But, it has recently been found that Mahalanobis Distance (MD) not only can be used for this purpose but also is more efficient. In the present paper, Artificial Neural Networks (ANN) has been used for prediction of MD parameter. However, advantages and disadvantages of other methods have been discussed. The Levenberg–Marquardt algorithm was found to be the most effective algorithm for GMAW process. It is known that the number of neurons plays an important role in optimal network design. In this work, using trial and error method, it has been found that 30 is the optimal number of neurons. The model has been investigated with different number of layers in Multilayer Perceptron (MLP) architecture and has been shown that for the aim of this work the optimal result is obtained when using MLP with one layer. Robustness of the system has been evaluated by adding noise into the input data and studying the effect of the noise in prediction capability of the network. The experiments for this study were conducted in an automated GMAW setup that was integrated with data acquisition system and prepared in a laboratory for welding of steel plate with 12 mm in thickness. The accuracy of the network was evaluated by Root Mean Squared (RMS) error between the measured and the estimated values. The low error value (about 0.008) reflects the good accuracy of the model. Also the comparison of the predicted results by ANN and the test data set showed very good agreement that reveals the predictive power of the model. Therefore, the ANN model offered in here for GMA welding process can be used effectively for prediction goals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lipopolysaccharide is a major immunogenic structure for the pathogen Yersinia pseudotuberculosis, which contains the O-specific polysaccharide (OPS) that is presented on the cell surface. The OPS contains many repeats of the oligosaccharide O-unit and exhibits a preferred modal chain length that has been shown to be crucial for cell protection in Yersinia. It is well established that the Wzz protein determines the preferred chain length of the OPS, and in its absence, the polymerization of O units by the Wzy polymerase is uncontrolled. However, for Y. pseudotuberculosis, a wzz mutation has never been described. In this study, we examine the effect of Wzz loss in Y. pseudotuberculosis serotype O:2a and compare the lipopolysaccharide chain-length profile to that of Escherichia coli serotype O111. In the absence of Wzz, the lipopolysaccharides of the two species showed significant differences in Wzy polymerization. Yersinia pseudotuberculosis O:2a exhibited only OPS with very short chain lengths, which is atypical of wzz-mutant phenotypes that have been observed for other species. We hypothesise that the Wzy polymerase of Y. pseudotuberculosis O:2a has a unique default activity in the absence of the Wzz, revealing the requirement of Wzz to drive O-unit polymerization to greater lengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research is to assess daylight performance of buildings with climatic responsive envelopes with complex geometry that integrates shading devices in the façade. To this end two case studies are chosen due to their complex geometries and integrated daylight devices. The effect of different parameters of the daylight devices is analysed through Climate base daylight metrics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Terra Preta is a site-specific bio-energy project which aims to create a synergy between the public and the pre-existing engineered landscape of Freshkills Park on Staten Island, New York. The project challenges traditional paradigms of public space by proposing a dynamic and ever-changing landscape. The initiative allows the publuc to self-organise the landscape and to engage in 'algorithmic processes' of growth, harvest and space creation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an efficient noniterative method for distribution state estimation using conditional multivariate complex Gaussian distribution (CMCGD). In the proposed method, the mean and standard deviation (SD) of the state variables is obtained in one step considering load uncertainties, measurement errors, and load correlations. In this method, first the bus voltages, branch currents, and injection currents are represented by MCGD using direct load flow and a linear transformation. Then, the mean and SD of bus voltages, or other states, are calculated using CMCGD and estimation of variance method. The mean and SD of pseudo measurements, as well as spatial correlations between pseudo measurements, are modeled based on the historical data for different levels of load duration curve. The proposed method can handle load uncertainties without using time-consuming approaches such as Monte Carlo. Simulation results of two case studies, six-bus, and a realistic 747-bus distribution network show the effectiveness of the proposed method in terms of speed, accuracy, and quality against the conventional approach.