161 resultados para log reduction
Resumo:
Reset/inhibitor nets are Petri nets extended with reset arcs and inhibitor arcs. These extensions can be used to model cancellation and blocking. A reset arc allows a transition to remove all tokens from a certain place when the transition fires. An inhibitor arc can stop a transition from being enabled if the place contains one or more tokens. While reset/inhibitor nets increase the expressive power of Petri nets, they also result in increased complexity of analysis techniques. One way of speeding up Petri net analysis is to apply reduction rules. Unfortunately, many of the rules defined for classical Petri nets do not hold in the presence of reset and/or inhibitor arcs. Moreover, new rules can be added. This is the first paper systematically presenting a comprehensive set of reduction rules for reset/inhibitor nets. These rules are liveness and boundedness preserving and are able to dramatically reduce models and their state spaces. It can be observed that most of the modeling languages used in practice have features related to cancellation and blocking. Therefore, this work is highly relevant for all kinds of application areas where analysis is currently intractable.
Resumo:
There is worldwide interest in reducing aircraft emissions. The difficulty of reducing emissions including water vapour, carbon dioxide (CO2) and oxides of nitrogen (NOx) is mainly due from the fact that a commercial aircraft is usually designed for a particular optimal cruise altitude but may be requested or required to operate and deviate at different altitude and speeds to archive a desired or commanded flight plan, resulting in increased emissions. This is a multi- disciplinary problem with multiple trade-offs such as optimising engine efficiency, minimising fuel burnt, minimise emissions while maintaining aircraft separation and air safety. This project presents the coupling of an advanced optimisation technique with mathematical models and algorithms for aircraft emission reduction through flight optimisation. Numerical results show that the method is able to capture a set of useful trade-offs between aircraft range and NOx, and mission fuel consumption and NOx. In addition, alternative cruise operating conditions including Mach and altitude that produce minimum NOx and CO2 (minimum mission fuel weight) are suggested.
Resumo:
Graphene, functionalized with oleylamine (OA) and soluble in non-polar organic solvents, was produced on a large scale with a high yield by combining the Hummers process for graphite oxidation, an amine-coupling process to make OA-functionalized graphite oxide (OA-GO), and a novel reduction process using trioctylphosphine (TOP). TOP acts as both a reducing agent and an aggregation-prevention surfactant in the reduction of OA-GO in 1,2-dichlorobenzene (DCB). The reduction of OA-GO is confirmed by X-ray photoelectron spectroscopy, Fourier-transform infrared spectroscopy, X-ray diffraction, thermogravimetric analysis, and Raman spectroscopy. The exfoliation of GO, OA GO, and OA-functionalized graphene (OA-G) is verified by atomic force microscopy. The conductivity of TOP-reduced OA G, which is deduced from the current–voltage characteristics of a vacuum-filtered thin film, shows that the reduction of functionalized GO by TOP is as effective as the reduction of GO by hydrazine.
Resumo:
Signal-degrading speckle is one factor that can reduce the quality of optical coherence tomography images. We demonstrate the use of a hierarchical model-based motion estimation processing scheme based on an affine-motion model to reduce speckle in optical coherence tomography imaging, by image registration and the averaging of multiple B-scans. The proposed technique is evaluated against other methods available in the literature. The results from a set of retinal images show the benefit of the proposed technique, which provides an improvement in signal-to-noise ratio of the square root of the number of averaged images, leading to clearer visual information in the averaged image. The benefits of the proposed technique are also explored in the case of ocular anterior segment imaging.
Resumo:
-
Resumo:
Gold nanoparticles supported on CeO2 were found to be efficient photocatalysts for three selective reductions of organic compounds at ambient temperatures, under irradiation of visible light; their reduction ability can be tuned by manipulating the irradiation wavelength.
Resumo:
Inspection of solder joints has been a critical process in the electronic manufacturing industry to reduce manufacturing cost, improve yield, and ensure project quality and reliability. This paper proposes the use of the Log-Gabor filter bank, Discrete Wavelet Transform and Discrete Cosine Transform for feature extraction of solder joint images on Printed Circuit Boards (PCBs). A distance based on the Mahalanobis Cosine metric is also presented for classification of five different types of solder joints. From the experimental results, this methodology achieved high accuracy and a well generalised performance. This can be an effective method to reduce cost and improve quality in the production of PCBs in the manufacturing industry.
Resumo:
The paper investigates a detailed Active Shock Control Bump Design Optimisation on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 to reduce cruise drag at transonic flow conditions using Evolutionary Algorithms (EAs) coupled to a robust design approach. For the uncertainty design parameters, the positions of boundary layer transition (xtr) and the coefficient of lift (Cl) are considered (250 stochastic samples in total). In this paper, two robust design methods are considered; the first approach uses a standard robust design method, which evaluates one design model at 250 stochastic conditions for uncertainty. The second approach is the combination of a standard robust design method and the concept of hierarchical (multi-population) sampling (250, 50, 15) for uncertainty. Numerical results show that the evolutionary optimization method coupled to uncertainty design techniques produces useful and reliable Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction. In addition,it also shows the benefit of using hierarchical robust method for detailed uncertainty design optimization.
Resumo:
Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.
Resumo:
This paper presents an analysis of the stream cipher Mixer, a bit-based cipher with structural components similar to the well-known Grain cipher and the LILI family of keystream generators. Mixer uses a 128-bit key and 64-bit IV to initialise a 217-bit internal state. The analysis is focused on the initialisation function of Mixer and shows that there exist multiple key-IV pairs which, after initialisation, produce the same initial state, and consequently will generate the same keystream. Furthermore, if the number of iterations of the state update function performed during initialisation is increased, then the number of distinct initial states that can be obtained decreases. It is also shown that there exist some distinct initial states which produce the same keystream, resulting in a further reduction of the effective key space