38 resultados para penalty-based aggregation functions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emergence of pen-based mobile devices such as PDAs and tablet PCs provides a new way to input mathematical expressions to computer by using handwriting which is much more natural and efficient for entering mathematics. This paper proposes a web-based handwriting mathematics system, called WebMath, for supporting mathematical problem solving. The proposed WebMath system is based on client-server architecture. It comprises four major components: a standard web server, handwriting mathematical expression editor, computation engine and web browser with Ajax-based communicator. The handwriting mathematical expression editor adopts a progressive recognition approach for dynamic recognition of handwritten mathematical expressions. The computation engine supports mathematical functions such as algebraic simplification and factorization, and integration and differentiation. The web browser provides a user-friendly interface for accessing the system using advanced Ajax-based communication. In this paper, we describe the different components of the WebMath system and its performance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose and investigate a method for the stable determination of a harmonic function from knowledge of its value and its normal derivative on a part of the boundary of the (bounded) solution domain (Cauchy problem). We reformulate the Cauchy problem as an operator equation on the boundary using the Dirichlet-to-Neumann map. To discretize the obtained operator, we modify and employ a method denoted as Classic II given in [J. Helsing, Faster convergence and higher accuracy for the Dirichlet–Neumann map, J. Comput. Phys. 228 (2009), pp. 2578–2576, Section 3], which is based on Fredholm integral equations and Nyström discretization schemes. Then, for stability reasons, to solve the discretized integral equation we use the method of smoothing projection introduced in [J. Helsing and B.T. Johansson, Fast reconstruction of harmonic functions from Cauchy data using integral equation techniques, Inverse Probl. Sci. Eng. 18 (2010), pp. 381–399, Section 7], which makes it possible to solve the discretized operator equation in a stable way with minor computational cost and high accuracy. With this approach, for sufficiently smooth Cauchy data, the normal derivative can also be accurately computed on the part of the boundary where no data is initially given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of stable determination of a harmonic function from knowledge of the solution and its normal derivative on a part of the boundary of the (bounded) solution domain. The alternating method is a procedure to generate an approximation to the harmonic function from such Cauchy data and we investigate a numerical implementation of this procedure based on Fredholm integral equations and Nyström discretization schemes, which makes it possible to perform a large number of iterations (millions) with minor computational cost (seconds) and high accuracy. Moreover, the original problem is rewritten as a fixed point equation on the boundary, and various other direct regularization techniques are discussed to solve that equation. We also discuss how knowledge of the smoothness of the data can be used to further improve the accuracy. Numerical examples are presented showing that accurate approximations of both the solution and its normal derivative can be obtained with much less computational time than in previous works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel market-based method, inspired by retail markets, for resource allocation in fully decentralised systems where agents are self-interested. Our market mechanism requires no coordinating node or complex negotiation. The stability of outcome allocations, those at equilibrium, is analysed and compared for three buyer behaviour models. In order to capture the interaction between self-interested agents, we propose the use of competitive coevolution. Our approach is both highly scalable and may be tuned to achieve specified outcome resource allocations. We demonstrate the behaviour of our approach in simulation, where evolutionary market agents act on behalf of service providing nodes to adaptively price their resources over time, in response to market conditions. We show that this leads the system to the predicted outcome resource allocation. Furthermore, the system remains stable in the presence of small changes in price, when buyers' decision functions degrade gracefully. © 2009 The Author(s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale disasters are constantly occurring around the world, and in many cases evacuation of regions of city is needed. ‘Operational Research/Management Science’ (OR/MS) has been widely used in emergency planning for over five decades. Warning dissemination, evacuee transportation and shelter management are three ‘Evacuation Support Functions’ (ESF) generic to many hazards. This thesis has adopted a case study approach to illustrate the importance of integrated approach of evacuation planning and particularly the role of OR/MS models. In the warning dissemination phase, uncertainty in the household’s behaviour as ‘warning informants’ has been investigated along with uncertainties in the warning system. An agentbased model (ABM) was developed for ESF-1 with households as agents and ‘warning informants’ behaviour as the agent behaviour. The model was used to study warning dissemination effectiveness under various conditions of the official channel. In the transportation phase, uncertainties in the household’s behaviour such as departure time (a function of ESF-1), means of transport and destination have been. Households could evacuate as pedestrians, using car or evacuation buses. An ABM was developed to study the evacuation performance (measured in evacuation travel time). In this thesis, a holistic approach for planning the public evacuation shelters called ‘Shelter Information Management System’ (SIMS) has been developed. A generic allocation framework of was developed to available shelter capacity to the shelter demand by considering the evacuation travel time. This was formulated using integer programming. In the sheltering phase, the uncertainty in household shelter choices (either nearest/allocated/convenient) has been studied for its impact on allocation policies using sensitivity analyses. Using analyses from the models and detailed examination of household states from ‘warning to safety’, it was found that the three ESFs though sequential in time, however have lot of interdependencies from the perspective of evacuation planning. This thesis has illustrated an OR/MS based integrated approach including and beyond single ESF preparedness. The developed approach will help in understanding the inter-linkages of the three evacuation phases and preparing a multi-agency-based evacuation planning evacuation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An injection locking-based pump recovery system for phase-sensitive amplified links, capable of handling 40 dB effective span loss, is demonstrated. Measurements with 10 GBd DQPSK signals show penalty-free recovery of a pump wave, phase modulated with two sinusoidal RF-tones at 0.1 GHz and 0.3 GHz, with 64 dB amplification. The operating power limit for the pump recovery system is experimentally investigated and is governed by the noise transfer and phase modulation transfer characteristics of the injection-locked laser. The corresponding link penalties are explained and quantified. This system enables, for the first time, WDM compatible phase-sensitive amplified links over significant lengths. © 2013 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose to increase residual carrier frequency offset tolerance based on short perfect reconstruction pulse shaping for coherent optical-orthogonal frequency division multiplexing. The proposed method suppresses the residual carrier frequency offset induced penalty at the receiver, without requiring any additional overhead and exhaustive signal processing. The Q-factor improvement contributed by the proposed method is 1.6 dB and 1.8 dB for time-frequency localization maximization and out-of-band energy minimization pulse shapes, respectively. Finally, the transmission span gain under the influence of residual carrier frequency offset is ̃62% with out-of-band energy minimization pulse shape. © 2014 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aggregation and caking of particles are common severe problems in many operations and processing of granular materials, where granulated sugar is an important example. Prevention of aggregation and caking of granular materials requires a good understanding of moisture migration and caking mechanisms. In this paper, the modeling of solid bridge formation between particles is introduced, based on moisture migration of atmospheric moisture into containers packed with granular materials through vapor evaporation and condensation. A model for the caking process is then developed, based on the growth of liquid bridges (during condensation), and their hardening and subsequent creation of solid bridges (during evaporation). The predicted caking strengths agree well with some available experimental data on granulated sugar under storage conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In multicriteria decision problems many values must be assigned, such as the importance of the different criteria and the values of the alternatives with respect to subjective criteria. Since these assignments are approximate, it is very important to analyze the sensitivity of results when small modifications of the assignments are made. When solving a multicriteria decision problem, it is desirable to choose a decision function that leads to a solution as stable as possible. We propose here a method based on genetic programming that produces better decision functions than the commonly used ones. The theoretical expectations are validated by case studies. © 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to study the effect of washcoat composition on lean NOx trap (LNT) aging characteristics, fully formulated monolithic LNT catalysts containing varying amounts of La-stabilized CeO2 (5 wt% La2O3) or CeO2-ZrO2 (Ce:Zr = 70:30) were subjected to accelerated aging on a bench reactor. Subsequent catalyst evaluation revealed that aging resulted in deterioration of the NOx storage, NOx release and NOx reduction functions, whereas the observation of lean phase NO2 slip for all of the aged catalysts indicated that LNT performance was not limited by the kinetics of NO oxidation. After aging, all of the catalysts showed increased selectivity to NH3 in the temperature range 250–450 °C. TEM, H2 chemisorption, XPS and elemental analysis data revealed two main changes which can explain the degradation in LNT performance. First, residual sulfur in the catalysts, present as BaSO4, decreased catalyst NOx storage capacity. Second, sintering of the precious metals in the washcoat was observed, which can be expected to decrease the rate of NOx reduction. Additionally, sintering is hypothesized to result in segregation of the precious metal and Ba phases, resulting in less efficient NOx spillover from Pt to Ba during NOx adsorption, as well as decreased rates of reductant spillover from Pt to Ba and reverse NOx spillover during catalyst regeneration. Spectacular improvement in LNT durability was observed for catalysts containing CeO2 or CeO2-ZrO2 relative to their non-ceria containing analog. This was attributed to (i) the ability of ceria to participate in NOx storage/reduction as a supplement to the main Ba NOx storage component; (ii) the fact that Pt and CeO2(-ZrO2) are not subject to phase segregation; and (iii) the ability of ceria to trap sulfur, resulting in decreased sulfur accumulation on the Ba component.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behaviour of self adaptive systems can be emergent, which means that the system’s behaviour may be seen as unexpected by its customers and its developers. Therefore, a self-adaptive system needs to garner confidence in its customers and it also needs to resolve any surprise on the part of the developer during testing and maintenance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system’s behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, we propose the use of goal-based requirements models at runtime to offer self-explanation of how a system is meeting its requirements. We demonstrate the analysis of run-time requirements models to yield a self-explanation codified in a domain specific language, and discuss possible future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A synchronization scheme for a two-channel phase sensitive amplifier is implemented based on the injection-locking of single InP quantum-dash mode-locked laser. Error free performance with penalty <1 dB is demonstrated for both channels. © 2011 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An injection-locking-based pump recovery system for phase-sensitively amplified links is proposed and studied experimentally. Measurements with 10 Gbaud DQPSK signals show penalty-free recovery of 0.8 GHz FWHM bandwidth pump with 63 dB overall amplification. © 2012 OSA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a proliferation of categorization schemes in the scientific literature that have mostly been developed from psychologists’ understanding of the nature of linguistic interactions. This has a led to problems in defining question types used by interviewers. Based on the principle that the overarching purpose of an interview is to elicit information and that questions can function both as actions in their own right and as vehicles for other actions, a Conversational Analysis approach was used to analyse a small number of police interviews. The analysis produced a different categorization of question types and, in particular, the conversational turns fell into two functional types: (i) Topic Initiation Questions and (ii) Topic Facilitation Questions. We argue that forensic interviewing requires a switch of focus from the ‘words’ used by interviewers in question types to the ‘function’ of conversational turns within interviews.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.