35 resultados para Generalized Monotone Bifunctions

em Deakin Research Online - Australia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attribute-based signature (ABS) is a novel cryptographic primitive, which can make the signing party sign a message with fine-grained control over identifying information. ABS only reveals the fact that the verified message must be signed by a user with a set of attributes satisfying a predicate. Thus, ABS can hide any identifying information and make fine-grained control on signing. Presently, many attribute-based signature schemes have been proposed, but most of them are not very efficient. Maji et al. recently presented a complete definition and construction about ABS for monotone predicates and showed three instantiations under their framework for ABS. Although the most practical one of their instantiations is efficient, the instantiation is constructed in the generic group model and has been proved to be insecure. Then, Okamoto et al. proposed an attribute-based signature scheme in the standard model, which can support generalized non-monotone predicates over access structure. However, their scheme is not efficient in practice. In this paper, we present a framework for ABS and show a detailed security model for ABS. Under our framework, we present an attribute-based signature scheme for monotone predicates in the standard model, where we choose the Waters’ signature scheme as the prototype of our attribute-based signature scheme. Compared with the Maji’s scheme in the generic group model, the proposed scheme is constructed in the standard model. Furthermore, compared with the Okamoto’s scheme, the proposed scheme is more efficient by decreasing the computation cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need for monotone approximation of scattered data often arises in many problems of regression, when the monotonicity is semantically important. One such domain is fuzzy set theory, where membership functions and aggregation operators are order preserving. Least squares polynomial splines provide great flexbility when modeling non-linear functions, but may fail to be monotone. Linear restrictions on spline coefficients provide necessary and sufficient conditions for spline monotonicity. The basis for splines is selected in such a way that these restrictions take an especially simple form. The resulting non-negative least squares problem can be solved by a variety of standard proven techniques. Additional interpolation requirements can also be imposed in the same framework. The method is applied to fuzzy systems, where membership functions and aggregation operators are constructed from empirical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we generalize Besag's pseudo-likelihood function for spatial statistical models on a region of a lattice. The correspondingly defined maximum generalized pseudo-likelihood estimates (MGPLEs) are natural extensions of Besag's maximum pseudo-likelihood estimate (MPLE). The MGPLEs connect the MPLE and the maximum likelihood estimate. We carry out experimental calculations of the MGPLEs for spatial processes on the lattice. These simulation results clearly show better performances of the MGPLEs than the MPLE, and the performances of differently defined MGPLEs are compared. These are also illustrated by the application to two real data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses identification of parameters of generalized ordered weighted averaging (GOWA) operators from empirical data. Similarly to ordinary OWA operators, GOWA are characterized by a vector of weights, as well as the power to which the arguments are raised. We develop optimization techniques which allow one to fit such operators to the observed data. We also generalize these methods for functional defined GOWA and generalized Choquet integral based aggregation operators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One major difficulty frustrating the application of linear causal models is that they are not easily adapted to cope with discrete data. This is unfortunate since most real problems involve both continuous and discrete variables. In this paper, we consider a class of graphical models which allow both continuous and discrete variables, and propose the parameter estimation method and a structure discovery algorithm based on Minimum Message Length and parameter estimation. Experimental results are given to demonstrate the potential for the application of this method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aggregation operators model various operations on fuzzy sets, such as conjunction, disjunction and averaging. Recently double aggregation operators have been introduced; they model multistep aggregation process. The choice of aggregation operators depends on the particular problem, and can be done by fitting the operator to empirical data. We examine fitting general aggregation operators by using a new method of monotone Lipschitz smoothing. We study various boundary conditions and constraints which determine specific types of aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To assess from a health sector perspective the incremental cost-effectiveness of interventions for generalized anxiety disorder (cognitive behavioural therapy [CBT] and serotonin and noradrenaline reuptake inhibitors [SNRIs]) and panic disorder (CBT, selective serotonin reuptake inhibitors [SSRIs] and tricyclic antidepressants [TCAs]).

Method: The health benefit is measured as a reduction in disability-adjusted life years (DALYs), based on effect size calculations from meta-analyses of randomised controlled trials. An assessment on second stage filters ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') is also undertaken to incorporate additional factors that impact on resource allocation decisions. Costs and benefits are calculated for a period of one year for the eligible population (prevalent cases of generalized anxiety disorder/panic disorder identified in the National Survey of Mental Health and Wellbeing, extrapolated to the Australian population in the year 2000 for those aged 18 years and older). Simulation modelling techniques are used to present 95% uncertainty intervals (UI) around the incremental cost-effectiveness ratios (ICERs).

Results: Compared to current practice, CBT by a psychologist on a public salary is the most cost-effective intervention for both generalized anxiety disorder (A$6900/DALY saved; 95% UI A$4000 to A$12 000) and panic disorder (A$6800/DALY saved; 95% UI A$2900 to A$15 000). Cognitive behavioural therapy results in a greater total health benefit than the drug interventions for both anxiety disorders, although equity and feasibility concerns for CBT interventions are also greater.

Conclusions: Cognitive behavioural therapy is the most effective and cost-effective intervention for generalized anxiety disorder and panic disorder. However, its implementation would require policy change to enable more widespread access to a sufficient number of trained therapists for the treatment of anxiety disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In epidemiologic studies, researchers often need to establish a nonlinear exposure-response relation between a continuous risk factor and a health outcome. Furthermore, periodic interviews are often conducted to take repeated measurements from an individual. The authors proposed to use fractional polynomial models to jointly analyze the effects of 2 continuous risk factors on a health outcome. This method was applied to an analysis of the effects of age and cumulative fluoride exposure on forced vital capacity in a longitudinal study of lung function carried out among aluminum workers in Australia (1995-2003). Generalized estimating equations and the quasi-likelihood under the independence model criterion were used. The authors found that the second-degree fractional polynomial models for age and fluoride fitted the data best. The best model for age was robust across different models for fluoride, and the best model for fluoride was also robust. No evidence was found to suggest that the effects of smoking and cumulative fluoride exposure on change in forced vital capacity over time were significant. The trend 1 model, which included the unexposed persons in the analysis of trend in forced vital capacity over tertiles of fluoride exposure, did not fit the data well, and caution should be exercised when this method is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Citation matching is the problem of extracting bibliographic records from citation lists in technical papers, and merging records that represent the same publication. Generally, there are three types of data- sets in citation matching, i.e., sparse, dense and hybrid types. Typical approaches for citation matching are Joint Segmentation (Jnt-Seg) and Joint Segmentation Entity Resolution (Jnt-Seg-ER). Jnt-Seg method is effective at processing sparse type datasets, but often produces many errors when applied to dense type datasets. On the contrary, Jnt-Seg-ER method is good at dealing with dense type datasets, but insufficient when sparse type datasets are presented. In this paper we propose an alternative joint inference approach–Generalized Joint Segmentation (Generalized-Jnt-Seg). It can effectively deal with the situation when the dataset type is unknown. Especially, in hybrid type datasets analysis there is often no a priori information for choosing Jnt-Seg method or Jnt-Seg-ER method to process segmentation and entity resolution. Both methods may produce many errors. Fortunately, our method can effectively avoid error of segmentation and produce well field boundaries. Experimental results on both types of citation datasets show that our method outperforms many alternative approaches for citation matching.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of the modified adaptive conjugate gradient (CG) algorithms based on the iterative CG method for adaptive filtering is highly related to the ways of estimating the correlation matrix and the cross-correlation vector. The existing approaches of implementing the CG algorithms using the data windows of exponential form or sliding form result in either loss of convergence or increase in misadjustment. This paper presents and analyzes a new approach to the implementation of the CG algorithms for adaptive filtering by using a generalized data windowing scheme. For the new modified CG algorithms, we show that the convergence speed is accelerated, the misadjustment and tracking capability comparable to those of the recursive least squares (RLS) algorithm are achieved. Computer simulations demonstrated in the framework of linear system modeling problem show the improvements of the new modifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate modeling capabilities of Bonferroni means and their generalizations. We will show that weighted Bonferroni means can model the concepts of hard and soft partial conjunction, and lead to several interesting special cases, with quite an intuitive interpretation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In information theory, entropies make up of the basis for distance and divergence measures among various probability densities. In this paper we propose a novel metric to detect DDoS attacks in networks by using the function of order α of the generalized (Rényi) entropy to distinguish DDoS attacks traffic from legitimate network traffic effectively. Our proposed approach can not only detect DDoS attacks early (it can detect attacks one hop earlier than using the Shannon metric while order α=2, and two hops earlier to detect attacks while order α=10.) but also reduce both the false positive rate and the false negative rate clearly compared with the traditional Shannon entropy metric approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we provide a systematic investigation of a family of composed aggregation functions which generalize the Bonferroni mean. Such extensions of the Bonferroni mean are capable of modeling the concepts of hard and soft partial conjunction and disjunction as well as that of k-tolerance and k-intolerance. There are several interesting special cases with quite an intuitive interpretation for application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We advance the theory of aggregation operators and introduce non-monotone aggregation methods based on minimization of a penalty for inputs disagreements. The application in mind is processing data sets which may contain noisy values. Our aim is to filter out noise while at the same time preserve signs of unusual values. We review various methods of robust estimators of location, and then introduce a new estimator based on penalty minimisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiwavelets are wavelets with multiplicity r, that is r scaling functions and r wavelets, which define multiresolution analysis similar to scalar wavelets. They are advantageous over scalar wavelets since they simultaneously posse symmetry and orthogonality. In this work, a new method for constructing multiwavelets with any approximation order is presented. The method involves the derivation of a matrix equation for the desired approximation order. The condition for approximation order is similar to the conditions in the scalar case. Generalized left eigenvectors give the combinations of scaling functions required to reconstruct the desired spline or super function. The method is demonstrated by constructing a specific class of symmetric and non-symmetric multiwavelets with different approximation orders, which include Geranimo-Hardin-Massopust (GHM), Daubechies and Alperts like multi-wavelets, as parameterized solutions. All multi-wavelets constructed in this work, posses the good properties of orthogonality, approximation order and short support.