998 resultados para risk minimization
Resumo:
Images obtained through fluorescence microscopy at low numerical aperture (NA) are noisy and have poor resolution. Images of specimens such as F-actin filaments obtained using confocal or widefield fluorescence microscopes contain directional information and it is important that an image smoothing or filtering technique preserve the directionality. F-actin filaments are widely studied in pathology because the abnormalities in actin dynamics play a key role in diagnosis of cancer, cardiac diseases, vascular diseases, myofibrillar myopathies, neurological disorders, etc. We develop the directional bilateral filter as a means of filtering out the noise in the image without significantly altering the directionality of the F-actin filaments. The bilateral filter is anisotropic to start with, but we add an additional degree of anisotropy by employing an oriented domain kernel for smoothing. The orientation is locally adapted using a structure tensor and the parameters of the bilateral filter are optimized for within the framework of statistical risk minimization. We show that the directional bilateral filter has better denoising performance than the traditional Gaussian bilateral filter and other denoising techniques such as SURE-LET, non-local means, and guided image filtering at various noise levels in terms of peak signal-to-noise ratio (PSNR). We also show quantitative improvements in low NA images of F-actin filaments. (C) 2015 Author(s). All article content, except where otherwise noted, is licensed under a Creative Commons Attribution 3.0 Unported License.
Resumo:
Quantile regression refers to the process of estimating the quantiles of a conditional distribution and has many important applications within econometrics and data mining, among other domains. In this paper, we show how to estimate these conditional quantile functions within a Bayes risk minimization framework using a Gaussian process prior. The resulting non-parametric probabilistic model is easy to implement and allows non-crossing quantile functions to be enforced. Moreover, it can directly be used in combination with tools and extensions of standard Gaussian Processes such as principled hyperparameter estimation, sparsification, and quantile regression with input-dependent noise rates. No existing approach enjoys all of these desirable properties. Experiments on benchmark datasets show that our method is competitive with state-of-the-art approaches. © 2009 IEEE.
Resumo:
准确的网络流量分类是众多网络研究工作的基础,也一直是网络测量领域的研究热点.近年来,利用机器学习方法处理流量分类问题成为了该领域一个新兴的研究方向.在目前研究中应用较多的是朴素贝叶斯(nave Bayes,NB)及其改进算法.这些方法具有实现简单、分类高效的特点.但该方法过分依赖于样本空间的分布,具有内在的不稳定性.因此,提出一种基于支持向量机(support vector machine,SVM)的流量分类方法.该方法利用非线性变换和结构风险最小化(structural risk minimization,SRM)原则将流量分类问题转化为二次寻优问题,具有良好的分类准确率和稳定性.在理论分析的基础上,通过在实际网络流集合上与朴素贝叶斯算法的对比实验,可以看出使用支持向量机方法处理流量分类问题,具有以下3个优势:1)网络流属性不必满足条件独立假设,无须进行属性过滤;2)能够在先验知识相对不足的情况下,仍保持较高的分类准确率;3)不依赖于样本空间的分布,具有较好的分类稳定性.
Resumo:
Esta dissertação foi realizada em colaboração com o grupo empresarial Monteiro, Ribas e teve como principais objetivos efetuar uma avaliação das melhores técnicas disponíveis relativas à refrigeração industrial e às emissões resultantes da armazenagem. O primeiro objetivo teve como alvo todas as instalações da Monteiro, Ribas enquanto que o segundo objetivo se debruçou sobre Monteiro, Ribas, Embalagens Flexíveis, S.A.. Para cumprir estes objetivos, inicialmente efetuou-se um levantamento das melhores técnicas disponíveis apresentadas nos respetivos documentos de referência. Em seguida selecionaram-se as técnicas que se adequavam às condições e às instalações em estudo e procedeu-se a uma avaliação de forma a verificar o grau de implementação das medidas sugeridas no BREF (Best Available Techniques Reference Document). Relativamente aos sistemas de refrigeração industrial verificou-se que estão implementadas quase todas as medidas referenciadas no respetivo documento de referência. Isto prende-se com o facto dos sistemas de refrigeração existentes no complexo industrial Monteiro, Ribas serem relativamente recentes. Foram implementados no ano de 2012, e são caracterizados por apresentarem uma conceção moderna com elevada eficiência. No que diz respeito à armazenagem de produtos químicos perigosos, a instalação em estudo, apresenta algumas inconformidades, uma vez que a maioria das técnicas mencionadas no BREF não se encontram implementadas, pelo que foi necessário efetuar uma avaliação de riscos ambientais, com recurso à metodologia proposta pela Norma Espanhola UNE 150008:2008 – Análise e Avaliação do Risco Ambiental. Para isso procedeu-se então à formulação de vários cenários de riscos e à quantificação de riscos para à Monteiro, Ribas Embalagens Flexíveis S.A., tendo-se apurado que os riscos estavam avaliados como moderados a altos. Por fim foram sugeridas algumas medidas de prevenção e de minimização do risco que a instalação deve aplicar, como por exemplo, o parque de resíduos perigosos deve ser equipado com kits de contenção de derrames (material absorvente), procedimentos a realizar em caso de emergência, fichas de dados de segurança e o extintor deve ser colocado num local de fácil visualização. No transporte de resíduos perigosos, para o respetivo parque, é aconselhável utilizar bacias de contenção de derrames portáteis e kits de contenção de derrames. Relativamente ao armazém de produtos químicos perigosos é recomendado que se proceda a sua reformulação tendo em conta as MTD apresentadas no subcapítulo 5.2.3 desta dissertação.
Resumo:
Cette thèse porte sur les questions d'évaluation et de couverture des options dans un modèle exponentiel-Lévy avec changements de régime. Un tel modèle est construit sur un processus additif markovien un peu comme le modèle de Black- Scholes est basé sur un mouvement Brownien. Du fait de l'existence de plusieurs sources d'aléa, nous sommes en présence d'un marché incomplet et ce fait rend inopérant les développements théoriques initiés par Black et Scholes et Merton dans le cadre d'un marché complet. Nous montrons dans cette thèse que l'utilisation de certains résultats de la théorie des processus additifs markoviens permet d'apporter des solutions aux problèmes d'évaluation et de couverture des options. Notamment, nous arrivons à caracté- riser la mesure martingale qui minimise l'entropie relative à la mesure de probabilit é historique ; aussi nous dérivons explicitement sous certaines conditions, le portefeuille optimal qui permet à un agent de minimiser localement le risque quadratique associé. Par ailleurs, dans une perspective plus pratique nous caract érisons le prix d'une option Européenne comme l'unique solution de viscosité d'un système d'équations intégro-di érentielles non-linéaires. Il s'agit là d'un premier pas pour la construction des schémas numériques pour approcher ledit prix.
Resumo:
In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
This paper discusses some aspects of hunter-gatherer spatial organization in southern South Patagonia, in later times to 10,000 cal yr BP. Various methods of spatial analysis, elaborated with a Geographic Information System (GIS) were applied to the distributional pattern of archaeological sites with radiocarbon dates. The shift in the distributional pattern of chronological information was assessed in conjunction with other lines of evidence within a biogeographic framework. Accordingly, the varying degrees of occupation and integration of coastal and interior spaces in human spatial organization are explained in association with the adaptive strategies hunter-gatherers have used over time. Both are part of the same human response to changes in risk and uncertainty variability in the region in terms of resource availability and environmental dynamics.
Resumo:
The effect of multiplicative noise on a signal when compared with that of additive noise is very large. In this paper, we address the problem of suppressing multiplicative noise in one-dimensional signals. To deal with signals that are corrupted with multiplicative noise, we propose a denoising algorithm based on minimization of an unbiased estimator (MURE) of meansquare error (MSE). We derive an expression for an unbiased estimate of the MSE. The proposed denoising is carried out in wavelet domain (soft thresholding) by considering time-domain MURE. The parameters of thresholding function are obtained by minimizing the unbiased estimator MURE. We show that the parameters for optimal MURE are very close to the optimal parameters considering the oracle MSE. Experiments show that the SNR improvement for the proposed denoising algorithm is competitive with a state-of-the-art method.
Resumo:
A new model to explain animal spacing, based on a trade-off between foraging efficiency and predation risk, is derived from biological principles. The model is able to explain not only the general tendency for animal groups to form, but some of the attributes of real groups. These include the independence of mean animal spacing from group population, the observed variation of animal spacing with resource availability and also with the probability of predation, and the decline in group stability with group size. The appearance of "neutral zones" within which animals are not motivated to adjust their relative positions is also explained. The model assumes that animals try to minimize a cost potential combining the loss of intake rate due to foraging interference and the risk from exposure to predators. The cost potential describes a hypothetical field giving rise to apparent attractive and repulsive forces between animals. Biologically based functions are given for the decline in interference cost and increase in the cost of predation risk with increasing animal separation. Predation risk is calculated from the probabilities of predator attack and predator detection as they vary with distance. Using example functions for these probabilities and foraging interference, we calculate the minimum cost potential for regular lattice arrangements of animals before generalizing to finite-sized groups and random arrangements of animals, showing optimal geometries in each case and describing how potentials vary with animal spacing. (C) 1999 Academic Press.</p>
Resumo:
This paper introduces the discrete choice model-paradigm of Random Regret Minimisation (RRM) to the field of health economics. The RRM is a regret-based model that explores a driver of choice different from the traditional utility-based Random Utility Maximisation (RUM). The RRM approach is based on the idea that, when choosing, individuals aim to minimise their regret–regret being defined as what one experiences when a non-chosen alternative in a choice set performs better than a chosen one in relation to one or more attributes. Analysing data from a discrete choice experiment on diet, physical activity and risk of a fatal heart attack in the next ten years administered to a sample of the Northern Ireland population, we find that the combined use of RUM and RRM models offer additional information, providing useful behavioural insights for better informed policy appraisal.
Resumo:
Background: High risk medications are commonly prescribed to older US patients. Currently, less is known about high risk medication prescribing in other Western Countries, including the UK. We measured trends and correlates of high risk medication prescribing in a subset of the older UK population (community/institutionalized) to inform harm minimization efforts. Methods: Three cross-sectional samples from primary care electronic clinical records (UK Clinical Practice Research Datalink, CPRD) in fiscal years 2003/04, 2007/08 and 2011/12 were taken. This yielded a sample of 13,900 people aged 65 years or over from 504 UK general practices. High risk medications were defined by 2012 Beers Criteria adapted for the UK. Using descriptive statistical methods and regression modelling, prevalence of ‘any’ (drugs prescribed at least once per year) and ‘long-term’ (drugs prescribed all quarters of year) high risk medication prescribing and correlates were determined. Results: While polypharmacy rates have risen sharply, high risk medication prevalence has remained stable across a decade. A third of older (65+) people are exposed to high risk medications, but only half of the total prevalence was long-term (any = 38.4 % [95 % CI: 36.3, 40.5]; long-term = 17.4 % [15.9, 19.9] in 2011/12). Long-term but not any high risk medication exposure was associated with older ages (85 years or over). Women and people with higher polypharmacy burden were at greater risk of exposure; lower socio-economic status was not associated. Ten drugs/drug classes accounted for most of high risk medication prescribing in 2011/12. Conclusions: High risk medication prescribing has not increased over time against a background of increasing polypharmacy in the UK. Half of patients receiving high risk medications do so for less than a year. Reducing or optimising the use of a limited number of drugs could dramatically reduce high risk medications in older people. Further research is needed to investigate why the oldest old and women are at greater risk. Interventions to reduce high risk medications may need to target shorter and long-term use separately.
Resumo:
In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The main purpose of this paper is to propose a methodology to obtain a hedge fund tail risk measure. Our measure builds on the methodologies proposed by Almeida and Garcia (2015) and Almeida, Ardison, Garcia, and Vicente (2016), which rely in solving dual minimization problems of Cressie Read discrepancy functions in spaces of probability measures. Due to the recently documented robustness of the Hellinger estimator (Kitamura et al., 2013), we adopt within the Cressie Read family, this specific discrepancy as loss function. From this choice, we derive a minimum Hellinger risk-neutral measure that correctly prices an observed panel of hedge fund returns. The estimated risk-neutral measure is used to construct our tail risk measure by pricing synthetic out-of-the-money put options on hedge fund returns of ten specific categories. We provide a detailed description of our methodology, extract the aggregate Tail risk hedge fund factor for Brazilian funds, and as a by product, a set of individual Tail risk factors for each specific hedge fund category.
Resumo:
Using the risk measure CV aR in �nancial analysis has become more and more popular recently. In this paper we apply CV aR for portfolio optimization. The problem is formulated as a two-stage stochastic programming model, and the SRA algorithm, a recently developed heuristic algorithm, is applied for minimizing CV aR.