20 resultados para Boolean-like laws. Fuzzy implications. Fuzzy rule based systens. Fuzzy set theories

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the fight against money laundering has emerged as a key issue of financial regulation. The Wolfsberg Group is an important multistakeholder agreement establishing corporate responsibility (CR) principles against money laundering in a domain where international coordination remains otherwise difficult. The fact that 10 out of the 25 top private banking institutions joined this initiative opens up an interesting puzzle concerning the conditions for the participation of key industry players in the Wolfsberg Group. The article presents a fuzzy-set analysis of seven hypotheses based on firm-level organizational factors, the macro-institutional context, and the regulatory framework. Results from the analysis of these 25 financial institutions show that public ownership of the bank and the existence of a code of conduct are necessary conditions for participation in the Wolfsberg Group, whereas factors related to the type of financial institution, combined with the existence of a black list, are sufficient for explaining participation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with a phenomenologically motivated magneto-viscoelastic coupled finite strain framework for simulating the curing process of polymers under the application of a coupled magneto-mechanical road. Magneto-sensitive polymers are prepared by mixing micron-sized ferromagnetic particles in uncured polymers. Application of a magnetic field during the curing process causes the particles to align and form chain-like structures lending an overall anisotropy to the material. The polymer curing is a viscoelastic complex process where a transformation from fluid. to solid occurs in the course of time. During curing, volume shrinkage also occurs due to the packing of polymer chains by chemical reactions. Such reactions impart a continuous change of magneto-mechanical properties that can be modelled by an appropriate constitutive relation where the temporal evolution of material parameters is considered. To model the shrinkage during curing, a magnetic-induction-dependent approach is proposed which is based on a multiplicative decomposition of the deformation gradient into a mechanical and a magnetic-induction-dependent volume shrinkage part. The proposed model obeys the relevant laws of thermodynamics. Numerical examples, based on a generalised Mooney-Rivlin energy function, are presented to demonstrate the model capacity in the case of a magneto-viscoelastically coupled load.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactions between stimuli's acoustic features and experience-based internal models of the environment enable listeners to compensate for the disruptions in auditory streams that are regularly encountered in noisy environments. However, whether auditory gaps are filled in predictively or restored a posteriori remains unclear. The current lack of positive statistical evidence that internal models can actually shape brain activity as would real sounds precludes accepting predictive accounts of filling-in phenomenon. We investigated the neurophysiological effects of internal models by testing whether single-trial electrophysiological responses to omitted sounds in a rule-based sequence of tones with varying pitch could be decoded from the responses to real sounds and by analyzing the ERPs to the omissions with data-driven electrical neuroimaging methods. The decoding of the brain responses to different expected, but omitted, tones in both passive and active listening conditions was above chance based on the responses to the real sound in active listening conditions. Topographic ERP analyses and electrical source estimations revealed that, in the absence of any stimulation, experience-based internal models elicit an electrophysiological activity different from noise and that the temporal dynamics of this activity depend on attention. We further found that the expected change in pitch direction of omitted tones modulated the activity of left posterior temporal areas 140-200 msec after the onset of omissions. Collectively, our results indicate that, even in the absence of any stimulation, internal models modulate brain activity as do real sounds, indicating that auditory filling in can be accounted for by predictive activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The annotation of protein post-translational modifications (PTMs) is an important task of UniProtKB curators and, with continuing improvements in experimental methodology, an ever greater number of articles are being published on this topic. To help curators cope with this growing body of information we have developed a system which extracts information from the scientific literature for the most frequently annotated PTMs in UniProtKB. RESULTS: The procedure uses a pattern-matching and rule-based approach to extract sentences with information on the type and site of modification. A ranked list of protein candidates for the modification is also provided. For PTM extraction, precision varies from 57% to 94%, and recall from 75% to 95%, according to the type of modification. The procedure was used to track new publications on PTMs and to recover potential supporting evidence for phosphorylation sites annotated based on the results of large scale proteomics experiments. CONCLUSIONS: The information retrieval and extraction method we have developed in this study forms the basis of a simple tool for the manual curation of protein post-translational modifications in UniProtKB/Swiss-Prot. Our work demonstrates that even simple text-mining tools can be effectively adapted for database curation tasks, providing that a thorough understanding of the working process and requirements are first obtained. This system can be accessed at http://eagl.unige.ch/PTM/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A government would like to subsidize an indivisible good. Consumers' valuations of the good vary according to their wealth and benefits from the good. A subsidy scheme may be based on consumers' wealth or benefit information. We translate a wealth-based policy to a benefit-based policy, and vice versa, and give a necessary and sufficient condition for the pair of policies to implement the same assignment: consumers choose to purchase the good under the wealth-based policy if and only if they choose to do so under the translated benefit-based policy. General taxation allows equivalent policies to require the same budget.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Modelling species at the assemblage level is required to make effective forecast of global change impacts on diversity and ecosystem functioning. Community predictions may be achieved using macroecological properties of communities (MEM), or by stacking of individual species distribution models (S-SDMs). To obtain more realistic predictions of species assemblages, the SESAM framework suggests applying successive filters to the initial species source pool, by combining different modelling approaches and rules. Here we provide a first test of this framework in mountain grassland communities. Location: The western Swiss Alps. Methods: Two implementations of the SESAM framework were tested: a "Probability ranking" rule based on species richness predictions and rough probabilities from SDMs, and a "Trait range" rule that uses the predicted upper and lower bound of community-level distribution of three different functional traits (vegetative height, specific leaf area and seed mass) to constraint a pool of environmentally filtered species from binary SDMs predictions. Results: We showed that all independent constraints expectedly contributed to reduce species richness overprediction. Only the "Probability ranking" rule allowed slightly but significantly improving predictions of community composition. Main conclusion: We tested various ways to implement the SESAM framework by integrating macroecological constraints into S-SDM predictions, and report one that is able to improve compositional predictions. We discuss possible improvements, such as further improving the causality and precision of environmental predictors, using other assembly rules and testing other types of ecological or functional constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Children with Wiskott-Aldrich syndrome (WAS) are often first diagnosed with immune thrombocytopenia (ITP), potentially leading to both inappropriate treatment and the delay of life-saving definitive therapy. WAS is traditionally differentiated from ITP based on the small size of WAS platelets. In practice, microthrombocytopenia is often not present or not appreciated in children with WAS. To develop an alternative method of differentiating WAS from ITP, we retrospectively reviewed all complete blood counts and measurements of immature platelet fraction (IPF) in 18 subjects with WAS and 38 subjects with a diagnosis of ITP treated at our hospital. Examination of peripheral blood smears revealed a wide range of platelet sizes in subjects with WAS. Mean platelet volume (MPV) was not reported in 26% of subjects, and subjects in whom MPV was not reported had lower platelet counts than did subjects in whom MPV was reported. Subjects with WAS had a lower IPF than would be expected for their level of thrombocytopenia, and the IPF in subjects with WAS was significantly lower than in subjects with a diagnosis of ITP. Using logistic regression, we developed and validated a rule based on platelet count and IPF that was more sensitive for the diagnosis of WAS than was the MPV, and was applicable regardless of the level of platelets or the availability of the MPV. Our observations demonstrate that MPV is often not available in severely thrombocytopenic subjects, which may hinder the diagnosis of WAS. In addition, subjects with WAS have a low IPF, which is consistent with the notion that a platelet production defect contributes to the thrombocytopenia of WAS. Knowledge of this detail of WAS pathophysiology allows to differentiate WAS from ITP with increased sensitivity, thereby allowing a physician to spare children with WAS from inappropriate treatment, and make definitive therapy available in a timely manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: We conducted a comprehensive review of the design, implementation, and outcome of first-in-human (FIH) trials of monoclonal antibodies (mAbs) to clearly determine early clinical development strategies for this class of compounds. METHODS: We performed a PubMed search using appropriate terms to identify reports of FIH trials of mAbs published in peer-reviewed journals between January 2000 and April 2013. RESULTS: A total of 82 publications describing FIH trials were selected for analysis. Only 27 articles (33%) reported the criteria used for selecting the starting dose (SD). Dose escalation was performed using rule-based methods in 66 trials (80%). The median number of planned dose levels was five (range, two to 13). The median of the ratio between the highest planned dose and the SD was 27 (range, two to 3,333). Although in 56 studies (68%) at least one grade 3 or 4 toxicity event was reported, no dose-limiting toxicity was observed in 47 trials (57%). The highest planned dose was reached in all trials, but the maximum-tolerated dose (MTD) was defined in only 13 studies (16%). The median of the ratio between MTD and SD was eight (range, four to 1,000). The recommended phase II dose was indicated in 34 studies (41%), but in 25 (73%) of these trials, this dose was chosen without considering toxicity as the main selection criterion. CONCLUSION: This literature review highlights the broad design heterogeneity of FIH trials testing mAbs. Because of the limited observed toxicity, the MTD was infrequently reached, and therefore, the recommended phase II dose for subsequent clinical trials was only tentatively defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Profiling miRNA levels in cells with miRNA microarrays is becoming a widely used technique. Although normalization methods for mRNA gene expression arrays are well established, miRNA array normalization has so far not been investigated in detail. In this study we investigate the impact of normalization on data generated with the Agilent miRNA array platform. We have developed a method to select nonchanging miRNAs (invariants) and use them to compute linear regression normalization coefficients or variance stabilizing normalization (VSN) parameters. We compared the invariants normalization to normalization by scaling, quantile, and VSN with default parameters as well as to no normalization using samples with strong differential expression of miRNAs (heart-brain comparison) and samples where only a few miRNAs are affected (by p53 overexpression in squamous carcinoma cells versus control). All normalization methods performed better than no normalization. Normalization procedures based on the set of invariants and quantile were the most robust over all experimental conditions tested. Our method of invariant selection and normalization is not limited to Agilent miRNA arrays and can be applied to other data sets including those from one color miRNA microarray platforms, focused gene expression arrays, and gene expression analysis using quantitative PCR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stable isotope labels are routinely introduced into proteomes for quantification purposes. Full labeling of cells in varying biological states, followed by sample mixing, fractionation and intensive data acquisition, is used to obtain accurate large-scale quantification of total protein levels. However, biological processes often affect only a small group of proteins for a short time, resulting in changes that are difficult to detect against the total proteome background. An alternative approach could be the targeted analysis of the proteins synthesized in response to a given biological stimulus. Such proteins can be pulse-labeled with a stable isotope by metabolic incorporation of 'heavy' amino acids. In this study we investigated the specific detection and identification of labeled proteins using acquisition methods based on Precursor Ion Scans (PIS) on a triple-quadrupole ion trap mass spectrometer. PIS-based methods were set to detect unique immonium ions originating from labeled peptides. Different labels and methods were tested in standard mixtures to optimize performance. We showed that, in comparison with an untargeted analysis on the same instrument, the approach allowed a several-fold increase in the specificity of detection of labeled proteins over unlabeled ones. The technique was applied to the identification of proteins secreted by human cells into growth media containing bovine serum proteins, allowing the preferential detection of labeled cellular proteins over unlabeled bovine ones. However, compared with untargeted acquisitions on two different instruments, the PIS-based strategy showed some limitations in sensitivity. We discuss possible perspectives of the technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diagnosis of chronic inflammatory demyelinating polyneuropathy (CIDP) is based on a set of clinical and neurophysiological parameters. However, in clinical practice, CIDP remains difficult to diagnose in atypical cases. In the present study, 32 experts from 22 centers (the French CIDP study group) were asked individually to score four typical, and seven atypical, CIDP observations (TOs and AOs, respectively) reported by other physicians, according to the Delphi method. The diagnoses of CIDP were confirmed by the group in 96.9 % of the TO and 60.1 % of the AO (p < 0.0001). There was a positive correlation between the consensus of CIDP diagnosis and the demyelinating features (r = 0.82, p < 0.004). The European CIDP classification was used in 28.3 % of the TOs and 18.2 % of the AOs (p < 0.002). The French CIDP study group diagnostic strategy was used in 90 % of the TOs and 61 % of the AOs (p < 0.0001). In 3 % of the TOs and 21.6 % of the AOs, the experts had difficulty determining a final diagnosis due to a lack of information. This study shows that a set of criteria and a diagnostic strategy are not sufficient to reach a consensus for the diagnosis of atypical CIDP in clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetic resonance angiography (MRA) provides a noninvasive means to detect the presence, location and severity of atherosclerosis throughout the vascular system. In such studies, and especially those in the coronary arteries, the vessel luminal area is typically measured at multiple cross-sectional locations along the course of the artery. The advent of fast volumetric imaging techniques covering proximal to mid segments of coronary arteries necessitates automatic analysis tools requiring minimal manual interactions to robustly measure cross-sectional area along the three-dimensional track of the arteries in under-sampled and non-isotropic datasets. In this work, we present a modular approach based on level set methods to track the vessel centerline, segment the vessel boundaries, and measure transversal area using two user-selected endpoints in each coronary of interest. Arterial area and vessel length are measured using our method and compared to the standard Soap-Bubble reformatting and analysis tool in in-vivo non-contrast enhanced coronary MRA images.