922 resultados para binary logic
Resumo:
As the calibration and evaluation of flood inundation models are a prerequisite for their successful application, there is a clear need to ensure that the performance measures that quantify how well models match the available observations are fit for purpose. This paper evaluates the binary pattern performance measures that are frequently used to compare flood inundation models with observations of flood extent. This evaluation considers whether these measures are able to calibrate and evaluate model predictions in a credible and consistent way, i.e. identifying the underlying model behaviour for a number of different purposes such as comparing models of floods of different magnitudes or on different catchments. Through theoretical examples, it is shown that the binary pattern measures are not consistent for floods of different sizes, such that for the same vertical error in water level, a model of a flood of large magnitude appears to perform better than a model of a smaller magnitude flood. Further, the commonly used Critical Success Index (usually referred to as F<2 >) is biased in favour of overprediction of the flood extent, and is also biased towards correctly predicting areas of the domain with smaller topographic gradients. Consequently, it is recommended that future studies consider carefully the implications of reporting conclusions using these performance measures. Additionally, future research should consider whether a more robust and consistent analysis could be achieved by using elevation comparison methods instead.
Resumo:
In order to enhance the quality of care, healthcare organisations are increasingly resorting to clinical decision support systems (CDSSs), which provide physicians with appropriate health care decisions or recommendations. However, how to explicitly represent the diverse vague medical knowledge and effectively reason in the decision-making process are still problems we are confronted. In this paper, we incorporate semiotics into fuzzy logic to enhance CDSSs with the aim of providing both the abilities of describing medical domain concepts contextually and reasoning with vague knowledge. A semiotically inspired fuzzy CDSSs framework is presented, based on which the vague knowledge representation and reasoning process are demonstrated.
Resumo:
This article examines the role played by ideas and their thinkers in Christopher Hill's histories of the English Revolution. Hill protested against a reductionist economic determinism with no place for the intrinsic power of ideas, but his account of ideas gave them a progressive logic parallel to, if not always easy to link with, that of economic development, and threatened to divorce them from their muddled and imperfect thinkers. This account of the logic of ideas had a striking impact on the way in which the more mainstream radicals of the English Revolution appeared in Hill's work, with both the Levellers and James Harrington being half assimilated to, and half pushed aside in favor of, the more thoroughgoing economic radicals who expressed, in however ragged a way, the intrinsic potential of their ideas. However, Hill's writings also betray a surprising attraction to religious over secular forms of radicalism.
Resumo:
This paper presents an approximate closed form sample size formula for determining non-inferiority in active-control trials with binary data. We use the odds-ratio as the measure of the relative treatment effect, derive the sample size formula based on the score test and compare it with a second, well-known formula based on the Wald test. Both closed form formulae are compared with simulations based on the likelihood ratio test. Within the range of parameter values investigated, the score test closed form formula is reasonably accurate when non-inferiority margins are based on odds-ratios of about 0.5 or above and when the magnitude of the odds ratio under the alternative hypothesis lies between about 1 and 2.5. The accuracy generally decreases as the odds ratio under the alternative hypothesis moves upwards from 1. As the non-inferiority margin odds ratio decreases from 0.5, the score test closed form formula increasingly overestimates the sample size irrespective of the magnitude of the odds ratio under the alternative hypothesis. The Wald test closed form formula is also reasonably accurate in the cases where the score test closed form formula works well. Outside these scenarios, the Wald test closed form formula can either underestimate or overestimate the sample size, depending on the magnitude of the non-inferiority margin odds ratio and the odds ratio under the alternative hypothesis. Although neither approximation is accurate for all cases, both approaches lead to satisfactory sample size calculation for non-inferiority trials with binary data where the odds ratio is the parameter of interest.
Resumo:
It is believed that eta Carinae is actually a massive binary system, with the wind-wind interaction responsible for the strong X-ray emission. Although the overall shape of the X-ray light curve can be explained by the high eccentricity of the binary orbit, other features like the asymmetry near periastron passage and the short quasi-periodic oscillations seen at those epochs have not yet been accounted for. In this paper we explain these features assuming that the rotation axis of eta Carinae is not perpendicular to the orbital plane of the binary system. As a consequence, the companion star will face eta Carinae on the orbital plane at different latitudes for different orbital phases and, since both the mass-loss rate and the wind velocity are latitude dependent, they would produce the observed asymmetries in the X-ray flux. We were able to reproduce the main features of the X-ray light curve assuming that the rotation axis of eta Carinae forms an angle of 29 degrees +/- 4 degrees with the axis of the binary orbit. We also explained the short quasi-periodic oscillations by assuming nutation of the rotation axis, with an amplitude of about 5 degrees and a period of about 22 days. The nutation parameters, as well as the precession of the apsis, with a period of about 274 years, are consistent with what is expected from the torques induced by the companion star.
Resumo:
Various popular machine learning techniques, like support vector machines, are originally conceived for the solution of two-class (binary) classification problems. However, a large number of real problems present more than two classes. A common approach to generalize binary learning techniques to solve problems with more than two classes, also known as multiclass classification problems, consists of hierarchically decomposing the multiclass problem into multiple binary sub-problems, whose outputs are combined to define the predicted class. This strategy results in a tree of binary classifiers, where each internal node corresponds to a binary classifier distinguishing two groups of classes and the leaf nodes correspond to the problem classes. This paper investigates how measures of the separability between classes can be employed in the construction of binary-tree-based multiclass classifiers, adapting the decompositions performed to each particular multiclass problem. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Several real problems involve the classification of data into categories or classes. Given a data set containing data whose classes are known, Machine Learning algorithms can be employed for the induction of a classifier able to predict the class of new data from the same domain, performing the desired discrimination. Some learning techniques are originally conceived for the solution of problems with only two classes, also named binary classification problems. However, many problems require the discrimination of examples into more than two categories or classes. This paper presents a survey on the main strategies for the generalization of binary classifiers to problems with more than two classes, known as multiclass classification problems. The focus is on strategies that decompose the original multiclass problem into multiple binary subtasks, whose outputs are combined to obtain the final prediction.
Resumo:
In this paper, we study binary differential equations a(x, y)dy (2) + 2b(x, y) dx dy + c(x, y)dx (2) = 0, where a, b, and c are real analytic functions. Following the geometric approach of Bruce and Tari in their work on multiplicity of implicit differential equations, we introduce a definition of the index for this class of equations that coincides with the classical Hopf`s definition for positive binary differential equations. Our results also apply to implicit differential equations F(x, y, p) = 0, where F is an analytic function, p = dy/dx, F (p) = 0, and F (pp) not equal aEuro parts per thousand 0 at the singular point. For these equations, we relate the index of the equation at the singular point with the index of the gradient of F and index of the 1-form omega = dy -aEuro parts per thousand pdx defined on the singular surface F = 0.
Resumo:
The use of liposomes to encapsulate materials has received widespread attention for drug delivery, transfection, diagnostic reagent, and as immunoadjuvants. Phospholipid polymers form a new class of biomaterials with many potential applications in medicine and research. Of interest are polymeric phospholipids containing a diacetylene moiety along their acyl chain since these kinds of lipids can be polymerized by Ultra-Violet (UV) irradiation to form chains of covalently linked lipids in the bilayer. In particular the diacetylenic phosphatidylcholine 1,2-bis(10,12-tricosadiynoyl)- sn-glycero-3-phosphocholine (DC8,9PC) can form intermolecular cross-linking through the diacetylenic group to produce a conjugated polymer within the hydrocarbon region of the bilayer. As knowledge of liposome structures is certainly fundamental for system design improvement for new and better applications, this work focuses on the structural properties of polymerized DC8,9PC:1,2-dimyristoyl-sn-glycero-3-phusphocholine (DMPC) liposomes. Liposomes containing mixtures of DC8,9PC and DMPC, at different molar ratios, and exposed to different polymerization cycles, were studied through the analysis of the electron spin resonance (ESR) spectra of a spin label incorporated into the bilayer, and the calorimetric data obtained from differential scanning calorimetry (DSC) studies. Upon irradiation, if all lipids had been polymerized, no gel-fluid transition would be expected. However, even samples that went through 20 cycles of UV irradiation presented a DSC band, showing that around 80% of the DC8,9PC molecules were not polymerized. Both DSC and ESR indicated that the two different lipids scarcely mix at low temperatures, however few molecules of DMPC are present in DC8,9PC rich domains and vice versa. UV irradiation was found to affect the gel fluid transition of both DMPC and DC8,9PC rich regions, indicating the presence of polymeric units of DC8,9PC in both areas, A model explaining lipids rearrangement is proposed for this partially polymerized system.
Resumo:
The design of translation invariant and locally defined binary image operators over large windows is made difficult by decreased statistical precision and increased training time. We present a complete framework for the application of stacked design, a recently proposed technique to create two-stage operators that circumvents that difficulty. We propose a novel algorithm, based on Information Theory, to find groups of pixels that should be used together to predict the Output Value. We employ this algorithm to automate the process of creating a set of first-level operators that are later combined in a global operator. We also propose a principled way to guide this combination, by using feature selection and model comparison. Experimental results Show that the proposed framework leads to better results than single stage design. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The logic of proofs (lp) was proposed as Gdels missed link between Intuitionistic and S4-proofs, but so far the tableau-based methods proposed for lp have not explored this closeness with S4 and contain rules whose analycity is not immediately evident. We study possible formulations of analytic tableau proof methods for lp that preserve the subformula property. Two sound and complete tableau decision methods of increasing degree of analycity are proposed, KELP and preKELP. The latter is particularly inspired on S4-proofs. The crucial role of proof constants in the structure of lp-proofs methods is analysed. In particular, a method for the abduction of proof constant specifications in strongly analytic preKELP proofs is presented; abduction heuristics and the complexity of the method are discussed.
Resumo:
The design of binary morphological operators that are translation-invariant and locally defined by a finite neighborhood window corresponds to the problem of designing Boolean functions. As in any supervised classification problem, morphological operators designed from a training sample also suffer from overfitting. Large neighborhood tends to lead to performance degradation of the designed operator. This work proposes a multilevel design approach to deal with the issue of designing large neighborhood-based operators. The main idea is inspired by stacked generalization (a multilevel classifier design approach) and consists of, at each training level, combining the outcomes of the previous level operators. The final operator is a multilevel operator that ultimately depends on a larger neighborhood than of the individual operators that have been combined. Experimental results show that two-level operators obtained by combining operators designed on subwindows of a large window consistently outperform the single-level operators designed on the full window. They also show that iterating two-level operators is an effective multilevel approach to obtain better results.
Resumo:
Canalizing genes possess such broad regulatory power, and their action sweeps across a such a wide swath of processes that the full set of affected genes are not highly correlated under normal conditions. When not active, the controlling gene will not be predictable to any significant degree by its subject genes, either alone or in groups, since their behavior will be highly varied relative to the inactive controlling gene. When the controlling gene is active, its behavior is not well predicted by any one of its targets, but can be very well predicted by groups of genes under its control. To investigate this question, we introduce in this paper the concept of intrinsically multivariate predictive (IMP) genes, and present a mathematical study of IMP in the context of binary genes with respect to the coefficient of determination (CoD), which measures the predictive power of a set of genes with respect to a target gene. A set of predictor genes is said to be IMP for a target gene if all properly contained subsets of the predictor set are bad predictors of the target but the full predictor set predicts the target with great accuracy. We show that logic of prediction, predictive power, covariance between predictors, and the entropy of the joint probability distribution of the predictors jointly affect the appearance of IMP genes. In particular, we show that high-predictive power, small covariance among predictors, a large entropy of the joint probability distribution of predictors, and certain logics, such as XOR in the 2-predictor case, are factors that favor the appearance of IMP. The IMP concept is applied to characterize the behavior of the gene DUSP1, which exhibits control over a central, process-integrating signaling pathway, thereby providing preliminary evidence that IMP can be used as a criterion for discovery of canalizing genes.
Resumo:
Planning to reach a goal is an essential capability for rational agents. In general, a goal specifies a condition to be achieved at the end of the plan execution. In this article, we introduce nondeterministic planning for extended reachability goals (i.e., goals that also specify a condition to be preserved during the plan execution). We show that, when this kind of goal is considered, the temporal logic CTL turns out to be inadequate to formalize plan synthesis and plan validation algorithms. This is mainly due to the fact that the CTL`s semantics cannot discern among the various actions that produce state transitions. To overcome this limitation, we propose a new temporal logic called alpha-CTL. Then, based on this new logic, we implement a planner capable of synthesizing reliable plans for extended reachability goals, as a side effect of model checking.