998 resultados para 010200 APPLIED MATHEMATICS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Radiocarbon dating is routinely used in paleoecology to build chronolo- gies of lake and peat sediments, aiming at inferring a model that would relate the sediment depth with its age. We present a new approach for chronology building (called “Bacon”) that has received enthusiastic attention by paleoecologists. Our methodology is based on controlling core accumulation rates using a gamma autoregressive semiparametric model with an arbitrary number of subdivisions along the sediment. Using prior knowledge about accumulation rates is crucial and informative priors are routinely used. Since many sediment cores are currently analyzed, using different data sets and prior distributions, a robust (adaptive) MCMC is very useful. We use the t-walk (Christen and Fox, 2010), a self adjusting, robust MCMC sampling algorithm, that works acceptably well in many situations. Outliers are also addressed using a recent approach that considers a Student-t model for radiocarbon data. Two examples are presented here, that of a peat core and a core from a lake, and our results are compared with other approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, the hydrodynamics and the pressure drop of liquid-liquid slug flow in round microcapillaries are presented. Two liquid-liquid flow systems are considered, viz. water-toluene and ethylene glycol/water-toluene. The slug lengths of the alternating continuous and dispersed phases were measured as a function of the slug velocity (0.03-0.5 m/s), the organic-to-aqueous flow ratio (0.1-4.0), and the microcapillary internal diameter (248 and 498 mu m). The pressure drop is modeled as the sum of two contributions: the frictional and the interface pressure drop. Two models are presented, viz, the stagnant film model and the moving film model. Both models account for the presence of a thin liquid film between the dispersed phase slug and the capillary wall. It is found that the film velocity is of negligible influence on the pressure drop. Therefore, the stagnant film model is adequate to accurately predict the liquid-liquid slug flow pressure drop. The influence of inertia and the consequent change of the slug cap curvature are accounted for by modifying Bretherton's curvature parameter in the interface pressure drop equation. The stagnant film model is in good agreement with experimental data with a mean relative error of less than 7%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The majority of reported learning methods for Takagi-Sugeno-Kang fuzzy neural models to date mainly focus on the improvement of their accuracy. However, one of the key design requirements in building an interpretable fuzzy model is that each obtained rule consequent must match well with the system local behaviour when all the rules are aggregated to produce the overall system output. This is one of the distinctive characteristics from black-box models such as neural networks. Therefore, how to find a desirable set of fuzzy partitions and, hence, to identify the corresponding consequent models which can be directly explained in terms of system behaviour presents a critical step in fuzzy neural modelling. In this paper, a new learning approach considering both nonlinear parameters in the rule premises and linear parameters in the rule consequents is proposed. Unlike the conventional two-stage optimization procedure widely practised in the field where the two sets of parameters are optimized separately, the consequent parameters are transformed into a dependent set on the premise parameters, thereby enabling the introduction of a new integrated gradient descent learning approach. A new Jacobian matrix is thus proposed and efficiently computed to achieve a more accurate approximation of the cost function by using the second-order Levenberg-Marquardt optimization method. Several other interpretability issues about the fuzzy neural model are also discussed and integrated into this new learning approach. Numerical examples are presented to illustrate the resultant structure of the fuzzy neural models and the effectiveness of the proposed new algorithm, and compared with the results from some well-known methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we study the influence of interventions on self-interactions in a spatial Prisoner's Dilemma on a two-dimensional grid with periodic boundary conditions and synchronous updating of the dynamics. We investigate two different types of self-interaction modifications. The first type (FSIP) is deterministic, effecting each self-interaction of a player by a constant factor, whereas the second type (PSIP) performs a probabilistic interventions. Both types of interventions lead to a reduction of the payoff of the players and, hence, represent inhibiting effects. We find that a constant but moderate reduction of self-interactions has a very beneficial effect on the evolution of cooperators in the population, whereas probabilistic interventions on self-interactions are in general counter productive for the coexistence of the two different strategies. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we show how interacting and occluding targets can be tackled successfully within a Gaussian approximation. For that purpose, we develop a general expansion of the mean and covariance of the posterior and we consider a first order approximation of it. The proposed method differs from EKF in that neither a non-linear dynamical model nor a non-linear measurement vector to state relation have to be defined, so it works with any kind of interaction potential and likelihood. The approach has been tested on three sequences (10400, 2500, and 400 frames each one). The results show that our approach helps to reduce the number of failures without increasing too much the computation time with respect to methods that do not take into account target interactions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ability of millimetre wave and terahertz systems to penetrate clothing is well known. The fact that the transmission of clothing and the reflectivity of the body vary as a function of frequency is less so. Several instruments have now been developed to exploit this capability. The choice of operating frequency, however, has often been associated with the maturity and the cost of the enabling technology rather than a sound systems engineering approach. Top level user and systems requirements have been derived to inform the development of design concepts. Emerging micro and nano technology concepts have been reviewed and we have demonstrated how these can be evaluated against these requirements by simulation using OpenFx. Openfx is an open source suite of 3D tools for modeling, animation and visualization which has been modified for use at millimeter waves. © 2012 SPIE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce the notion of a (noncommutative) C *-Segal algebra as a Banach algebra (A, {norm of matrix}{dot operator}{norm of matrix} A) which is a dense ideal in a C *-algebra (C, {norm of matrix}{dot operator}{norm of matrix} C), where {norm of matrix}{dot operator}{norm of matrix} A is strictly stronger than {norm of matrix}{dot operator}{norm of matrix} C onA. Several basic properties are investigated and, with the aid of the theory of multiplier modules, the structure of C *-Segal algebras with order unit is determined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we introduce a macroscopic model for road traffic accidents along highway sections. We discuss the motivation and the derivation of such a model, and we present its mathematical properties. The results are presented by means of examples where a section of a crowded one-way highway contains in the middle a cluster of drivers whose dynamics are prone to road traffic accidents. We discuss the coupling conditions and present some existence results of weak solutions to the associated Riemann Problems. Furthermore, we illustrate some features of the proposed model through some numerical simulations. © The authors 2012.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background:
The physical periphery of a biological cell is mainly described by signaling pathways which are triggered by transmembrane proteins and receptors that are sentinels to control the whole gene regulatory network of a cell. However, our current knowledge about the gene regulatory mechanisms that are governed by extracellular signals is severely limited.Results: The purpose of this paper is three fold. First, we infer a gene regulatory network from a large-scale B-cell lymphoma expression data set using the C3NET algorithm. Second, we provide a functional and structural analysis of the largest connected component of this network, revealing that this network component corresponds to the peripheral region of a cell. Third, we analyze the hierarchical organization of network components of the whole inferred B-cell gene regulatory network by introducing a new approach which exploits the variability within the data as well as the inferential characteristics of C3NET. As a result, we find a functional bisection of the network corresponding to different cellular components.

Conclusions:
Overall, our study allows to highlight the peripheral gene regulatory network of B-cells and shows that it is centered around hub transmembrane proteins located at the physical periphery of the cell. In addition, we identify a variety of novel pathological transmembrane proteins such as ion channel complexes and signaling receptors in B-cell lymphoma. © 2012 Simoes et al.; licensee BioMed Central Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coxian phase-type distributions are becoming a popular means of representing survival times within a health care environment. They are favoured as they show a distribution as a system of phases and can allow for an easy visual representation of the rate of flow of patients through a system. Difficulties arise, however, in determining the parameter estimates of the Coxian phase-type distribution. This paper examines ways of making the fitting of the Coxian phase-type distribution less cumbersome by outlining different software packages and algorithms available to perform the fit and assessing their capabilities through a number of performance measures. The performance measures rate each of the methods and help in identifying the more efficient. Conclusions drawn from these performance measures suggest SAS to be the most robust package. It has a high rate of convergence in each of the four example model fits considered, short computational times, detailed output, convergence criteria options, along with a succinct ability to switch between different algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A general approach to information correction and fusion for belief functions is proposed, where not only may the information items be irrelevant, but sources may lie as well. We introduce a new correction scheme, which takes into account uncertain metaknowledge on the source’s relevance and truthfulness and that generalizes Shafer’s discounting operation. We then show how to reinterpret all connectives of Boolean logic in terms of source behavior assumptions with respect to relevance and truthfulness. We are led to generalize the unnormalized Dempster’s rule to all Boolean connectives, while taking into account the uncertainties pertaining to assumptions concerning the behavior of sources. Eventually, we further extend this approach to an even more general setting, where source behavior assumptions do not have to be restricted to relevance and truthfulness.We also establish the commutativity property between correction and fusion processes, when the behaviors of the sources are independent.