843 resultados para mathematical programming
Resumo:
Polymers which can respond to externally applied stimuli have found much application in the biomedical field due to their (reversible) coil–globule transitions. Polymers displaying a lower critical solution temperature are the most commonly used, but for blood-borne (i.e., soluble) biomedical applications the application of heat is not always possible, nor practical. Here we report the design and synthesis of poly(oligoethylene glycol methacrylate)-based polymers whose cloud points are easily varied by alkaline phosphatase-mediated dephosphorylation. By fine-tuning the density of phosphate groups on the backbone, it was possible to induce an isothermal transition: A change in solubility triggered by removal of a small number of phosphate esters from the side chains activating the LCST-type response. As there was no temperature change involved, this serves as a model of a cell-instructed polymer response. Finally, it was found that both polymers were non cytotoxic against MCF-7 cells (at 1 mg·mL–1), which confirms promise for biomedical applications.
Resumo:
Mathematical relationships between Scoring Parameters can be used in Economic Scoring Formulas (ESF) in tendering to distribute the score among bidders in the economic part of a proposal. Each contracting authority must set an ESF when publishing tender specifications and the strategy of each bidder will differ depending on the ESF selected and the weight of the overall proposal scoring. This paper introduces the various mathematical relationships and density distributions that describe and inter-relate not only the main Scoring Parameters but the main Forecasting Parameters in any capped tender (those whose price is upper-limited). Forecasting Parameters, as variables that can be known in advance before the deadline of a tender is reached, together with Scoring Parameters constitute the basis of a future Bid Tender Forecasting Model.
Resumo:
Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.
Resumo:
The congruential rule advanced by Graves for polarization basis transformation of the radar backscatter matrix is now often misinterpreted as an example of consimilarity transformation. However, consimilarity transformations imply a physically unrealistic antilinear time-reversal operation. This is just one of the approaches found in literature to the description of transformations where the role of conjugation has been misunderstood. In this paper, the different approaches are examined in particular in respect to the role of conjugation. In order to justify and correctly derive the congruential rule for polarization basis transformation and properly place the role of conjugation, the origin of the problem is traced back to the derivation of the antenna height from the transmitted field. In fact, careful consideration of the role played by the Green’s dyadic operator relating the antenna height to the transmitted field shows that, under general unitary basis transformation, it is not justified to assume a scalar relationship between them. Invariance of the voltage equation shows that antenna states and wave states must in fact lie in dual spaces, a distinction not captured in conventional Jones vector formalism. Introducing spinor formalism, and with the use of an alternate spin frame for the transmitted field a mathematically consistent implementation of the directional wave formalism is obtained. Examples are given comparing the wider generality of the congruential rule in both active and passive transformations with the consimilarity rule.
Resumo:
SHIMMER (Soil biogeocHemIcal Model for Microbial Ecosystem Response) is a new numerical modelling framework designed to simulate microbial dynamics and biogeochemical cycling during initial ecosystem development in glacier forefield soils. However, it is also transferable to other extreme ecosystem types (such as desert soils or the surface of glaciers). The rationale for model development arises from decades of empirical observations in glacier forefields, and enables a quantitative and process focussed approach. Here, we provide a detailed description of SHIMMER, test its performance in two case study forefields: the Damma Glacier (Switzerland) and the Athabasca Glacier (Canada) and analyse sensitivity to identify the most sensitive and unconstrained model parameters. Results show that the accumulation of microbial biomass is highly dependent on variation in microbial growth and death rate constants, Q10 values, the active fraction of microbial biomass and the reactivity of organic matter. The model correctly predicts the rapid accumulation of microbial biomass observed during the initial stages of succession in the forefields of both the case study systems. Primary production is responsible for the initial build-up of labile substrate that subsequently supports heterotrophic growth. However, allochthonous contributions of organic matter, and nitrogen fixation, are important in sustaining this productivity. The development and application of SHIMMER also highlights aspects of these systems that require further empirical research: quantifying nutrient budgets and biogeochemical rates, exploring seasonality and microbial growth and cell death. This will lead to increased understanding of how glacier forefields contribute to global biogeochemical cycling and climate under future ice retreat.
Resumo:
A mathematical model for Banana Xanthomonas Wilt (BXW) spread by insect is presented. The model incorporates inflorescence infection and vertical transmission from the mother corm to attached suckers, but not tool-based transmission by humans. Expressions for the basic reproduction number R0 are obtained and it is verified that disease persists, at a unique endemic level, when R0 > 1. From sensitivity analysis, inflorescence infection rate and roguing rate were the parameters with most influence on disease persistence and equilibrium level. Vertical transmission parameters had less effect on persistence threshold values. Parameters were approximately estimated from field data. The model indicates that single stem removal is a feasible approach to eradication if spread is mainly via inflorescence infection. This requires continuous surveillance and debudding such that a 50% reduction in inflorescence infection and 2–3 weeks interval of surveillance would eventually lead to full recovery of banana plantations and hence improved production.
Resumo:
Low birth weight has been associated with increased obesity in adulthood. It has been shown that dietary salt restriction during intrauterine life induces low birth weight and insulin resistance in adult Wistar rats. The present study had a two-fold objective: to evaluate the effects that low salt intake during pregnancy and lactation has on the amount and distribution of adipose tissue; and to determine whether the phenotypic changes in fat mass in this model are associated with alterations in the activity of the renin-angiotensin system. Maternal salt restriction was found to reduce birth weight in male and female offspring. In adulthood, the female offspring of dams fed the low-salt diet presented higher adiposity indices than those seen in the offspring of dams fed a normal-salt diet. This was attributed to the fact that adipose tissue mass (retroperitoneal but not gonadal, mesenteric or inguinal) was greater in those rats than in the offspring of dams fed a normal diet. The adult offspring of dams fed the low-salt diet, compared to those dams fed a normal-salt diet, presented the following: plasma leptin levels higher in males and lower in females; plasma renin activity higher in males but not in females; and no differences in body weight, mean arterial blood pressure or serum angiotensin-converting enzyme activity. Therefore, low salt intake during pregnancy might lead to the programming of obesity in adult female offspring. (c) 2009 Elsevier Inc. All rights reserved.
Resumo:
We introduce a problem called maximum common characters in blocks (MCCB), which arises in applications of approximate string comparison, particularly in the unification of possibly erroneous textual data coming from different sources. We show that this problem is NP-complete, but can nevertheless be solved satisfactorily using integer linear programming for instances of practical interest. Two integer linear formulations are proposed and compared in terms of their linear relaxations. We also compare the results of the approximate matching with other known measures such as the Levenshtein (edit) distance. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Industrial production processes involving both lot-sizing and cutting stock problems are common in many industrial settings. However, they are usually treated in a separate way, which could lead to costly production plans. In this paper, a coupled mathematical model is formulated and a heuristic method based on Lagrangian relaxation is proposed. Computational results prove its effectiveness. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Background Along the internal carotid artery (ICA), atherosclerotic plaques are often located in its cavernous sinus (parasellar) segments (pICA). Studies indicate that the incidence of pre-atherosclerotic lesions is linked with the complexity of the pICA; however, the pICA shape was never objectively characterized. Our study aims at providing objective mathematical characterizations of the pICA shape. Methods and results Three-dimensional (3D) computer models, reconstructed from contrast enhanced computed tomography (CT) data of 30 randomly selected patients (60 pICAs) were analyzed with modern visualization software and new mathematical algorithms. As objective measures for the pICA shape complexity, we provide calculations of curvature energy, torsion energy, and total complexity of 3D skeletons of the pICA lumen. We further measured the posterior knee of the so-called ""carotid siphon"" with a virtual goniometer and performed correlations between the objective mathematical calculations and the subjective angle measurements. Conclusions Firstly, our study provides mathematical characterizations of the pICA shape, which can serve as objective reference data for analyzing connections between pICA shape complexity and vascular diseases. Secondly, we provide an objective method for creating Such data. Thirdly, we evaluate the usefulness of subjective goniometric measurements of the angle of the posterior knee of the carotid siphon.
Resumo:
Inside the `cavernous sinus` or `parasellar region` the human internal carotid artery takes the shape of a siphon that is twisted and torqued in three dimensions and surrounded by a network of veins. The parasellar section of the internal carotid artery is of broad biological and medical interest, as its peculiar shape is associated with temperature regulation in the brain and correlated with the occurrence of vascular pathologies. The present study aims to provide anatomical descriptions and objective mathematical characterizations of the shape of the parasellar section of the internal carotid artery in human infants and its modifications during ontogeny. Three-dimensional (3D) computer models of the parasellar section of the internal carotid artery of infants were generated with a state-of-the-art 3D reconstruction method and analysed using both traditional morphometric methods and novel mathematical algorithms. We show that four constant, demarcated bends can be described along the infant parasellar section of the internal carotid artery, and we provide measurements of their angles. We further provide calculations of the curvature and torsion energy, and the total complexity of the 3D skeleton of the parasellar section of the internal carotid artery, and compare the complexity of this in infants and adults. Finally, we examine the relationship between shape parameters of the parasellar section of the internal carotid artery in infants, and the occurrence of intima cushions, and evaluate the reliability of subjective angle measurements for characterizing the complexity of the parasellar section of the internal carotid artery in infants. The results can serve as objective reference data for comparative studies and for medical imaging diagnostics. They also form the basis for a new hypothesis that explains the mechanisms responsible for the ontogenetic transformation in the shape of the parasellar section of the internal carotid artery.
Resumo:
Based on a divide and conquer approach, knowledge about nature has been organized into a set of interrelated facts, allowing a natural representation in terms of graphs: each `chunk` of knowledge corresponds to a node, while relationships between such chunks are expressed as edges. This organization becomes particularly clear in the case of mathematical theorems, with their intense cross-implications and relationships. We have derived a web of mathematical theorems from Wikipedia and, thanks to the powerful concept of entropy, identified its more central and frontier elements. Our results also suggest that the central nodes are the oldest theorems, while the frontier nodes are those recently added to the network. The network communities have also been identified, allowing further insights about the organization of this network, such as its highly modular structure.
Resumo:
This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth subproblems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.
Resumo:
Given an algorithm A for solving some mathematical problem based on the iterative solution of simpler subproblems, an outer trust-region (OTR) modification of A is the result of adding a trust-region constraint to each subproblem. The trust-region size is adaptively updated according to the behavior of crucial variables. The new subproblems should not be more complex than the original ones, and the convergence properties of the OTR algorithm should be the same as those of Algorithm A. In the present work, the OTR approach is exploited in connection with the ""greediness phenomenon"" of nonlinear programming. Convergence results for an OTR version of an augmented Lagrangian method for nonconvex constrained optimization are proved, and numerical experiments are presented.
Resumo:
We investigate several two-dimensional guillotine cutting stock problems and their variants in which orthogonal rotations are allowed. We first present two dynamic programming based algorithms for the Rectangular Knapsack (RK) problem and its variants in which the patterns must be staged. The first algorithm solves the recurrence formula proposed by Beasley; the second algorithm - for staged patterns - also uses a recurrence formula. We show that if the items are not so small compared to the dimensions of the bin, then these algorithms require polynomial time. Using these algorithms we solved all instances of the RK problem found at the OR-LIBRARY, including one for which no optimal solution was known. We also consider the Two-dimensional Cutting Stock problem. We present a column generation based algorithm for this problem that uses the first algorithm above mentioned to generate the columns. We propose two strategies to tackle the residual instances. We also investigate a variant of this problem where the bins have different sizes. At last, we study the Two-dimensional Strip Packing problem. We also present a column generation based algorithm for this problem that uses the second algorithm above mentioned where staged patterns are imposed. In this case we solve instances for two-, three- and four-staged patterns. We report on some computational experiments with the various algorithms we propose in this paper. The results indicate that these algorithms seem to be suitable for solving real-world instances. We give a detailed description (a pseudo-code) of all the algorithms presented here, so that the reader may easily implement these algorithms. (c) 2007 Elsevier B.V. All rights reserved.