27 resultados para Mixed Binary Linear Programming
Resumo:
Sugars affect the gelatinization of starch, with the effect varying significantly between sugars. Since many food products contain a mixture of sugar sources, it is important to understand how their mixtures affect starch gelatinization. In a Rapid Visco Analyser study of maize starch gelatinization, changing proportions in binary mixtures of refined sugars saw a largely proportionate change in starch gelatinization properties. However, binary mixture of pure sugars and honey, or a model honey system (the main sugars in honey) and honey responded differently. Generally, replacing 25% or 50% of the refined sugar or model honey system with honey gave a large change in starch gelatinization properties, while further increases in honey level had little further effect. Differences between honey and buffered model honey system (either gluconic acid, or a mixture of citric acid and di-sodium phosphate) showed the sensitivity of starch gelatinization to the composition of the nonsaccharide component. (c) 2004 Swiss Society of Food Science and Technology. Published by Elsevier Ltd. All rights reserved.
Resumo:
Standard factorial designs sometimes may be inadequate for experiments that aim to estimate a generalized linear model, for example, for describing a binary response in terms of several variables. A method is proposed for finding exact designs for such experiments that uses a criterion allowing for uncertainty in the link function, the linear predictor, or the model parameters, together with a design search. Designs are assessed and compared by simulation of the distribution of efficiencies relative to locally optimal designs over a space of possible models. Exact designs are investigated for two applications, and their advantages over factorial and central composite designs are demonstrated.
Resumo:
Computer modelling promises to. be an important tool for analysing and predicting interactions between trees within mixed species forest plantations. This study explored the use of an individual-based mechanistic model as a predictive tool for designing mixed species plantations of Australian tropical trees. The 'spatially explicit individually based-forest simulator' (SeXI-FS) modelling system was used to describe the spatial interaction of individual tree crowns within a binary mixed-species experiment. The three-dimensional model was developed and verified with field data from three forest tree species grown in tropical Australia. The model predicted the interactions within monocultures and binary mixtures of Flindersia brayleyana, Eucalyptus pellita and Elaeocarpus grandis, accounting for an average of 42% of the growth variation exhibited by species in different treatments. The model requires only structural dimensions and shade tolerance as species parameters. By modelling interactions in existing tree mixtures, the model predicted both increases and reductions in the growth of mixtures (up to +/- 50% of stem volume at 7 years) compared to monocultures. This modelling approach may be useful for designing mixed tree plantations. (c) 2006 Published by Elsevier B.V.
Resumo:
We consider a problem of robust performance analysis of linear discrete time varying systems on a bounded time interval. The system is represented in the state-space form. It is driven by a random input disturbance with imprecisely known probability distribution; this distributional uncertainty is described in terms of entropy. The worst-case performance of the system is quantified by its a-anisotropic norm. Computing the anisotropic norm is reduced to solving a set of difference Riccati and Lyapunov equations and a special form equation.
Resumo:
Small-angle neutron scattering measurements on a series of monodisperse linear entangled polystyrene melts in nonlinear flow through an abrupt 4:1 contraction have been made. Clear signatures of melt deformation and subsequent relaxation can be observed in the scattering patterns, which were taken along the centerline. These data are compared with the predictions of a recently derived molecular theory. Two levels of molecular theory are used: a detailed equation describing the evolution of molecular structure over all length scales relevant to the scattering data and a simplified version of the model, which is suitable for finite element computations. The velocity field for the complex melt flow is computed using the simplified model and scattering predictions are made by feeding these flow histories into the detailed model. The modeling quantitatively captures the full scattering intensity patterns over a broad range of data with independent variation of position within the contraction geometry, bulk flow rate and melt molecular weight. The study provides a strong, quantitative validation of current theoretical ideas concerning the microscopic dynamics of entangled polymers which builds upon existing comparisons with nonlinear mechanical stress data. Furthermore, we are able to confirm the appreciable length scale dependence of relaxation in polymer melts and highlight some wider implications of this phenomenon.
Resumo:
-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.
Resumo:
Quantum computers promise to increase greatly the efficiency of solving problems such as factoring large integers, combinatorial optimization and quantum physics simulation. One of the greatest challenges now is to implement the basic quantum-computational elements in a physical system and to demonstrate that they can be reliably and scalably controlled. One of the earliest proposals for quantum computation is based on implementing a quantum bit with two optical modes containing one photon. The proposal is appealing because of the ease with which photon interference can be observed. Until now, it suffered from the requirement for non-linear couplings between optical modes containing few photons. Here we show that efficient quantum computation is possible using only beam splitters, phase shifters, single photon sources and photo-detectors. Our methods exploit feedback from photo-detectors and are robust against errors from photon loss and detector inefficiency. The basic elements are accessible to experimental investigation with current technology.
Resumo:
Four adducts of triphenylphosphine oxide with aromatic carboxylic acids have been synthesized and tested for second-order non-linear optical properties. These were with N-methylpyrrole-2-carboxylic acid (I), indole-2-carboxylic acid (2), 3-dimethylaminobenzoic acid (3), and thiophen-2-carboxylic acid (4). Compound (1) produced clear, colourless crystals (space group P2(1)2(1)2(1) With a 9.892(1), b 14.033(1), c 15.305(1) Angstrom, Z 4) which allowed the structure to be determined by X-ray diffraction.
Resumo:
Background. Age-related motor slowing may reflect either motor programming deficits, poorer movement execution, or mere strategic preferences for online guidance of movement. We controlled such preferences, limiting the extent to which movements could be programmed. Methods. Twenty-four young and 24 older adults performed a line drawing task that allowed movements to he prepared in advance in one case (i.e., cue initially available indicating target location) and not in another (i.e., no cue initially available as to target location). Participants connected large or small targets illuminated by light-emitting diodes upon a graphics tablet that sampled pen tip position at 200 Hz. Results. Older adults had a disproportionate difficulty initiating movement when prevented from programming in advance. Older adults produced slower, less efficient movements, particularly when prevented from programming under greater precision requirements. Conclusions. The slower movements of older adults do not simply reflect a preference for online control, as older adults have less efficient movements when forced to reprogram their movements. Age-related motor slowing kinematically resembles that seen in patients with cerebellar dysfunction.