964 resultados para Binary linear programming (BLP)
Resumo:
Background. Increased life expectancy in men during the last thirty years is largely due to the decrease in mortality from cardiovascular disease in the age group 29-69 yr. This change has resulted in a change in the disease profile of the population with conditions such as aneurysm of the abdominal aorta (AAA) becoming more prevalent. The advent of endoluminal treatment for AAA has encouraged prophylactic intervention and fuelled the argument to screen for the disease. The feasibility of inserting an endoluminal graft is dependent on the morphology and growth characteristics of the aneurysm. This study used data from a randomized controlled trial of ultrasound screening for AAA in men aged 65-83 yr in Western Australia for the purpose of determining the norms of the living anatomy in the pressurized infrarenal aorta. Aims. To examine (1) the diameters of the infra-renal aorta in aneurysmal and non-aneurysmal cases, (2) the implications for treatment modalities, with particular reference to endoluminal grafting, which is most dependent on normal and aneurysmal morphology, and (3) any evidence to support the notion that northern Europeans are predisposed to aneurysmal disease. Methods. Using ultrasound, a randomized control trial was established in Western Australia to assess the value of a screening program in males aged 65-83 yr, The infra-renal aorta was defined as aneurysmal if the maximum diameter was 30 mm or more. Aortic diameter was modelled both as a continuous tin mm) and as a binary outcome variable, for those men who had an infra-renal diameter of 30 mm or more. ANOVA and linear regression were used for modelling aortic diameter as a continuum, while chi-square analysis and logistic regression were used in comparing men with and without the diagnosis of AAA. Findings. By December 1998, of 19.583 men had been invited to undergo ultrasound screening for AAA, 12.203 accepted the invitation (corrected response fraction 70.8%). The prevalence of AAA increased with age from 4.8% at 65 yr to 10.8% at 80 yr (chi (2) = 77.9, df = 3, P<0.001). The median (IQR) diameter for the non-aneurysmal group was 21.4 mm (3.3 mm) and there was an increase (<chi>(2) = 76.0, df = 1, P<0.001) in the diameter of the infra-renal aorta with age. Since 27 mm is the 95th centile for the non-aneurysmal infra-renal aorta, a diameter of 30 mm or more is justified as defining an aneurysm. The risk of AAA was higher in men of Australian (OR = 1.0) and northern European origin (OR = 1.0, 95%CL: 0.9. 1.2) compared with those of Mediterranean origin (OR = 0.5, 99%CL: 0.4, 0.7). Conclusion. Although screening has not yet been shown to reduce mortality from AAA. these population-based data assist the understanding of aneurysmal disease and the further development and use of endoluminal grafts for this condition. (C) 2001 Published by Elsevier Science Ltd on behalf of The International Society for Cardiovascular Surgery.
Resumo:
1. A model of the population dynamics of Banksia ornata was developed, using stochastic dynamic programming (a state-dependent decision-making tool), to determine optimal fire management strategies that incorporate trade-offs between biodiversity conservation and fuel reduction. 2. The modelled population of B. ornata was described by its age and density, and was exposed to the risk of unplanned fires and stochastic variation in germination success. 3. For a given population in each year, three management strategies were considered: (i) lighting a prescribed fire; (ii) controlling the incidence of unplanned fire; (iii) doing nothing. 4. The optimal management strategy depended on the state of the B. ornata population, with the time since the last fire (age of the population) being the most important variable. Lighting a prescribed fire at an age of less than 30 years was only optimal when the density of seedlings after a fire was low (< 100 plants ha(-1)) or when there were benefits of maintaining a low fuel load by using more frequent fire. 5. Because the cost of management was assumed to be negligible (relative to the value of the persistence of the population), the do-nothing option was never the optimal strategy, although lighting prescribed fires had only marginal benefits when the mean interval between unplanned fires was less than 20-30 years.
Resumo:
A chance constrained programming model is developed to assist Queensland barley growers make varietal and agronomic decisions in the face of changing product demands and volatile production conditions. Unsuitable or overlooked in many risk programming applications, the chance constrained programming approach nonetheless aptly captures the single-stage decision problem faced by barley growers of whether to plant lower-yielding but potentially higher-priced malting varieties, given a particular expectation of meeting malting grade standards. Different expectations greatly affect the optimal mix of malting and feed barley activities. The analysis highlights the suitability of chance constrained programming to this specific class of farm decision problem.
Resumo:
A modelling framework is developed to determine the joint economic and environmental net benefits of alternative land allocation strategies. Estimates of community preferences for preservation of natural land, derived from a choice modelling study, are used as input to a model of agricultural production in an optimisation framework. The trade-offs between agricultural production and environmental protection are analysed using the sugar industry of the Herbert River district of north Queensland as an example. Spatially-differentiated resource attributes and the opportunity costs of natural land determine the optimal tradeoffs between production and conservation for a range of sugar prices.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
A study was conducted to verify whether the theory on the evolution of corporate environmental management (CEM) is applicable to organizations located in Brazil. Some of the most important proposals pertaining to the evolution of CEM were evaluated in a systematic fashion and integrated into a typical theoretical framework containing three evolutionary stages: reactive, preventive and proactive. The validity of this framework was tested by surveying 94 companies located in Brazil with ISO 14001 certification. Results indicated that the evolution of CEM tends to occur in a manner that is counter to what has generally been described in the literature. Two evolutionary stages were identified: 1) synergy for eco-efficiency and 2) environmental legislation view, which combine variables that were initially categorized into different theoretical CEM stages. These data, obtained from a direct study of Brazilian companies, suggest that the evolution of environmental management in organizations tends to occur in a non-linear fashion, requiring a re-analysis of traditional perceptions of CEM development, as suggested by Kolk and Mauser (2002). (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Surface pressure (pi)-molecular area (A) curves were used to characterize the packing of pseudo-ternary mixed Langmuir monolayers of egg phosphatidylcholine (EPC), 1,2-dioleoyl-3-trimethylammonium propane (DOTAP) and L-alpha-dioleoyl phosphatidylethanolamine (DOPE). This pseudo-ternary mixture EPC/DOPE/DOTAP has been successfully employed in liposome formulations designed for DNA non-viral vectors. Pseudo-binary mixtures were also studied as a control. Miscibility behavior was inferred from pi-A curves applying the additivity rule by calculating the excess free energy of mixture (Delta G(Exc)). The interaction between the lipids was also deduced from the surface compressional modulus (C(s)(-1)). The deviation from ideality shows dependence on the lipid polar head type and monolayer composition. For lower DOPE concentrations, the forces are predominantly attractive. However, if the monolayer is DOPE rich, the DOTAP presence disturbs the PE-PE intermolecular interaction and the net interaction is then repulsive. The ternary monolayer EPC/DOPE/DOTAP presented itself in two configurations, modulated by the DOPE content, in a similar behavior to the DOPE/DOTAP monolayers. These results contribute to the understanding of the lipid interactions and packing in self-assembled systems associated with the in vitro and in vivo stability of liposomes. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.
Resumo:
Modeling volatile organic compounds (voc`s) adsorption onto cup-stacked carbon nanotubes (cscnt) using the linear driving force model. Volatile organic compounds (VOC`s) are an important category of air pollutants and adsorption has been employed in the treatment (or simply concentration) of these compounds. The current study used an ordinary analytical methodology to evaluate the properties of a cup-stacked nanotube (CSCNT), a stacking morphology of truncated conical graphene, with large amounts of open edges on the outer surface and empty central channels. This work used a Carbotrap bearing a cup-stacked structure (composite); for comparison, Carbotrap was used as reference (without the nanotube). The retention and saturation capacities of both adsorbents to each concentration used (1, 5, 20 and 35 ppm of toluene and phenol) were evaluated. The composite performance was greater than Carbotrap; the saturation capacities for the composite was 67% higher than Carbotrap (average values). The Langmuir isotherm model was used to fit equilibrium data for both adsorbents, and a linear driving force model (LDF) was used to quantify intraparticle adsorption kinetics. LDF was suitable to describe the curves.
Resumo:
The goal of this paper is to study the global existence of small data solutions to the Cauchy problem for the nonlinear wave equation u(tt) - a(t)(2) Delta u = u(t)(2) - a(t)(2)vertical bar del u vertical bar(2). In particular we are interested in statements for the 1D case. We will explain how the interplay between the increasing and oscillating behavior of the coefficient will influence global existence of small data solutions. Copyright c 2011 John Wiley & Sons, Ltd.
Resumo:
We examine a problem with n players each facing the same binary choice. One choice is superior to the other. The simple assumption of competition - that an individual's payoff falls with a rise in the number of players making the same choice, guarantees the existence of a unique symmetric equilibrium (involving mixed strategies). As n increases, there are two opposing effects. First, events in the middle of the distribution - where a player finds itself having made the same choice as many others - become more likely, but the payoffs in these events fall. In opposition, events in the tails of the distribution - where a player finds itself having made the same choice as few others - become less likely, but the payoffs in these events remain high. We provide a sufficient condition (strong competition) under which an increase in the number of players leads to a reduction in the equilibrium probability that the superior choice is made.
Resumo:
New differential linear coherent scattering coefficient, mu(CS), data for four biological tissue types (fat pork, tendon chicken, adipose and fibroglandular human breast tissues) covering a large momentum transfer interval (0.07 <= q <= 70.5 nm(-1)), resulted from combining WAXS and SAXS data, are presented in order to emphasize the need to update the default data-base by including the molecular interference and the large-scale arrangements effect. The results showed that the differential linear coherent scattering coefficient demonstrates influence of the large-scale arrangement, mainly due to collagen fibrils for tendon chicken and fibroglandular breast samples, and triacylglycerides for fat pork and adipose breast samples at low momentum transfer region. While, at high momentum transfer, the mu(CS) reflects effects of molecular interference related to water for tendon chicken and fibroglandular samples and, fatty acids for fat pork and adipose samples. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Although planning is important for the functioning of patients with dementia of the Alzheimer Type (DAT), little is known about response programming in DAT. This study used a cueing paradigm coupled with quantitative kinematic analysis to document the preparation and execution of movements made by a group of 12 DAT patients and their age and sex matched controls. Participants connected a series of targets placed upon a WACOM SD420 graphics tablet, in response to the pattern of illumination of a set of light emitting diodes (LEDs). In one condition, participants could programme the upcoming movement, whilst in another they were forced to reprogramme this movement on-line (i.e. they were not provided with advance information about the location of the upcoming target). DAT patients were found to have programming deficits, taking longer to initiate movements; particularly in the absence of cues. While problems spontaneously programming a movement might cause a greater reliance upon on-line guidance, when both groups were required to guide the movement on-line, DAT patients continued to show slower and less efficient movements implying declining sensori-motor function; these differences were not simply due to strategy or medication status. (C) 1997 Elsevier Science Ltd.
Resumo:
The classification rules of linear discriminant analysis are defined by the true mean vectors and the common covariance matrix of the populations from which the data come. Because these true parameters are generally unknown, they are commonly estimated by the sample mean vector and covariance matrix of the data in a training sample randomly drawn from each population. However, these sample statistics are notoriously susceptible to contamination by outliers, a problem compounded by the fact that the outliers may be invisible to conventional diagnostics. High-breakdown estimation is a procedure designed to remove this cause for concern by producing estimates that are immune to serious distortion by a minority of outliers, regardless of their severity. In this article we motivate and develop a high-breakdown criterion for linear discriminant analysis and give an algorithm for its implementation. The procedure is intended to supplement rather than replace the usual sample-moment methodology of discriminant analysis either by providing indications that the dataset is not seriously affected by outliers (supporting the usual analysis) or by identifying apparently aberrant points and giving resistant estimators that are not affected by them.