941 resultados para Nonlinear programming model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider modifications of the nonlinear Schrodinger model (NLS) to look at the recently introduced concept of quasi-integrability. We show that such models possess an in finite number of quasi-conserved charges which present intriguing properties in relation to very specific space-time parity transformations. For the case of two-soliton solutions where the fields are eigenstates of this parity, those charges are asymptotically conserved in the scattering process of the solitons. Even though the charges vary in time their values in the far past and the far future are the same. Such results are obtained through analytical and numerical methods, and employ adaptations of algebraic techniques used in integrable field theories. Our findings may have important consequences on the applications of these models in several areas of non-linear science. We make a detailed numerical study of the modified NLS potential of the form V similar to (vertical bar psi vertical bar(2))(2+epsilon), with epsilon being a perturbation parameter. We perform numerical simulations of the scattering of solitons for this model and find a good agreement with the results predicted by the analytical considerations. Our paper shows that the quasi-integrability concepts recently proposed in the context of modifications of the sine-Gordon model remain valid for perturbations of the NLS model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Augmented Lagrangian methods are effective tools for solving large-scale nonlinear programming problems. At each outer iteration, a minimization subproblem with simple constraints, whose objective function depends on updated Lagrange multipliers and penalty parameters, is approximately solved. When the penalty parameter becomes very large, solving the subproblem becomes difficult; therefore, the effectiveness of this approach is associated with the boundedness of the penalty parameters. In this paper, it is proved that under more natural assumptions than the ones employed until now, penalty parameters are bounded. For proving the new boundedness result, the original algorithm has been slightly modified. Numerical consequences of the modifications are discussed and computational experiments are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The conclusion of the Doha Round negotiations is likely to influence Swiss agricultural policy substantially. The same goes for a free trade agreement in agriculture and food with the European Communities. Even though neither of them will bring about duty-free and quota-free market access, or restrict domestic support measures to green box compatible support, both would represent a big step in that direction. There is no empirical evidence on the effect of such a counterfactual scenario for Swiss agriculture. We therefore use a normative mathematical programming model to illustrate possible effects for agricultural production and the corresponding agricultural income. Moreover, we discuss the results with respect to the provision of public goods under the assumption of continuing green box-compatible direct payments. The aim of our article is to bring more transparency into the discussion on the effects of freer and less distorted trade on the income generation by a multifunctional agriculture. The article will be organized as follows. In the first Section we specify the background of our study. In the second section, we focus on the problem statement and our research questions. In Section 3, we describe in detail a counterfactual scenario of “duty-free, quota-free and price support-free” agriculture from an economic as well as a legal perspective. Our methodology and the results are presented in Section 4 and 5 respectively. In Section 6, we discuss our results with respect to economic and legal aspects of multifunctional agriculture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Clustered data analysis is characterized by the need to describe both systematic variation in a mean model and cluster-dependent random variation in an association model. Marginalized multilevel models embrace the robustness and interpretations of a marginal mean model, while retaining the likelihood inference capabilities and flexible dependence structures of a conditional association model. Although there has been increasing recognition of the attractiveness of marginalized multilevel models, there has been a gap in their practical application arising from a lack of readily available estimation procedures. We extend the marginalized multilevel model to allow for nonlinear functions in both the mean and association aspects. We then formulate marginal models through conditional specifications to facilitate estimation with mixed model computational solutions already in place. We illustrate this approach on a cerebrovascular deficiency crossover trial.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In evaluating the accuracy of diagnosis tests, it is common to apply two imperfect tests jointly or sequentially to a study population. In a recent meta-analysis of the accuracy of microsatellite instability testing (MSI) and traditional mutation analysis (MUT) in predicting germline mutations of the mismatch repair (MMR) genes, a Bayesian approach (Chen, Watson, and Parmigiani 2005) was proposed to handle missing data resulting from partial testing and the lack of a gold standard. In this paper, we demonstrate an improved estimation of the sensitivities and specificities of MSI and MUT by using a nonlinear mixed model and a Bayesian hierarchical model, both of which account for the heterogeneity across studies through study-specific random effects. The methods can be used to estimate the accuracy of two imperfect diagnostic tests in other meta-analyses when the prevalence of disease, the sensitivities and/or the specificities of diagnostic tests are heterogeneous among studies. Furthermore, simulation studies have demonstrated the importance of carefully selecting appropriate random effects on the estimation of diagnostic accuracy measurements in this scenario.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this research was to develop a high-fidelity dynamic model of a parafoilpayload system with respect to its application for the Ship Launched Aerial Delivery System (SLADS). SLADS is a concept in which cargo can be transfered from ship to shore using a parafoil-payload system. It is accomplished in two phases: An initial towing phase when the glider follows the towing vessel in a passive lift mode and an autonomous gliding phase when the system is guided to the desired point. While many previous researchers have analyzed the parafoil-payload system when it is released from another airborne vehicle, limited work has been done in the area of towing up the system from ground or sea. One of the main contributions of this research was the development of a nonlinear dynamic model of a towed parafoil-payload system. After performing an extensive literature review of the existing methods of modeling a parafoil-payload system, a five degree-of-freedom model was developed. The inertial and geometric properties of the system were investigated to predict accurate results in the simulation environment. Since extensive research has been done in determining the aerodynamic characteristics of a paraglider, an existing aerodynamic model was chosen to incorporate the effects of air flow around the flexible paraglider wing. During the towing phase, it is essential that the parafoil-payload system follow the line of the towing vessel path to prevent an unstable flight condition called ‘lockout’. A detailed study of the causes of lockout, its mathematical representation and the flight conditions and the parameters related to lockout, constitute another contribution of this work. A linearized model of the parafoil-payload system was developed and used to analyze the stability of the system about equilibrium conditions. The relationship between the control surface inputs and the stability was investigated. In addition to stability of flight, one more important objective of SLADS is to tow up the parafoil-payload system as fast as possible. The tension in the tow cable is directly proportional to the rate of ascent of the parafoil-payload system. Lockout instability is more favorable when tow tensions are large. Thus there is a tradeoff between susceptibility to lockout and rapid deployment. Control strategies were also developed for optimal tow up and to maintain stability in the event of disturbances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Small-scale farmers in the Chipata District of Zambia rely on their farm fields to grow maize and groundnuts for food security. Cotton production and surplus food security crops are used to generate income to provide for their families. With increasing population pressure, available land has decreased and farmers struggle to provide the necessary food requirements and income to meet their family’s needs. The purpose of the study was to determine how a farmer can best allocate his land to produce maize, groundnuts and cotton when constrained by labor and capital resources to generate the highest potential for food security and financial gains. Data from the 2008-2009 growing season was compiled and analyzed using a linear programming model. The study determined that farmers make the most profit by allocating all additional land and resources to cotton after meeting their minimum food security requirements. The study suggests growing cotton is a beneficial practice for small-scale subsistence farmers to generate income when restricted by limited resources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulations of forest stand dynamics in a modelling framework including Forest Vegetation Simulator (FVS) are diameter driven, thus the diameter or basal area increment model needs a special attention. This dissertation critically evaluates diameter or basal area increment models and modelling approaches in the context of the Great Lakes region of the United States and Canada. A set of related studies are presented that critically evaluate the sub-model for change in individual tree basal diameter used in the Forest Vegetation Simulator (FVS), a dominant forestry model in the Great Lakes region. Various historical implementations of the STEMS (Stand and Tree Evaluation and Modeling System) family of diameter increment models, including the current public release of the Lake States variant of FVS (LS-FVS), were tested for the 30 most common tree species using data from the Michigan Forest Inventory and Analysis (FIA) program. The results showed that current public release of the LS-FVS diameter increment model over-predicts 10-year diameter increment by 17% on average. Also the study affirms that a simple adjustment factor as a function of a single predictor, dbh (diameter at breast height) used in the past versions, provides an inadequate correction of model prediction bias. In order to re-engineer the basal diameter increment model, the historical, conceptual and philosophical differences among the individual tree increment model families and their modelling approaches were analyzed and discussed. Two underlying conceptual approaches toward diameter or basal area increment modelling have been often used: the potential-modifier (POTMOD) and composite (COMP) approaches, which are exemplified by the STEMS/TWIGS and Prognosis models, respectively. It is argued that both approaches essentially use a similar base function and neither is conceptually different from a biological perspective, even though they look different in their model forms. No matter what modelling approach is used, the base function is the foundation of an increment model. Two base functions – gamma and Box-Lucas – were identified as candidate base functions for forestry applications. The results of a comparative analysis of empirical fits showed that quality of fit is essentially similar, and both are sufficiently detailed and flexible for forestry applications. The choice of either base function in order to model diameter or basal area increment is dependent upon personal preference; however, the gamma base function may be preferred over the Box-Lucas, as it fits the periodic increment data in both a linear and nonlinear composite model form. Finally, the utility of site index as a predictor variable has been criticized, as it has been widely used in models for complex, mixed species forest stands though not well suited for this purpose. An alternative to site index in an increment model was explored, using site index and a combination of climate variables and Forest Ecosystem Classification (FEC) ecosites and data from the Province of Ontario, Canada. The results showed that a combination of climate and FEC ecosites variables can replace site index in the diameter increment model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Asthma is an increasing health problem worldwide, but the long-term temporal pattern of clinical symptoms is not understood and predicting asthma episodes is not generally possible. We analyse the time series of peak expiratory flows, a standard measurement of airway function that has been assessed twice daily in a large asthmatic population during a long-term crossover clinical trial. Here we introduce an approach to predict the risk of worsening airflow obstruction by calculating the conditional probability that, given the current airway condition, a severe obstruction will occur within 30 days. We find that, compared with a placebo, a regular long-acting bronchodilator (salmeterol) that is widely used to improve asthma control decreases the risk of airway obstruction. Unexpectedly, however, a regular short-acting beta2-agonist bronchodilator (albuterol) increases this risk. Furthermore, we find that the time series of peak expiratory flows show long-range correlations that change significantly with disease severity, approaching a random process with increased variability in the most severe cases. Using a nonlinear stochastic model, we show that both the increased variability and the loss of correlations augment the risk of unstable airway function. The characterization of fluctuations in airway function provides a quantitative basis for objective risk prediction of asthma episodes and for evaluating the effectiveness of therapy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

More than 500 endemic haplochromine cichlid species inhabit Lake Victoria. This striking species diversity is a classical example of recent explosive adaptive radiation thought to have happened within the last similar to 15,000 years. In this study, we examined the population structure and historical demography of 3 pelagic haplochromine cichlid species that resemble in morphology and have similar niche, Haplochromis (Yssichromis) laparogramma, Haplochromis (Y.) pyrrhocephalus, and Haplochromis (Y.) sp. "glaucocephalus". We investigated the sequences of the mitochondrial DNA control region and the insertion patterns of short interspersed elements (SINEs) of 759 individuals. We show that sympatric forms are genetically differentiated in 4 of 6 cases, but we also found apparent weakening of the genetic differentiation in areas with turbid water. We estimated the timings of population expansion and species divergence to coincide with the refilling of the lake at the Pleistocene/Holocene boundary. We also found that estimates can be altered significantly by the choice of the shape of the molecular clock. If we employ the nonlinear clock model of evolutionary rates in which the rates are higher towards the recent, the population expansion was dated at around the event of desiccation of the lake ca. 17,000 YBP. Thus, we succeeded in clarifying the species and population structure of closely related Lake Victoria cichlids and in showing the importance of applying appropriate clock calibrations in elucidating recent evolutionary events. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vertebral compression fracture is a common medical problem in osteoporotic individuals. The quantitative computed tomography (QCT)-based finite element (FE) method may be used to predict vertebral strength in vivo, but needs to be validated with experimental tests. The aim of this study was to validate a nonlinear anatomy specific QCT-based FE model by using a novel testing setup. Thirty-seven human thoracolumbar vertebral bone slices were prepared by removing cortical endplates and posterior elements. The slices were scanned with QCT and the volumetric bone mineral density (vBMD) was computed with the standard clinical approach. A novel experimental setup was designed to induce a realistic failure in the vertebral slices in vitro. Rotation of the loading plate was allowed by means of a ball joint. To minimize device compliance, the specimen deformation was measured directly on the loading plate with three sensors. A nonlinear FE model was generated from the calibrated QCT images and computed vertebral stiffness and strength were compared to those measured during the experiments. In agreement with clinical observations, most of the vertebrae underwent an anterior wedge-shape fracture. As expected, the FE method predicted both stiffness and strength better than vBMD (R2 improved from 0.27 to 0.49 and from 0.34 to 0.79, respectively). Despite the lack of fitting parameters, the linear regression of the FE prediction for strength was close to the 1:1 relation (slope and intercept close to one (0.86 kN) and to zero (0.72 kN), respectively). In conclusion, a nonlinear FE model was successfully validated through a novel experimental technique for generating wedge-shape fractures in human thoracolumbar vertebrae.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data compiled within the IMPENSO project. The Impact of ENSO on Sustainable Water Management and the Decision-Making Community at a Rainforest Margin in Indonesia (IMPENSO), http://www.gwdg.de/~impenso, was a German-Indonesian research project (2003-2007) that has studied the impact of ENSO (El Nino-Southern Oscillation) on the water resources and the agricultural production in the PALU RIVER watershed in Central Sulawesi. ENSO is a climate variability that causes serious droughts in Indonesia and other countries of South-East Asia. The last ENSO event occurred in 1997. As in other regions, many farmers in Central Sulawesi suffered from reduced crop yields and lost their livestock. A better prediction of ENSO and the development of coping strategies would help local communities mitigate the impact of ENSO on rural livelihoods and food security. The IMPENSO project deals with the impact of the climate variability ENSO (El Niño Southern Oscillation) on water resource management and the local communities in the Palu River watershed of Central Sulawesi, Indonesia. The project consists of three interrelated sub-projects, which study the local and regional manifestation of ENSO using the Regional Climate Models REMO and GESIMA (Sub-project A), quantify the impact of ENSO on the availability of water for agriculture and other uses, using the distributed hydrological model WaSiM-ETH (Sub-project B), and analyze the socio-economic impact and the policy implications of ENSO on the basis of a production function analysis, a household vulnerability analysis, and a linear programming model (Sub-project C). The models used in the three sub-projects will be integrated to simulate joint scenarios that are defined in collaboration with local stakeholders and are relevant for the design of coping strategies.