22 resultados para optimization model

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Recent work on the complexity of life highlights the roles played by evolutionary forces at different levels of individuality. One of the central puzzles in explaining transitions in individuality for entities ranging from complex cells, to multicellular organisms and societies, is how different autonomous units relinquish control over their functions to others in the group. In addition to the necessity of reducing conflict over effecting specialized tasks, differentiating groups must control the exploitation of the commons, or else be out-competed by more fit groups. Results We propose that two forms of conflict – access to resources within groups and representation in germ line – may be resolved in tandem through individual and group-level selective effects. Specifically, we employ an optimization model to show the conditions under which different within-group social behaviors (cooperators producing a public good or cheaters exploiting the public good) may be selected to disperse, thereby not affecting the commons and functioning as germ line. We find that partial or complete dispersal specialization of cheaters is a general outcome. The propensity for cheaters to disperse is highest with intermediate benefit:cost ratios of cooperative acts and with high relatedness. An examination of a range of real biological systems tends to support our theory, although additional study is required to provide robust tests. Conclusion We suggest that trait linkage between dispersal and cheating should be operative regardless of whether groups ever achieve higher levels of individuality, because individual selection will always tend to increase exploitation, and stronger group structure will tend to increase overall cooperation through kin selected benefits. Cheater specialization as dispersers offers simultaneous solutions to the evolution of cooperation in social groups and the origin of specialization of germ and soma in multicellular organisms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Laser tissue soldering (LTS) is a promising technique for tissue fusion based on a heat-denaturation process of proteins. Thermal damage of the fused tissue during the laser procedure has always been an important and challenging problem. Particularly in LTS of arterial blood vessels strong heating of the endothelium should be avoided to minimize the risk of thrombosis. A precise knowledge of the temperature distribution within the vessel wall during laser irradiation is inevitable. The authors developed a finite element model (FEM) to simulate the temperature distribution within blood vessels during LTS. Temperature measurements were used to verify and calibrate the model. Different parameters such as laser power, solder absorption coefficient, thickness of the solder layer, cooling of the vessel and continuous vs. pulsed energy deposition were tested to elucidate their impact on the temperature distribution within the soldering joint in order to reduce the amount of further animal experiments. A pulsed irradiation with high laser power and high absorbing solder yields the best results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20\% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hERG voltage-gated potassium channel mediates the cardiac I(Kr) current, which is crucial for the duration of the cardiac action potential. Undesired block of the channel by certain drugs may prolong the QT interval and increase the risk of malignant ventricular arrhythmias. Although the molecular determinants of hERG block have been intensively studied, not much is known about its stereoselectivity. Levo-(S)-bupivacaine was the first drug reported to have a higher affinity to block hERG than its enantiomer. This study strives to understand the principles underlying the stereoselectivity of bupivacaine block with the help of mutagenesis analyses and molecular modeling simulations. Electrophysiological measurements of mutated hERG channels allowed for the identification of residues involved in bupivacaine binding and stereoselectivity. Docking and molecular mechanics simulations for both enantiomers of bupivacaine and terfenadine (a non-stereoselective blocker) were performed inside an open-state model of the hERG channel. The predicted binding modes enabled a clear depiction of ligand-protein interactions. Estimated binding affinities for both enantiomers were consistent with electrophysiological measurements. A similar computational procedure was applied to bupivacaine enantiomers towards two mutated hERG channels (Tyr652Ala and Phe656Ala). This study confirmed, at the molecular level, that bupivacaine stereoselectively binds the hERG channel. These results help to lay the foundation for structural guidelines to optimize the cardiotoxic profile of drug candidates in silico.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During osteoporosis induction in sheep, side effects of the steroids were observed in previous studies. The aim of this study was to improve the induction regimen consisting of ovariectomy, calcium/vitamin D- restricted diet and methylprednisolone (-MP)- medication with respect to the bone metabolism and to reduce the adverse side effects. Thirty-six ewes (age 6.5 +/- 0.6 years) were divided into four MP-administration groups (n = 9) with a total dose of 1800 mg MP: group 1: 20 mg/day, group 2: 60 mg/every third day, group 3: 3 x 500 mg and 1 x 300 mg at intervals of three weeks, group 4: weekly administration, starting at 70 mg and weekly reduction by 10 mg. After double-labelling with Calcein Green and Xylenol Orange, bone biopsy specimens were taken from the iliac crest (IC) at the beginning and four weeks after the last MP injection, and additionally from the vertebral body (VB) at the end of the experiment. Bone samples were processed into stained and fluorescent sections, static and dynamic measurements were performed. There were no significant differences for static parameters between the groups initially. The bone perimeter and the bone area values were significantly higher in the VB than in the IC (Pm: 26%, p < 0.0001, Ar: 11%, p < 0.0166). A significant decrease (20%) of the bone area was observed after corticosteroid-induced osteoporosis (p < 0.0004). For the dynamic parameters, no significant difference between the groups was found. Presence of Calcein Green and Xylenol Orange labels were noted in 50% of the biopsies in the IC, 100% in the VB. Group 3 showed the lowest prevalence of adverse side effects. The bone metabolism changes were observed in all four groups, and the VB bone metabolism was higher when compared to the IC. In conclusion, when using equal amounts of steroids adverse side effects can be reduced by decreasing the number of administrations without reducing the effect regarding corticosteroid-induced osteoporosis. This information is useful to reduce the discomfort of the animals in this sheep model of corticosteroid-induced osteoporosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To analyze computer-assisted diagnostics and virtual implant planning and to evaluate the indication for template-guided flapless surgery and immediate loading in the rehabilitation of the edentulous maxilla. MATERIALS AND METHODS: Forty patients with an edentulous maxilla were selected for this study. The three-dimensional analysis and virtual implant planning was performed with the NobelGuide software program (Nobel Biocare, Göteborg, Sweden). Prior to the computer tomography aesthetics and functional aspects were checked clinically. Either a well-fitting denture or an optimized prosthetic setup was used and then converted to a radiographic template. This allowed for a computer-guided analysis of the jaw together with the prosthesis. Accordingly, the best implant position was determined in relation to the bone structure and prospective tooth position. For all jaws, the hypothetical indication for (1) four implants with a bar overdenture and (2) six implants with a simple fixed prosthesis were planned. The planning of the optimized implant position was then analyzed as follows: the number of implants was calculated that could be placed in sufficient quantity of bone. Additional surgical procedures (guided bone regeneration, sinus floor elevation) that would be necessary due the reduced bone quality and quantity were identified. The indication of template-guided, flapless surgery or an immediate loaded protocol was evaluated. RESULTS: Model (a) - bar overdentures: for 28 patients (70%), all four implants could be placed in sufficient bone (total 112 implants). Thus, a full, flapless procedure could be suggested. For six patients (15%), sufficient bone was not available for any of their planned implants. The remaining six patients had exhibited a combination of sufficient or insufficient bone. Model (b) - simple fixed prosthesis: for 12 patients (30%), all six implants could be placed in sufficient bone (total 72 implants). Thus, a full, flapless procedure could be suggested. For seven patients (17%), sufficient bone was not available for any of their planned implants. The remaining 21 patients had exhibited a combination of sufficient or insufficient bone. DISCUSSION: In the maxilla, advanced atrophy is often observed, and implant placement becomes difficult or impossible. Thus, flapless surgery or an immediate loading protocol can be performed just in a selected number of patients. Nevertheless, the use of a computer program for prosthetically driven implant planning is highly efficient and safe. The three-dimensional view of the maxilla allows the determination of the best implant position, the optimization of the implant axis, and the definition of the best surgical and prosthetic solution for the patient. Thus, a protocol that combines a computer-guided technique with conventional surgical procedures becomes a promising option, which needs to be further evaluated and improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A basic prerequisite for in vivo X-ray imaging of the lung is the exact determination of radiation dose. Achieving resolutions of the order of micrometres may become particularly challenging owing to increased dose, which in the worst case can be lethal for the imaged animal model. A framework for linking image quality to radiation dose in order to optimize experimental parameters with respect to dose reduction is presented. The approach may find application for current and future in vivo studies to facilitate proper experiment planning and radiation risk assessment on the one hand and exploit imaging capabilities on the other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In process industries, make-and-pack production is used to produce food and beverages, chemicals, and metal products, among others. This type of production process allows the fabrication of a wide range of products in relatively small amounts using the same equipment. In this article, we consider a real-world production process (cf. Honkomp et al. 2000. The curse of reality – why process scheduling optimization problems are diffcult in practice. Computers & Chemical Engineering, 24, 323–328.) comprising sequence-dependent changeover times, multipurpose storage units with limited capacities, quarantine times, batch splitting, partial equipment connectivity, and transfer times. The planning problem consists of computing a production schedule such that a given demand of packed products is fulfilled, all technological constraints are satisfied, and the production makespan is minimised. None of the models in the literature covers all of the technological constraints that occur in such make-and-pack production processes. To close this gap, we develop an efficient mixed-integer linear programming model that is based on a continuous time domain and general-precedence variables. We propose novel types of symmetry-breaking constraints and a preprocessing procedure to improve the model performance. In an experimental analysis, we show that small- and moderate-sized instances can be solved to optimality within short CPU times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.