862 resultados para Sequential optimization
Resumo:
Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2015
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
Otto-von-Guericke-Universität Magdeburg, Fakultät für Mathematik, Univ., Dissertation, 2015
Resumo:
The paper documents MINTOOLKIT for GNU Octave. MINTOOLKIT provides functions for minimization and numeric differentiation. The main algorithms are BFGS, LBFGS, and simulated annealing. Examples are given.
Resumo:
We report on a series of experiments that test the effects of an uncertain supply on the formation of bids and prices in sequential first-price auctions with private-independent values and unit-demands. Supply is assumed uncertain when buyers do not know the exact number of units to be sold (i.e., the length of the sequence). Although we observe a non-monotone behavior when supply is certain and an important overbidding, the data qualitatively support our price trend predictions and the risk neutral Nash equilibrium model of bidding for the last stage of a sequence, whether supply is certain or not. Our study shows that behavior in these markets changes significantly with the presence of an uncertain supply, and that it can be explained by assuming that bidders formulate pessimistic beliefs about the occurrence of another stage.
Resumo:
We study a sequential protocol of endogenous coalition formation based on a process of bilateral agreements among the players. We apply the game to a Cournot environment with linear demand and constant average costs. We show that the final outcome of any Subgame Perfect Equilibrium of the game is the grand coalition, provided the initial number of firms is high enough and they are sufficiently patient.
Resumo:
Some analysts use sequential dominance criteria, and others use equivalence scales in combination with non-sequential dominance tests, to make welfare comparisons of oint distributions of income and needs. In this paper we present a new sequential procedure hich copes with situations in which sequential dominance fails. We also demonstrate that there commendations deriving from the sequential approach are valid for distributions of equivalent income whatever equivalence scale the analyst might adopt. Thus the paper marries together the sequential and equivalizing approaches, seen as alternatives in much previous literature. All results are specified in forms which allow for demographic differences in the populations being compared.
Resumo:
Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.
Resumo:
Knowledge of the spatial distribution of hydraulic conductivity (K) within an aquifer is critical for reliable predictions of solute transport and the development of effective groundwater management and/or remediation strategies. While core analyses and hydraulic logging can provide highly detailed information, such information is inherently localized around boreholes that tend to be sparsely distributed throughout the aquifer volume. Conversely, larger-scale hydraulic experiments like pumping and tracer tests provide relatively low-resolution estimates of K in the investigated subsurface region. As a result, traditional hydrogeological measurement techniques contain a gap in terms of spatial resolution and coverage, and they are often alone inadequate for characterizing heterogeneous aquifers. Geophysical methods have the potential to bridge this gap. The recent increased interest in the application of geophysical methods to hydrogeological problems is clearly evidenced by the formation and rapid growth of the domain of hydrogeophysics over the past decade (e.g., Rubin and Hubbard, 2005).
Resumo:
The goal of the present work was assess the feasibility of using a pseudo-inverse and null-space optimization approach in the modeling of the shoulder biomechanics. The method was applied to a simplified musculoskeletal shoulder model. The mechanical system consisted in the arm, and the external forces were the arm weight, 6 scapulo-humeral muscles and the reaction at the glenohumeral joint, which was considered as a spherical joint. The muscle wrapping was considered around the humeral head assumed spherical. The dynamical equations were solved in a Lagrangian approach. The mathematical redundancy of the mechanical system was solved in two steps: a pseudo-inverse optimization to minimize the square of the muscle stress and a null-space optimization to restrict the muscle force to physiological limits. Several movements were simulated. The mathematical and numerical aspects of the constrained redundancy problem were efficiently solved by the proposed method. The prediction of muscle moment arms was consistent with cadaveric measurements and the joint reaction force was consistent with in vivo measurements. This preliminary work demonstrated that the developed algorithm has a great potential for more complex musculoskeletal modeling of the shoulder joint. In particular it could be further applied to a non-spherical joint model, allowing for the natural translation of the humeral head in the glenoid fossa.
Resumo:
In this paper we analyse a simple two-person sequential-move contest game with heterogeneous players. Assuming that the heterogeneity could be the consequence of past discrimination, we study the effects of implementation of affirmative action policy, which tackles this heterogeneity by compensating discriminated players, and compare them with the situation in which the heterogeneity is ignored and the contestants are treated equally. In our analysis we consider different orders of moves. We show that the order of moves of contestants is a very important factor in determination of the effects of the implementation of the affirmative action policy. We also prove that in such cases a significant role is played by the level of the heterogeneity of individuals. In particular, in contrast to the present-in-the-literature predictions, we demonstrate that as a consequence of the interplay of these two factors, the response to the implementation of the affirmative action policy option may be the decrease in the total equilibrium effort level of the contestants in comparison to the unbiased contest game.
Resumo:
Este trabajo analiza el rendimiento de cuatro nodos de cómputo multiprocesador de memoria compartida para resolver el problema N-body. Se paraleliza el algoritmo serie, y se codifica usando el lenguaje C extendido con OpenMP. El resultado son dos variantes que obedecen a dos criterios de optimización diferentes: minimizar los requisitos de memoria y minimizar el volumen de cómputo. Posteriormente, se realiza un proceso de análisis de las prestaciones del programa sobre los nodos de cómputo. Se modela el rendimiento de las variantes secuenciales y paralelas de la aplicación, y de los nodos de cómputo; se instrumentan y ejecutan los programas para obtener resultados en forma de varias métricas; finalmente se muestran e interpretan los resultados, proporcionando claves que explican ineficiencias y cuellos de botella en el rendimiento y posibles líneas de mejora. La experiencia de este estudio concreto ha permitido esbozar una incipiente metodología de análisis de rendimiento, identificación de problemas y sintonización de algoritmos a nodos de cómputo multiprocesador de memoria compartida.
Resumo:
BACKGROUND: Among patients with steroid-refractory ulcerative colitis (UC) in whom a first rescue therapy has failed, a second line salvage treatment can be considered to avoid colectomy. AIM: To evaluate the efficacy and safety of second or third line rescue therapy over a one-year period. METHODS: Response to single or sequential rescue treatments with infliximab (5mg/kg intravenously (iv) at week 0, 2, 6 and then every 8weeks), ciclosporin (iv 2mg/kg/daily and then oral 5mg/kg/daily) or tacrolimus (0.05mg/kg divided in 2 doses) in steroid-refractory moderate to severe UC patients from 7 Swiss and 1 Serbian tertiary IBD centers was retrospectively studied. The primary endpoint was the one year colectomy rate. RESULTS: 60% of patients responded to the first rescue therapy, 10% went to colectomy and 30% non-responders were switched to a 2(nd) line rescue treatment. 66% of patients responded to the 2(nd) line treatment whereas 34% failed, of which 15% went to colectomy and 19% received a 3(rd) line rescue treatment. Among those, 50% patients went to colectomy. Overall colectomy rate of the whole cohort was 18%. Steroid-free remission rate was 39%. The adverse event rates were 33%, 37.5% and 30% for the first, second and third line treatment respectively. CONCLUSION: Our data show that medical intervention even with 2(nd) and 3(rd) rescue treatments decreased colectomy frequency within one year of follow up. A longer follow-up will be necessary to investigate whether sequential therapy will only postpone colectomy and what percentage of patients will remain in long-term remission.
Resumo:
Purpose: To evaluate the feasibility, determine the optimal b-value, and assess the utility of 3-T diffusion-weighted MR imaging (DWI) of the spine in differentiating benign from pathologic vertebral compression fractures.Methods and Materials: Twenty patients with 38 vertebral compression fractures (24 benign, 14 pathologic) and 20 controls (total: 23 men, 17 women, mean age 56.2years) were included from December 2010 to May 2011 in this IRB-approved prospective study. MR imaging of the spine was performed on a 3-T unit with T1-w, fat-suppressed T2-w, gadolinium-enhanced fat-suppressed T1-w and zoomed-EPI (2D RF excitation pulse combined with reduced field-of-view single-shot echo-planar readout) diffusion-w (b-values: 0, 300, 500 and 700s/mm2) sequences. Two radiologists independently assessed zoomed-EPI image quality in random order using a 4-point scale: 1=excellent to 4=poor. They subsequently measured apparent diffusion coefficients (ADCs) in normal vertebral bodies and compression fractures, in consensus.Results: Lower b-values correlated with better image quality scores, with significant differences between b=300 (mean±SD=2.6±0.8), b=500 (3.0±0.7) and b=700 (3.6±0.6) (all p<0.001). Mean ADCs of normal vertebral bodies (n=162) were 0.23, 0.17 and 0.11×10-3mm2/s with b=300, 500 and 700s/mm2, respectively. In contrast, mean ADCs were 0.89, 0.70 and 0.59×10-3mm2/s for benign vertebral compression fractures and 0.79, 0.66 and 0.51×10-3mm2/s for pathologic fractures with b=300, 500 and 700s/mm2, respectively. No significant difference was found between ADCs of benign and pathologic fractures.Conclusion: 3-T DWI of the spine is feasible and lower b-values (300s/mm2) are recommended. However, our preliminary results show no advantage of DWI in differentiating benign from pathologic vertebral compression fractures.