17 resultados para Linear boundary value control problems
Resumo:
In this article, we develop the a priori and a posteriori error analysis of hp-version interior penalty discontinuous Galerkin finite element methods for strongly monotone quasi-Newtonian fluid flows in a bounded Lipschitz domain Ω ⊂ ℝd, d = 2, 3. In the latter case, computable upper and lower bounds on the error are derived in terms of a natural energy norm, which are explicit in the local mesh size and local polynomial degree of the approximating finite element method. A series of numerical experiments illustrate the performance of the proposed a posteriori error indicators within an automatic hp-adaptive refinement algorithm.
Resumo:
In this paper, an Insulin Infusion Advisory System (IIAS) for Type 1 diabetes patients, which use insulin pumps for the Continuous Subcutaneous Insulin Infusion (CSII) is presented. The purpose of the system is to estimate the appropriate insulin infusion rates. The system is based on a Non-Linear Model Predictive Controller (NMPC) which uses a hybrid model. The model comprises a Compartmental Model (CM), which simulates the absorption of the glucose to the blood due to meal intakes, and a Neural Network (NN), which simulates the glucose-insulin kinetics. The NN is a Recurrent NN (RNN) trained with the Real Time Recurrent Learning (RTRL) algorithm. The output of the model consists of short term glucose predictions and provides input to the NMPC, in order for the latter to estimate the optimum insulin infusion rates. For the development and the evaluation of the IIAS, data generated from a Mathematical Model (MM) of a Type 1 diabetes patient have been used. The proposed control strategy is evaluated at multiple meal disturbances, various noise levels and additional time delays. The results indicate that the implemented IIAS is capable of handling multiple meals, which correspond to realistic meal profiles, large noise levels and time delays.
Resumo:
We investigate a class of optimal control problems that exhibit constant exogenously given delays in the control in the equation of motion of the differential states. Therefore, we formulate an exemplary optimal control problem with one stock and one control variable and review some analytic properties of an optimal solution. However, analytical considerations are quite limited in case of delayed optimal control problems. In order to overcome these limits, we reformulate the problem and apply direct numerical methods to calculate approximate solutions that give a better understanding of this class of optimization problems. In particular, we present two possibilities to reformulate the delayed optimal control problem into an instantaneous optimal control problem and show how these can be solved numerically with a stateof- the-art direct method by applying Bock’s direct multiple shooting algorithm. We further demonstrate the strength of our approach by two economic examples.
Resumo:
Given a short-arc optical observation with estimated angle-rates, the admissible region is a compact region in the range / range-rate space defined such that all likely and relevant orbits are contained within it. An alternative boundary value problem formulation has recently been proposed where range / range hypotheses are generated with two angle measurements from two tracks as input. In this paper, angle-rate information is reintroduced as a means to eliminate hypotheses by bounding their constants of motion before a more computationally costly Lambert solver or differential correction algorithm is run.
Resumo:
We introduce and analyze hp-version discontinuous Galerkin (dG) finite element methods for the numerical approximation of linear second-order elliptic boundary-value problems in three-dimensional polyhedral domains. To resolve possible corner-, edge- and corner-edge singularities, we consider hexahedral meshes that are geometrically and anisotropically refined toward the corresponding neighborhoods. Similarly, the local polynomial degrees are increased linearly and possibly anisotropically away from singularities. We design interior penalty hp-dG methods and prove that they are well-defined for problems with singular solutions and stable under the proposed hp-refinements. We establish (abstract) error bounds that will allow us to prove exponential rates of convergence in the second part of this work.
Resumo:
The goal of this paper is to establish exponential convergence of $hp$-version interior penalty (IP) discontinuous Galerkin (dG) finite element methods for the numerical approximation of linear second-order elliptic boundary-value problems with homogeneous Dirichlet boundary conditions and piecewise analytic data in three-dimensional polyhedral domains. More precisely, we shall analyze the convergence of the $hp$-IP dG methods considered in [D. Schötzau, C. Schwab, T. P. Wihler, SIAM J. Numer. Anal., 51 (2013), pp. 1610--1633] based on axiparallel $\sigma$-geometric anisotropic meshes and $\bm{s}$-linear anisotropic polynomial degree distributions.
Resumo:
Measurement association and initial orbit determination is a fundamental task when building up a database of space objects. This paper proposes an efficient and robust method to determine the orbit using the available information of two tracklets, i.e. their line-of-sights and their derivatives. The approach works with a boundary-value formulation to represent hypothesized orbital states and uses an optimization scheme to find the best fitting orbits. The method is assessed and compared to an initial-value formulation using a measurement set taken by the Zimmerwald Small Aperture Robotic Telescope of the Astronomical Institute at the University of Bern. False associations of closely spaced objects on similar orbits cannot be completely eliminated due to the short duration of the measurement arcs. However, the presented approach uses the available information optimally and the overall association performance and robustness is very promising. The boundary-value optimization takes only around 2% of computational time when compared to optimization approaches using an initial-value formulation. The full potential of the method in terms of run-time is additionally illustrated by comparing it to other published association methods.
Resumo:
To determine the local control and complication rates for children with papillary and/or macular retinoblastoma progressing after chemotherapy and undergoing stereotactic radiotherapy (SRT) with a micromultileaf collimator.
Resumo:
A small proportion of individuals with non-specific low back pain (NSLBP) develop persistent problems. Up to 80% of the total costs for NSLBP are owing to chronic NSLBP. Psychosocial factors have been described to be important in the transition from acute to chronic NSLBP. Guidelines recommend the use of the Acute Low Back Pain Screening Questionnaire (ALBPSQ) and the Örebro Musculoskeletal Pain Screening Questionnaire (ÖMPSQ) to identify individuals at risk of developing persistent problems, such as long-term absence of work, persistent restriction in function or persistent pain. These instruments can be used with a cutoff value, where patients with values above the threshold are further assessed with a more comprehensive examination.
Resumo:
The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).
Resumo:
Popularity of Online Social Networks has been recently overshadowed by the privacy problems they pose. Users are getting increasingly vigilant concerning information they disclose and are strongly opposing the use of their information for commercial purposes. Nevertheless, as long as the network is offered to users for free, providers have little choice but to generate revenue through personalized advertising to remain financially viable. Our study empirically investigates the ways out of this deadlock. Using conjoint analysis we find that privacy is indeed important for users. We identify three groups of users with different utility patterns: Unconcerned Socializers, Control-conscious Socializers and Privacy-concerned. Our results provide relevant insights into how network providers can capitalize on different user preferences by specifically addressing the needs of distinct groups in the form of various premium accounts. Overall, our study is the first attempt to assess the value of privacy in monetary terms in this context.
Resumo:
AIM To assess the prevalence of vascular dementia, mixed dementia and Alzheimer's disease in patients with atrial fibrillation, and to evaluate the accuracy of the Hachinski ischemic score for these subtypes of dementia. METHODS A nested case-control study was carried out. A total of 103 of 784 consecutive patients evaluated for cognitive status at the Ambulatory Geriatric Clinic had a diagnosis of atrial fibrillation. Controls without atrial fibrillation were randomly selected from the remaining 681 patients using a 1:2 matching for sex, age and education. RESULTS The prevalence of vascular dementia was twofold in patients with atrial fibrillation compared with controls (21.4% vs 10.7%, P = 0.024). Alzheimer's disease was also more frequent in the group with atrial fibrillation (12.6% vs 7.3%, P = 0.046), whereas mixed dementia had a similar distribution. The Hachinski ischemic score poorly discriminated between dementia subtypes, with misclassification rates between 46% (95% CI 28-66) and 70% (95% CI 55-83). In patients with atrial fibrillation, these rates ranged from 55% (95% CI 32-77) to 69% (95% CI 39-91%). In patients in whom the diagnosis of dementia was excluded, the Hachinski ischemic score suggested the presence of vascular dementia in 11% and mixed dementia in 30%. CONCLUSIONS Vascular dementia and Alzheimer's disease, but not mixed dementia, are more prevalent in patients with atrial fibrillation. The discriminative accuracy of the Hachinski ischemic score for dementia subtypes in atrial fibrillation is poor, with a significant proportion of misclassifications.