998 resultados para Robust feasibility


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The multiobjective optimization model studied in this paper deals with simultaneous minimization of finitely many linear functions subject to an arbitrary number of uncertain linear constraints. We first provide a radius of robust feasibility guaranteeing the feasibility of the robust counterpart under affine data parametrization. We then establish dual characterizations of robust solutions of our model that are immunized against data uncertainty by way of characterizing corresponding solutions of robust counterpart of the model. Consequently, we present robust duality theorems relating the value of the robust model with the corresponding value of its dual problem.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we examine multi-objective linear programming problems in the face of data uncertainty both in the objective function and the constraints. First, we derive a formula for the radius of robust feasibility guaranteeing constraint feasibility for all possible scenarios within a specified uncertainty set under affine data parametrization. We then present numerically tractable optimality conditions for minmax robust weakly efficient solutions, i.e., the weakly efficient solutions of the robust counterpart. We also consider highly robust weakly efficient solutions, i.e., robust feasible solutions which are weakly efficient for any possible instance of the objective matrix within a specified uncertainty set, providing lower bounds for the radius of highly robust efficiency guaranteeing the existence of this type of solutions under affine and rank-1 objective data uncertainty. Finally, we provide numerically tractable optimality conditions for highly robust weakly efficient solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Management of osteoarthritis (OA) includes the use of non-pharmacological and pharmacological therapies. Although walking is commonly recommended for reducing pain and increasing physical function in people with OA, glucosamine sulphate has also been used to alleviate pain and slow the progression of OA. This study evaluated the effects of a progressive walking program and glucosamine sulphate intake on OA symptoms and physical activity participation in people with mild to moderate hip or knee OA. Methods: Thirty-six low active participants (aged 42 to 73 years) were provided with 1500 mg glucosamine sulphate per day for 6 weeks, after which they began a 12-week progressive walking program, while continuing to take glucosamine. They were randomized to walk 3 or 5 days per week and given a pedometer to monitor step counts. For both groups, step level of walking was gradually increased to 3000 steps/day during the first 6 weeks of walking, and to 6000 steps/day for the next 6 weeks. Primary outcomes included physical activity levels, physical function (self-paced step test), and the WOMAC Osteoarthritis Index for pain, stiffness and physical function. Assessments were conducted at baseline and at 6-, 12-, 18-, and 24-week follow-ups. The Mann Whitney Test was used to examine differences in outcome measures between groups at each assessment, and the Wilcoxon Signed Ranks Test was used to examine differences in outcome measures between assessments. Results: During the first 6 weeks of the study (glucosamine supplementation only), physical activity levels, physical function, and total WOMAC scores improved (P<0.05). Between the start of the walking program (Week 6) and the final follow-up (Week 24), further improvements were seen in these outcomes (P<0.05) although most improvements were seen between Weeks 6 and 12. No significant differences were found between walking groups. Conclusions: In people with hip or knee OA, walking a minimum of 3000 steps (~30 minutes), at least 3 days/week, in combination with glucosamine sulphate, may reduce OA symptoms. A more robust study with a larger sample is needed to support these preliminary findings. Trial Registration: Australian Clinical Trials Registry ACTRN012607000159459.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates the application of a robust form of pose estimation and scene reconstruction using data from camera images. We demonstrate results that suggest the ability of the algorithm to rival methods of RANSAC based pose estimation polished by bundle adjustment in terms of solution robustness, speed and accuracy, even when given poor initialisations. Our simulated results show the behaviour of the algorithm in a number of novel simulated scenarios reflective of real world cases that show the ability of the algorithm to handle large observation noise and difficult reconstruction scenes. These results have a number of implications for the vision and robotics community, and show that the application of visual motion estimation on robotic platforms in an online fashion is approaching real-world feasibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern database systems incorporate a query optimizer to identify the most efficient "query execution plan" for executing the declarative SQL queries submitted by users. A dynamic-programming-based approach is used to exhaustively enumerate the combinatorially large search space of plan alternatives and, using a cost model, to identify the optimal choice. While dynamic programming (DP) works very well for moderately complex queries with up to around a dozen base relations, it usually fails to scale beyond this stage due to its inherent exponential space and time complexity. Therefore, DP becomes practically infeasible for complex queries with a large number of base relations, such as those found in current decision-support and enterprise management applications. To address the above problem, a variety of approaches have been proposed in the literature. Some completely jettison the DP approach and resort to alternative techniques such as randomized algorithms, whereas others have retained DP by using heuristics to prune the search space to computationally manageable levels. In the latter class, a well-known strategy is "iterative dynamic programming" (IDP) wherein DP is employed bottom-up until it hits its feasibility limit, and then iteratively restarted with a significantly reduced subset of the execution plans currently under consideration. The experimental evaluation of IDP indicated that by appropriate choice of algorithmic parameters, it was possible to almost always obtain "good" (within a factor of twice of the optimal) plans, and in the few remaining cases, mostly "acceptable" (within an order of magnitude of the optimal) plans, and rarely, a "bad" plan. While IDP is certainly an innovative and powerful approach, we have found that there are a variety of common query frameworks wherein it can fail to consistently produce good plans, let alone the optimal choice. This is especially so when star or clique components are present, increasing the complexity of th- e join graphs. Worse, this shortcoming is exacerbated when the number of relations participating in the query is scaled upwards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new formulation of variable horizon model predictive control (VH-MPC) that utilises move blocking for reducing computational complexity. Various results pertaining to move blocking are derived, following which, a generalised blocked VH-MPC controller is formulated for linear discrete-time systems. Robustness to bounded disturbances is ensured through the use of tightened constraints. The resulting time-varying control scheme is shown to guarantee robust recursive feasibility and finite-time completion. An example is then presented for a particular choice of blocking regime, as would be applicable to vehicle manœuvring problems. Simulations demonstrate the efficacy of the formulation. © 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a multiloop robust control strategy is proposed based on H∞ control and a partial least squares (PLS) model (H∞_PLS) for multivariable chemical processes. It is developed especially for multivariable systems in ill-conditioned plants and non-square systems. The advantage of PLS is to extract the strongest relationship between the input and the output variables in the reduced space of the latent variable model rather than in the original space of the highly dimensional variables. Without conventional decouplers, the dynamic PLS framework automatically decomposes the MIMO process into multiple single-loop systems in the PLS subspace so that the controller design can be simplified. Since plant/model mismatch is almost inevitable in practical applications, to enhance the robustness of this control system, the controllers based on the H∞ mixed sensitivity problem are designed in the PLS latent subspace. The feasibility and the effectiveness of the proposed approach are illustrated by the simulation results of a distillation column and a mixing tank process. Comparisons between H∞_PLS control and conventional individual control (either H∞ control or PLS control only) are also made

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adequate user authentication is a persistent problem, particularly with mobile devices, which tend to be highly personal and at the fringes of an organisation's influence. Yet these devices are being used increasingly in various business settings, where they pose a risk to security and privacy, not only from sensitive information they may contain, but also from the means they typically offer to access such information over wireless networks. User authentication is the first line of defence for a mobile device that falls into the hands of an unauthorised user. However, motivating users to enable simple password mechanisms and periodically update their authentication information is difficult at best. This paper examines some of the issues relating to the use of biometrics as a viable method of authentication on mobile wireless devices. It is also a critical analysis of some of the techniques currently employed and where appropriate, suggests novel hybrid ways in which they could be improved or modified. Both biometric technology and wireless setting based constraints that determine the feasibility and the performance of the authentication feature are specified. Some well known biometric technologies are briefly reviewed and their feasibility for wireless and mobile use is reviewed. Furthermore, a number of quantitative and qualitative parameters for evaluation are also presented. Biometric technologies are continuously advancing toward commercial implementation in wireless devices. When carefully designed and implemented, the advantage of biometric authentication arises mainly from increased convenience and coexistent improved security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty affects all aspects of the property market but one area where the impact of uncertainty is particularly significant is within feasibility analyses. Any development is impacted by differences between market conditions at the conception of the project and the market realities at the time of completion. The feasibility study needs to address the possible outcomes based on an understanding of the current market. This requires the appraiser to forecast the most likely outcome relating to the sale price of the completed development, the construction costs and the timing of both. It also requires the appraiser to understand the impact of finance on the project. All these issues are time sensitive and analysis needs to be undertaken to show the impact of time to the viability of the project. The future is uncertain and a full feasibility analysis should be able to model the upside and downside risk pertaining to a range of possible outcomes. Feasibility studies are extensively used in Italy to determine land value but they tend to be single point analysis based upon a single set of “likely” inputs. In this paper we look at the practical impact of uncertainty in variables using a simulation model (Crystal Ball ©) with an actual case study of an urban redevelopment plan for an Italian Municipality. This allows the appraiser to address the issues of uncertainty involved and thus provide the decision maker with a better understanding of the risk of development. This technique is then refined using a “two-dimensional technique” to distinguish between “uncertainty” and “variability” and thus create a more robust model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new methodology to control the power flow between a distributed generator (DG) and the electrical power distribution grid. It is used the droop voltage control to manage the active and reactive power. Through this control a sinusoidal voltage reference is generated to be tracked by voltage loop and this loop generates the current reference for the current loop. The proposed control introduces feed-forward states improving the control performance in order to obtain high quality for the current injected to the grid. The controllers were obtained through the linear matrix inequalities (LMI) using the D-stability analysis to allocate the closed-loop controller poles. Therefore, the results show quick transient response with low oscillations. Thus, this paper presents the proposed control technique, the main simulation results and a prototype with 1000VA was developed in the laboratory in order to demonstrate the feasibility of the proposed control. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

T2 mapping techniques use the relaxation constant as an indirect marker of cartilage structure, and the relaxation constant has also been shown to be a sensitive parameter for cartilage evaluation. As a possible additional robust biomarker, T2* relaxation time is a potential, clinically feasible parameter for the biochemical evaluation of articular cartilage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel solution to the long standing issue of chip entanglement and breakage in metal cutting is presented in this dissertation. Through this work, an attempt is made to achieve universal chip control in machining by using chip guidance and subsequent breakage by backward bending (tensile loading of the chip's rough top surface) to effectively control long continuous chips into small segments. One big limitation of using chip breaker geometries in disposable carbide inserts is that the application range is limited to a narrow band depending on cutting conditions. Even within a recommended operating range, chip breakers do not function effectively as designed due to the inherent variations of the cutting process. Moreover, for a particular process, matching the chip breaker geometry with the right cutting conditions to achieve effective chip control is a very iterative process. The existence of a large variety of proprietary chip breaker designs further exacerbates the problem of easily implementing a robust and comprehensive chip control technique. To address the need for a robust and universal chip control technique, a new method is proposed in this work. By using a single tool top form geometry coupled with a tooling system for inducing chip breaking by backward bending, the proposed method achieves comprehensive chip control over a wide range of cutting conditions. A geometry based model is developed to predict a variable edge inclination angle that guides the chip flow to a predetermined target location. Chip kinematics for the new tool geometry is examined via photographic evidence from experimental cutting trials. Both qualitative and quantitative methods are used to characterize the chip kinematics. Results from the chip characterization studies indicate that the chip flow and final form show a remarkable consistency across multiple levels of workpiece and tool configurations as well as cutting conditions. A new tooling system is then designed to comprehensively break the chip by backward bending. Test results with the new tooling system prove that by utilizing the chip guidance and backward bending mechanism, long continuous chips can be more consistently broken into smaller segments that are generally deemed acceptable or good chips. It is found that the proposed tool can be applied effectively over a wider range of cutting conditions than present chip breakers thus taking possibly the first step towards achieving universal chip control in machining.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Multiple breath washout (MBW) derived Scond is an established index of ventilation inhomogeneity. Time-consuming post hoc calculations of the expirogram's slope of alveolar phase III (SIII) and the lack of available software hampered widespread application of Scond. METHODS Seventy-two school-aged children (45 with cystic fibrosis; CF) performed 3 nitrogen MBW. We tested a new automated algorithm for Scond analysis (Scondauto ) which comprised breath selection for SIII detection, calculation and reporting of test quality. We compared Scondauto to (i) standard Scond analysis (Scondmanual ) with manual breath selection and to (ii) pragmatic Scond analysis including all breaths (Scondall ). Primary outcomes were success rate and agreement between different Scond protocols, and Scond fitting quality (linear regression R(2) ). RESULTS Average Scondauto (0.06 for CF and 0.01 for controls) was not different from Scondmanual (0.06 for CF and 0.01 for controls) and showed comparable fitting quality (R(2) 0.53 for CF and 0.13 for controls vs. R(2) 0.54 for CF and 0.13 for controls). Scondall was similar in CF and controls but with inferior fitting quality compared to Scondauto and Scondmanual . CONCLUSIONS Automated Scond calculation is feasible and produces robust results comparable to the standard manual way of Scond calculation. This algorithm provides a valid, fast and objective tool for regular use, even in children. Pediatr Pulmonol. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robotic assistance in the context of lateral skull base surgery, particularly during cochlear implantation procedures, has been the subject of considerable research over the last decade. The use of robotics during these procedures has the potential to provide significant benefits to the patient by reducing invasiveness when gaining access to the cochlea, as well as reducing intracochlear trauma when performing a cochleostomy. Presented herein is preliminary work on the combination of two robotic systems for reducing invasiveness and trauma in cochlear implantation procedures. A robotic system for minimally invasive inner ear access was combined with a smart drilling tool for robust and safe cochleostomy; evaluation was completed on a single human cadaver specimen. Access to the middle ear was successfully achieved through the facial recess without damage to surrounding anatomical structures; cochleostomy was completed at the planned position with the endosteum remaining intact after drilling as confirmed by microscope evaluation.