866 resultados para Vehicle routing problems with gains
Resumo:
We prove exponential rates of convergence of hp-version discontinuous Galerkin (dG) interior penalty finite element methods for second-order elliptic problems with mixed Dirichlet-Neumann boundary conditions in axiparallel polyhedra. The dG discretizations are based on axiparallel, σ-geometric anisotropic meshes of mapped hexahedra and anisotropic polynomial degree distributions of μ-bounded variation. We consider piecewise analytic solutions which belong to a larger analytic class than those for the pure Dirichlet problem considered in [11, 12]. For such solutions, we establish the exponential convergence of a nonconforming dG interpolant given by local L 2 -projections on elements away from corners and edges, and by suitable local low-order quasi-interpolants on elements at corners and edges. Due to the appearance of non-homogeneous, weighted norms in the analytic regularity class, new arguments are introduced to bound the dG consistency errors in elements abutting on Neumann edges. The non-homogeneous norms also entail some crucial modifications of the stability and quasi-optimality proofs, as well as of the analysis for the anisotropic interpolation operators. The exponential convergence bounds for the dG interpolant constructed in this paper generalize the results of [11, 12] for the pure Dirichlet case.
Resumo:
BACKGROUND Implant-overdentures supported by rigid bars provide stability in the edentulous atrophic mandible. However, fractures of solder joints and matrices, and loosening of screws and matrices were observed with soldered gold bars (G-bars). Computer-aided designed/computer-assisted manufactured (CAD/CAM) titanium bars (Ti-bars) may reduce technical complications due to enhanced material quality. PURPOSE To compare prosthetic-technical maintenance service of mandibular implant-overdentures supported by CAD/CAM Ti-bar and soldered G-bar. MATERIALS AND METHODS Edentulous patients were consecutively admitted for implant-prosthodontic treatment with a maxillary complete denture and a mandibular implant-overdenture connected to a rigid G-bar or Ti-bar. Maintenance service and problems with the implant-retention device complex and the prosthesis were recorded during minimally 3-4 years. Annual peri-implant crestal bone level changes (ΔBIC) were radiographically assessed. RESULTS Data of 213 edentulous patients (mean age 68 ± 10 years), who had received a total of 477 tapered implants, were available. Ti-bar and G-bar comprised 101 and 112 patients with 231 and 246 implants, respectively. Ti-bar mostly exhibited distal bar extensions (96%) compared to 34% of G-bar (p < .001). Fracture rate of bars extensions (4.7% vs 14.8%, p < .001) and matrices (1% vs 13%, p < .001) was lower for Ti-bar. Matrices activation was required 2.4× less often in Ti-bar. ΔBIC remained stable for both groups. CONCLUSIONS Implant overdentures supported by soldered gold bars or milled CAD/CAM Ti-bars are a successful treatment modality but require regular maintenance service. These short-term observations support the hypothesis that CAD/CAM Ti-bars reduce technical complications. Fracture location indicated that the titanium thickness around the screw-access hole should be increased.
Resumo:
We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.
Resumo:
This dissertation was written in the format of three journal articles. Paper 1 examined the influence of change and fluctuation in body mass index (BMI) over an eleven-year period, on changes in serum lipid levels (total, HDL, and LDL cholesterol, triglyceride) in a population of Mexican Americans with type 2 diabetes. Linear regression models containing initial lipid value, BMI and age, BMI change (slope of BMI), and BMI fluctuation (root mean square error) were used to investigate associations of these variables with change in lipids over time. Increasing BMI over time was associated with gains in total and LDL cholesterol and triglyceride levels in women. Fluctuation of BMI was not associated with detrimental lipid profiles. These effects were independent of age and were not statistically significant in men. In Mexican-American women with type 2 diabetes, weight reduction is likely to result in more favorable levels of total and LDL cholesterol and triglyceride, without concern for possible detrimental effects of weight fluctuation. Weight reduction may not be as effective in men, but does not appear to be harmful either. ^ Paper 2 examined the associations of upper and total body fat with total cholesterol, HDL and LDL cholesterol, and triglyceride levels in the same population. Multilevel analysis was used to predict serum lipid levels from total body fat (BMI and triceps skinfold) and upper body fat (subscapular skinfold), while controlling for the effects of sex, age and self-correlations across time. Body fat was not strikingly associated with trends in serum lipid levels. However, upper body fat was strongly associated with triglyceride levels. This suggests that loss of upper body fat may be more important than weight loss in management of the hypertriglyceridemia commonly seen in type 2 diabetes. ^ Paper 3 was a review of the literature reporting associations between weight fluctuation and lipid levels. Few studies have reported associations between weight fluctuation and total, LDL, and HDL cholesterol and triglyceride levels. The body of evidence to date suggests that weight fluctuation does not strongly influence levels of total, LDL and HDL cholesterol and triglyceride. ^
Resumo:
Introduction. Selectively manned units have a long, international history, both military and civilian. Some examples include SWAT teams, firefighters, the FBI, the DEA, the CIA, and military Special Operations. These special duty operators are individuals who perform a highly skilled and dangerous job in a unique environment. A significant amount of money is spent by the Department of Defense (DoD) and other federal agencies to recruit, select, train, equip and support these operators. When a critical incident or significant life event occurs, that jeopardizes an operator's performance; there can be heavy losses in terms of training, time, money, and potentially, lives. In order to limit the number of critical incidents, selection processes have been developed over time to “select out” those individuals most likely to perform below desired performance standards under pressure or stress and to "select in" those with the "right stuff". This study is part of a larger program evaluation to assess markers that identify whether a person will fail under the stresses in a selectively manned unit. The primary question of the study is whether there are indicators in the selection process that signify potential negative performance at a later date. ^ Methods. The population being studied included applicants to a selectively manned DoD organization between 1993 and 2001 as part of a unit assessment and selection process (A&S). Approximately 1900 A&S records were included in the analysis. Over this nine year period, seventy-two individuals were determined to have had a critical incident. A critical incident can come in the form of problems with the law, personal, behavioral or family problems, integrity issues, and skills deficit. Of the seventy-two individuals, fifty-four of these had full assessment data and subsequent supervisor performance ratings which assessed how an individual performed while on the job. This group was compared across a variety of variables including demographics and psychometric testing with a group of 178 individuals who did not have a critical incident and had been determined to be good performers with positive ratings by their supervisors.^ Results. In approximately 2004, an online pre-screen survey was developed in the hopes of preselecting out those individuals with items that would potentially make them ineligible for selection to this organization. This survey has aided the organization to increase its selection rates and save resources in the process. (Patterson, Howard Smith, & Fisher, Unit Assessment and Selection Project, 2008) When the same prescreen was used on the critical incident individuals, it was found that over 60% of the individuals would have been flagged as unacceptable. This would have saved the organization valuable resources and heartache.^ There were some subtle demographic differences between the two groups (i.e. those with critical incidents were almost twice as likely to be divorced compared with the positive performers). Upon comparison of Psychometric testing several items were noted to be different. The two groups were similar when their IQ levels were compared using the Multidimensional Aptitude Battery (MAB). When looking at the Minnesota Multiphasic Personality Inventory (MMPI), there appeared to be a difference on the MMPI Social Introversion; the Critical Incidence group scored somewhat higher. When analysis was done, the number of MMPI Critical Items between the two groups was similar as well. When scores on the NEO Personality Inventory (NEO) were compared, the critical incident individuals tended to score higher on Openness and on its subscales (Ideas, Actions, and Feelings). There was a positive correlation between Total Neuroticism T Score and number of MMPI critical items.^ Conclusions. This study shows that the current pre-screening process is working and would have saved the organization significant resources. ^ If one was to develop a profile of a candidate who potentially could suffer a critical incident and subsequently jeopardize the unit, mission and the safety of the public they would look like the following: either divorced or never married, score high on the MMPI in Social Introversion, score low on MMPI with an "excessive" amount of MMPI critical items; and finally scores high on the NEO Openness and subscales Ideas, Feelings, and Actions.^ Based on the results gleaned from the analysis in this study there seems to be several factors, within psychometric testing, that when taken together, will aid the evaluators in selecting only the highest quality operators in order to save resources and to help protect the public from unfortunate critical incidents which may adversely affect our health and safety.^
Resumo:
Characteristics of Medicare-certified home health agencies in Texas and the contributions of selected agency characteristics on home health care costs were examined. Cost models were developed and estimated for both nursing and total visit costs using multiple regression procedures. The models included home health agency size, profit status, control, hospital-based affiliation, contract-cost ratio, service provision, competition, urban-rural input-price differences, and selected measures of patient case-mix. The study population comprised 314 home health agencies in Texas that had been certified at least one year on July, 1, 1986. Data for the analysis were obtained from Medicare Cost Reports for fiscal year ending between July 1, 1985 to June 30, 1986.^ Home health agency size, as measured by the logs of nursing and total visits, has a statistically significant negative linear relationship with nursing visit and total visit costs. Nursing and total visit costs decrease at a declining rate as size increases. The size-cost relationship is not altered when controlling for any other agency characteristic. The number of visits per patient per year, a measure of patient case-mix, is also negatively related to costs, suggesting that costs decline with care of chronic patients. Hospital-based affiliation and urban location are positively associated with costs. Together, the four characteristics explain 19 percent of the variance in nursing visit costs and 24 percent of the variance in total visit costs.^ Profit status and control, although correlated with other agency characteristics, exhibit no observable effect on costs. Although no relationship was found between costs and competition, contract cost ratio, or the provision on non-reimburseable services, no conclusions can be made due to problems with measurement of these variables. ^
Resumo:
Since the epoch-making "memoir" of Saint-Venant in 1855 the torsion of prismatic and cilindrical bars has reduced to a mathematical problem: the calculation of an analytical function satisfying prescribed boundary values. For over one century, till the first applications of the F.E.M. to the problem, the only possibility of study in irregularly shaped domains was the beatiful, but limitated, theory of complex function analysis, several functional approaches and the finite difference method. Nevertheless in 1963 Jaswon published an interestingpaper which was nearly lost between the splendid F. E.M. boom. The method was extended by Rizzo to more complicated problems and definitively incorporated to the scientific community background through several lecture-notes of Cruse recently published, but widely circulated during past years. The work of several researches has shown the tremendous possibilities of the method which is today a recognized alternative to the well established F .E. procedure. In fact, the first comprehensive attempt to cover the method, has been recently published in textbook form. This paper is a contribution to the implementation of a difficulty which arises if the isoparametric elements concept is applicated to plane potential problems with sharp corners in the boundary domain. In previous works, these problems was avoided using two principal approximations: equating the fluxes round the corner or establishing a binode element (in fact, truncating the corner). The first approximation distortes heavily the solution in thecorner neighbourhood, and a great amount of element is neccesary to reduce its influence. The second is better suited but the price payed is increasing the size of the system of equations to be solved. In this paper an alternative formulation, consistent with the shape function chosen in the isoparametric representation, is presented. For ease of comprehension the formulation has been limited to the linear element. Nevertheless its extension to more refined elements is straight forward. Also a direct procedure for the assembling of the equations is presented in an attempt to reduce the in-core computer requirements.
Resumo:
Dominance measuring methods are a new approach to deal with complex decision-making problems with imprecise information. These methods are based on the computation of pairwise dominance values and exploit the information in the dominance matrix in dirent ways to derive measures of dominance intensity and rank the alternatives under consideration. In this paper we propose a new dominance measuring method to deal with ordinal information about decision-maker preferences in both weights and component utilities. It takes advantage of the centroid of the polytope delimited by ordinal information and builds triangular fuzzy numbers whose distances to the crisp value 0 constitute the basis for the de?nition of a dominance intensity measure. Monte Carlo simulation techniques have been used to compare the performance of this method with other existing approaches.
Resumo:
Wireless teleoperation of field robots for maintenance, inspection and rescue missions is often performed in environments with low wireless connectivity, caused by signal losses from the environment and distance from the wireless transmitters. Various studies from the literature have addressed these problems with time-delay robust control systems and multi-hop wireless relay networks. However, such approaches do not solve the issue of how to present wireless data to the operator to avoid losing control of the robot. Despite the fact that teleoperation for maintenance often already involves haptic devices, no studies look at the possibility of using this existing feedback to aid operators in navigating within areas of variable wireless connectivity. We propose a method to incorporate haptic information into the velocity control of an omnidirectional robot to augment the operators perception of wireless signal strength in the remote environment. In this paper we introduce a mapping between wireless signal strength from multiple receivers to the force feedback of a 6 Degree of Freedom haptic master and evaluate the proposed approach using experimental data and randomly generated wireless maps
Resumo:
Dominance measuring methods are an approach for dealing with complex decision-making problems with imprecise information within multi-attribute value/utility theory. These methods are based on the computation of pairwise dominance values and exploit the information in the dominance matrix in different ways to derive measures of dominance intensity and rank the alternatives under consideration. In this paper we review dominance measuring methods proposed in the literature for dealing with imprecise information (intervals, ordinal information or fuzzy numbers) about decision-makers? preferences and their performance in comparison with other existing approaches, like SMAA and SMAA-II or Sarabando and Dias? method.
Resumo:
El objetivo de la tesis es la investigación de algoritmos numéricos para el desarrollo de herramientas numéricas para la simulación de problemas tanto de comportamiento en la mar como de resistencia al avance de buques y estructuras flotantes. La primera herramienta desarrollada resuelve el problema de difracción y radiación de olas. Se basan en el método de los elementos finitos (MEF) para la resolución de la ecuación de Laplace, así como en esquemas basados en MEF, integración a lo largo de líneas de corriente, y en diferencias finitas desarrollados para la condición de superficie libre. Se han desarrollado herramientas numéricas para la resolución de la dinámica de sólido rígido en sistemas multicuerpos con ligaduras. Estas herramientas han sido integradas junto con la herramienta de resolución de olas difractadas y radiadas para la resolución de problemas de interacción de cuerpos con olas. También se han diseñado algoritmos de acoplamientos con otras herramientas numéricas para la resolución de problemas multifísica. En particular, se han realizado acoplamientos con una herramienta numérica basada de cálculo de estructuras con MEF para problemas de interacción fluido-estructura, otra de cálculo de líneas de fondeo, y con una herramienta numérica de cálculo de flujos en tanques internos para problemas acoplados de comportamiento en la mar con “sloshing”. Se han realizado simulaciones numéricas para la validación y verificación de los algoritmos desarrollados, así como para el análisis de diferentes casos de estudio con aplicaciones diversas en los campos de la ingeniería naval, oceánica, y energías renovables marinas. ABSTRACT The objective of this thesis is the research on numerical algorithms to develop numerical tools to simulate seakeeping problems as well as wave resistance problems of ships and floating structures. The first tool developed is a wave diffraction-radiation solver. It is based on the finite element method (FEM) in order to solve the Laplace equation, as well as numerical schemes based on FEM, streamline integration, and finite difference method tailored for solving the free surface boundary condition. It has been developed numerical tools to solve solid body dynamics of multibody systems with body links across them. This tool has been integrated with the wave diffraction-radiation solver to solve wave-body interaction problems. Also it has been tailored coupling algorithms with other numerical tools in order to solve multi-physics problems. In particular, it has been performed coupling with a MEF structural solver to solve fluid-structure interaction problems, with a mooring solver, and with a solver capable of simulating internal flows in tanks to solve couple seakeeping-sloshing problems. Numerical simulations have been carried out to validate and verify the developed algorithms, as well as to analyze case studies in the areas of marine engineering, offshore engineering, and offshore renewable energy.
Resumo:
Among many other problems, the migration, humanitarian and policy crises in the European Union in 2015 and early 2016 have highlighted a pressing need for reliable, timely and comparable statistical data on migration, asylum and arrivals at national borders. In this fast-moving policy field, data production and the timeliness of dissemination have seen some improvements but the sources of data remain largely unchanged at national level. In this paper the author examines the reasons for some of the problems with the data for policy and for public discussion, and makes a set of recommendations that call for a complete and updated inventory of data sources and for an evaluation of the quality of data used for policy-making.
Resumo:
We have designed and tested an Internet-based video-phone suitable for use in the homes of families in need of paediatric palliative care services. The equipment uses an ordinary telephone line and includes a PC, Web camera and modem housed in a custom-made box. In initial field testing, six clinical consultations were conducted in a one-month trial of the videophone with a family in receipt of palliative care services who were living in the outer suburbs of Brisbane. Problems with variability in call quality-namely audio and video freezing, and audio break-up-prompted further laboratory testing. We completed a programme of over 250 test calls. Fixing modem connection parameters to use the V.34 modulation protocol at a set bandwidth of 24 kbit/s improved connection stability and the reliability of the video-phone. In subsequent field testing 47 of 50 calls (94%) connected without problems. The freezes that did occur were brief (with greatly reduced packet loss) and had little effect on the ability to communicate, unlike the problems arising in the home testing. The low-bandwidth Internet-based video-phone we have developed provides a feasible means of doing telemedicine in the home.
Resumo:
The testing of concurrent software components can be difficult due to the inherent non-determinism present in these components. For example, if the same test case is run multiple times, it may produce different results. This non-determinism may lead to problems with determining expected outputs. In this paper, we present and discuss several possible solutions to this problem in the context of testing concurrent Java components using the ConAn testing tool. We then present a recent extension to the tool that provides a general solution to this problem that is sufficient to deal with the level of non-determinism that we have encountered in testing over 20 components with ConAn. © 2005 IEEE
Resumo:
Physical distribution plays an imporant role in contemporary logistics management. Both satisfaction level of of customer and competitiveness of company can be enhanced if the distribution problem is solved optimally. The multi-depot vehicle routing problem (MDVRP) belongs to a practical logistics distribution problem, which consists of three critical issues: customer assignment, customer routing, and vehicle sequencing. According to the literatures, the solution approaches for the MDVRP are not satisfactory because some unrealistic assumptions were made on the first sub-problem of the MDVRP, ot the customer assignment problem. To refine the approaches, the focus of this paper is confined to this problem only. This paper formulates the customer assignment problem as a minimax-type integer linear programming model with the objective of minimizing the cycle time of the depots where setup times are explicitly considered. Since the model is proven to be MP-complete, a genetic algorithm is developed for solving the problem. The efficiency and effectiveness of the genetic algorithm are illustrated by a numerical example.