46 resultados para Nonhomogeneous initial-boundary-value problems
Resumo:
This study has concentrated on the development of an impact simulation model for use at the sub-national level. The necessity for the development of this model was demonstrated by the growth of local economic initiatives during the 1970's, and the lack of monitoring and evaluation exercise to assess their success and cost-effectiveness. The first stage of research involved the confirmation that the potential for micro-economic and spatial initiatives existed. This was done by identifying the existence of involuntary structural unemployment. The second stage examined the range of employment policy options from the macroeconomic, micro-economic and spatial perspectives, and focused on the need for evaluation of those policies. The need for spatial impact evaluation exercise in respect of other exogenous shocks, and structural changes was also recognised. The final stage involved the investigation of current techniques of evaluation and their adaptation for the purpose in hand. This led to a recognition of a gap in the armoury of techniques. The employment-dependency model has been developed to fill that gap, providing a low-budget model, capable of implementation at the small area level and generating a vast array of industrially disaggregate data, in terms of employment, employment-income, profits, value-added and gross income, related to levels of United Kingdom final demand. Thus providing scope for a variety of impact simulation exercises.
Resumo:
We investigate two numerical procedures for the Cauchy problem in linear elasticity, involving the relaxation of either the given boundary displacements (Dirichlet data) or the prescribed boundary tractions (Neumann data) on the over-specified boundary, in the alternating iterative algorithm of Kozlov et al. (1991). The two mixed direct (well-posed) problems associated with each iteration are solved using the method of fundamental solutions (MFS), in conjunction with the Tikhonov regularization method, while the optimal value of the regularization parameter is chosen via the generalized cross-validation (GCV) criterion. An efficient regularizing stopping criterion which ceases the iterative procedure at the point where the accumulation of noise becomes dominant and the errors in predicting the exact solutions increase, is also presented. The MFS-based iterative algorithms with relaxation are tested for Cauchy problems for isotropic linear elastic materials in various geometries to confirm the numerical convergence, stability, accuracy and computational efficiency of the proposed method.
Resumo:
We propose and investigate a method for the stable determination of a harmonic function from knowledge of its value and its normal derivative on a part of the boundary of the (bounded) solution domain (Cauchy problem). We reformulate the Cauchy problem as an operator equation on the boundary using the Dirichlet-to-Neumann map. To discretize the obtained operator, we modify and employ a method denoted as Classic II given in [J. Helsing, Faster convergence and higher accuracy for the Dirichlet–Neumann map, J. Comput. Phys. 228 (2009), pp. 2578–2576, Section 3], which is based on Fredholm integral equations and Nyström discretization schemes. Then, for stability reasons, to solve the discretized integral equation we use the method of smoothing projection introduced in [J. Helsing and B.T. Johansson, Fast reconstruction of harmonic functions from Cauchy data using integral equation techniques, Inverse Probl. Sci. Eng. 18 (2010), pp. 381–399, Section 7], which makes it possible to solve the discretized operator equation in a stable way with minor computational cost and high accuracy. With this approach, for sufficiently smooth Cauchy data, the normal derivative can also be accurately computed on the part of the boundary where no data is initially given.
Resumo:
We investigate an application of the method of fundamental solutions (MFS) to the one-dimensional parabolic inverse Cauchy–Stefan problem, where boundary data and the initial condition are to be determined from the Cauchy data prescribed on a given moving interface. In [B.T. Johansson, D. Lesnic, and T. Reeve, A method of fundamental solutions for the one-dimensional inverse Stefan Problem, Appl. Math Model. 35 (2011), pp. 4367–4378], the inverse Stefan problem was considered, where only the boundary data is to be reconstructed on the fixed boundary. We extend the MFS proposed in Johansson et al. (2011) and show that the initial condition can also be simultaneously recovered, i.e. the MFS is appropriate for the inverse Cauchy-Stefan problem. Theoretical properties of the method, as well as numerical investigations, are included, showing that accurate results can be efficiently obtained with small computational cost.
Resumo:
In this paper, free surface problems of Stefan-type for the parabolic heat equation are investigated using the method of fundamental solutions. The additional measurement necessary to determine the free surface could be a boundary temperature, a heat flux or an energy measurement. Both one- and two-phase flows are investigated. Numerical results are presented and discussed.
Resumo:
We consider a Cauchy problem for the heat equation, where the temperature field is to be reconstructed from the temperature and heat flux given on a part of the boundary of the solution domain. We employ a Landweber type method proposed in [2], where a sequence of mixed well-posed problems are solved at each iteration step to obtain a stable approximation to the original Cauchy problem. We develop an efficient boundary integral equation method for the numerical solution of these mixed problems, based on the method of Rothe. Numerical examples are presented both with exact and noisy data, showing the efficiency and stability of the proposed procedure and approximations.
Resumo:
The shape of a plane acoustical sound-soft obstacle is detected from knowledge of the far field pattern for one time-harmonic incident field. Two methods based on solving a system of integral equations for the incoming wave and the far field pattern are investigated. Properties of the integral operators required in order to apply regularization, i.e. injectivity and denseness of the range, are proved.
Resumo:
PowerAqua is a Question Answering system, which takes as input a natural language query and is able to return answers drawn from relevant semantic resources found anywhere on the Semantic Web. In this paper we provide two novel contributions: First, we detail a new component of the system, the Triple Similarity Service, which is able to match queries effectively to triples found in different ontologies on the Semantic Web. Second, we provide a first evaluation of the system, which in addition to providing data about PowerAqua's competence, also gives us important insights into the issues related to using the Semantic Web as the target answer set in Question Answering. In particular, we show that, despite the problems related to the noisy and incomplete conceptualizations, which can be found on the Semantic Web, good results can already be obtained.
Resumo:
An iterative method for the parabolic Cauchy problem in planar domains having a finite number of corners is implemented based on boundary integral equations. At each iteration, mixed well-posed problems are solved for the same parabolic operator. The presence of corner points renders singularities of the solutions to these mixed problems, and this is handled with the use of weight functions together with, in the numerical implementation, mesh grading near the corners. The mixed problems are reformulated in terms of boundary integrals obtained via discretization of the time-derivative to obtain an elliptic system of partial differential equations. To numerically solve these integral equations a Nyström method with super-algebraic convergence order is employed. Numerical results are presented showing the feasibility of the proposed approach. © 2014 IMACS.
Resumo:
Purpose - This research note aims to present a summary of research concerning economic-lot scheduling problem (ELSP). Design/methodology/approach - The paper's approach is to review over 100 selected studies published in the last 15 years (1997-2012), which are then grouped under different research themes. Findings - Five research themes are identified and insights for future studies are reported at the end of this paper. Research limitations/implications - The motivation of preparing this research note is to summarize key research studies in this field since 1997, when the ELSP problems have been verified as NP-hard. Originality/value - ELSP is an important scheduling problem that has been studied since the 1950s. Because of its complexity in delivering a feasible analytical closed form solution, many studies in the last two decades employed heuristic algorithms in order to come up with good and acceptable solutions. As a consequence, the solution approaches are quite diversified. The major contribution of this paper is to provide researchers who are interested in this area with a quick reference guide on the reviewed studies. © Emerald Group Publishing Limited.
Resumo:
Fuzzy data envelopment analysis (DEA) models emerge as another class of DEA models to account for imprecise inputs and outputs for decision making units (DMUs). Although several approaches for solving fuzzy DEA models have been developed, there are some drawbacks, ranging from the inability to provide satisfactory discrimination power to simplistic numerical examples that handles only triangular fuzzy numbers or symmetrical fuzzy numbers. To address these drawbacks, this paper proposes using the concept of expected value in generalized DEA (GDEA) model. This allows the unification of three models - fuzzy expected CCR, fuzzy expected BCC, and fuzzy expected FDH models - and the ability of these models to handle both symmetrical and asymmetrical fuzzy numbers. We also explored the role of fuzzy GDEA model as a ranking method and compared it to existing super-efficiency evaluation models. Our proposed model is always feasible, while infeasibility problems remain in certain cases under existing super-efficiency models. In order to illustrate the performance of the proposed method, it is first tested using two established numerical examples and compared with the results obtained from alternative methods. A third example on energy dependency among 23 European Union (EU) member countries is further used to validate and describe the efficacy of our approach under asymmetric fuzzy numbers.
Resumo:
It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.
Resumo:
This work is an initial study of a numerical method for identifying multiple leak zones in saturated unsteady flow. Using the conventional saturated groundwater flow equation, the leak identification problem is modelled as a Cauchy problem for the heat equation and the aim is to find the regions on the boundary of the solution domain where the solution vanishes, since leak zones correspond to null pressure values. This problem is ill-posed and to reconstruct the solution in a stable way, we therefore modify and employ an iterative regularizing method proposed in [1] and [2]. In this method, mixed well-posed problems obtained by changing the boundary conditions are solved for the heat operator as well as for its adjoint, to get a sequence of approximations to the original Cauchy problem. The mixed problems are solved using a Finite element method (FEM), and the numerical results indicate that the leak zones can be identified with the proposed method.
Resumo:
Purpose: The purpose of this paper is to investigate the possibilities and problems for collaboration in the area of corporate social responsibility (CSR) and sustainability. The paper explores the nature and concept of collaboration and its forms, and critically evaluates the potential contribution a collaborative approach between agencies might offer to these agendas. Design/methodology/approach: The paper explores different forms of research on collaboration, together with a UK Government report on collaboration, to evaluate how the issue is addressed in theory and practice. Findings: Sustainable development creates extensive challenges for a wide range of agencies, including governments, non-governmental organizations, businesses and civil society. It is unlikely, however, that solutions will be found in any one quarter. Collaboration between agencies in some form would seem a logical step in supporting measures towards a more responsible and environmentally sustainable global economy. Originality/value: The paper offers new insights into developing a research and praxis agenda for collaborative possibilities towards the advancement of CSR and sustainability. © Emerald Group Publishing Limited.