894 resultados para Quasilinear Elliptic Problems
Resumo:
A neurofuzzy classifier identification algorithm is introduced for two class problems. The initial fuzzy base construction is based on fuzzy clustering utilizing a Gaussian mixture model (GMM) and the analysis of covariance (ANOVA) decomposition. The expectation maximization (EM) algorithm is applied to determine the parameters of the fuzzy membership functions. Then neurofuzzy model is identified via the supervised subspace orthogonal least square (OLS) algorithm. Finally a logistic regression model is applied to produce the class probability. The effectiveness of the proposed neurofuzzy classifier has been demonstrated using a real data set.
Resumo:
This paper develops a conceptual framework for analyzing emerging agricultural hydrology problems in post-conflict Libya. Libya is one of the most arid regions on the planet. Thus, as well as substantial political and social changes, post-conflict Libyan administrators are confronted with important hydrological issues in Libya’s emerging water-landuse complex. This paper presents a substantial background to the water-land-use problem in Libya; reviews previous work in Libya and elsewhere on water-land-use issues and water-land-use conflicts in the dry and arid zones; outlines a conceptual framework for fruitful research interventions; and details the results of a survey conducted on Libyan farmers’ water usage, perceptions of emerging water-land-use conflicts and the appropriate value one should place on agricultural-use hydrological resources in Libya. Extensions are discussed.
The unsteady flow of a weakly compressible fluid in a thin porous layer II: three-dimensional theory
Resumo:
We consider the problem of determining the pressure and velocity fields for a weakly compressible fluid flowing in a three-dimensional layer, composed of an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting and/or extracting fluid. Numerical solution of this three-dimensional evolution problem may be expensive, particularly in the case that the depth scale of the layer h is small compared to the horizontal length scale l, a situation which occurs frequently in the application to oil and gas reservoir recovery and which leads to significant stiffness in the numerical problem. Under the assumption that $\epsilon\propto h/l\ll 1$, we show that, to leading order in $\epsilon$, the pressure field varies only in the horizontal directions away from the wells (the outer region). We construct asymptotic expansions in $\epsilon$ in both the inner (near the wells) and outer regions and use the asymptotic matching principle to derive expressions for all significant process quantities. The only computations required are for the solution of non-stiff linear, elliptic, two-dimensional boundary-value, and eigenvalue problems. This approach, via the method of matched asymptotic expansions, takes advantage of the small aspect ratio of the layer, $\epsilon$, at precisely the stage where full numerical computations become stiff, and also reveals the detailed structure of the dynamics of the flow, both in the neighbourhood of wells and away from wells.
Resumo:
We describe a novel method for determining the pressure and velocity fields for a weakly compressible fluid flowing in a thin three-dimensional layer composed of an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting and/or extracting fluid. Our approach uses the method of matched asymptotic expansions to derive expressions for all significant process quantities, the computation of which requires only the solution of linear, elliptic, two-dimensional boundary value and eigenvalue problems. In this article, we provide full implementation details and present numerical results demonstrating the efficiency and accuracy of our scheme.
Resumo:
Red tape is not desirable as it impedes business growth. Relief from the administrative burdens that businesses face due to legislation can benefit the whole economy, especially at times of recession. However, recent governmental initiatives aimed at reducing administrative burdens have encountered some success, but also failures. This article compares three national initiatives - in the Netherlands, UK and Italy - aimed at cutting red tape by using the Standard Cost Model. Findings highlight the factors affecting the outcomes of measurement and reduction plans and ways to improve the Standard Cost Model methodology.
Resumo:
Purpose – This paper summarises the main research findings from a detailed, qualitative set of structured interviews and case studies of private finance initiative (PFI) schemes in the UK, which involve the construction of built facilities. The research, which was funded by the Foundation for the Built Environment, examines the emergence of PFI in the UK. Benefits and problems in the PFI process are investigated. Best practice, the key critical factors for success, and lessons for the future are also analysed. Design/methodology/approach – The research is based around 11 semi-structured interviews conducted with stakeholders in key PFI projects in the UK. Findings – The research demonstrates that value for money and risk transfer are key success criteria. High procurement and transaction costs are a feature of PFI projects, and the large-scale nature of PFI projects frequently acts as barrier to entry. Research limitations/implications – The research is based on a limited number of in-depth case study interviews. The paper also shows that further research is needed to find better ways to measure these concepts empirically. Practical implications – The paper is important in highlighting four main areas of practical improvement in the PFI process: value for money assessment; establishing end-user needs; developing competitive markets and developing appropriate skills in the public sector. Originality/value – The paper examines the drivers, barriers and critical success factors for PFI in the UK for the first time in detail and will be of value to property investors, financiers, and others involved in the PFI process.
Resumo:
The purpose of this lecture is to review recent development in data analysis, initialization and data assimilation. The development of 3-dimensional multivariate schemes has been very timely because of its suitability to handle the many different types of observations during FGGE. Great progress has taken place in the initialization of global models by the aid of non-linear normal mode technique. However, in spite of great progress, several fundamental problems are still unsatisfactorily solved. Of particular importance is the question of the initialization of the divergent wind fields in the Tropics and to find proper ways to initialize weather systems driven by non-adiabatic processes. The unsatisfactory ways in which such processes are being initialized are leading to excessively long spin-up times.
Resumo:
Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.