807 resultados para Oberwolfach Problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A neurofuzzy classifier identification algorithm is introduced for two class problems. The initial fuzzy base construction is based on fuzzy clustering utilizing a Gaussian mixture model (GMM) and the analysis of covariance (ANOVA) decomposition. The expectation maximization (EM) algorithm is applied to determine the parameters of the fuzzy membership functions. Then neurofuzzy model is identified via the supervised subspace orthogonal least square (OLS) algorithm. Finally a logistic regression model is applied to produce the class probability. The effectiveness of the proposed neurofuzzy classifier has been demonstrated using a real data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops a conceptual framework for analyzing emerging agricultural hydrology problems in post-conflict Libya. Libya is one of the most arid regions on the planet. Thus, as well as substantial political and social changes, post-conflict Libyan administrators are confronted with important hydrological issues in Libya’s emerging water-landuse complex. This paper presents a substantial background to the water-land-use problem in Libya; reviews previous work in Libya and elsewhere on water-land-use issues and water-land-use conflicts in the dry and arid zones; outlines a conceptual framework for fruitful research interventions; and details the results of a survey conducted on Libyan farmers’ water usage, perceptions of emerging water-land-use conflicts and the appropriate value one should place on agricultural-use hydrological resources in Libya. Extensions are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Red tape is not desirable as it impedes business growth. Relief from the administrative burdens that businesses face due to legislation can benefit the whole economy, especially at times of recession. However, recent governmental initiatives aimed at reducing administrative burdens have encountered some success, but also failures. This article compares three national initiatives - in the Netherlands, UK and Italy - aimed at cutting red tape by using the Standard Cost Model. Findings highlight the factors affecting the outcomes of measurement and reduction plans and ways to improve the Standard Cost Model methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – This paper summarises the main research findings from a detailed, qualitative set of structured interviews and case studies of private finance initiative (PFI) schemes in the UK, which involve the construction of built facilities. The research, which was funded by the Foundation for the Built Environment, examines the emergence of PFI in the UK. Benefits and problems in the PFI process are investigated. Best practice, the key critical factors for success, and lessons for the future are also analysed. Design/methodology/approach – The research is based around 11 semi-structured interviews conducted with stakeholders in key PFI projects in the UK. Findings – The research demonstrates that value for money and risk transfer are key success criteria. High procurement and transaction costs are a feature of PFI projects, and the large-scale nature of PFI projects frequently acts as barrier to entry. Research limitations/implications – The research is based on a limited number of in-depth case study interviews. The paper also shows that further research is needed to find better ways to measure these concepts empirically. Practical implications – The paper is important in highlighting four main areas of practical improvement in the PFI process: value for money assessment; establishing end-user needs; developing competitive markets and developing appropriate skills in the public sector. Originality/value – The paper examines the drivers, barriers and critical success factors for PFI in the UK for the first time in detail and will be of value to property investors, financiers, and others involved in the PFI process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this lecture is to review recent development in data analysis, initialization and data assimilation. The development of 3-dimensional multivariate schemes has been very timely because of its suitability to handle the many different types of observations during FGGE. Great progress has taken place in the initialization of global models by the aid of non-linear normal mode technique. However, in spite of great progress, several fundamental problems are still unsatisfactorily solved. Of particular importance is the question of the initialization of the divergent wind fields in the Tropics and to find proper ways to initialize weather systems driven by non-adiabatic processes. The unsatisfactory ways in which such processes are being initialized are leading to excessively long spin-up times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to reduce environmental impacts and achieve sustainability, it is important to balance the interactions between the built and natural environment. The construction industry is becoming more aware of ecological concerns and the importance that biodiversity and maintenance ecosystem services has for sustainability. Bats constitute an important component of urban biodiversity and several species in the UK are highly dependent on buildings, making them particularly vulnerable to anthropogenic and environmental changes. Many buildings suitable for use as bat roosts often require re-roofing as they age and traditional bituminous roofing felts are frequently being replaced with breathable roofing membranes (BRMs). In the UK new building regulations and modern materials may substantially reduce the viability of existing roosts, yet at thesame time building regulations require that materials be fit for purpose. Reports suggest that both bats and BRMs may experience problems when the two interact. Such information makes it important to understand how house dwelling bats and BRMs may be affected. This paper considers the possible ways in which bats and BRMs may interact, how this could affect existing bat roosts within buildings and the implications for BRM service life predictions and warranties. Keywords –Breathable Roofing Membranes, Bats in Buildings, Material Deterioration, Sustainability, Conservation, Biodiversit

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In The Conduct of Inquiry in International Relations, Patrick Jackson situates methodologies in International Relations in relation to their underlying philosophical assumptions. One of his aims is to map International Relations debates in a way that ‘capture[s] current controversies’ (p. 40). This ambition is overstated: whilst Jackson’s typology is useful as a clarificatory tool, (re)classifying existing scholarship in International Relations is more problematic. One problem with Jackson’s approach is that he tends to run together the philosophical assumptions which decisively differentiate his methodologies (by stipulating a distinctive warrant for knowledge claims) and the explanatory strategies that are employed to generate such knowledge claims, suggesting that the latter are entailed by the former. In fact, the explanatory strategies which Jackson associates with each methodology reflect conventional practice in International Relations just as much as they reflect philosophical assumptions. This makes it more difficult to identify each methodology at work than Jackson implies. I illustrate this point through a critical analysis of Jackson’s controversial reclassification of Waltz as an analyticist, showing that whilst Jackson’s typology helps to expose inconsistencies in Waltz’s approach, it does not fully support the proposed reclassification. The conventional aspect of methodologies in International Relations also raises questions about the limits of Jackson’s ‘engaged pluralism’.