110 resultados para Two-point boundary value problems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The potential of near infrared spectroscopy in conjunction with partial least squares regression to predict Miscanthus xgiganteus and short rotation coppice willow quality indices was examined. Moisture, calorific value, ash and carbon content were predicted with a root mean square error of cross validation of 0.90% (R2 = 0.99), 0.13 MJ/kg (R2 = 0.99), 0.42% (R2 = 0.58), and 0.57% (R2 = 0.88), respectively. The moisture and calorific value prediction models had excellent accuracy while the carbon and ash models were fair and poor, respectively. The results indicate that near infrared spectroscopy has the potential to predict quality indices of dedicated energy crops, however the models must be further validated on a wider range of samples prior to implementation. The utilization of such models would assist in the optimal use of the feedstock based on its biomass properties.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper discusses concepts of value from the point of view of the user of the space and the counter view of the provider of the same. Land and property are factors of production. The value of the land flows from the use to which it is put, and that in turn, is dependent upon the demand (and supply) for the product or service that is produced/provided from that space. If there is a high demand for the product (at a fixed level of supply), the price will increase and the economic rent for the land/property will increase accordingly. This is the underlying paradigm of Ricardian rent theory where the supply of land is fixed and a single good is produced. In such a case the rent of land is wholly an economic rent. Economic theory generally distinguishes between two kinds of price, price of production or “value in use” (as determined by the labour theory of value), and market price or “value in exchange” (as determined by supply and demand). It is based on a coherent and consistent theory of value and price. Effectively the distinction is between what space is ‘worth’ to an individual and that space’s price of exchange in the market place. In a perfect market where any individual has access to the same information as all others in the market, price and worth should coincide. However in a market where access to information is not uniform, and where different uses compete for the same space, it is more likely that the two figures will diverge. This paper argues that the traditional reliance of valuers to use methods of comparison to determine “price” has led to an artificial divergence of “value in use” and “value in exchange”, but now such comparison are becoming more difficult due to the diversity of lettings in the market place, there will be a requirement to return to fundamentals and pay heed to the thought process of the user in assessing the worth of the space to be let.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

 In the last decade, a vast number of land surface schemes has been designed for use in global climate models, atmospheric weather prediction, mesoscale numerical models, ecological models, and models of global changes. Since land surface schemes are designed for different purposes they have various levels of complexity in the treatment of bare soil processes, vegetation, and soil water movement. This paper is a contribution to a little group of papers dealing with intercomparison of differently designed and oriented land surface schemes. For that purpose we have chosen three schemes for classification: i) global climate models, BATS (Dickinson et al., 1986; Dickinson et al., 1992); ii) mesoscale and ecological models, LEAF (Lee, 1992) and iii) mesoscale models, LAPS (Mihailović, 1996; Mihailović and Kallos, 1997; Mihailović et al., 1999) according to the Shao et al. (1995) classification. These schemes were compared using surface fluxes and leaf temperature outputs obtained by time integrations of data sets derived from the micrometeorological measurements above a maize field at an experimental site in De Sinderhoeve (The Netherlands) for 18 August, 8 September, and 4 October 1988. Finally, comparison of the schemes was supported applying a simple statistical analysis on the surface flux outputs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we consider one-dimensional diffusions with constant coefficients in a finite interval with jump boundary and a certain deterministic jump distribution. We use coupling methods in order to identify the spectral gap in the case of a large drift and prove that there is a threshold drift above which the bottom of the spectrum no longer depends on the drift. As a corollary to our result we are able to answer two questions concerning elliptic eigenvalue problems with non-local boundary conditions formulated previously by Iddo Ben-Ari and Ross Pinsky.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The goal of this paper is to study and further develop the orthogonality sampling or stationary waves algorithm for the detection of the location and shape of objects from the far field pattern of scattered waves in electromagnetics or acoustics. Orthogonality sampling can be seen as a special beam forming algorithm with some links to the point source method and to the linear sampling method. The basic idea of orthogonality sampling is to sample the space under consideration by calculating scalar products of the measured far field pattern , with a test function for all y in a subset Q of the space , m = 2, 3. The way in which this is carried out is important to extract the information which the scattered fields contain. The theoretical foundation of orthogonality sampling is only partly resolved, and the goal of this work is to initiate further research by numerical demonstration of the high potential of the approach. We implement the method for a two-dimensional setting for the Helmholtz equation, which represents electromagnetic scattering when the setup is independent of the third coordinate. We show reconstructions of the location and shape of objects from measurements of the scattered field for one or several directions of incidence and one or many frequencies or wave numbers, respectively. In particular, we visualize the indicator function both with the Dirichlet and Neumann boundary condition and for complicated inhomogeneous media.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A neurofuzzy classifier identification algorithm is introduced for two class problems. The initial fuzzy base construction is based on fuzzy clustering utilizing a Gaussian mixture model (GMM) and the analysis of covariance (ANOVA) decomposition. The expectation maximization (EM) algorithm is applied to determine the parameters of the fuzzy membership functions. Then neurofuzzy model is identified via the supervised subspace orthogonal least square (OLS) algorithm. Finally a logistic regression model is applied to produce the class probability. The effectiveness of the proposed neurofuzzy classifier has been demonstrated using a real data set.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

 In the last decade, a vast number of land surface schemes has been designed for use in global climate models, atmospheric weather prediction, mesoscale numerical models, ecological models, and models of global changes. Since land surface schemes are designed for different purposes they have various levels of complexity in the treatment of bare soil processes, vegetation, and soil water movement. This paper is a contribution to a little group of papers dealing with intercomparison of differently designed and oriented land surface schemes. For that purpose we have chosen three schemes for classification: i) global climate models, BATS (Dickinson et al., 1986; Dickinson et al., 1992); ii) mesoscale and ecological models, LEAF (Lee, 1992) and iii) mesoscale models, LAPS (Mihailović, 1996; Mihailović and Kallos, 1997; Mihailović et al., 1999) according to the Shao et al. (1995) classification. These schemes were compared using surface fluxes and leaf temperature outputs obtained by time integrations of data sets derived from the micrometeorological measurements above a maize field at an experimental site in De Sinderhoeve (The Netherlands) for 18 August, 8 September, and 4 October 1988. Finally, comparison of the schemes was supported applying a simple statistical analysis on the surface flux outputs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Model differences in projections of extratropical regional climate change due to increasing greenhouse gases are investigated using two atmospheric general circulation models (AGCMs): ECHAM4 (Max Planck Institute, version 4) and CCM3 (National Center for Atmospheric Research Community Climate Model version 3). Sea-surface temperature (SST) fields calculated from observations and coupled versions of the two models are used to force each AGCM in experiments based on time-slice methodology. Results from the forced AGCMs are then compared to coupled model results from the Coupled Model Intercomparison Project 2 (CMIP2) database. The time-slice methodology is verified by showing that the response of each model to doubled CO2 and SST forcing from the CMIP2 experiments is consistent with the results of the coupled GCMs. The differences in the responses of the models are attributed to (1) the different tropical SST warmings in the coupled simulations and (2) the different atmospheric model responses to the same tropical SST warmings. Both are found to have important contributions to differences in implied Northern Hemisphere (NH) winter extratropical regional 500 mb height and tropical precipitation climate changes. Forced teleconnection patterns from tropical SST differences are primarily responsible for sensitivity differences in the extratropical North Pacific, but have relatively little impact on the North Atlantic. There are also significant differences in the extratropical response of the models to the same tropical SST anomalies due to differences in numerical and physical parameterizations. Differences due to parameterizations dominate in the North Atlantic. Differences in the control climates of the two coupled models from the current climate, in particular for the coupled model containing CCM3, are also demonstrated to be important in leading to differences in extratropical regional sensitivity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A two-stage linear-in-the-parameter model construction algorithm is proposed aimed at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage which constructs a sparse linear-in-the-parameter classifier. The prefiltering stage is a two-level process aimed at maximizing a model's generalization capability, in which a new elastic-net model identification algorithm using singular value decomposition is employed at the lower level, and then, two regularization parameters are optimized using a particle-swarm-optimization algorithm at the upper level by minimizing the leave-one-out (LOO) misclassification rate. It is shown that the LOO misclassification rate based on the resultant prefiltered signal can be analytically computed without splitting the data set, and the associated computational cost is minimal due to orthogonality. The second stage of sparse classifier construction is based on orthogonal forward regression with the D-optimality algorithm. Extensive simulations of this approach for noisy data sets illustrate the competitiveness of this approach to classification of noisy data problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to explore, from a practical point-of-view, a number of key strategic issues that critically influence organisations' competitiveness. Design/methodology/approach – The paper is based on a semi-structured interview with Mr Paul Walsh, CEO of Diageo. Diageo is a highly successful company and Mr Walsh has played a central role in making Diageo the number one branded drink company in the world. Findings – The paper discusses the key attributes of successful merger, lessons from a complex cross boarder acquisition, rationale for strategic alliance with competitors, distinctive resources, and the role of corporate social responsibility. Research limitations/implications – It is not too often that management scholars have the opportunity to discuss with the CEOs of large multinationals the rational of key strategic decisions. In this paper these issues are explored from the perspective of a CEO of a large and successful company. The lessons, while not generalisable, offer unique insights to students of management and management researchers. Originality/value – The paper offers a bridge between theory and practice. It demonstrates that from Diageo's perspective the distinctive capabilities are intangible. It also offers insight into how to successfully execute strategic decision. In terms of originality it offers a view from the top, which is often missing from strategy research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

New representations and efficient calculation methods are derived for the problem of propagation from an infinite regularly spaced array of coherent line sources above a homogeneous impedance plane, and for the Green's function for sound propagation in the canyon formed by two infinitely high, parallel rigid or sound soft walls and an impedance ground surface. The infinite sum of source contributions is replaced by a finite sum and the remainder is expressed as a Laplace-type integral. A pole subtraction technique is used to remove poles in the integrand which lie near the path of integration, obtaining a smooth integrand, more suitable for numerical integration, and a specific numerical integration method is proposed. Numerical experiments show highly accurate results across the frequency spectrum for a range of ground surface types. It is expected that the methods proposed will prove useful in boundary element modeling of noise propagation in canyon streets and in ducts, and for problems of scattering by periodic surfaces.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The endocannabinoid system (ECS) was only 'discovered' in the 1990s. Since then, many new ligands have been identified, as well as many new intracellular targets--ranging from the PPARs, to mitochondria, to lipid rafts. It was thought that blocking the CB-1 receptor might reverse obesity and the metabolic syndrome. This was based on the idea that the ECS was dysfunctional in these conditions. This has met with limited success. The reason may be that the ECS is a homeostatic system, which integrates energy seeking and storage behaviour with resistance to oxidative stress. It could be viewed as having thrifty actions. Thriftiness is an innate property of life, which is programmed to a set point by both environment and genetics, resulting in an epigenotype perfectly adapted to its environment. This thrifty set point can be modulated by hormetic stimuli, such as exercise, cold and plant micronutrients. We have proposed that the physiological and protective insulin resistance that underlies thriftiness encapsulates something called 'redox thriftiness', whereby insulin resistance is determined by the ability to resist oxidative stress. Modern man has removed most hormetic stimuli and replaced them with a calorific sedentary lifestyle, leading to increased risk of metabolic inflexibility. We suggest that there is a tipping point where lipotoxicity in adipose and hepatic cells induces mild inflammation, which switches thrifty insulin resistance to inflammation-driven insulin resistance. To understand this, we propose that the metabolic syndrome could be seen from the viewpoint of the ECS, the mitochondrion and the FOXO group of transcription factors. FOXO has many thrifty actions, including increasing insulin resistance and appetite, suppressing oxidative stress and shifting the organism towards using fatty acids. In concert with factors such as PGC-1, they also modify mitochondrial function and biogenesis. Hence, the ECS and FOXO may interact at many points; one of which may be via intracellular redox signalling. As cannabinoids have been shown to modulate reactive oxygen species production, it is possible that they can upregulate anti-oxidant defences. This suggests they may have an 'endohormetic' signalling function. The tipping point into the metabolic syndrome may be the result of a chronic lack of hormetic stimuli (in particular, physical activity), and thus, stimulus for PGC-1, with a resultant reduction in mitochondrial function and a reduced lipid capacitance. This, in the context of a positive calorie environment, will result in increased visceral adipose tissue volume, abnormal ectopic fat content and systemic inflammation. This would worsen the inflammatory-driven pathological insulin resistance and inability to deal with lipids. The resultant oxidative stress may therefore drive a compensatory anti-oxidative response epitomised by the ECS and FOXO. Thus, although blocking the ECS (e.g. via rimonabant) may induce temporary weight loss, it may compromise long-term stress resistance. Clues about how to modulate the system more safely are emerging from observations that some polyphenols, such as resveratrol and possibly, some phytocannabinoids, can modulate mitochondrial function and might improve resistance to a modern lifestyle.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.