129 resultados para Geospatial Data Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil erosion in the Philippine uplands is severe. Hedgerow intercropping is widely advocated as an effective means of controlling soil erosion from annual cropping systems in the uplands. However, few farmers adopt hedgerow intercropping even in areas where it has been vigorously promoted. This may be because farmers find hedgerow intercropping to be uneconomic compared to traditional methods of farming. This paper reports a cost-benefit analysis comparing the economic returns from traditional maize farming with those from hedgerow intercropping in an upland community with no past adoption of hedgerows. A simple erosion/productivity model, Soil Changes Under Agroforestry (SCUAF), is used to predict maize yields over 25 years. Economic data were collected through key informant surveys with experienced maize farmers in an upland community. Traditional methods of open-field farming of maize are economically attractive to farmers in the Philippine uplands. In the short term, establishment costs are a major disincentive to the adoption of hedgerow intercropping. In the long term, higher economic returns from hedgerow intercropping compared to open-field farming are realised, but these lie beyond farmers' limited planning horizons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental data for E. coli debris size reduction during high-pressure homogenisation at 55 MPa are presented. A mathematical model based on grinding theory is developed to describe the data. The model is based on first-order breakage and compensation conditions. It does not require any assumption of a specified distribution for debris size and can be used given information on the initial size distribution of whole cells and the disruption efficiency during homogenisation. The number of homogeniser passes is incorporated into the model and used to describe the size reduction of non-induced stationary and induced E. coil cells during homogenisation. Regressing the results to the model equations gave an excellent fit to experimental data ( > 98.7% of variance explained for both fermentations), confirming the model's potential for predicting size reduction during high-pressure homogenisation. This study provides a means to optimise both homogenisation and disc-stack centrifugation conditions for recombinant product recovery. (C) 1997 Elsevier Science Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distributed-tubes model of hepatic elimination is extended to include intermixing between sinusoids, resulting in the formulation of a new, interconnected-tubes model. The new model is analysed for the simple case of two interconnected tubes, where an exact solution is obtained. For the case of many strongly-interconnected tubes, it is shown that a zeroth-order approximation leads to the convection-dispersion model. As a consequence the dispersion number is expressed, for the first time, in terms of its main physiological determinants: heterogeneity of flow and density of interconnections between sinusoids. The analysis of multiple indicator dilution data from a perfused liver preparation using the simplest version of the model yields the estimate 10.3 for the average number of interconnections. The problem of boundary conditions for the dispersion model is considered from the viewpoint that the dispersion-convection equation is a zeroth-order approximation to the equations for the interconnected-tubes model. (C) 1997 Academic Press Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of a major multi-site epidemiologic study of heart disease has required estimation of the pairwise correlation of several measurements across sub-populations. Because the measurements from each sub-population were subject to sampling variability, the Pearson product moment estimator of these correlations produces biased estimates. This paper proposes a model that takes into account within and between sub-population variation, provides algorithms for obtaining maximum likelihood estimates of these correlations and discusses several approaches for obtaining interval estimates. (C) 1997 by John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of cell numbers rather than mass to quantify the size of the biotic phase in animal cell cultures causes several problems. First, the cell size varies with growth conditions, thus yields expressed in terms of cell numbers cannot be used in the normal mass balance sense. Second, experience from microbial systems shows that cell number dynamics lag behind biomass dynamics. This work demonstrates that this lag phenomenon also occurs in animal cell culture. Both the lag phenomenon and the variation in cell size are explained using a simple model of the cell cycle. The basis for the model is that onset of DNA synthesis requires accumulation of G1 cyclins to a prescribed level. This requirement is translated into a requirement for a cell to reach a critical size before commencement of DNA synthesis. A slower gl-owing cell will spend more time in G1 before reaching the critical mass. In contrast, the period between onset of DNA synthesis and mitosis, tau(B), is fixed. The two parameters in the model, the critical size and tau(B), were determined from eight steady-state measurements of mean cell size in a continuous hybridoma culture. Using these parameters, it was possible to predict with reasonable accuracy the transient behavior in a separate shift-up culture, i.e., a culture where cells were transferred from a lean environment to a rich environment. The implications for analyzing experimental data for animal cell culture are discussed. (C) 1997 John Wiley & Sons, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are two main types of data sources of income distributions in China: household survey data and grouped data. Household survey data are typically available for isolated years and individual provinces. In comparison, aggregate or grouped data are typically available more frequently and usually have national coverage. In principle, grouped data allow investigation of the change of inequality over longer, continuous periods of time, and the identification of patterns of inequality across broader regions. Nevertheless, a major limitation of grouped data is that only mean (average) income and income shares of quintile or decile groups of the population are reported. Directly using grouped data reported in this format is equivalent to assuming that all individuals in a quintile or decile group have the same income. This potentially distorts the estimate of inequality within each region. The aim of this paper is to apply an improved econometric method designed to use grouped data to study income inequality in China. A generalized beta distribution is employed to model income inequality in China at various levels and periods of time. The generalized beta distribution is more general and flexible than the lognormal distribution that has been used in past research, and also relaxes the assumption of a uniform distribution of income within quintile and decile groups of populations. The paper studies the nature and extent of inequality in rural and urban China over the period 1978 to 2002. Income inequality in the whole of China is then modeled using a mixture of province-specific distributions. The estimated results are used to study the trends in national inequality, and to discuss the empirical findings in the light of economic reforms, regional policies, and globalization of the Chinese economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most cellular solids are random materials, while practically all theoretical structure-property results are for periodic models. To be able to generate theoretical results for random models, the finite element method (FEM) was used to study the elastic properties of solids with a closed-cell cellular structure. We have computed the density (rho) and microstructure dependence of the Young's modulus (E) and Poisson's ratio (PR) for several different isotropic random models based on Voronoi tessellations and level-cut Gaussian random fields. The effect of partially open cells is also considered. The results, which are best described by a power law E infinity rho (n) (1<n<2), show the influence of randomness and isotropy on the properties of closed-cell cellular materials, and are found to be in good agreement with experimental data. (C) 2001 Acta Materialia Inc. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The collection of spatial information to quantify changes to the state and condition of the environment is a fundamental component of conservation or sustainable utilization of tropical and subtropical forests, Age is an important structural attribute of old-growth forests influencing biological diversity in Australia eucalypt forests. Aerial photograph interpretation has traditionally been used for mapping the age and structure of forest stands. However this method is subjective and is not able to accurately capture fine to landscape scale variation necessary for ecological studies. Identification and mapping of fine to landscape scale vegetative structural attributes will allow the compilation of information associated with Montreal Process indicators lb and ld, which seek to determine linkages between age structure and the diversity and abundance of forest fauna populations. This project integrated measurements of structural attributes derived from a canopy-height elevation model with results from a geometrical-optical/spectral mixture analysis model to map forest age structure at a landscape scale. The availability of multiple-scale data allows the transfer of high-resolution attributes to landscape scale monitoring. Multispectral image data were obtained from a DMSV (Digital Multi-Spectral Video) sensor over St Mary's State Forest in Southeast Queensland, Australia. Local scene variance levels for different forest tapes calculated from the DMSV data were used to optimize the tree density and canopy size output in a geometric-optical model applied to a Landsat Thematic Mapper (TU) data set. Airborne laser scanner data obtained over the project area were used to calibrate a digital filter to extract tree heights from a digital elevation model that was derived from scanned colour stereopairs. The modelled estimates of tree height, crown size, and tree density were used to produce a decision-tree classification of forest successional stage at a landscape scale. The results obtained (72% accuracy), were limited in validation, but demonstrate potential for using the multi-scale methodology to provide spatial information for forestry policy objectives (ie., monitoring forest age structure).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Eysenck Personality Questionnaire-Revised (EPQ-R), the Eysenck Personality Profiler Short Version (EPP-S), and the Big Five Inventory (BFI-V4a) were administered to 135 postgraduate students of business in Pakistan. Whilst Extraversion and Neuroticism scales from the three questionnaires were highly correlated, it was found that Agreeableness was most highly correlated with Psychoticism in the EPQ-R and Conscientiousness was most highly correlated with Psychoticism in the EPP-S. Principal component analyses with varimax rotation were carried out. The analyses generally suggested that the five factor model rather than the three-factor model was more robust and better for interpretation of all the higher order scales of the EPQ-R, EPP-S, and BFI-V4a in the Pakistani data. Results show that the superiority of the five factor solution results from the inclusion of a broader variety of personality scales in the input data, whereas Eysenck's three factor solution seems to be best when a less complete but possibly more important set of variables are input. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dynamic modelling methodology, which combines on-line variable estimation and parameter identification with physical laws to form an adaptive model for rotary sugar drying processes, is developed in this paper. In contrast to the conventional rate-based models using empirical transfer coefficients, the heat and mass transfer rates are estimated by using on-line measurements in the new model. Furthermore, a set of improved sectional solid transport equations with localized parameters is developed in this work to reidentified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.place the global correlation for the computation of solid retention time. Since a number of key model variables and parameters are identified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposed a novel model for short term load forecast in the competitive electricity market. The prior electricity demand data are treated as time series. The forecast model is based on wavelet multi-resolution decomposition by autocorrelation shell representation and neural networks (multilayer perceptrons, or MLPs) modeling of wavelet coefficients. To minimize the influence of noisy low level coefficients, we applied the practical Bayesian method Automatic Relevance Determination (ARD) model to choose the size of MLPs, which are then trained to provide forecasts. The individual wavelet domain forecasts are recombined to form the accurate overall forecast. The proposed method is tested using Queensland electricity demand data from the Australian National Electricity Market. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bond's method for ball mill scale-up only gives the mill power draw for a given duty. This method is incompatible with computer modelling and simulation techniques. It might not be applicable for the design of fine grinding ball mills and ball mills preceded by autogenous and semi-autogenous grinding mills. Model-based ball mill scale-up methods have not been validated using a wide range of full-scale circuit data. Their accuracy is therefore questionable. Some of these methods also need expensive pilot testing. A new ball mill scale-up procedure is developed which does not have these limitations. This procedure uses data from two laboratory tests to determine the parameters of a ball mill model. A set of scale-up criteria then scales-up these parameters. The procedure uses the scaled-up parameters to simulate the steady state performance of full-scale mill circuits. At the end of the simulation, the scale-up procedure gives the size distribution, the volumetric flowrate and the mass flowrate of all the streams in the circuit, and the mill power draw.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new ball mill scale-up procedure is developed which uses laboratory data to predict the performance of MI-scale ball mill circuits. This procedure contains two laboratory tests. These laboratory tests give the data for the determination of the parameters of a ball mill model. A set of scale-up criteria then scales-up these parameters. The procedure uses the scaled-up parameters to simulate the steady state performance of the full-scale mill circuit. At the end of the simulation, the scale-up procedure gives the size distribution, the volumetric flowrate and the mass flowrate of all the streams in the circuit, and the mill power draw. A worked example shows how the new ball mill scale-up procedure is executed. This worked example uses laboratory data to predict the performance of a full-scale re-grind mill circuit. This circuit consists of a ball mill in closed circuit with hydrocyclones. The MI-scale ball mill has a diameter (inside liners) of 1.85m. The scale-up procedure shows that the full-scale circuit produces a product (hydrocyclone overflow) that has an 80% passing size of 80 mum. The circuit has a recirculating load of 173%. The calculated power draw of the full-scale mill is 92kW (C) 2001 Elsevier Science Ltd. All rights reserved.