185 resultados para model efficiency


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper summarises test results that were used to validate a model and scale-up procedure of the high pressure grinding roll (HPGR) which was developed at the JKMRC by Morrell et al. [Morrell, Lim, Tondo, David,1996. Modelling the high pressure grinding rolls. In: Mining Technology Conference, pp. 169-176.]. Verification of the model is based on results from four data sets that describe the performance of three industrial scale units fitted with both studded and smooth roll surfaces. The industrial units are currently in operation within the diamond mining industry and are represented by De Beers, BHP Billiton and Rio Tinto. Ore samples from the De Beers and BHP Billiton operations were sent to the JKMRC for ore characterisation and HPGR laboratory-scale tests. Rio Tinto contributed an historical data set of tests completed during a previous research project. The results conclude that the modelling of the HPGR process has matured to a point where the model may be used to evaluate new and to optimise existing comminution circuits. The model prediction of product size distribution is good and has been found to be strongly dependent of the characteristics of the material being tested. The prediction of throughput and corresponding power draw (based on throughput) is sensitive to inconsistent gap/diameter ratios observed between laboratory-scale tests and full-scale operations. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims: We have optimized the isolated perfused mouse kidney (IPMK) model for studying renal vascular and tubular function in vitro using 24-28 g C57BL6J mice; the wild type controls for many transgenic mice. Methods and Results: Buffer composition was optimized for bovine serum albumin concentration (BSA). The effect of adding erythrocytes on renal function and morphology was assessed. Autoregulation was investigated during stepped increases in perfusion pressure. Perfusion for 60 min at 90-110 mmHg with Krebs bicarbonate buffer containing 5.5% BSA, and amino acids produced functional parameters within the in vivo range. Erythrocytes increased renal vascular resistance (3.8 +/- 0.2 vs 2.4 +/- 0.1 mL/min.mmHg, P < 0.05), enhanced sodium reabsorption (FENa = 0.3 +/- 0.08 vs 1.5 +/- 0.7%, P < 0.05), produced equivalent glomerular filtration rates (GFR; 364 +/- 38 vs 400 +/- 9 muL/min per gkw) and reduced distal tubular cell injury in the inner stripe (5.8 +/- 1.7 vs 23.7 +/- 3.1%, P < 0.001) compared to cell free perfusion. The IPMK was responsive to vasoconstrictor (angiotensin II, EC50 100 pM) and vasodilator (methacholine, EC50 75 nM) mediators and showed partial autoregulation of perfusate flow under control conditions over 65-85 mmHg; autoregulatory index (ARI) of 0.66 +/- 0.11. Angiotensin II (100 pM) extended this range (to 65-120 mmHg) and enhanced efficiency (ARI 0.21 +/- 0.02, P < 0.05). Angiotensin II facilitation was antagonized by methacholine (ARI 0.76 +/- 0.08) and papaverine (ARI 0.91 +/- 0.13). Conclusion: The IPMK model is useful for studying renal physiology and pathophysiology without systemic neurohormonal influences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper investigates the effects of trade liberalisation on the technical efficiency of the Bangladesh manufacturing sector by estimating a combined stochastic frontier-inefficiency model using panel data for the period 197894 for 25 three-digit level industries. The results show that the overall technical efficiency of the manufacturing sector as well as the technical efficiencies of the majority of the individual industries has increased over time. The findings also clearly suggest that trade liberalisation, proxied by export orientation and capital deepening, has had significant impact on the reduction of the overall technical inefficiency. Similarly, the scale of operation and the proportion of non-production labour in total employment appear as important determinants of technical inefficiency. The evidence also indicates that both export-promoting and import-substituting industries have experienced rises in technical efficiencies over time. Besides, the results are suggestive of neutral technical change, although (at the 5 per cent level of significance) the empirical results indicate that there was no technical change in the manufacturing industries. Finally, the joint test based on the likelihood ratio (LR) test rejects the Cobb-Douglas production technology as description of the database given the specification of the translog production technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Explants of the hard coral Seriatopora hystrix were exposed to sublethal concentrations of the herbicide diuron DCMU (N'-(3,4-dichlorophenyl,-N,N-dimethylurea)) and the heavy metal copper. Pulse amplitude modulated (PAM) chlorophyll fluorescence techniques were used to assess the effects on the photosynthetic efficiency of the algal symbionts in the tissue (in Symbio), and chlorophyll fluorescence and counts of symbiotic algae (normalised to surface area) were used to assess the extent of coral bleaching. At 30 mug DCMU l(-1), there was a reduction in both the maximum effective quantum yield (DeltaF/F-m') and maximum potential quantum yield (F-v/F-m) of the algal symbionts in symbio. Corals subsequently lost their algal symbionts and discoloured (bleached), especially on their upper sunlight-exposed surfaces. At the same DCMU concentration but under low light (5% of growth irradiance), there was a marked reduction in DeltaF/F-m' but only a slight reduction in F-v/F-m and slight loss of algae. Loss of algal symbionts was also noted after a 7 d exposure to concentrations as low as 10 mug DCMU l(-1) under normal growth irradiance, and after 14 d exposure to 10 mug DCMU l(-1) under reduced irradiance. Collectively the results indicate that DCMU-induced bleaching is caused by a light-dependent photoinactivation of algal symbionts, and that bleaching occurs when F-v/F-n, (measured 2 h after sunset) is reduced to a value of less than or equal to 0.6. Elevated copper concentrations (60 mug Cu l(-1) for 10 h) also induced a rapid bleaching in S. hystrix but without affecting the quantum yield of the algae in symbio. Tests with isolated algae indicated that substantially higher concentrations (300 mug Cu l(-1) for 8 h) were needed to significantly reduce the quantum yield. Thus, copper-induced bleaching occurs without affecting the algal photosynthesis and may be related to effects on the host (animal). It is argued that warm-water bleaching of corals resembles both types of chemically induced bleaching, suggesting the need for an integrated model of coral bleaching involving the effect of temperature on both host (coral) and algal symbionts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between reported treatments of lameness, metabolic disorders (milk fever, ketosis), digestive disorders, and technical efficiency (TE) was investigated using neutral and non-neutral stochastic frontier analysis (SFA). TE is estimated relative to the stochastic frontier production function for a sample of 574 Danish dairy herds collected in 1997. Contrary to most published results, but in line with the expected negative impact of disorders on the average cow milk production, herds reporting higher frequencies of milk fever are less technically efficient. Unexpectedly, however, the opposite results were observed for lameness, ketosis, and digestive disorders. The non-neutral stochastic frontier indicated that the opposite results are due to the relative. high productivities of inputs. The productivity of the cows is also reflected by the direction of impact of herd management variables. Whereas efficient farms replace cows more frequently, enroll heifers in production at an earlier age, and have shorter calving intervals, they also report higher frequency of disorder treatments. The average estimated energy corrected milk loss per cow is 1036, 451 and 242 kg for low, medium and high efficient farms. The study demonstrates the benefit of the stochastic frontier production function involving the estimation of individual technical efficiencies to evaluate farm performance and investigate the source of inefficiency. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relationships of various reproductive disorders and milk production performance of Danish dairy farms were investigated. A stochastic frontier production function was estimated using data collected in 1998 from 514 Danish dairy farms. Measures of farm-level milk production efficiency relative to this production frontier were obtained, and relationships between milk production efficiency and the incidence risk of reproductive disorders were examined. There were moderate positive relationships between milk production efficiency and retained placenta, induction of estrus, uterine infections, ovarian cysts, and induction of birth. Inclusion of reproductive management variables showed that these moderate relationships disappeared, but directions of coefficients for almost all those variables remained the same. Dystocia showed a weak negative correlation with milk production efficiency. Farms that were mainly managed by young farmers had the highest average efficiency scores. The estimated milk losses due to inefficiency averaged 1142, 488, and 256 kg of energy-corrected milk per cow, respectively, for low-, medium-, and high-efficiency herds. It is concluded that the availability of younger cows, which enabled farmers to replace cows with reproductive disorders, contributed to high cow productivity in efficient farms. Thus, a high replacement rate more than compensates for the possible negative effect of reproductive disorders. The use of frontier production and efficiency/ inefficiency functions to analyze herd data may enable dairy advisors to identify inefficient herds and to simulate the effect of alternative management procedures on the individual herd's efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The resource potential of shallow water tables for cropping systems has been investigated using the Australian sugar industry as a case study. Literature concerning shallow water table contributions to sugarcane crops has been summarised, and an assessment of required irrigation for water tables to depths of 2 m investigated using the SWIMv2.1 soil water balance model for three different soils. The study was undertaken because water availability is a major limitation for sugarcane and other crop production systems in Australia and knowledge on how best to incorporate upflow from water tables in irrigation scheduling is limited. Our results showed that for the three soils studied (representing a range of permeabilities as defined by near-saturated hydraulic conductivities), no irrigation would be required for static water tables within 1 m of the soil surface. Irrigation requirements when static water tables exceeded 1 m depth were dependent on the soil type and rooting characteristics (root depth and density). Our results also show that the near-saturated hydraulic conductivities are a better indicator of the ability of water tables below 1 m to supply sufficient upflow as opposed to soil textural classifications. We conclude that there is potential for reductions in irrigation and hence improvements in irrigation water use efficiency in areas where shallow water tables are a low salinity risk: either fresh, or the local hydrology results in net recharge. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Demonstrating the existence of trends in monitoring data is of increasing practical importance to conservation managers wishing to preserve threatened species or reduce the impact of pest species. However, the ability to do so can be compromised if the species in question has low detectability and the true occupancy level or abundance of the species is thus obscured. Zero-inflated models that explicitly model detectability improve the ability to make sound ecological inference in such situations. In this paper we apply an occupancy model including detectability to data from the initial stages of a fox-monitoring program on the Eyre Peninsula, South Australia. We find that detectability is extremely low (< 18%) and varies according to season and the presence or absence of roadside vegetation. We show that simple methods of using monitoring data to inform management, such as plotting the raw data or performing logistic regression, fail to accurately diagnose either the status of the fox population or its trajectory over time. We use the results of the detectability model to consider how future monitoring could be redesigned to achieve efficiency gains. A wide range of monitoring programs could benefit from similar analyses, as part of an active adaptive approach to improving monitoring and management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detailed microscopic examination using optical and electron microscopes suggests that Al4C3, often observed in the central regions of magnesium grains on polished sections, is a potent substrate for primary Mg. Calculations of the crystallographic relationships between magnesium and Al4C3 further support the experimental observations. (c) 2005 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cereal-legume intercropping plays an important role in subsistence food production in developing countries, especially in situations of limited water resources. Crop simulation can be used to assess risk for intercrop productivity over time and space. In this study, a simple model for intercropping was developed for cereal and legume growth and yield, under semi-arid conditions. The model is based on radiation interception and use, and incorporates a water stress factor. Total dry matter and yield are functions of photosynthetically active radiation (PAR), the fraction of radiation intercepted and radiation use efficiency (RUE). One of two PAR sub-models was used to estimate PAR from solar radiation; either PAR is 50% of solar radiation or the ratio of PAR to solar radiation (PAR/SR) is a function of the clearness index (K-T). The fraction of radiation intercepted was calculated either based on Beer's Law with crop extinction coefficients (K) from field experiments or from previous reports. RUE was calculated as a function of available soil water to a depth of 900 mm (ASW). Either the soil water balance method or the decay curve approach was used to determine ASW. Thus, two alternatives for each of three factors, i.e., PAR/SR, K and ASW, were considered, giving eight possible models (2 methods x 3 factors). The model calibration and validation were carried out with maize-bean intercropping systems using data collected in a semi-arid region (Bloemfontein, Free State, South Africa) during seven growing seasons (1996/1997-2002/2003). The combination of PAR estimated from the clearness index, a crop extinction coefficient from the field experiment and the decay curve model gave the most reasonable and acceptable result. The intercrop model developed in this study is simple, so this modelling approach can be employed to develop other cereal-legume intercrop models for semi-arid regions. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With marine biodiversity conservation the primary goal for reserve planning initiatives, a site's conservation potential is typically evaluated on the basis of the biological and physical features it contains. By comparison, socio-economic information is seldom a formal consideration of the reserve system design problem and generally limited to an assessment of threats, vulnerability or compatibility with surrounding uses. This is perhaps surprising given broad recognition that the success of reserve establishment is highly dependent on widespread stakeholder and community support. Using information on the spatial distribution and intensity of commercial rock lobster catch in South Australia, we demonstrate the capacity of mathematical reserve selection procedures to integrate socio-economic and biophysical information for marine reserve system design. Analyses of trade-offs highlight the opportunities to design representative, efficient and practical marine reserve systems that minimise potential loss to commercial users. We found that the objective of minimising the areal extent of the reserve system was barely compromised by incorporating economic design constraints. With a small increase in area (< 3%) and boundary length (< 10%), the economic impact of marine reserves on the commercial rock lobster fishery was reduced by more than a third. We considered also how a reserve planner might prioritise conservation areas using information on a planning units selection frequency. We found that selection frequencies alone were not a reliable guide for the selection of marine reserve systems, but could be used with approaches such as summed irreplaceability to direct conservation effort for efficient marine reserve design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bunge-Wand-Weber (BWW) representation model defines ontological constructs for information systems. According to these constructs the completeness and efficiency of a modeling technique can be defined. Ontology plays an essential role in e-commerce. Using or updating an existing ontology and providing tools to solve any semantic conflicts become essential steps before putting a system online. We use conceptual graphs (CGs) to implement ontologies. This paper evaluates CG capabilities using the BWW representation model. It finds out that CGs are ontologically complete according to Wand and Weber definition. Also it finds out that CGs have construct overload and construct redundancy which can undermine the ontological clarity of CGs. This leads us to build a meta-model to avoid some ontological-unclarity problems. We use some of the BWW constructs to build the meta-model. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The particle-based lattice solid model developed to study the physics of rocks and the nonlinear dynamics of earthquakes is refined by incorporating intrinsic friction between particles. The model provides a means for studying the causes of seismic wave attenuation, as well as frictional heat generation, fault zone evolution, and localisation phenomena. A modified velocity-Verlat scheme that allows friction to be precisely modelled is developed. This is a difficult computational problem given that a discontinuity must be accurately simulated by the numerical approach (i.e., the transition from static to dynamical frictional behaviour). This is achieved using a half time step integration scheme. At each half time step, a nonlinear system is solved to compute the static frictional forces and states of touching particle-pairs. Improved efficiency is achieved by adaptively adjusting the time step increment, depending on the particle velocities in the system. The total energy is calculated and verified to remain constant to a high precision during simulations. Numerical experiments show that the model can be applied to the study of earthquake dynamics, the stick-slip instability, heat generation, and fault zone evolution. Such experiments may lead to a conclusive resolution of the heat flow paradox and improved understanding of earthquake precursory phenomena and dynamics. (C) 1999 Academic Press.