39 resultados para soil data requirements

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report outlines an analysis of soil data requirements across the Gippsland region in order for local councils to carry out regional-scale land suitability analysis (LSA). As such, the primary objective of this study was to ascertain and source available relevant soil attribute data and information necessary to enable LSA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Biophysical investigations of estuaries require a diversity of tasks to be undertaken by a number of disciplines leading to a range of data requirements and dataflow pathways. Technology changes relating to data collection and storage have lead to the need for metadata systems that describe the vast amounts of data now able to be stored electronically. Such a system is described as the first step in the creation of an efficient data management system for biophysical estuarine data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Agricultural soils are a major source of nitrous oxide (N2O) emissions and an understanding of factors regulating such emissions across contrasting soil types is critical for improved estimation through modelling and mitigation of N2O. In this study we investigated the role of soil texture and its interaction with plants in regulating the N2O fluxes in agricultural systems. A measurement system that combined weighing lysimeters with automated chambers was used to directly compare continuously measured surface N2O fluxes, leaching losses of water and nitrogen and evapotranspiration in three contrasting soils types of the Riverine Plain, NSW, Australia. The soils comprised a deep sand, a loam and a clay loam with and without the presence of wheat plants. All soils were under the same fertilizer management and irrigation was applied according to plant water requirements. In fallow soils, texture significantly affected N2O emissions in the order clay loam > loam > sand. However, when planted, the difference in N2O emissions among the three soils types became less pronounced. Nitrous oxide emissions were 6.2 and 2.4 times higher from fallow clay loam and loam cores, respectively, compared with cores planted with wheat. This is considered to be due to plant uptake of water and nitrogen which resulted in reduced amounts of soil water and available nitrogen, and therefore less favourable soil conditions for denitrification. The effect of plants on N2O emissions was not apparent in the coarse textured sandy soil probably because of aerobic soil conditions, likely caused by low water holding capacity and rapid drainage irrespective of plant presence resulting in reduced denitrification activity. More than 90% of N2O emissions were derived from denitrification in the fine-textured clay loam-determined for a two week period using K15NO3 fertilizer. The proportion of N2O that was not derived from K15NO3 was higher in the coarse-textured sand and loam, which may have been derived from soil N through nitrification or denitrification of mineralized N. Water filled pore space was a poorer predictor of N2O emissions compared with volumetric water content because of variable bulk density among soil types. The data may better inform the calibration of greenhouse gas prediction models as soil texture is one of the primary factors that explain spatial variation in N2O emissions by regulating soil oxygen. Defining the significance of N2O emissions between planted and fallow soils may enable improved yield scaled N2O emission assessment, water and nitrogen scheduling in the pre-watering phase during early crop establishment and within rotations of irrigated arable cropping systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embodied energy (EE) analysis has become an important area of energy research, in attempting to trace the direct and indirect energy requirements of products and services throughout their supply chain. Typically, input-output (I-O) models have been used to calculate EE because they are considered to be comprehensive in their analysis. However, a major deficiency of using I-O models is that they have inherent errors and therefore cannot be reliably applied to individual cases. Thus, there is a need for the ability to disaggregate an I-O model into its most important 'energy paths', for the purpose of integrating case-specific data. This paper presents a new hybrid method for conducting EE analyses for individual buildings, which retains the completeness of the I-O model. This new method is demonstrated by application to an Australian residential building. Only 52% of the energy paths derived from the I-O model were substituted using case-specific data. This indicates that previous system boundaries for EE studies of individual residential buildings are less than optimal. It is envisaged that the proposed method will provide construction professionals with more accurate and reliable data for conducting life cycle energy analysis of buildings. Furthermore, by analysing the unmodified energy paths, further data collection can be prioritized effectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water repellent soils are difficult to irrigate and susceptible to preferential flow, which enhances the potential for accelerated leaching to groundwater of hazardous substances. Over 5 Mha of Australian soil is water repellent, while treated municipal sewage is increasingly used for irrigation. Only if a critical water content is exceeded will repellent soils become wettable. To avoid excessive loss of water from the root zone via preferential flow paths, irrigation schemes should therefore aim to keep the soil wet enough to maintain soil wettability. Our objective was to monitor the near-surface water content and water repellency in a blue gum (Eucalyptus globulus) plantation irrigated with treated sewage. The plantation's sandy soil surface was strongly water repellent when dry. For 4 months, three rows of 15 blue gum trees each received no irrigation, three other rows received 50% of the estimated potential water use minus rainfall, and three more rows received 100%. During this period, 162 soil samples were obtained in three sampling rounds, and their water content (% dry mass) and degree of water repellency determined. Both high and low irrigation effectively wetted up the soil and eliminated water repellency after 2 (high) or 4 (low) months. A single-peaked distribution of water contents was observed in the soil samples, but the water repellency distribution was dichotomous, with 44% extremely water-repellent and 36% wettable. This is consistent with a threshold water content at which a soil sample changes from water repellent to wettable, with spatial variability of this threshold creating a much wider transition zone at the field scale. We characterized this transition zone by expressing the fraction of wettable samples as a function of water content, and demonstrated a way to estimate from this the wettable portion of a field from a number of water content measurements. To keep the plantation soil wettable, the water content must be maintained at a level at which a significant downward flux is likely, with the associated enhanced leaching. At water contents with negligible downward flux, the field is water repellent, and leaching through preferential flow paths is likely. Careful management is needed to resolve these conflicting requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the appropriateness for application to large data sets of standard machine learning algorithms, which were mainly developed in the context of small data sets. Sampling and parallelisation have proved useful means for reducing computation time when learning from large data sets. However, such methods assume that algorithms that were designed for use with what are now considered small data sets are also fundamentally suitable for large data sets. It is plausible that optimal learning from large data sets requires a different type of algorithm to optimal learning from small data sets. This paper investigates one respect in which data set size may affect the requirements of a learning algorithm — the bias plus variance decomposition of classification error. Experiments show that learning from large data sets may be more effective when using an algorithm that places greater emphasis on bias management, rather than variance management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years there has been considerable debate about the general decline in the number of students undertaking bachelor degrees and majors in economics. The discussion has stemmed mainly from a supply-side perspective of the economics education market. The goal of this paper is to add another dimension to the debate and report the results of a survey of employers of economics graduates. Drawing on the extensive customer services literature it is argued that a market oriented, or demand-side analysis is also an important component in redressing low student enrolments and retention. A first step in adopting a market oriented approach is to determine the skills required of the economics graduates entering the jobs market. With the support of The Economics Society of Au~tralia, twenty-nine public and private sector employers were surveyed in 2002. The aim of the survey was to establish the demand for economics graduates with bachelor and honours degrees, the skills and knowledge required of these graduates, and the performance of such graduates. The study found that economic knowledge and skill were important to employers. However, the skills rated most important by employers were the more general or 'generic' skills of clear writing, data analysis, interpersonal skills and a practical problem-solving orientation. While graduates generally performed satisfactorily in relation to the economic skills required by employers, this was not the case for generic skills. The result of the survey findings have some significant implications for the content and teaching of undergraduate economics programs. This paper outlines these implications and also discusses areas for future research It is argued that such research should aim to utilize both the demand and supplyside perspectives with the development of more precise definitions and measurement of the economic skills required by the various stakeholders in the economics education market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Australia is considered the driest populated continent in the world. Despite this, we consume the largest amount of water, per capita. While little of this water is used for the operation of buildings, buildings are now being designed to use less water. Additionally, rainwater collection and grey water recycling systems offer the potential to significantly reduce demand for fresh water. However, little is known about the water required directly and indirectly (ie., embodied in) construction materials and products. Embodied water comprises the water required directly for construction itself and the water consumed indirectly in the production and delivery of materials, products and services to construction. Water required directly for construction is likely to be insignificant compared to the indirect water required for the manufacture of construction materials and products (ie., through materials and other products required to support construction). There is currently a lack of research into embodied water requirements by the construction sector. The relationship between the embodied water and the operational water is also unknown, apart from a handful of studies based solely on national average statistics known as 'input-output' data. The aim of this paper is therefore to model the water required directly and indirectly by construction, integrating currently available public domain industry data with input-output data. The coverage of the industry data relative to the input-output data was evaluated for a typical commercial building, and was found to be very low.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a theoretical process model and the associated detailed information structure which reflects the complexity of information, stakeholder interaction and intellectual property concerns which are currently seen in the construction industry. This is being developed and tested against a field study renovation project. The field study project identifies information flows and interactions between stakeholders such as designers, project managers, clients, contractors, subcontractors and suppliers. The process model which is being established shows very high levels of complexity in dependencies and interdependencies between implicit and explicit information within the project design and construction teams. Without an understanding of these detailed and complex process interactions, proposals for the application of ICT to the construction industry will not reflect the requirements of those for whom they are being developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compulsory superannuation was introduced in Australia in July 1992, and has led to significant growth in funds under management.  Reserve Bank of Australia data (2004) shows that in September 2004 Australians has AUD$ 767 billion invested in managed funds.  A large portion of this investment is based on the recommendation of financial planners.  This paper provides a brief history of the development of the financial services industry in Australia, with particular reference to the development of the role of the financial planner in investment decisions.

The paper focuses in detail on the set of professional skills required by financial planners given that the widely reported ASIC survey (2003), identified gaps between client expectation and competencies of financial planners.  Birkett (1996) described professional skills as the dominant individual attribute that describes a competent professional.  The individual attributes of a financial planner includes two categories: cognitive and behavioural skills.  The paper provides strong support for the view that financial planning educators should ensure adequate development of behavioural skills to enable financial planners to meet the needs of the investors they serve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate assessment of the fate of salts, nutrients, and pollutants in natural, heterogeneous soils requires a proper quantification of both spatial and temporal solute spreading during solute movement. The number of experiments with multisampler devices that measure solute leaching as a function of space and time is increasing. The breakthrough curve (BTC) can characterize the temporal aspect of solute leaching, and recently the spatial solute distribution curve (SSDC) was introduced to describe the spatial solute distribution. We combined and extended both concepts to develop a tool for the comprehensive analysis of the full spatio-temporal behavior of solute leaching. The sampling locations are ranked in order of descending amount of total leaching (defined as the cumulative leaching from an individual compartment at the end of the experiment), thus collapsing both spatial axes of the sampling plane into one. The leaching process can then be described by a curved surface that is a function of the single spatial coordinate and time. This leaching surface is scaled to integrate to unity, and termed S can efficiently represent data from multisampler solute transport experiments or simulation results from multidimensional solute transport models. The mathematical relationships between the scaled leaching surface S, the BTC, and the SSDC are established. Any desired characteristic of the leaching process can be derived from S. The analysis was applied to a chloride leaching experiment on a lysimeter with 300 drainage compartments of 25 cm2 each. The sandy soil monolith in the lysimeter exhibited fingered flow in the water-repellent top layer. The observed S demonstrated the absence of a sharp separation between fingers and dry areas, owing to diverging flow in the wettable soil below the fingers. Times-to-peak, maximum solute fluxes, and total leaching varied more in high-leaching than in low-leaching compartments. This suggests a stochastic–convective transport process in the high-flow streamtubes, while convection–dispersion is predominant in the low-flow areas. S can be viewed as a bivariate probability density function. Its marginal distributions are the BTC of all sampling locations combined, and the SSDC of cumulative solute leaching at the end of the experiment. The observed S cannot be represented by assuming complete independence between its marginal distributions, indicating that S contains information about the leaching process that cannot be derived from the combination of the BTC and the SSDC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tea has been Sri Lanka's major export earner for several decades. However, soil erosion on tea-producing land has had considerable on-site and off-site effects. This study quantifies soil erosion impacts for smallholder tea farms in Sri Lanka by estimating a yield damage function and an erosion damage function using a subjective elicitation technique. The Mitscherlich-Spillman type of function was found to yield acceptable results. The study indicates that high rates of soil erosion require earlier adoption of soil conservation measures than do low rates of erosion. Sensitivity analysis shows the optimum year to change to a conservation practice is very sensitive to the discount rate but less sensitive to the cost of production and price of tea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protocol analysis is an empirical method applied by researchers in cognitive psychology and behavioural analysis. Protocol analysis can be used to collect, document and analyse thought processes by an individual problem solver. In general, research subjects are asked to think aloud when performing a given task. Their verbal reports are transcribed and represent a sequence of their thoughts and cognitive activities. These verbal reports are analysed to identify relevant segments of cognitive behaviours by the research subjects. The analysis results may be cross-examined (or validated through retrospective interviews with the research subjects). This paper offers a critical analysis of this research method, its approaches to data collection and analysis, strengths and limitations, and discusses its use in information systems research. The aim is to explore the use of protocol analysis in studying the creative requirements engineering process.