39 resultados para soil data requirements


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hybrid neural network model, based on the fusion of fuzzy adaptive resonance theory (FA ART) and the general regression neural network (GRNN), is proposed in this paper. Both FA and the GRNN are incremental learning systems and are very fast in network training. The proposed hybrid model, denoted as GRNNFA, is able to retain these advantages and, at the same time, to reduce the computational requirements in calculating and storing information of the kernels. A clustering version of the GRNN is designed with data compression by FA for noise removal. An adaptive gradient-based kernel width optimization algorithm has also been devised. Convergence of the gradient descent algorithm can be accelerated by the geometric incremental growth of the updating factor. A series of experiments with four benchmark datasets have been conducted to assess and compare effectiveness of GRNNFA with other approaches. The GRNNFA model is also employed in a novel application task for predicting the evacuation time of patrons at typical karaoke centers in Hong Kong in the event of fire. The results positively demonstrate the applicability of GRNNFA in noisy data regression problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The steady increase of regulations and its acceleration due to the financial crisis heavily affect the management of regulatory compliance. Regulations, such as Basel III and Solvency II particularly impact data warehouses and lead to many organizational and technical changes. From an IS perspective modeling techniques for data warehouse requirement elicitation help to manage conceptual requirements. From a legal perspective attempts to visualize regulatory requirements – so called legal visualization approaches – have been developed. This paper investigates whether a conceptual modeling technique for regulatory-driven data warehouse requirements is applicable for representing data warehouse requirements in a legal environment. Applying the modeling technique H2 for Reporting in three extensive modeling projects provides three contributions. First, evidence for the applicability of a modeling technique for regulatory-driven data warehouse requirements is given. Second, lessons learned for further modeling projects are provided. Third, a discussion towards a combined perspective of information modeling and legal visualization is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of climate change on the shallow expansive foundation conditions of resident dwellings is costing several hundred billion dollars worldwide. The design and costs of constructing or repairing residential footings is greatly influenced by the degree of ground movement, which is driven by the magnitude of change in soil moisture. The impacts of climate change on urban infrastructure are expected to include accelerated degradation of materials and foundations of buildings and facilities, increased ground movement, changes in ground water affecting the chemical structure of foundations, and fatigue of structures from extreme storm events. Previous research found that residential houses that were built less than five years ago have suffered major cracks and other damage caused by slab movement after record rainfall. The Thornthwaite Moisture Index (TMI) categorises climate on the basis of rainfall, temperature, potential evapotranspiration and the water holding capacity of the soil. Originally TMI was mainly used to map soil moisture conditions for agriculture but soon became a method to predict pavement and foundation changes. Few researchers have developed TMI maps for Australia, but generally, their accuracy is low or unknown, and their use is limited. The aims of this paper are: (1) To produce accurate maps of TMI for the state of Victoria for 100 years (1913 to 2012) in 20 year periods using long-term historical climatic data and advanced spatial statistics methods in GIS, and (2) Analyse the spatial and temporal changes of TMI in Victoria. Preliminary results suggest that a better understanding of climate change through long-term TMI mapping can assist urban planning and guide construction regulations towards the development of cities which are more resilient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Goat fibre production is affected by genetic and environmental influences. Environmental influences which are the subject of this review include bio–geophysical factors (photoperiod, climate–herbage system and soil–plant trace nutrient composition), nutrition factors and management factors. Nutrition and management influences discussed include rate of stocking, supplementary feeding of energy and protein, liveweight change, parturition and management during shearing. While experimental data suggest affects of seasonal photoperiod on the growth of mohair and cashmere are large, these results may have confounded changes in temperature with photoperiod. The nutritional variation within and among years is the most important climatic factor influencing mohair and cashmere production and quality. Mohair quality and growth is affected significantly by rate of stocking and during periods of liveweight loss by supplementary feeding of either energy or protein. Strategic use of supplements, methods for rapid introduction of cereal grains, influence of dietary roughage on intake and the economics of supplementary feeding are discussed. Cashmere production of young, low producing goats does not appear to be affected by energy supplementation, but large responses to energy supplementation have been measured in more productive cashmere goat strains. The designs of these cashmere nutrition experiments are reviewed. Evidence for the hypothesis that energy-deprived cashmere goats divert nutrients preferentially to cashmere growth is reviewed. The influence and potential use of liveweight manipulation in affecting mohair and cashmere production and quality are described. Estimates of the energy requirements for the maintenance of fibre goats and the effect of pregnancy and lactation on mohair and cashmere growth are summarised. The effects and importance of management and hygiene during fibre harvesting (shearing) in producing quality fibre is emphasised. The review concludes that it is important to assess the results of scientific experiments for the total environmental content within which they were conducted. The review supports the view that scientific experiments should use control treatments appropriate to the environment under study as well as having controls relevant for other environments. In mediterranean and annual temperate environments, appropriate controls are liveweight loss and liveweight maintenance treatments. Mohair producers must graze goats at moderate rates of stocking to maximise animal welfare, but in so doing, they will produce heavier goats and coarser mohair. In mediterranean and annual temperate environments, seasonal changes in liveweight are large and influence both quality and production of mohair and cashmere. Mohair and cashmere producers can manipulate liveweight by supplementary feeding energy during dry seasons to minimise liveweight loss, but the economics of such feeding needs to be carefully examined. Strategic benefits can be obtained by enhancing the growth of young does prior to mating and for higher producing cashmere goats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Big data presents a remarkable opportunity for organisations to obtain critical intelligence to drive decisions and obtain insights as never before. However, big data generates high network traffic. Moreover, the continuous growth in the variety of network traffic due to big data variety has rendered the network to be one of the key big data challenges. In this article, we present a comprehensive analysis of big data variety and its adverse effects on the network performance. We present taxonomy of big data variety and discuss various dimensions of the big data variety features. We also discuss how the features influence the interconnection network requirements. Finally, we discuss some of the challenges each big data variety dimension presents and possible approach to address them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of contexts is important for context-aware applications in pervasive computing. This is a challenging problem because of the stream nature of data, the complexity and changing nature of contexts. We propose a Bayesian nonparametric model for the detection of co-location contexts from Bluetooth signals. By using an Indian buffet process as the prior distribution, the model can discover the number of contexts automatically. We introduce a novel fixed-lag particle filter that processes data incrementally. This sampling scheme is especially suitable for pervasive computing as the computational requirements remain constant in spite of growing data. We examine our model on a synthetic dataset and two real world datasets. To verify the discovered contexts, we compare them to the communities detected by the Louvain method, showing a strong correlation between the results of the two methods. Fixed-lag particle filter is compared with Gibbs sampling in terms of the normalized factorization error that shows a close performance between the two inference methods. As fixed-lag particle filter processes a small chunk of data when it comes and does not need to be restarted, its execution time is significantly shorter than that of Gibbs sampling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to estimate a particular quantile of a distribution is an important problem which frequently arises in many computer vision and signal processing applications. For example, our work was motivated by the requirements of many semi-automatic surveillance analytics systems which detect abnormalities in close-circuit television (CCTV) footage using statistical models of low-level motion features. In this paper we specifically address the problem of estimating the running quantile of a data stream with non-stationary stochasticity when the memory for storing observations is limited. We make several major contributions: (i) we derive an important theoretical result which shows that the change in the quantile of a stream is constrained regardless of the stochastic properties of data, (ii) we describe a set of high-level design goals for an effective estimation algorithm that emerge as a consequence of our theoretical findings, (iii) we introduce a novel algorithm which implements the aforementioned design goals by retaining a sample of data values in a manner adaptive to changes in the distribution of data and progressively narrowing down its focus in the periods of quasi-stationary stochasticity, and (iv) we present a comprehensive evaluation of the proposed algorithm and compare it with the existing methods in the literature on both synthetic data sets and three large 'real-world' streams acquired in the course of operation of an existing commercial surveillance system. Our findings convincingly demonstrate that the proposed method is highly successful and vastly outperforms the existing alternatives, especially when the target quantile is high valued and the available buffer capacity severely limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to estimate a particular quantile of a distribution is an important problem that frequently arises in many computer vision and signal processing applications. For example, our work was motivated by the requirements of many semiautomatic surveillance analytics systems that detect abnormalities in close-circuit television footage using statistical models of low-level motion features. In this paper, we specifically address the problem of estimating the running quantile of a data stream when the memory for storing observations is limited. We make the following several major contributions: 1) we highlight the limitations of approaches previously described in the literature that make them unsuitable for nonstationary streams; 2) we describe a novel principle for the utilization of the available storage space; 3) we introduce two novel algorithms that exploit the proposed principle in different ways; and 4) we present a comprehensive evaluation and analysis of the proposed algorithms and the existing methods in the literature on both synthetic data sets and three large real-world streams acquired in the course of operation of an existing commercial surveillance system. Our findings convincingly demonstrate that both of the proposed methods are highly successful and vastly outperform the existing alternatives. We show that the better of the two algorithms (data-aligned histogram) exhibits far superior performance in comparison with the previously described methods, achieving more than 10 times lower estimate errors on real-world data, even when its available working memory is an order of magnitude smaller.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emerging field of blue carbon science is seeking cost-effective ways to estimate the organic carbon content of soils that are bound by coastal vegetated ecosystems. Organic carbon (Corg) content in terrestrial soils and marine sediments has been correlated with mud content (i.e. silt and clay), however, empirical tests of this theory are lacking for coastal vegetated ecosystems. Here, we compiled data (n = 1345) on the relationship between Corg and mud (i.e. silt and clay, particle sizes <63 μm) contents in seagrass ecosystems (79 cores) and adjacent bare sediments (21 cores) to address whether mud can be used to predict soil Corg content. We also combined these data with the δ13C signatures of the soil Corg to understand the sources of Corg stores. The results showed that mud is positively correlated with soil Corg content only when the contribution of seagrass-derived Corg to the sedimentary Corg pool is relatively low, such as in small and fast growing meadows of the genera Zostera, Halodule and Halophila, and in bare sediments adjacent to seagrass ecosystems. In large and long-living seagrass meadows of the genera Posidonia and Amphibolis there was a lack of, or poor relationship between mud and soil Corg content, related to a higher contribution of seagrass-derived Corg to the sedimentary Corg pool in these meadows. The relative high soil Corg contents with relatively low mud contents (i.e. mud-Corg saturation) together with significant allochthonous inputs of terrestrial organic matter could overall disrupt the correlation expected between soil Corg and mud contents. This study shows that mud (i.e. silt and clay content) is not a universal proxy for blue carbon content in seagrass ecosystems, and therefore should not be applied generally across all seagrass habitats. Mud content can only be used as a proxy to estimate soil Corg content for scaling up purposes when opportunistic and/or low biomass seagrass species (i.e. Zostera, Halodule and Halophila) are present (explaining 34 to 91% of variability), and in bare sediments (explaining 78% of the variability).