904 resultados para cacao trees


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the graphics race subsides and gamers grow weary of predictable and deterministic game characters, game developers must put aside their “old faithful” finite state machines and look to more advanced techniques that give the users the gaming experience they crave. The next industry breakthrough will be with characters that behave realistically and that can learn and adapt, rather than more polygons, higher resolution textures and more frames-per-second. This paper explores the various artificial intelligence techniques that are currently being used by game developers, as well as techniques that are new to the industry. The techniques covered in this paper are finite state machines, scripting, agents, flocking, fuzzy logic and fuzzy state machines decision trees, neural networks, genetic algorithms and extensible AI. This paper introduces each of these technique, explains how they can be applied to games and how commercial games are currently making use of them. Finally, the effectiveness of these techniques and their future role in the industry are evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trees, shrubs and other vegetation are of continued importance to the environment and our daily life. They provide shade around our roads and houses, offer a habitat for birds and wildlife, and absorb air pollutants. However, vegetation touching power lines is a risk to public safety and the environment, and one of the main causes of power supply problems. Vegetation management, which includes tree trimming and vegetation control, is a significant cost component of the maintenance of electrical infrastructure. For example, Ergon Energy, the Australia’s largest geographic footprint energy distributor, currently spends over $80 million a year inspecting and managing vegetation that encroach on power line assets. Currently, most vegetation management programs for distribution systems are calendar-based ground patrol. However, calendar-based inspection by linesman is labour-intensive, time consuming and expensive. It also results in some zones being trimmed more frequently than needed and others not cut often enough. Moreover, it’s seldom practicable to measure all the plants around power line corridors by field methods. Remote sensing data captured from airborne sensors has great potential in assisting vegetation management in power line corridors. This thesis presented a comprehensive study on using spiking neural networks in a specific image analysis application: power line corridor monitoring. Theoretically, the thesis focuses on a biologically inspired spiking cortical model: pulse coupled neural network (PCNN). The original PCNN model was simplified in order to better analyze the pulse dynamics and control the performance. Some new and effective algorithms were developed based on the proposed spiking cortical model for object detection, image segmentation and invariant feature extraction. The developed algorithms were evaluated in a number of experiments using real image data collected from our flight trails. The experimental results demonstrated the effectiveness and advantages of spiking neural networks in image processing tasks. Operationally, the knowledge gained from this research project offers a good reference to our industry partner (i.e. Ergon Energy) and other energy utilities who wants to improve their vegetation management activities. The novel approaches described in this thesis showed the potential of using the cutting edge sensor technologies and intelligent computing techniques in improve power line corridor monitoring. The lessons learnt from this project are also expected to increase the confidence of energy companies to move from traditional vegetation management strategy to a more automated, accurate and cost-effective solution using aerial remote sensing techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, the effect of ions and ultrafine particles on ambient air quality and human health has been well documented, however, knowledge about their sources, concentrations and interactions within different types of urban environments remains limited. This thesis presents the results of numerous field studies aimed at quantifying variations in ion concentration with distance from the source, as well as identifying the dynamics of the particle ionisation processes which lead to the formation of charged particles in the air. In order to select the most appropriate measurement instruments and locations for the studies, a literature review was also conducted on studies that reported ion and ultrafine particle emissions from different sources in a typical urban environment. The initial study involved laboratory experiments on the attachment of ions to aerosols, so as to gain a better understanding of the interaction between ions and particles. This study determined the efficiency of corona ions at charging and removing particles from the air, as a function of different particle number and ion concentrations. The results showed that particle number loss was directly proportional to particle charge concentration, and that higher small ion concentrations led to higher particle deposition rates in all size ranges investigated. Nanoparticles were also observed to decrease with increasing particle charge concentration, due to their higher Brownian mobility and subsequent attachment to charged particles. Given that corona discharge from high voltage powerlines is considered one of the major ion sources in urban areas, a detailed study was then conducted under three parallel overhead powerlines, with a steady wind blowing in a perpendicular direction to the lines. The results showed that large sections of the lines did not produce any corona at all, while strong positive emissions were observed from discrete components such as a particular set of spacers on one of the lines. Measurements were also conducted at eight upwind and downwind points perpendicular to the powerlines, spanning a total distance of about 160m. The maximum positive small and large ion concentrations, and DC electric field were observed at a point 20 m downwind from the lines, with median values of 4.4×103 cm-3, 1.3×103 cm-3 and 530 V m-1, respectively. It was estimated that, at this point, less than 7% of the total number of particles was charged. The electrical parameters decreased steadily with increasing downwind distance from the lines but remained significantly higher than background levels at the limit of the measurements. Moreover, vehicles are one of the most prevalent ion and particle emitting sources in urban environments, and therefore, experiments were also conducted behind a motor vehicle exhaust pipe and near busy motorways, with the aim of quantifying small ion and particle charge concentration, as well as their distribution as a function of distance from the source. The study found that approximately equal numbers of positive and negative ions were observed in the vehicle exhaust plume, as well as near motorways, of which heavy duty vehicles were believed to be the main contributor. In addition, cluster ion concentration was observed to decrease rapidly within the first 10-15 m from the road and ion-ion recombination and ion-aerosol attachment were the most likely cause of ion depletion, rather than dilution and turbulence related processes. In addition to the above-mentioned dominant ion sources, other sources also exist within urban environments where intensive human activities take place. In this part of the study, airborne concentrations of small ions, particles and net particle charge were measured at 32 different outdoor sites in and around Brisbane, Australia, which were classified into seven different groups as follows: park, woodland, city centre, residential, freeway, powerlines and power substation. Whilst the study confirmed that powerlines, power substations and freeways were the main ion sources in an urban environment, it also suggested that not all powerlines emitted ions, only those with discrete corona discharge points. In addition to the main ion sources, higher ion concentrations were also observed environments affected by vehicle traffic and human activities, such as the city centre and residential areas. A considerable number of ions were also observed in a woodland area and it is still unclear if they were emitted directly from the trees, or if they originated from some other local source. Overall, it was found that different types of environments had different types of ion sources, which could be classified as unipolar or bipolar particle sources, as well as ion sources that co-exist with particle sources. In general, fewer small ions were observed at sites with co-existing sources, however particle charge was often higher due to the effect of ion-particle attachment. In summary, this study quantified ion concentrations in typical urban environments, identified major charge sources in urban areas, and determined the spatial dispersion of ions as a function of distance from the source, as well as their controlling factors. The study also presented ion-aerosol attachment efficiencies under high ion concentration conditions, both in the laboratory and in real outdoor environments. The outcomes of these studies addressed the aims of this work and advanced understanding of the charge status of aerosols in the urban environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, the problems resulting from unsustainable subdivision development have become significant problems in the Bangkok Metropolitan Region (BMR), Thailand. Numbers of government departments and agencies have tried to eliminate the problems by introducing the rating tools to encourage the higher sustainability levels of subdivision development in BMR, such as the Environmental Impact Assessment Monitoring Award (EIA-MA) and the Thai’s Rating for Energy and Environmental Sustainability of New construction and major renovation (TREES-NC). However, the EIA-MA has included the neighbourhood designs in the assessment criteria, but this requirement applies to large projects only. Meanwhile, TREES-NC has focused only on large scale buildings such as condominiums, office buildings, and is not specific for subdivision neighbourhood designs. Recently, the new rating tool named “Rating for Subdivision Neighbourhood Sustainability Design (RSNSD)” has been developed. Therefore, the validation process of RSNSD is still required. This paper aims to validate the new rating tool for subdivision neighbourhood design in BMR. The RSNSD has been validated by applying the rating tool to eight case study subdivisions. The result of RSNSD by data generated through surveying subdivisions will be compared to the existing results from the EIA-MA. The selected cases include of one “Excellent Award”, two “Very Good Award”, and five non-rated subdivision developments. This paper expects to prove the credibility of RSNSD before introducing to the real subdivision development practises. The RSNSD could be useful to encourage higher sustainability subdivision design level, and then protect the problems from further subdivision development in BMR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new stormwater quality improvement device (SQID) called ‘Green Gully’ has been designed and developed in this study with an aim to re-using stormwater for irrigating plants and trees. The main purpose of the Green Gully is to collect road runoff/stormwater, make it suitable for irrigation and provide an automated network system for watering roadside plants and irrigational areas. This paper presents the design and development of Green Gully along with experimental and computational investigations of the performance of Green Gully. Performance (in the form of efficiency, i.e. the percentage of water flow through the gully grate) was experimentally determined using a gully model in the laboratory first, then a three dimensional numerical model was developed and simulated to predict the efficiency of Green Gully as a function of flow rate. Computational Fluid Dynamics (CFD) code FLUENT was used for the simulation. GAMBIT was used for geometry creation and mesh generation. Experimental and simulation results are discussed and compared in this paper. The predicted efficiency was compared with the laboratory measured efficiency. It was found that the simulated results are in good agreement with the experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growing number of XML documents on theWeb it becomes essential to effectively organise these XML documents in order to retrieve useful information from them. A possible solution is to apply clustering on the XML documents to discover knowledge that promotes effective data management, information retrieval and query processing. However, many issues arise in discovering knowledge from these types of semi-structured documents due to their heterogeneity and structural irregularity. Most of the existing research on clustering techniques focuses only on one feature of the XML documents, this being either their structure or their content due to scalability and complexity problems. The knowledge gained in the form of clusters based on the structure or the content is not suitable for reallife datasets. It therefore becomes essential to include both the structure and content of XML documents in order to improve the accuracy and meaning of the clustering solution. However, the inclusion of both these kinds of information in the clustering process results in a huge overhead for the underlying clustering algorithm because of the high dimensionality of the data. The overall objective of this thesis is to address these issues by: (1) proposing methods to utilise frequent pattern mining techniques to reduce the dimension; (2) developing models to effectively combine the structure and content of XML documents; and (3) utilising the proposed models in clustering. This research first determines the structural similarity in the form of frequent subtrees and then uses these frequent subtrees to represent the constrained content of the XML documents in order to determine the content similarity. A clustering framework with two types of models, implicit and explicit, is developed. The implicit model uses a Vector Space Model (VSM) to combine the structure and the content information. The explicit model uses a higher order model, namely a 3- order Tensor Space Model (TSM), to explicitly combine the structure and the content information. This thesis also proposes a novel incremental technique to decompose largesized tensor models to utilise the decomposed solution for clustering the XML documents. The proposed framework and its components were extensively evaluated on several real-life datasets exhibiting extreme characteristics to understand the usefulness of the proposed framework in real-life situations. Additionally, this research evaluates the outcome of the clustering process on the collection selection problem in the information retrieval on the Wikipedia dataset. The experimental results demonstrate that the proposed frequent pattern mining and clustering methods outperform the related state-of-the-art approaches. In particular, the proposed framework of utilising frequent structures for constraining the content shows an improvement in accuracy over content-only and structure-only clustering results. The scalability evaluation experiments conducted on large scaled datasets clearly show the strengths of the proposed methods over state-of-the-art methods. In particular, this thesis work contributes to effectively combining the structure and the content of XML documents for clustering, in order to improve the accuracy of the clustering solution. In addition, it also contributes by addressing the research gaps in frequent pattern mining to generate efficient and concise frequent subtrees with various node relationships that could be used in clustering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate and detailed road models play an important role in a number of geospatial applications, such as infrastructure planning, traffic monitoring, and driver assistance systems. In this thesis, an integrated approach for the automatic extraction of precise road features from high resolution aerial images and LiDAR point clouds is presented. A framework of road information modeling has been proposed, for rural and urban scenarios respectively, and an integrated system has been developed to deal with road feature extraction using image and LiDAR analysis. For road extraction in rural regions, a hierarchical image analysis is first performed to maximize the exploitation of road characteristics in different resolutions. The rough locations and directions of roads are provided by the road centerlines detected in low resolution images, both of which can be further employed to facilitate the road information generation in high resolution images. The histogram thresholding method is then chosen to classify road details in high resolution images, where color space transformation is used for data preparation. After the road surface detection, anisotropic Gaussian and Gabor filters are employed to enhance road pavement markings while constraining other ground objects, such as vegetation and houses. Afterwards, pavement markings are obtained from the filtered image using the Otsu's clustering method. The final road model is generated by superimposing the lane markings on the road surfaces, where the digital terrain model (DTM) produced by LiDAR data can also be combined to obtain the 3D road model. As the extraction of roads in urban areas is greatly affected by buildings, shadows, vehicles, and parking lots, we combine high resolution aerial images and dense LiDAR data to fully exploit the precise spectral and horizontal spatial resolution of aerial images and the accurate vertical information provided by airborne LiDAR. Objectoriented image analysis methods are employed to process the feature classiffcation and road detection in aerial images. In this process, we first utilize an adaptive mean shift (MS) segmentation algorithm to segment the original images into meaningful object-oriented clusters. Then the support vector machine (SVM) algorithm is further applied on the MS segmented image to extract road objects. Road surface detected in LiDAR intensity images is taken as a mask to remove the effects of shadows and trees. In addition, normalized DSM (nDSM) obtained from LiDAR is employed to filter out other above-ground objects, such as buildings and vehicles. The proposed road extraction approaches are tested using rural and urban datasets respectively. The rural road extraction method is performed using pan-sharpened aerial images of the Bruce Highway, Gympie, Queensland. The road extraction algorithm for urban regions is tested using the datasets of Bundaberg, which combine aerial imagery and LiDAR data. Quantitative evaluation of the extracted road information for both datasets has been carried out. The experiments and the evaluation results using Gympie datasets show that more than 96% of the road surfaces and over 90% of the lane markings are accurately reconstructed, and the false alarm rates for road surfaces and lane markings are below 3% and 2% respectively. For the urban test sites of Bundaberg, more than 93% of the road surface is correctly reconstructed, and the mis-detection rate is below 10%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the large diffusion of Business Process Managemen (BPM) automation suites, the possibility of managing process-related risks arises. This paper introduces an innovative framework for process-related risk management and describes a working implementation realized by extending the YAWL system. The framework covers three aspects of risk management: risk monitoring, risk prevention, and risk mitigation. Risk monitoring functionality is provided using a sensor-based architecture, where sensors are defined at design time and used at run-time for monitoring purposes. Risk prevention functionality is provided in the form of suggestions about what should be executed, by who, and how, through the use of decision trees. Finally, risk mitigation functionality is provided as a sequence of remedial actions (e.g. reallocating, skipping, rolling back of a work item) that should be executed to restore the process to a normal situation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Divergence dating studies, which combine temporal data from the fossil record with branch length data from molecular phylogenetic trees, represent a rapidly expanding approach to understanding the history of life. National Evolutionary Synthesis Center hosted the first Fossil Calibrations Working Group (3–6 March, 2011, Durham, NC, USA), bringing together palaeontologists, molecular evolutionists and bioinformatics experts to present perspectives from disciplines that generate, model and use fossil calibration data. Presentations and discussions focused on channels for interdisciplinary collaboration, best practices for justifying, reporting and using fossil calibrations and roadblocks to synthesis of palaeontological and molecular data. Bioinformatics solutions were proposed, with the primary objective being a new database for vetted fossil calibrations with linkages to existing resources, targeted for a 2012 launch.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australasian marsupials include three major radiations, the insectivorous/carnivorous Dasyuromorphia, the omnivorous bandicoots (Peramelemorphia), and the largely herbivorous diprotodontians. Morphologists have generally considered the bandicoots and diprotodontians to be closely related, most prominently because they are both syndactylous (with the 2nd and 3rd pedal digits being fused). Molecular studies have been unable to confirm or reject this Syndactyla hypothesis. Here we present new mitochondrial (mt) genomes from a spiny bandicoot (Echymipera rufescens) and two dasyurids, a fat-tailed dunnart (Sminthopsis crassicaudata) and a northern quoll (Dasyurus hallucatus). By comparing trees derived from pairwise base-frequency differences between taxa with standard (absolute, uncorrected) distance trees, we infer that composition bias among mt protein-coding and RNA sequences is sufficient to mislead tree reconstruction. This can explain incongruence between trees obtained from mt and nuclear data sets. However, after excluding major sources of compositional heterogeneity, both the “reduced-bias” mt and nuclear data sets clearly favor a bandicoot plus dasyuromorphian association, as well as a grouping of kangaroos and possums (Phalangeriformes) among diprotodontians. Notably, alternatives to these groupings could only be confidently rejected by combining the mt and nuclear data. Elsewhere on the tree, Dromiciops appears to be sister to the monophyletic Australasian marsupials, whereas the placement of the marsupial mole (Notoryctes) remains problematic. More generally, we contend that it is desirable to combine mt genome and nuclear sequences for inferring vertebrate phylogeny, but as separately modeled process partitions. This strategy depends on detecting and excluding (or accounting for) major sources of nonhistorical signal, such as from compositional nonstationarity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sequence data often have competing signals that are detected by network programs or Lento plots. Such data can be formed by generating sequences on more than one tree, and combining the results, a mixture model. We report that with such mixture models, the estimates of edge (branch) lengths from maximum likelihood (ML) methods that assume a single tree are biased. Based on the observed number of competing signals in real data, such a bias of ML is expected to occur frequently. Because network methods can recover competing signals more accurately, there is a need for ML methods allowing a network. A fundamental problem is that mixture models can have more parameters than can be recovered from the data, so that some mixtures are not, in principle, identifiable. We recommend that network programs be incorporated into best practice analysis, along with ML and Bayesian trees.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Evolutionary biologists are often misled by convergence of morphology and this has been common in the study of bird evolution. However, the use of molecular data sets have their own problems and phylogenies based on short DNA sequences have the potential to mislead us too. The relationships among clades and timing of the evolution of modern birds (Neoaves) has not yet been well resolved. Evidence of convergence of morphology remain controversial. With six new bird mitochondrial genomes (hummingbird, swift, kagu, rail, flamingo and grebe) we test the proposed Metaves/Coronaves division within Neoaves and the parallel radiations in this primary avian clade. Results Our mitochondrial trees did not return the Metaves clade that had been proposed based on one nuclear intron sequence. We suggest that the high number of indels within the seventh intron of the β-fibrinogen gene at this phylogenetic level, which left a dataset with not a single site across the alignment shared by all taxa, resulted in artifacts during analysis. With respect to the overall avian tree, we find the flamingo and grebe are sister taxa and basal to the shorebirds (Charadriiformes). Using a novel site-stripping technique for noise-reduction we found this relationship to be stable. The hummingbird/swift clade is outside the large and very diverse group of raptors, shore and sea birds. Unexpectedly the kagu is not closely related to the rail in our analysis, but because neither the kagu nor the rail have close affinity to any taxa within this dataset of 41 birds, their placement is not yet resolved. Conclusion Our phylogenetic hypothesis based on 41 avian mitochondrial genomes (13,229 bp) rejects monophyly of seven Metaves species and we therefore conclude that the members of Metaves do not share a common evolutionary history within the Neoaves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the recently introduced Neighbourhood Disputes Resolution Act 2011 (Qld). The operation of the Act is considered as it impacts upon the responsibility of neighbours for dividing fences and trees as well as disclosure obligations associated with sale transactions. A particular focus of the article is the interrelationship of the disclosure obligations imposed by the Act with the operation of standard contractual warranties in Queensland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Kyoto Protocol recognises trees as a sink of carbon and a valid means to offset greenhouse gas emissions and meet internationally agreed emissions targets. This study details biological carbon sequestration rates for common plantation species Araucaria cunninghamii (hoop pine), Eucalyptus cloeziana, Eucalyptus argophloia, Pinus elliottii and Pinus caribaea var hondurensis and individual land areas required in north-eastern Australia to offset greenhouse gas emissions of 1000tCO 2e. The 3PG simulation model was used to predict above and below-ground estimates of biomass carbon for a range of soil productivity conditions for six representative locations in agricultural regions of north-eastern Australia. The total area required to offset 1000tCO 2e ranges from 1ha of E. cloeziana under high productivity conditions in coastal North Queensland to 45ha of hoop pine in low productivity conditions of inland Central Queensland. These areas must remain planted for a minimum of 30years to meet the offset of 1000tCO 2e.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.