607 resultados para Water classification
Resumo:
This report presents the final deliverable from the project titled Conceptual and statistical framework for a water quality component of an integrated report card’ funded by the Marine and Tropical Sciences Research Facility (MTSRF; Project 3.7.7). The key management driver of this, and a number of other MTSRF projects concerned with indicator development, is the requirement for state and federal government authorities and other stakeholders to provide robust assessments of the present ‘state’ or ‘health’ of regional ecosystems in the Great Barrier Reef (GBR) catchments and adjacent marine waters. An integrated report card format, that encompasses both biophysical and socioeconomic factors, is an appropriate framework through which to deliver these assessments and meet a variety of reporting requirements. It is now well recognised that a ‘report card’ format for environmental reporting is very effective for community and stakeholder communication and engagement, and can be a key driver in galvanising community and political commitment and action. Although a report card it needs to be understandable by all levels of the community, it also needs to be underpinned by sound, quality-assured science. In this regard this project was to develop approaches to address the statistical issues that arise from amalgamation or integration of sets of discrete indicators into a final score or assessment of the state of the system. In brief, the two main issues are (1) selecting, measuring and interpreting specific indicators that vary both in space and time, and (2) integrating a range of indicators in such a way as to provide a succinct but robust overview of the state of the system. Although there is considerable research and knowledge of the use of indicators to inform the management of ecological, social and economic systems, methods on how to best to integrate multiple disparate indicators remain poorly developed. Therefore the objective of this project was to (i) focus on statistical approaches aimed at ensuring that estimates of individual indicators are as robust as possible, and (ii) present methods that can be used to report on the overall state of the system by integrating estimates of individual indicators. It was agreed at the outset, that this project was to focus on developing methods for a water quality report card. This was driven largely by the requirements of Reef Water Quality Protection Plan (RWQPP) and led to strong partner engagement with the Reef Water Quality Partnership.
Resumo:
In this research Agency Theory and Stewardship Theory are used to analyse the relative performance of different forms of privitisation of water infrastructure and in doing so enriches understanding of previously underdeveloped aspects of both theories. The prior Agency Theory literature had established assumptions about the behaviour of principals and agents in contracts and these were found not to be correct in the context of contracts between modern government and private organisations. Agency theory was extended to include steward-like behaviour of an agent and Stewardship Theory was developed by the identification of factors within the contractual relationship which promote the sense of responsibility to the principal. The alliance, joint venture and Build Own Operate Transfer (BOOT) forms of privatisation were found to achieve stewardship of the infrastructure.
Resumo:
Fine-grained leaf classification has concentrated on the use of traditional shape and statistical features to classify ideal images. In this paper we evaluate the effectiveness of traditional hand-crafted features and propose the use of deep convolutional neural network (ConvNet) features. We introduce a range of condition variations to explore the robustness of these features, including: translation, scaling, rotation, shading and occlusion. Evaluations on the Flavia dataset demonstrate that in ideal imaging conditions, combining traditional and ConvNet features yields state-of-theart performance with an average accuracy of 97:3%�0:6% compared to traditional features which obtain an average accuracy of 91:2%�1:6%. Further experiments show that this combined classification approach consistently outperforms the best set of traditional features by an average of 5:7% for all of the evaluated condition variations.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. HRV analysis is an important tool to observe the heart’s ability to respond to normal regulatory impulses that affect its rhythm. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. A computer-based arrhythmia detection system of cardiac states is very useful in diagnostics and disease management. In this work, we studied the identification of the HRV signals using features derived from HOS. These features were fed to the support vector machine (SVM) for classification. Our proposed system can classify the normal and other four classes of arrhythmia with an average accuracy of more than 85%.
Resumo:
In this paper we propose the hybrid use of illuminant invariant and RGB images to perform image classification of urban scenes despite challenging variation in lighting conditions. Coping with lighting change (and the shadows thereby invoked) is a non-negotiable requirement for long term autonomy using vision. One aspect of this is the ability to reliably classify scene components in the presence of marked and often sudden changes in lighting. This is the focus of this paper. Posed with the task of classifying all parts in a scene from a full colour image, we propose that lighting invariant transforms can reduce the variability of the scene, resulting in a more reliable classification. We leverage the ideas of “data transfer” for classification, beginning with full colour images for obtaining candidate scene-level matches using global image descriptors. This is commonly followed by superpixellevel matching with local features. However, we show that if the RGB images are subjected to an illuminant invariant transform before computing the superpixel-level features, classification is significantly more robust to scene illumination effects. The approach is evaluated using three datasets. The first being our own dataset and the second being the KITTI dataset using manually generated ground truth for quantitative analysis. We qualitatively evaluate the method on a third custom dataset over a 750m trajectory.
Resumo:
Two Archaean komatiitic flows, Fred’s Flow in Canada and the Murphy Well Flow in Australia, have similar thicknesses (120 and 160 m) but very different compositions and internal structures. Their contrasting differentiation profiles are keys to determine the cooling and crystallization mechanisms that operated during the eruption of Archaean ultramafic lavas. Fred’s Flow is the type example of a thick komatiitic basalt flow. It is strongly differentiated and consists of a succession of layers with contrasting textures and compositions. The layering is readily explained by the accumulation of olivine and pyroxene in a lower cumulate layer and by evolution of the liquid composition during downward growth of spinifex-textured rocks within the upper crust. The magmas that erupted to form Fred’s Flow had variable compositions, ranging from 12 to 20 wt% MgO, and phenocryst contents from 0 to 20 vol%. The flow was emplaced by two pulses. A first ~20-m-thick pulse was followed by another more voluminous but less magnesian pulse that inflated the flow to its present 120 m thickness. Following the second pulse, the flow crystallized in a closed system and differentiated into cumulates containing 30–38 wt% MgO and a residual gabbroic layer with only 6 wt% MgO. The Murphy Well Flow, in contrast, has a remarkably uniform composition throughout. It comprises a 20-m-thick upper layer of fine-grained dendritic olivine and 2–5 vol% amygdales, a 110–120 m intermediate layer of olivine porphyry and a 20–30 m basal layer of olivine orthocumulate. Throughout the flow, MgO contents vary little, from only 30 to 33 wt%, except for the slightly more magnesian basal layer (38–40 wt%). The uniform composition of the flow and dendritic olivine habits in the upper 20 m point to rapid cooling of a highly magnesian liquid with a composition like that of the bulk of the flow. Under equilibrium conditions, this liquid should have crystallized olivine with the composition Fo94.9, but the most magnesian composition measured by electron microprobe in samples from the flow is Fo92.9. To explain these features, we propose that the parental liquid contained around 32 wt% MgO and 3 wt% H2O. This liquid degassed during the eruption, creating a supercooled liquid that solidified quickly and crystallized olivine with non-equilibrium textures and compositions.
Resumo:
The importance of clean drinking water in any community is absolutely vital if we as the consumers are to sustain a life of health and wellbeing. Suspended particles in surface waters not only provide the means to transport micro-organisms which can cause serious infections and diseases, they can also affect the performance capacity of a water treatment plant. In such situations pre-treatment ahead of the main plant is recommended. Previous research carried out using non-woven synthetic as a pre-filter materials for protecting slow sand filters from high turbidity showed that filter run times can be extended by several times and filters can be regenerated by simply removing and washing of the fabric ( Mbwette and Graham, 1987 and Mbwette, 1991). Geosynthetic materials have been extensively used for soil retention and dewatering in geotechnical applications and little research exists for the application of turbidity reduction in water treatment. With the development of new materials in geosynthetics today, it was hypothesized that the turbidity removal efficiency can be improved further by selecting appropriate materials. Two different geosynthetic materials (75 micron) tested at a filtration rate of 0.7 m/h yielded 30-45% reduction in turbidity with relatively minor head loss. It was found that the non-woven geotextile Propex 1701 retained the highest performance in both filtration efficiency and head loss across the varying turbidity ranges in comparison to other geotextiles tested. With 5 layers of the Propex 1701 an average percent reduction of approximately 67% was achieved with a head loss average of 4mm over the two and half hour testing period. Using the data collected for the Propex 1701 a mathematical model was developed for predicting the expected percent reduction given the ability to control the cost and as a result the number of layers to be used in a given filtration scenario.
Resumo:
We demonstrate potential applications for unusual dendrite like Au–Ag alloy nanoparticles formed via a galvanic replacement reaction in the ionic liquid [BMIM][BF4]. In comparison to Au–Ag alloy nanoshells synthesised via a similar reaction in water, the unusual branched structure of the dendritic materials led to increased electrocatalytic activity for the oxidation of both formaldehyde and hydrazine, and increased sensitivity and spectral resolution for the surface enhanced Raman scattering (SERS) of 4,4-bipyridal.
Resumo:
A numerical study is carried out to investigate the transition from laminar to chaos in mixed convection heat transfer inside a lid-driven trapezoidal enclosure. In this study, the top wall is considered as isothermal cold surface, which is moving in its own plane at a constant speed, and a constant high temperature is provided at the bottom surface. The enclosure is assumed to be filled with water-Al2O3 nanofluid. The governing Navier–Stokes and thermal energy equations are expressed in non-dimensional forms and are solved using Galerkin finite element method. Attention is paid in the present study on the pure mixed convection regime at Richandson number, Ri = 1. The numerical simulations are carried out over a wide range of Reynolds (0.1 ≤ Re ≤ 103) and Grashof (0.01 ≤ Gr ≤ 106) numbers. Effects of the presence of nanofluid on the characteristics of mixed convection heat transfer are also explored. The average Nusselt numbers of the heated wall are computed to demonstrate the influence of flow parameter variations on heat transfer. The corresponding change of flow and thermal fields is visualized from the streamline and the isotherm contour plots.
Resumo:
We investigated the effect of cold water immersion (CWI) on the recovery of muscle function and physiological responses following high-intensity resistance exercise. Using a randomized, cross-over design, 10 physically active men performed high-intensity resistance exercise, followed by one of two recovery interventions: 10 min of cold water immersion at 10°C, or 10 min active recovery (low-intensity cycling). After the recovery interventions, maximal muscle function was assessed after 2 h and 4 h by measuring jump height and isometric squat strength. Submaximal muscle function was assessed after 6 h by measuring the average load lifted during six sets of 10 squats at 80% 1RM. Intramuscular temperature (1 cm) was also recorded, and venous blood samples were analyzed for markers of metabolism, vasoconstriction and muscle damage. CWI did not enhance recovery of maximal muscle function. However, during the final three sets of the submaximal muscle function test, the participants lifted a greater load (p<0.05; 38%; Cohen’s d 1.3) following CWI compared with active recovery. During CWI, muscle temperature decreased 6°C below post-exercise values, and remained below pre-exercise values for another 35 min. Venous blood O2 saturation decreased below pre-exercise values for 1.5 h after CWI. Serum endothelin-1 concentration did not change after CWI, whereas it decreased after active recovery. Plasma myoglobin concentration was lower, whereas plasma interleukin-6 concentration was higher after CWI compared with active recovery. These results suggest that cold water immersion after resistance exercise allow athletes to complete more work during subsequent training sessions, which could enhance long-term training adaptations.
Resumo:
Leptospirosis outbreaks have been associated with many common water events including water consumption, water sports, environmental disasters and occupational exposure. The ability of leptospires to survive in moist environments makes them a high risk agent for infection following contact with any contaminated water source. Water treatment processes reduce the likelihood of leptospirosis or other microbial agents causing infection provided they do not malfunction and the distribution networks are maintained. Notably, there are many differences in water treatment systems around the world, particularly between developing and developed countries. Detection of leptospirosis in water samples is uncommonly performed by molecular methods.
Resumo:
Water education and conservation programs have grown exponentially in Australian primary and secondary schools and, although early childhood services have been slower to respond to the challenges of sustainability, they are catching up fast. One early program targeted at preschools was the Water Aware Centre Program in northern New South Wales developed by the local water supply authority. This paper reports on a qualitative study of children’s and teachers’ experiences of the program in three preschools. The study’s aim was to identify program attributes and pedagogies that supported learning and action taking for water conservation, and to investigate if and how the program influenced children’s and teachers’practices. Data were collected through an interview with the program designer, conversations with child participants of the program, and a qualitative survey with early childhood staff. A three-step thematic analysis was conducted on the children’s and teachers’ data. Findings revealed that the program expanded children and teachers’ ideas about water conservation and increased their water conservation practices. The children were found to influence the water conservation practices of the adults around them, thus changing practices at school and at home.
Resumo:
Calls from 14 species of bat were classified to genus and species using discriminant function analysis (DFA), support vector machines (SVM) and ensembles of neural networks (ENN). Both SVMs and ENNs outperformed DFA for every species while ENNs (mean identification rate – 97%) consistently outperformed SVMs (mean identification rate – 87%). Correct classification rates produced by the ENNs varied from 91% to 100%; calls from six species were correctly identified with 100% accuracy. Calls from the five species of Myotis, a genus whose species are considered difficult to distinguish acoustically, had correct identification rates that varied from 91 – 100%. Five parameters were most important for classifying calls correctly while seven others contributed little to classification performance.
Resumo:
We describe an investigation into how Massey University’s Pollen Classifynder can accelerate the understanding of pollen and its role in nature. The Classifynder is an imaging microscopy system that can locate, image and classify slide based pollen samples. Given the laboriousness of purely manual image acquisition and identification it is vital to exploit assistive technologies like the Classifynder to enable acquisition and analysis of pollen samples. It is also vital that we understand the strengths and limitations of automated systems so that they can be used (and improved) to compliment the strengths and weaknesses of human analysts to the greatest extent possible. This article reviews some of our experiences with the Classifynder system and our exploration of alternative classifier models to enhance both accuracy and interpretability. Our experiments in the pollen analysis problem domain have been based on samples from the Australian National University’s pollen reference collection (2,890 grains, 15 species) and images bundled with the Classifynder system (400 grains, 4 species). These samples have been represented using the Classifynder image feature set.We additionally work through a real world case study where we assess the ability of the system to determine the pollen make-up of samples of New Zealand honey. In addition to the Classifynder’s native neural network classifier, we have evaluated linear discriminant, support vector machine, decision tree and random forest classifiers on these data with encouraging results. Our hope is that our findings will help enhance the performance of future releases of the Classifynder and other systems for accelerating the acquisition and analysis of pollen samples.