955 resultados para Burnt area mapping
Resumo:
This paper presents the application of a monocular visual SLAMon a fixed-wing small Unmanned Aerial System (sUAS) capable of simultaneous estimation of aircraft pose and scene structure. We demonstrate the robustness of unconstrained vision alone in producing reliable pose estimates of a sUAS, at altitude. It is ultimately capable of online state estimation feedback for aircraft control and next-best-view estimation for complete map coverage without the use of additional sensors.We explore some of the challenges of visual SLAM from a sUAS including dealing with planar structure, distant scenes and noisy observations. The developed techniques are applied on vision data gathered from a fast-moving fixed-wing radio control aircraft flown over a 1×1km rural area at an altitude of 20-100m.We present both raw Structure from Motion results and a SLAM solution that includes FAB-MAP based loop-closures and graph-optimised pose. Timing information is also presented to demonstrate near online capabilities. We compare the accuracy of the 6-DOF pose estimates to an off-the-shelfGPS aided INS over a 1.7kmtrajectory.We also present output 3D reconstructions of the observed scene structure and texture that demonstrates future applications in autonomous monitoring and surveying.
Resumo:
Passive air samplers (PAS) consisting of polyurethane foam (PUF) disks were deployed at 6 outdoor air monitoring stations in different land use categories (commercial, industrial, residential and semi-rural) to assess the spatial distribution of polybrominated diphenyl ethers (PBDEs) in the Brisbane airshed. Air monitoring sites covered an area of 1143 km2 and PAS were allowed to accumulate PBDEs in the city's airshed over three consecutive seasons commencing in the winter of 2008. The average sum of five (∑5) PBDEs (BDEs 28, 47, 99, 100 and 209) levels were highest at the commercial and industrial sites (12.7 ± 5.2 ng PUF−1), which were relatively close to the city center and were a factor of 8 times higher than residential and semi-rural sites located in outer Brisbane. To estimate the magnitude of the urban ‘plume’ an empirical exponential decay model was used to fit PAS data vs. distance from the CBD, with the best correlation observed when the particulate bound BDE-209 was not included (∑5-209) (r2 = 0.99), rather than ∑5 (r2 = 0.84). At 95% confidence intervals the model predicts that regardless of site characterization, ∑5-209 concentrations in a PAS sample taken between 4–10 km from the city centre would be half that from a sample taken from the city centre and reach a baseline or plateau (0.6 to 1.3 ng PUF−1), approximately 30 km from the CBD. The observed exponential decay in ∑5-209 levels over distance corresponded with Brisbane's decreasing population density (persons/km2) from the city center. The residual error associated with the model increased significantly when including BDE-209 levels, primarily due to the highest level (11.4 ± 1.8 ng PUF−1) being consistently detected at the industrial site, indicating a potential primary source at this site. Active air samples collected alongside the PAS at the industrial air monitoring site (B) indicated BDE-209 dominated congener composition and was entirely associated with the particulate phase. This study demonstrates that PAS are effective tools for monitoring citywide regional differences however, interpretation of spatial trends for POPs which are predominantly associated with the particulate phase such as BDE-209, may be restricted to identifying ‘hotspots’ rather than broad spatial trends.
Resumo:
Mapping Multiple Literacies brings together the latest theory and research in the fields of literacy study and European philosophy, Multiple Literacies Theory (MLT) and the philosophical work of Gilles Deleuze. It frames the process of becoming literate as a fluid process involving multiple modes of presentation, and explains these processes in terms of making maps of our social lives and ways of doing things together. For Deleuze, language acquisition is a social activity of which we are a part, but only one part amongst many others. Masny and Cole draw on Deleuze's thinking to expand the repertoires of literacy research and understanding. They outline how we can understand literacy as a social activity and map the ways in which becoming literate may take hold and transform communities. The chapters in this book weave together theory, data and practice to open up a creative new area of literacy studies and to provoke vigorous debate about the sociology of literacy.
Resumo:
There is an increasing interest in the use of information technology as a participatory planning tool, particularly the use of geographical information technologies to support collaborative activities such as community mapping. However, despite their promise, the introduction of such technologies does not necessarily promote better participation nor improve collaboration. In part this can be attributed to a tendency for planners to focus on the technical considerations associated with these technologies at the expense of broader participation considerations. In this paper we draw on the experiences of a community mapping project with disadvantaged communities in suburban Australia to highlight the importance of selecting tools and techniques which support and enhance participatory planning. This community mapping project, designed to identify and document community-generated transport issues and solutions, had originally intended to use cadastral maps extracted from the government’s digital cadastral database as the foundation for its community mapping approach. It was quickly discovered that the local residents found the cadastral maps confusing as the maps lacked sufficient detail to orient them to their suburb (the study area). In response to these concerns and consistent with the project’s participatory framework, a conceptual base map based on resident’s views of landmarks of local importance was developed to support the community mapping process. Based on this community mapping experience we outline four key lessons learned regarding the process of community mapping and the place of geographical information technologies within this process.
Resumo:
Whilst there is an excellent and growing body of literature around female criminality underpinned by feminist methodologies, the nitty gritty of the methodological journey is nowhere as well detailed as it is in the context of the Higher Degree Research (HDR) thesis. Thus the purpose of this paper is threefold: i) to explore a range of feminist methodologies underpinning 20 Australian HDR theses focussing on female criminality; ii) to identify and map the governance/ethics tensions experienced by these researchers whilst undertaking high risk research in the area of female offending; and iii) to document strategies drawn from negotiations, resolutions and outcomes to a range of gate-keeping issues. By exploring the strategies used by these researchers, this paper aims to: promote discussion on feminist methodologies; highlight pathways that may be created when negotiating the challenging process of accessing data pertinent to this relatively understudied area; contribute to a community of practice; and provide useful insights into what Mason & Stubbs (2010:16) refer to as “the open and honest reflexivity through the research process by describing the assumptions, and hiccups” for future researchers navigating governance landscapes.
Resumo:
Significant problems confront our child protection out-of-home care system including: high costs; increasing numbers of children and young people entering and remaining in care longer; high frequency of placement movement; and, negative whole-of-life outcomes for children and young people who have exited care. National policy and research agendas recognise the importance of enhancing the evidence base in out-of-home care to inform the development of policy, programs and practice, and improve longitudinal outcomes of children and young people. The authors discuss the concept of placement trajectory as a framework for research and systems analysis in the out-of-home context. While not without limitations, the concept of placement trajectory is particularly useful in understanding the factors influencing placement movement and stability. Increasing the evidence base in this area can serve to enhance improved outcomes across the lifespan for children and young people in the out-of-home care system.
Resumo:
Experience gained from numerous projects conducted by the U.S. Environmental Protection Agency's (EPA) Environmental Monitoring Systems Laboratory in Las Vegas, Nevada has provided insight to functional issues of mapping, monitoring, and modeling of wetland habitats. Three case studies in poster form describe these issues pertinent to managing wetland resources as mandated under Federal laws. A multiphase project was initiated by the EPA Alaska operations office to provide detailed wetland mapping of arctic plant communities in an area under petroleum development pressure. Existing classification systems did not meet EPA needs. Therefore a Habitat Classification System (HCS) derived from aerial photography was compiled. In conjunction with this photointerpretive keys were developed. These products enable EPA personnel to map large inaccessible areas of the arctic coastal plain and evaluate the sensitivity of various wetland habitats relative to petroleum development needs.
Resumo:
Assurance of learning (AOL) is a quality enhancement and quality assurance process used in higher education. It involves a process of determining programme learning outcomes and standards, and systematically gathering evidence to measure students' performance on these. The systematic assessment of whole-of-programme outcomes provides a basis for curriculum development and management, continuous improvement, and accreditation. To better understand how AOL processes operate, a national study of university practices across one discipline area, business and management, was undertaken. To solicit data on AOL practice, interviews were undertaken with a sample of business school representatives (n = 25). Two key processes emerged: (1) mapping of graduate attributes and (2) collection of assurance data. External drivers such as professional accreditation and government legislation were the primary reasons for undertaking AOL outcomes but intrinsic motivators in relation to continuous improvement were also evident. The facilitation of academic commitment was achieved through an embedded approach to AOL by the majority of universities in the study. A sustainable and inclusive process of AOL was seen to support wider stakeholder engagement in the development of higher education learning outcomes.
Resumo:
Accurate three-dimensional representations of cultural heritage sites are highly valuable for scientific study, conservation, and educational purposes. In addition to their use for archival purposes, 3D models enable efficient and precise measurement of relevant natural and architectural features. Many cultural heritage sites are large and complex, consisting of multiple structures spatially distributed over tens of thousands of square metres. The process of effectively digitising such geometrically complex locations requires measurements to be acquired from a variety of viewpoints. While several technologies exist for capturing the 3D structure of objects and environments, none are ideally suited to complex, large-scale sites, mainly due to their limited coverage or acquisition efficiency. We explore the use of a recently developed handheld mobile mapping system called Zebedee in cultural heritage applications. The Zebedee system is capable of efficiently mapping an environment in three dimensions by continually acquiring data as an operator holding the device traverses through the site. The system was deployed at the former Peel Island Lazaret, a culturally significant site in Queensland, Australia, consisting of dozens of buildings of various sizes spread across an area of approximately 400 × 250 m. With the Zebedee system, the site was scanned in half a day, and a detailed 3D point cloud model (with over 520 million points) was generated from the 3.6 hours of acquired data in 2.6 hours. We present results demonstrating that Zebedee was able to accurately capture both site context and building detail comparable in accuracy to manual measurement techniques, and at a greatly increased level of efficiency and scope. The scan allowed us to record derelict buildings that previously could not be measured because of the scale and complexity of the site. The resulting 3D model captures both interior and exterior features of buildings, including structure, materials, and the contents of rooms.
Resumo:
We identified, mapped, and characterized a widespread area (gt;1,020 km2) of patterned ground in the Saginaw Lowlands of Michigan, a wet, flat plain composed of waterlain tills, lacustrine deposits, or both. The polygonal patterned ground is interpreted as a possible relict permafrost feature, formed in the Late Wisconsin when this area was proximal to the Laurentide ice sheet. Cold-air drainage off the ice sheet might have pooled in the Saginaw Lowlands, which sloped toward the ice margin, possibly creating widespread but short-lived permafrost on this glacial lake plain. The majority of the polygons occur between the Glacial Lake Warren strandline (~14.8 cal. ka) and the shoreline of Glacial Lake Elkton (~14.3 cal. ka), providing a relative age bracket for the patterned ground. Most of the polygons formed in dense, wet, silt loam soils on flat-lying sites and take the form of reticulate nets with polygon long axes of 150 to 160 m and short axes of 60 to 90 m. Interpolygon swales, often shown as dark curvilinears on aerial photographs, are typically slightly lower than are the polygon centers they bound. Some portions of these interpolygon swales are infilled with gravel-free, sandy loam sediments. The subtle morphology and sedimentological characteristics of the patterned ground in the Saginaw Lowlands suggest that thermokarst erosion, rather than ice-wedge replacement, was the dominant geomorphic process associated with the degradation of the Late-Wisconsin permafrost in the study area and, therefore, was primarily responsible for the soil patterns seen there today.
Resumo:
A precise representation of the spatial distribution of hydrophobicity, hydrophilicity and charges on the molecular surface of proteins is critical for the understanding of the interaction with small molecules and larger systems. The representation of hydrophobicity is rarely done at atom-level, as this property is generally assigned to residues. A new methodology for the derivation of atomic hydrophobicity from any amino acid-based hydrophobicity scale was used to derive 8 sets of atomic hydrophobicities, one of which was used to generate the molecular surfaces for 35 proteins with convex structures, 5 of which, i.e., lysozyme, ribonuclease, hemoglobin, albumin and IgG, have been analyzed in more detail. Sets of the molecular surfaces of the model proteins have been constructed using spherical probes with increasingly large radii, from 1.4 to 20 A˚, followed by the quantification of (i) the surface hydrophobicity; (ii) their respective molecular surface areas, i.e., total, hydrophilic and hydrophobic area; and (iii) their relative densities, i.e., divided by the total molecular area; or specific densities, i.e., divided by property-specific area. Compared with the amino acid-based formalism, the atom-level description reveals molecular surfaces which (i) present an approximately two times more hydrophilic areas; with (ii) less extended, but between 2 to 5 times more intense hydrophilic patches; and (iii) 3 to 20 times more extended hydrophobic areas. The hydrophobic areas are also approximately 2 times more hydrophobicity-intense. This, more pronounced "leopard skin"-like, design of the protein molecular surface has been confirmed by comparing the results for a restricted set of homologous proteins, i.e., hemoglobins diverging by only one residue (Trp37). These results suggest that the representation of hydrophobicity on the protein molecular surfaces at atom-level resolution, coupled with the probing of the molecular surface at different geometric resolutions, can capture processes that are otherwise obscured to the amino acid-based formalism.
Resumo:
description and analysis of geographically indexed health data with respect to demographic, environmental, behavioural, socioeconomic, genetic, and infectious risk factors (Elliott andWartenberg 2004). Disease maps can be useful for estimating relative risk; ecological analyses, incorporating area and/or individual-level covariates; or cluster analyses (Lawson 2009). As aggregated data are often more readily available, one common method of mapping disease is to aggregate the counts of disease at some geographical areal level, and present them as choropleth maps (Devesa et al. 1999; Population Health Division 2006). Therefore, this chapter will focus exclusively on methods appropriate for areal data...
Resumo:
The research is related to the Finnish Jabal Harun Project (FJHP), which is part of the research unit directed by Professor Jaakko Frösén. The project consists of two interrelated parts: the excavation of a Byzantine monastery/pilgrimage centre on Jabal Harun, and a multiperiod archaeological survey of the surrounding landscape. It is generally held that the Near Eastern landscape has been modified by millennia of human habitation and activity. Past climatic changes and human activities could be expected to have significantly changed also the landscape of the Jabal Harun area. Therefore it was considered that a study of erosion in the Jabal Harun area could shed light on the environmental and human history of the area. It was hoped that it would be possible to connect the results of the sedimentological studies either to wider climatic changes in the Near East, or to archaeologically observable periods of human activity and land use. As evidence of some archaeological periods is completely missing from the Jabal Harun area, it was also of interest whether catastrophic erosion or unfavourable environmental change, caused either by natural forces or by human agency, could explain the gaps in the archaeological record. Changes in climate and/or land-use were expected to be reflected in the sedimentary record. The field research, carried out as part of the FJHP survey fieldwork, included the mapping of wadi terraces and cleaning of sediment profiles which were recorded and sampled for laboratory analyses of facies and lithology. To obtain a chronology for the sedimentation and erosion phases also OSL (optically stimulated luminescence) dating samples were collected. The results were compared to the record of the Near Eastern palaeoclimate, and to data from geoarchaeological studies in central and southern Jordan. The picture of the environmental development was then compared to the human history in the area, based on archaeological evidence from the FJHP survey and the published archaeological research in the Petra region, and the question of the relationship between human activity and environmental change was critically discussed. Using the palaeoclimatic data and the results from geoarchaeological studies it was possible to outline the environmental development in the Jabal Harun area from the Pleistocene to the present.It is appears that there was a phase of accumulation of sediment before the Middle Palaeolithic period, possibly related to tectonic movement. This phase was later followed by erosion, tentatively suggested to have taken place during the Upper Palaeolithic. A period of wadi aggradation probably occurred during the Late Glacial and continued until the end of the Pleistocene, followed by significant channel degradation, attributed to increased rainfall during the Early Holocene. It seems that during the later Holocene channel incision has been dominant in the Jabal Harûn area although there have been also small-scale channel aggradation phases, two of which were OSL-dated to around 4000-3000 BP and 2400-2000 BP. As there is no evidence of tectonic movements in the Jabal Harun area after the early Pleistocene, it is suggested that climate change and human activity have been the major causes of environmental change in the area. At a brief glance it seems that many of the changes in the settlement and land use in the Jabal Harun area can be explained by climatic and environmental conditions. However, the responses of human societies to environmental change are dependent on many factors. Therefore an evaluation of the significance of environmental, cultural, socio-economic and political factors is needed to decide whether certain phenomena are environmentally induced. Comparison with the wider Petra region is also needed to judge whether the phenomena are characteristic of the Jabal Harun area only, or can they be connected to social, political and economic development over a wider area.
Resumo:
his paper presents identification and mapping of vulnerable and safe zones for liquefaction hazard. About 850 bore logs data collected from geotechnical investigation reports have been used to estimate the liquefaction factor of safety for Bangalore Mahanagara palike (BMP) area of about 220 km(2). Liquefaction factor of safety is arrived based on surface level peak ground acceleration presented by Anbazhagan and Sitharam(5) and liquefaction resistance, using corrected standard penetration test (SPT) N values. The estimated factor of safety against liquefaction is used to estimate liquefaction potential index and liquefaction severity index. These values are mapped using Geographical information system (GIS) to identify the vulnerable and safe zones in Bangalore. This study shows that more than 95% of the BMP area is safe against liquefaction potential. However the western part of the BMP is not safe against liquefaction, as it may be subjected to liquefaction with probability of 35 to 65%. Three approaches used in this study show that 1) mapping least factor of safety irrespective of depth may be used to find liquefiable area for worst case. 2) mapping liquefaction potential index can be used to assess the liquefaction severity of the area by considering layer thickness and factor of safety and 3) mapping of liquefaction severity index can be used to access the probability of liquefaction of area.
Resumo:
Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.