976 resultados para Coastal sensitivity mapping


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of and conditions favourable to nucleation were investigated at an industrial and commercial coastal location in Brisbane, Australia during five different campaigns covering a total period of 13 months. To identify potential nucleation events, the difference in number concentration in the size range 14-30 nm (N14-30) between consecutive observations was calculated using first-order differencing. The data showed that nucleation events were a rare occurrence, and that in the absence of nucleation the particle number was dominated by particles in the range 30-300 nm. In many instances, total particle concentration declined during nucleation. There was no clear pattern in change in NO and NO2 concentrations during the events. SO2 concentration, in the majority of cases, declined during nucleation but there were exceptions. Most events took place in summer, followed by winter and then spring, and no events were observed for the autumn campaigns. The events were associated with sea breeze and long-range transport. Roadside emissions, in contrast, did not contribute to nucleation, probably due to the predominance of particles in the range 50-100 nm associated with these emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A month-long intensive measurement campaign was conducted in March/April 2007 at Agnes Water, a remote coastal site just south of the Great Barrier Reef on the east coast of Australia. Particle and ion size distributions were continuously measured during the campaign. Coastal nucleation events were observed in clean, marine air masses coming from the south-east on 65% of the days. The events usually began at ~10:00 local time and lasted for 1-4 hrs. They were characterised by the appearance of a nucleation mode with a peak diameter of ~10 nm. The freshly nucleated particles grew within 1-4 hrs up to sizes of 20-50 nm. The events occurred when solar intensity was high (~1000 W m-2) and RH was low (~60%). Interestingly, the events were not related to tide height. The volatile and hygroscopic properties of freshly nucleated particles (17-22.5 nm), simultaneously measured with a volatility-hygroscopicity-tandem differential mobility analyser (VH-TDMA), were used to infer chemical composition. The majority of the volume of these particles was attributed to internally mixed sulphate and organic components. After ruling out coagulation as a source of significant particle growth, we conclude that the condensation of sulphate and/or organic vapours was most likely responsible for driving particle growth during the nucleation events. We cannot make any direct conclusions regarding the chemical species that participated in the initial particle nucleation. However, we suggest that nucleation may have resulted from the photo-oxidation products of unknown sulphur or organic vapours emitted from the waters of Hervey Bay, or from the formation of DMS-derived sulphate clusters over the open ocean that were activated to observable particles by condensable vapours emitted from the nutrient rich waters around Fraser Island or Hervey Bay. Furthermore, a unique and particularly strong nucleation event was observed during northerly wind. The event began early one morning (08:00) and lasted almost the entire day resulting in the production of a large number of ~80 nm particles (average modal concentration during the event was 3200 cm-3). The Great Barrier Reef was the most likely source of precursor vapours responsible for this event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For robots to operate in human environments they must be able to make their own maps because it is unrealistic to expect a user to enter a map into the robot’s memory; existing floorplans are often incorrect; and human environments tend to change. Traditionally robots have used sonar, infra-red or laser range finders to perform the mapping task. Digital cameras have become very cheap in recent years and they have opened up new possibilities as a sensor for robot perception. Any robot that must interact with humans can reasonably be expected to have a camera for tasks such as face recognition, so it makes sense to also use the camera for navigation. Cameras have advantages over other sensors such as colour information (not available with any other sensor), better immunity to noise (compared to sonar), and not being restricted to operating in a plane (like laser range finders). However, there are disadvantages too, with the principal one being the effect of perspective. This research investigated ways to use a single colour camera as a range sensor to guide an autonomous robot and allow it to build a map of its environment, a process referred to as Simultaneous Localization and Mapping (SLAM). An experimental system was built using a robot controlled via a wireless network connection. Using the on-board camera as the only sensor, the robot successfully explored and mapped indoor office environments. The quality of the resulting maps is comparable to those that have been reported in the literature for sonar or infra-red sensors. Although the maps are not as accurate as ones created with a laser range finder, the solution using a camera is significantly cheaper and is more appropriate for toys and early domestic robots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To explore the effects of glaucoma and aging on low-spatial-frequency contrast sensitivity by using tests designed to assess performance of either the magnocellular (M) or parvocellular (P) visual pathways. METHODS: Contrast sensitivity was measured for spatial frequencies of 0.25 to 2 cyc/deg by using a published steady- and pulsed-pedestal approach. Sixteen patients with glaucoma and 16 approximately age-matched control subjects participated. Patients with glaucoma were tested foveally and at two midperipheral locations: (1) an area of early visual field loss, and (2) an area of normal visual field. Control subjects were assessed in matched locations. An additional group of 12 younger control subjects (aged 20-35 years) were also tested. RESULTS: Older control subjects demonstrated reduced sensitivity relative to the younger group for the steady (presumed M)- and pulsed (presumed P)-pedestal conditions. Sensitivity was reduced foveally and in the midperiphery across the spatial frequency range. In the area of early visual field loss, the glaucoma group demonstrated further sensitivity reduction relative to older control subjects across the spatial frequency range for both the steady- and pulsed-pedestal tasks. Sensitivity was also reduced in the midperipheral location of "normal" visual field for the pulsed condition. CONCLUSIONS: Normal aging results in a reduction of contrast sensitivity for the low-spatial-frequency-sensitive components of both the M and P pathways. Glaucoma results in a further reduction of sensitivity that is not selective for M or P function. The low-spatial-frequency-sensitive channels of both pathways, which are presumably mediated by cells with larger receptive fields, are approximately equivalently impaired in early glaucoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coastal communities face the social, cultural and environmental challenges of managing rapid urban and industrial development, expanding tourism, and sensitive ecological environments. Enriching relationships between communities and universities through a structured engagement process can deliver integrated options towards sustainable coastal futures. This process draws on the embedded knowledge and values of all participants in the relationship, and offers a wide and affordable range of options for the future. This paper reviews lessons learnt from two projects with coastal communities, and discusses their application in a third. Queensland University of Technology has formed collaborative partnerships with industry in Queensland's Wide Bay-Burnett region to undertake a series of planning and design projects with community engagement as a central process. Senior students worked with community and produced design and planning drawings and reports outlining future options for project areas. A reflective approach has been adopted by the authors to assess the engagement process and outcomes of each project to learn lessons to apply in the next. Methods include surveying community and student participants regarding the value they place on process and outcomes respectively in planning for a sustainable future. All project participants surveyed have placed high importance on the process of engagement, emphasising the value of developing relationships between all project partners. The quality of these relationships is central to planning for sustainable futures, and while the outcomes the students deliver are valued, it is as much for their catalytic role as for their contents. Design and planning projects through community engagement have been found to develop innovative responses to the challenges faced by coastal communities seeking direction toward sustainable futures. The enrichment of engagement relationships and processes has an important influence on the quality of these design and planning responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter reports on Australian and Swedish experiences in the iterative design, development, and ongoing use of interactive educational systems we call ‘Media Maps.’ Like maps in general, Media Maps are usefully understood as complex cultural technologies; that is, they are not only physical objects, tools and artefacts, but also information creation and distribution technologies, the use and development of which are embedded in systems of knowledge and social meaning. Drawing upon Australian and Swedish experiences with one Media Map technology, this paper illustrates this three-layered approach to the development of media mapping. It shows how media mapping is being used to create authentic learning experiences for students preparing for work in the rapidly evolving media and communication industries. We also contextualise media mapping as a response to various challenges for curriculum and learning design in Media and Communication Studies that arise from shifts in tertiary education policy in a global knowledge economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As regulators, governments are often criticised for over‐regulating industries. This research project seeks to examine the regulation affecting the construction industry in a federal system of government. It uses a case study of the Australian system of government to focus on the question of the implications of regulation in the construction industry. Having established the extent of the regulatory environment, the research project considers the costs associated with this environment. Consequently, ways in which the regulatory burden on industry can be reduced are evaluated. The Construction Industry Business Environment project is working with industry and government agencies to improve regulatory harmonisation in Australia, and thereby reduce the regulatory burden on industry. It is found that while taxation and compliance costs are not likely to be reduced in the short term, costs arising from having to adapt to variation between regulatory regimes in a federal system of government, seem the most promising way of reducing regulatory costs. Identifying and reducing adaptive costs across jurisdictional are argued to present a novel approach to regulatory reform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are currently a number of issues of great importance affecting universities and the way in which their programs are now offered. Many issues are largely being driven top-down and impact both at a university-wide and at an individual discipline level. This paper provides a brief history of cartography and digital mapping education at the Queensland University of Technology (QUT). It also provides an overview of what is curriculum mapping and presents some interesting findings from the program review process. Further, this review process has triggered discussion and action for the review, mapping and embedding of graduate attributes within the spatial science major program. Some form of practical based learning is expected in vocationally oriented degrees that lead to professional accreditation and are generally regarded as a good learning exposure. With the restructure of academic programs across the Faculty of Built Environment and Engineering in 2006, spatial science and surveying students now undertake a formal work integrated learning unit. There is little doubt that students acquire the skills of their discipline (mapping science, spatial) by being immersed in the industry culture- learning how to process information and solve real-world problems within context. The broad theme of where geo-spatial mapping skills are embedded in this broad-based tertiary education course are examined with some focused discussion on the learning objectives, outcomes and examples of some student learning experiences

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network crawling and visualisation tools and other datamining systems are now advanced enough to provide significant new impulses to the study of cultural activity on the Web. A growing range of studies focus on communicative processes in the blogosphere – including for example Adamic & Glance’s 2005 map of political allegiances during the 2004 U.S. presidential election and Kelly & Etling’s 2008 study of blogging practices in Iran. There remain a number of significant shortcomings in the application of such tools and methodologies to the study of blogging; these relate both to how the content of blogs is analysed, and to how the network maps resulting from such studies are understood. Our project highlights and addresses such shortcomings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Areal bone mineral density (aBMD) is the most common surrogate measurement for assessing the bone strength of the proximal femur associated with osteoporosis. Additional factors, however, contribute to the overall strength of the proximal femur, primarily the anatomical geometry. Finite element analysis (FEA) is an effective and widely used computerbased simulation technique for modeling mechanical loading of various engineering structures, providing predictions of displacement and induced stress distribution due to the applied load. FEA is therefore inherently dependent upon both density and anatomical geometry. FEA may be performed on both three-dimensional and two-dimensional models of the proximal femur derived from radiographic images, from which the mechanical stiffness may be redicted. It is examined whether the outcome measures of two-dimensional FEA, two-dimensional, finite element analysis of X-ray images (FEXI), and three-dimensional FEA computed stiffness of the proximal femur were more sensitive than aBMD to changes in trabecular bone density and femur geometry. It is assumed that if an outcome measure follows known trends with changes in density and geometric parameters, then an increased sensitivity will be indicative of an improved prediction of bone strength. All three outcome measures increased non-linearly with trabecular bone density, increased linearly with cortical shell thickness and neck width, decreased linearly with neck length, and were relatively insensitive to neck-shaft angle. For femoral head radius, aBMD was relatively insensitive, with two-dimensional FEXI and threedimensional FEA demonstrating a non-linear increase and decrease in sensitivity, respectively. For neck anteversion, aBMD decreased non-linearly, whereas both two-dimensional FEXI and three dimensional FEA demonstrated a parabolic-type relationship, with maximum stiffness achieved at an angle of approximately 15o. Multi-parameter analysis showed that all three outcome measures demonstrated their highest sensitivity to a change in cortical thickness. When changes in all input parameters were considered simultaneously, three and twodimensional FEA had statistically equal sensitivities (0.41±0.20 and 0.42±0.16 respectively, p = ns) that were significantly higher than the sensitivity of aBMD (0.24±0.07; p = 0.014 and 0.002 for three-dimensional and two-dimensional FEA respectively). This simulation study suggests that since mechanical integrity and FEA are inherently dependent upon anatomical geometry, FEXI stiffness, being derived from conventional two-dimensional radiographic images, may provide an improvement in the prediction of bone strength of the proximal femur than currently provided by aBMD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The last three decades have seen consumers’ environmental consciousness grow as the environment has moved to a mainstream issue. Results from our study of green marketing blog site comments in the first half of 2009 finds thirteen prominent concepts: carbon, consumers, global and energy were the largest themes, while crisis, power, people, water, fuel, product, work, time, water, organic, content and interest were the others. However sub issues were also identified, as the driving factor of this information is coming from consumer led social networks. While marketers hold some power, consumers are the real key factor to possess influence for change. They want to drive change and importantly, they have the power. Power to the people.