842 resultados para Tangible-intangible
Resumo:
This dissertation develops the model of a prototype system for the digital lodgement of spatial data sets with statutory bodies responsible for the registration and approval of land related actions under the Torrens Title system. Spatial data pertain to the location of geographical entities together with their spatial dimensions and are classified as point, line, area or surface. This dissertation deals with a sub-set of spatial data, land boundary data that result from the activities performed by surveying and mapping organisations for the development of land parcels. The prototype system has been developed, utilising an event-driven paradigm for the user-interface, to exploit the potential of digital spatial data being generated from the utilisation of electronic techniques. The system provides for the creation of a digital model of the cadastral network and dependent data sets for an area of interest from hard copy records. This initial model is calibrated on registered control and updated by field survey to produce an amended model. The field-calibrated model then is electronically validated to ensure it complies with standards of format and content. The prototype system was designed specifically to create a database of land boundary data for subsequent retrieval by land professionals for surveying, mapping and related activities. Data extracted from this database are utilised for subsequent field survey operations without the need to create an initial digital model of an area of interest. Statistical reporting of differences resulting when subsequent initial and calibrated models are compared, replaces the traditional checking operations of spatial data performed by a land registry office. Digital lodgement of survey data is fundamental to the creation of the database of accurate land boundary data. This creation of the database is fundamental also to the efficient integration of accurate spatial data about land being generated by modem technology such as global positioning systems, and remote sensing and imaging, with land boundary information and other information held in Government databases. The prototype system developed provides for the delivery of accurate, digital land boundary data for the land registration process to ensure the continued maintenance of the integrity of the cadastre. Such data should meet also the more general and encompassing requirements of, and prove to be of tangible, longer term benefit to the developing, electronic land information industry.
Resumo:
This paper describes Electronic Blocks, a new robot construction element designed to allow children as young as age three to build and program robotic structures. The Electronic Blocks encapsulate input, output and logic concepts in tangible elements that young children can use to create a wide variety of physical agents. The children are able to determine the behavior of these agents by the choice of blocks and the manner in which they are connected. The Electronic Blocks allow children without any knowledge of mechanical design or computer programming to create and control physically embodied robots. They facilitate the development of technological capability by enabling children to design, construct, explore and evaluate dynamic robotics systems. A study of four and five year-old children using the Electronic Blocks has demonstrated that the interface is well suited to young children. The complexity of the implementation is hidden from the children, leaving the children free to autonomously explore the functionality of the blocks. As a consequence, children are free to move their focus beyond the technology. Instead they are free to focus on the construction process, and to work on goals related to the creation of robotic behaviors and interactions. As a resource for robot building, the blocks have proved to be effective in encouraging children to create robot structures, allowing children to design and program robot behaviors.
Resumo:
Geographic information is increasingly being touted for use in research and industrial projects. While the technology is now available and affordable, there is a lack of easy to use software that takes advantage of geographic information. This is an important problem because users are often researchers or scientists who have insufficient software skills, and by providing applications that are easier to use, time and financial resources can be taken from training and be better applied to the actual research and development work. A solution for this problem must cater for the user and research needs. In particular it must allow for mobile operation for fieldwork, flexibility or customisability of data input, sharing of data with other tools and collaborative capabilities for the usual teamwork environment. This thesis has developed a new architecture and data model to achieve the solution. The result is the Mobile Collaborative Annotation framework providing an implementation of the new architecture and data model. Mobile Collaborative Mapping implements the framework as a Web 2.0 mashup rich internet application and has proven to be an effective solution through its positive application to a case study with fieldwork scientists. This thesis has contributed to research into mobile computing, collaborative computing and geospatial systems by creating a simpler entry point to mobile geospatial applications, enabling simplified collaboration and providing tangible time savings.
Resumo:
Nitrous oxide (N2O) is a major greenhouse gas (GHG) product of intensive agriculture. Fertilizer nitrogen (N) rate is the best single predictor of N2O emissions in row-crop agriculture in the US Midwest. We use this relationship to propose a transparent, scientifically robust protocol that can be utilized by developers of agricultural offset projects for generating fungible GHG emission reduction credits for the emerging US carbon cap and trade market. By coupling predicted N2O flux with the recently developed maximum return to N (MRTN) approach for determining economically profitable N input rates for optimized crop yield, we provide the basis for incentivizing N2O reductions without affecting yields. The protocol, if widely adopted, could reduce N2O from fertilized row-crop agriculture by more than 50%. Although other management and environmental factors can influence N2O emissions, fertilizer N rate can be viewed as a single unambiguous proxy—a transparent, tangible, and readily manageable commodity. Our protocol addresses baseline establishment, additionality, permanence, variability, and leakage, and provides for producers and other stakeholders the economic and environmental incentives necessary for adoption of agricultural N2O reduction offset projects.
Resumo:
Many of the costs associated with greenfield residential development are apparent and tangible. For example, regulatory fees, government taxes, acquisition costs, selling fees, commissions and others are all relatively easily identified since they represent actual costs incurred at a given point in time. However, identification of holding costs are not always immediately evident since by contrast they characteristically lack visibility. One reason for this is that, for the most part, they are typically assessed over time in an ever-changing environment. In addition, wide variations exist in development pipeline components: they are typically represented from anywhere between a two and over sixteen years time period - even if located within the same geographical region. Determination of the starting and end points, with regards holding cost computation, can also prove problematic. Furthermore, the choice between application of prevailing inflation, or interest rates, or a combination of both over time, adds further complexity. Although research is emerging in these areas, a review of the literature reveals attempts to identify holding cost components are limited. Their quantification (in terms of relative weight or proportionate cost to a development project) is even less apparent; in fact, the computation and methodology behind the calculation of holding costs varies widely and in some instances completely ignored. In addition, it may be demonstrated that ambiguities exists in terms of the inclusion of various elements of holding costs and assessment of their relative contribution. Yet their impact on housing affordability is widely acknowledged to be profound, with their quantification potentially maximising the opportunities for delivering affordable housing. This paper seeks to build on earlier investigations into those elements related to holding costs, providing theoretical modelling of the size of their impact - specifically on the end user. At this point the research is reliant upon quantitative data sets, however additional qualitative analysis (not included here) will be relevant to account for certain variations between expectations and actual outcomes achieved by developers. Although this research stops short of cross-referencing with a regional or international comparison study, an improved understanding of the relationship between holding costs, regulatory charges, and housing affordability results.
Resumo:
In November 2009 the researcher embarked on a project aimed at reducing the amount of paper used by Queensland University of Technology (QUT) staff in their daily workplace activities. The key goal was to communicate to staff that excessive printing has a tangible and negative effect on their workplace and local environment. The research objective was to better understand what motivates staff towards more ecologically sustainable printing practises, whilst meeting their job’s demands. The current study is built on previous research that found that one interface does not address the needs of all users when creating persuasive Human Computer Interaction (HCI) interventions targeting resource consumption. In response, the current study created and trialled software that communicates individual paper consumption in precise metrics. Based on preliminary research data different metric sets have been defined to address the different motivations and beliefs of user archetypes using descriptive and injunctive normative information.
Resumo:
Most infrastructure project developments are complex in nature, particularly in the planning phase. During this stage, many vague alternatives are tabled - from the strategic to operational level. Human judgement and decision making are characterised by biases, errors and the use of heuristics. These factors are intangible and hard to measure because they are subjective and qualitative in nature. The problem with human judgement becomes more complex when a group of people are involved. The variety of different stakeholders may cause conflict due to differences in personal judgements. Hence, the available alternatives increase the complexities of the decision making process. Therefore, it is desirable to find ways of enhancing the efficiency of decision making to avoid misunderstandings and conflict within organisations. As a result, numerous attempts have been made to solve problems in this area by leveraging technologies such as decision support systems. However, most construction project management decision support systems only concentrate on model development and neglect fundamentals of computing such as requirement engineering, data communication, data management and human centred computing. Thus, decision support systems are complicated and are less efficient in supporting the decision making of project team members. It is desirable for decision support systems to be simpler, to provide a better collaborative platform, to allow for efficient data manipulation, and to adequately reflect user needs. In this chapter, a framework for a more desirable decision support system environment is presented. Some key issues related to decision support system implementation are also described.
Resumo:
The purpose of this thesis is to outline the relationship that existed in the past and exists in the present, between Australians and the War Graves and Memorials to the Missing. commemorations of Australians who died during the First World War. Their final resting places are scattered all over the world and provide a tangible record of the sacrifice of men and women in the war, and represent the final result by Official Agencies such as the Imperial, and later, Commonwealth War Graves Commission, and its agency representative, the Office of Australian War Graves, of an attempt to appropriately commemorate them. The study follows the path of history from the event of death of an individual in the First World War, through their burial; temporary grave or memorial commemoration; the permanent commemoration; the family and public reaction to the deaths; how the Official Agencies of related Commonwealth Governments dealt with the dead; and finally, how the Australian dead are represented on the battlefields of the world in the 21st century. Australia.s war dead of the First World War are scattered around the globe in more than 40 countries and are represented in war cemeteries and civil cemeteries; and listed on large „Memorials to the Missing., which commemorate the individuals devoid of a known graves or final resting place.
Resumo:
The authors currently engage in two projects to improve human-computer interaction (HCI) designs that can help conserve resources. The projects explore motivation and persuasion strategies relevant to ubiquitous computing systems that bring real-time consumption data into the homes and hands of residents in Brisbane, Australia. The first project seeks to increase understanding among university staff of the tangible and negative effects that excessive printing has on the workplace and local environment. The second project seeks to shift attitudes toward domestic energy conservation through software and hardware that monitor real-time, in situ electricity consumption in homes across Queensland. The insights drawn from these projects will help develop resource consumption user archetypes, providing a framework linking people to differing interface design requirements.
Resumo:
This paper presents research in response to environmental concerns we face today. In a search for a better method to manage spaces and building resources consumed excessively through traditional top-down architectural solutions, the research began by speculating that the building spaces and resources can be managed by designing architectural systems that encourage a bottom-up approach. In other words, this research investigates how to design systems that encourage occupants and users of buildings to actively understand, manage and customise their own spaces. Specific attention is paid to the participation of building users because no matter how sophisticated the system is, the building will become as wasteful as conventional buildings if users cannot, or do not want to, utilise the system effectively. The research is still in its early stages. The intension of this paper is to provide a background to the issue, discuss researches and projects relevant to, but not necessarily about, architecture, and introduce a number of hypothesis and investigations to realise adaptable, participatory and sustainable environments for users.
Resumo:
Rationale, aims and objectives: Patient preference for interventions aimed at preventing in-hospital falls has not previously been investigated. This study aims to contrast the amount patients are willing to pay to prevent falls through six intervention approaches. ----- ----- Methods: This was a cross-sectional willingness-to-pay (WTP), contingent valuation survey conducted among hospital inpatients (n = 125) during their first week on a geriatric rehabilitation unit in Queensland, Australia. Contingent valuation scenarios were constructed for six falls prevention interventions: a falls consultation, an exercise programme, a face-to-face education programme, a booklet and video education programme, hip protectors and a targeted, multifactorial intervention programme. The benefit to participants in terms of reduction in risk of falls was held constant (30% risk reduction) within each scenario. ----- ----- Results: Participants valued the targeted, multifactorial intervention programme the highest [mean WTP (95% CI): $(AUD)268 ($240, $296)], followed by the falls consultation [$215 ($196, $234)], exercise [$174 ($156, $191)], face-to-face education [$164 ($146, $182)], hip protector [$74 ($62, $87)] and booklet and video education interventions [$68 ($57, $80)]. A ‘cost of provision’ bias was identified, which adversely affected the valuation of the booklet and video education intervention. ----- ----- Conclusion: There may be considerable indirect and intangible costs associated with interventions to prevent falls in hospitals that can substantially affect patient preferences. These costs could substantially influence the ability of these interventions to generate a net benefit in a cost–benefit analysis.
Resumo:
The chapters of this book form a persuasive chorus of social practices that advocate the use of music to build a capacity for resilience in individuals and groups. As a whole they exemplify music projects that share common features aligned with an ecological view of reform in health, education and social work systems. Internationally renowned and early career academics have collaborated with practitioners to sing ‘Songs of Resilience’; some of which are narratives that report on the effects of music practices for a general population, and some are based on a specific approach, genre or service. Others are quite literally ‘songs’ that demonstrate aspects of resilience in action. The book makes the connection between music and resilience explicit by posing the following questions—Do music projects in education, health and social services build a measurable capacity for resilience amongst individuals? Can we replicate these projects’ outcomes to develop a capacity for resilience in diverse cultural groups? Does shared use of the term ‘resilience’ help to secure funding for innovative musical activities that provide tangible health, education and social outcomes?
Resumo:
“What did you think you were doing?” Was the question posed by the conference organizers to me as the inventor and constructor of the first working Tangible Interfaces over 40 years ago. I think the question was intended to encourage me to talk about the underlying ideas and intentionality rather than describe an endless sequence of electronic bricks and that is what I shall do in this presentation. In the sixties the prevalent idea for a graphics interface was an analogue with sketching which was to somehow be understood by the computer as three dimensional form. I rebelled against this notion for reasons which I will explain in the presentation and instead came up with tangible physical three dimensional intelligent objects. I called these first prototypes “Intelligent Physical Modelling Systems” which is a really dumb name for an obvious concept. I am eternally grateful to Hiroshi Ishii for coining the term “Tangible User Interfaces” - the same idea but with a much smarter name. Another motivator was user involvement in the design process, and that led to the Generator (1979) project with Cedric Price for the world’s first intelligent building capable of organizing itself in response to the appetites of the users. The working model of that project is in MoMA. And the same motivation led to a self builders design kit (1980) for Walter Segal which facilitated self-builders to design their own houses. And indeed as the organizer’s question implied, the motivation and intentionality of these projects developed over the years in step with advancing technology. The speaker will attempt to articulate these changes with medical, psychological and educational examples. Much of this later work indeed stemming from the Media Lab where we are talking. Related topics such as “tangible thinking” and “intelligent teacups” will be introduced and the presentation will end with some speculations for the future. The presentation will be given against a background of images of early prototypes many of which have never been previously published.
Resumo:
As a result of a broad invitation extended by Professor Martin Betts, Executive Dean of the Faculty of Built Environment and Engineering, to the community of interest at QUT, a cross-disciplinary collaborative workshop was conducted to contribute ideas about responding to the Government of India’s urgent requirement to implement a program to re-house slum dwellers. This is a complex problem facing the Indian Ministry of Housing. Not only does the government aspire to eradicate existing slum conditions and to achieve tangible results within five years, but it must also ensure that slums do not form in the future. The workshop focused on technological innovation in construction to deliver transformation from the current unsanitary and overcrowded informal urban settlements to places that provide the economically weaker sections of Indian society with healthy, environmentally sustainable, economically viable mass housing that supports successful urban living. The workshop was conducted in two part process as follows: Initially, QUT academics from diverse fields shared current research and provided technical background to contextualise the challenge at a pre-workshop briefing session. This was followed by a one-day workshop during which participants worked intensively in multi-disciplinary groups through a series of exercises to develop innovative approaches to the complex problem of slum redevelopment. Dynamic, compressed work sessions, interspersed with cross-functional review and feedback by the whole group took place throughout the day. Reviews emphasised testing the concepts for their level of complexity, and likelihood of success. The two-stage workshop process achieved several objectives: Inspired a sense of shared purpose amongst a diverse group of academics Built participants’ knowledge of each other’s capacity Engaged multi disciplinary team in an innovative design research process Built participants’ confidence in the collaborative process Demonstrated that collaborative problem solving can create solutions that represent transformative change. Developed a framework of how workable solutions might be developed for the program through follow up workshops and charrettes of a similar nature involving stakeholders drawn from the context of the slum housing program management.
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.