857 resultados para Mapping And Monitoring


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The trans-locative potential of the Internet has driven the design of many online applications. Online communities largely cluster around topics of interest, which take precedence over participants’ geographical locations. The site of production is often disregarded when creative content appears online. However, for some, a sense of place is a defining aspect of creativity. Yet environments that focus on the display and sharing of regionally situated content have, so far, been largely overlooked. Recent developments in geo-technologies have precipitated the emergence of a new field of interactive media. Entitled locative media, it emphasizes the geographical context of media. This paper argues that we might combine practices of locative media (experiential mapping and geo-spatial annotation) with aspects of online participatory culture (uploading, file-sharing and search categorization) to produce online applications that support geographically ‘located’ communities. It discusses the design considerations and possibilities of this convergence,making reference to an example, OurPlace 3G to 3D, which has to date been developed as a prototype.1 It goes on to discuss the benefits and potential uses of such convergent applications, including the co-production of spatial- emporal narratives of place.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This project aims to develop a methodology for designing and conducting a systems engineering analysis to build and fly continuously, day and night, propelled uniquely by solar energy for one week with a 0.25Kg payload consuming 0.5 watt without fuel or pollution. An airplane able to fly autonomously for many days could find many applications. Including coastal or border surveillance, atmospherical and weather research and prediction, environmental, forestry, agricultural, and oceanic monitoring, imaging for the media and real-estate industries, etc. Additional advantages of solar airplanes are their low cost and the simplicity with which they can be launched. For example, in the case of potential forest fire risks during a warm and dry period, swarms of solar airplanes, easily launched with the hand, could efficiently monitor a large surface, reporting rapidly any fire starts. This would allow a fast intervention and thus reduce the cost of such disaster, in terms of human and material losses. At higher dimension, solar HALE platforms are expected to play a major role as communication relays and could replace advantageously satellites in a near future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RatSLAM is a vision-based SLAM system based on extended models of the rodent hippocampus. RatSLAM creates environment representations that can be processed by the experience mapping algorithm to produce maps suitable for goal recall. The experience mapping algorithm also allows RatSLAM to map environments many times larger than could be achieved with a one to one correspondence between the map and environment, by reusing the RatSLAM maps to represent multiple sections of the environment. This paper describes experiments investigating the effects of the environment-representation size ratio and visual ambiguity on mapping and goal navigation performance. The experiments demonstrate that system performance is weakly dependent on either parameter in isolation, but strongly dependent on their joint values.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When asking the question, ``How can institutions design science policies for the benefit of decision makers?'' Sarewitz and Pielke Sarewitz, D., Pielke Jr., R.A., this issue. The neglected heart of science policy: reconciling supply of and demand for science. Environ. Sci. Policy 10] posit the idea of ``reconciling supply and demand of science'' as a conceptual tool for assessment of science programs. We apply the concept to the U.S. Department of Agriculture's (USDA) carbon cycle science program. By evaluating the information needs of decision makers, or the ``demand'', along with the supply of information by the USDA, we can ascertain where matches between supply and demand exist, and where science policies might miss opportunities. We report the results of contextual mapping and of interviews with scientists at the USDA to evaluate the production and use of current agricultural global change research, which has the stated goal of providing ``optimal benefit'' to decision makers on all levels. We conclude that the USDA possesses formal and informal mechanisms by which scientists evaluate the needs of users, ranging from individual producers to Congress and the President. National-level demands for carbon cycle science evolve as national and international policies are explored. Current carbon cycle science is largely derived from those discussions and thus anticipates the information needs of producers. However, without firm agricultural carbon policies, such information is currently unimportant to producers. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diabetic peripheral neuropathy (DPN) is one of the most debilitating complications of diabetes. DPN is a major cause of foot ulceration and lower limb amputation. Early diagnosis and management is a key factor in reducing morbidity and mortality. Current techniques for clinical assessment of DPN are relatively insensitive for detecting early disease or involve invasive procedures such as skin biopsies. There is a need for less painful, non-invasive and safe evaluation methods. Eye care professionals already play an important role in the management of diabetic retinopathy; however recent studies have indicated that the eye may also be an important site for the diagnosis and monitoring of neuropathy. Corneal nerve morphology has been shown to be a promising marker of diabetic neuropathy occurring elsewhere in the body, and emerging evidence tentatively suggests that retinal anatomical markers and a range of functional visual indicators could similarly provide useful information regarding neural damage in diabetes – although this line of research is, as yet, less well established. This review outlines the growing body of evidence supporting a potential diagnostic role for retinal structure and visual functional markers in the diagnosis and monitoring of peripheral neuropathy in diabetes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) completed an Australian National Data Service (ANDS) funded “Seeding the Commons Project” to contribute metadata to Research Data Australia. The project employed two Research Data Librarians from October 2009 through to July 2010. Technical support for the project was provided by QUT’s High Performance Computing and Research Support Specialists. ---------- The project identified and described QUT’s category 1 (ARC / NHMRC) research datasets. Metadata for the research datasets was stored in QUT’s Research Data Repository (Architecta Mediaflux). Metadata which was suitable for inclusion in Research Data Australia was made available to the Australian Research Data Commons (ARDC) in RIF-CS format. ---------- Several workflows and processes were developed during the project. 195 data interviews took place in connection with 424 separate research activities which resulted in the identification of 492 datasets. ---------- The project had a high level of technical support from QUT High Performance Computing and Research Support Specialists who developed the Research Data Librarian interface to the data repository that enabled manual entry of interview data and dataset metadata, creation of relationships between repository objects. The Research Data Librarians mapped the QUT metadata repository fields to RIF-CS and an application was created by the HPC and Research Support Specialists to generate RIF-CS files for harvest by the Australian Research Data Commons (ARDC). ---------- This poster will focus on the workflows and processes established for the project including: ---------- • Interview processes and instruments • Data Ingest from existing systems (including mapping to RIF-CS) • Data entry and the Data Librarian interface to Mediaflux • Verification processes • Mapping and creation of RIF-CS for the ARDC

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article reports on a research program that has developed new methodologies for mapping the Australian blogosphere and tracking how information is disseminated across it. The authors improve on conventional web crawling methodologies in a number of significant ways: First, the authors track blogging activity as it occurs, by scraping new blog posts when such posts are announced through Really Simple Syndication (RSS) feeds. Second, the authors use custom-made tools that distinguish between the different types of content and thus allow us to analyze only the salient discursive content provided by bloggers. Finally, the authors are able to examine these better quality data using both link network mapping and textual analysis tools, to produce both cumulative longer term maps of interlinkages and themes, and specific shorter term snapshots of current activity that indicate current clusters of heavy interlinkage and highlight their key themes. In this article, the authors discuss findings from a yearlong observation of the Australian political blogosphere, suggesting that Australian political bloggers consistently address current affairs, but interpret them differently from mainstream news outlets. The article also discusses the next stage of the project, which extends this approach to an examination of other social networks used by Australians, including Twitter, YouTube, and Flickr. This adaptation of our methodology moves away from narrow models of political communication, and toward an investigation of everyday and popular communication, providing a more inclusive and detailed picture of the Australian networked public sphere.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The first use of computing technologies and the development of land use models in order to support decision-making processes in urban planning date back to as early as mid 20th century. The main thrust of computing applications in urban planning is their contribution to sound decision-making and planning practices. During the last couple of decades many new computing tools and technologies, including geospatial technologies, are designed to enhance planners' capability in dealing with complex urban environments and planning for prosperous and healthy communities. This chapter, therefore, examines the role of information technologies, particularly internet-based geographic information systems, as decision support systems to aid public participatory planning. The chapter discusses challenges and opportunities for the use of internet-based mapping application and tools in collaborative decision-making, and introduces a prototype internet-based geographic information system that is developed to integrate public-oriented interactive decision mechanisms into urban planning practice. This system, referred as the 'Community-based Internet GIS' model, incorporates advanced information technologies, distance learning, sustainable urban development principles and community involvement techniques in decision-making processes, and piloted in Shibuya, Tokyo, Japan.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Overloaded truck traffic is a significant problem on highways around the world. Developing countries in particular, overloaded truck traffic causes large amounts of unexpected expenditure in terms of road maintenance because of premature pavement damage. Overloaded truck traffic is a common phenomenon in developing countries, because of inefficient road management and monitoring systems. According to the available literature, many developing countries are facing the same problem, which is economic loss caused by the existence of overloaded trucks in the traffic stream. This paper summarizes the available literature, news reports, journal articles and traffic research regarding overloaded traffic. It examines the issue of overloading and the strategies and legislation used in developed countries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim: To review the management of heart failure in patients not enrolled in specialist multidisciplinary programs. Method: A prospective clinical audit of patients admitted to hospital with either a current or past diagnosis of heart failure and not enrolled in a specialist heart failure program or under the direct care of the cardiology unit. Results: 81 eligible patients were enrolled (1 August to 1 October 2008). The median age was 81 9.4 years and 48% were male. Most patients (63%) were in New York Heart Association Class II or Class III heart failure. On discharge, 59% of patients were prescribed angiotensin converting enzyme inhibitors and 43% were prescribed beta-blockers. During hospitalisation, 8.6% of patients with a past diagnosis of heart failure were started on an angiotensin converting enzyme inhibitor and 4.9% on a beta-blocker. There was evidence of suboptimal dosage on admission and discharge for angiotensin converting enzyme inhibitors (19% and 7.4%) and beta-blockers (29% and 17%). The results compared well with international reports regarding the under-treatment of heart failure. Conclusion: The demonstrated practice gap provides excellent opportunities for the involvement of pharmacists to improve the continuation of care for heart failure patients discharged from hospital in the areas of medication management review, dose titration and monitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we describe a body of work aimed at extending the reach of mobile navigation and mapping. We describe how running topological and metric mapping and pose estimation processes concurrently, using vision and laser ranging, has produced a full six-degree-of-freedom outdoor navigation system. It is capable of producing intricate three-dimensional maps over many kilometers and in real time. We consider issues concerning the intrinsic quality of the built maps and describe our progress towards adding semantic labels to maps via scene de-construction and labeling. We show how our choices of representation, inference methods and use of both topological and metric techniques naturally allow us to fuse maps built from multiple sessions with no need for manual frame alignment or data association.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Providing ongoing family centred support is an integral part of childhood cancer care. For families living in regional and remote areas, opportunities to receive specialist support are limited by the availability of health care professionals and accessibility, which is often reduced due to distance, time, cost and transport. The primary aim of this work is to investigate the cost-effectiveness of videotelephony to support regional and remote families returning home for the first time with a child newly diagnosed with cancer Methods/design We will recruit 162 paediatric oncology patients and their families to a single centre randomised controlled trial. Patients from regional and remote areas, classified by Accessibility/Remoteness Index of Australia (ARIA+) greater than 0.2, will be randomised to a videotelephone support intervention or a usual support control group. Metropolitan families (ARIA+ ≤ 0.2) will be recruited as an additional usual support control group. Families allocated to the videotelephone support intervention will have access to usual support plus education, communication, counselling and monitoring with specialist multidisciplinary team members via a videotelephone service for a 12-week period following first discharge home. Families in the usual support control group will receive standard care i.e., specialist multidisciplinary team members provide support either face-to-face during inpatient stays, outpatient clinic visits or home visits, or via telephone for families who live far away from the hospital. The primary outcome measure is parental health related quality of life as measured using the Medical Outcome Survey (MOS) Short Form SF-12 measured at baseline, 4 weeks, 8 weeks and 12 weeks. The secondary outcome measures are: parental informational and emotional support; parental perceived stress, parent reported patient quality of life and parent reported sibling quality of life, parental satisfaction with care, cost of providing improved support, health care utilisation and financial burden for families. Discussion This investigation will establish the feasibility, acceptability and cost-effectiveness of using videotelephony to improve the clinical and psychosocial support provided to regional and remote paediatric oncology patients and their families.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Crisis holds the potential for profound change in organizations and industries. The past 50 years of crisis management highlight key shifts in crisis practice, creating opportunities for multiple theories and research tracks. Defining crises such as Tylenol, Exxon Valdez, and September 11 terrorist attacks have influenced or challenged the principles of best practice of crisis communication in public relations. This study traces the development of crisis process and practice by identifying shifts in crisis research and models and mapping these against key management theories and practices. The findings define three crisis domains: crisis planning, building and testing predictive models, and mapping and measuring external environmental influences. These crisis domains mirror but lag the evolution of management theory, suggesting challenges for researchers to reshape the research agenda to close the gap and lead the next stage of development in the field of crisis communication for effective organizational outcomes.