848 resultados para incomplete information
Resumo:
With an increasing level of collaboration amongst researchers, software developers and industry practitioners in the past three decades, building information modelling (BIM) is now recognized as an emerging technological and procedural shift within the architect, engineering and construction (AEC) industry. BIM is not only considered as a way to make a profound impact on the professions of AEC, but is also regarded as an approach to assist the industry to develop new ways of thinking and practice. Despite the widespread development and recognition of BIM, a succinct and systematic review of the existing BIM research and achievement is scarce. It is also necessary to take stock on existing applications and have a fresh look at where BIM should be heading and how it can benefit from the advances being made. This paper first presents a review of BIM research and achievement in AEC industry. A number of suggestions are then made for future research in BIM. This paper maintains that the value of BIM during design and construction phases is well documented over the last decade, and new research needs to expand the level of development and analysis from design/build stage to postconstruction and facility asset management. New research in BIM could also move beyond the traditional building type to managing the broader range of facilities and built assets and providing preventative maintenance schedules for sustainable and intelligent buildings
Resumo:
This paper raises some questions about teaching and teacher education in the social sciences in response to the decision to implement a national curriculum in Australia. In particular, it contends that the decision to focus on discipline-specific knowledge in the social sciences will not necessarily meet the hopes of the Melbourne Declaration and deliver a 21st century curriculum that prepares students for the future. In doing so, it suggests that social educators need to engage with the broader discourse and political context shaping the push for curriculum reform in Australia and makes reference to the marginalisation of civics and citizenship education in the latest draft of the Australian curriculum: History.
Resumo:
We advance the proposition that dynamic stochastic general equilibrium (DSGE) models should not only be estimated and evaluated with full information methods. These require that the complete system of equations be specified properly. Some limited information analysis, which focuses upon specific equations, is therefore likely to be a useful complement to full system analysis. Two major problems occur when implementing limited information methods. These are the presence of forward-looking expectations in the system as well as unobservable non-stationary variables. We present methods for dealing with both of these difficulties, and illustrate the interaction between full and limited information methods using a well-known model.
Contextualizing the tensions and weaknesses of information privacy and data breach notification laws
Resumo:
Data breach notification laws have detailed numerous failures relating to the protection of personal information that have blighted both corporate and governmental institutions. There are obvious parallels between data breach notification and information privacy law as they both involve the protection of personal information. However, a closer examination of both laws reveals conceptual differences that give rise to vertical tensions between each law and shared horizontal weaknesses within both laws. Tensions emanate from conflicting approaches to the implementation of information privacy law that results in different regimes and the implementation of different types of protections. Shared weaknesses arise from an overt focus on specified types of personal information which results in ‘one size fits all’ legal remedies. The author contends that a greater contextual approach which promotes the importance of social context is required and highlights the effect that contextualization could have on both laws.
Resumo:
The concept of sustainable urban development has been pushed to the forefront of policy-making and politics as the world wakes up to the impacts of climate change and the effects of modern urban lifestyles. Today, sustainable development has become a very prominent element in the day-to-day debate on urban policy and the expression of that policy in urban planning and development decisions. As a result of this, during the last few years, sustainable development automation applications such as sustainable urban development decision support systems have become popular tools as they offer new opportunities for local governments to realise their sustainable development agendas. This chapter explores a range of issues associated with the application of information and communication technologies and decision support systems in the process of underpinning sustainable urban development. The chapter considers how information and communication technologies can be applied to enhance urban planning, raise environmental awareness, share decisions and improve public participation. It introduces and explores three web-based geographical information systems projects as best practice. These systems are developed as support tools to include public opinion in the urban planning and development processes, and to provide planners with comprehensive tools for the analysis of sustainable urban development variants in order to prepare the best plans for constructing sustainable urban communities and futures.
Resumo:
Most information retrieval (IR) models treat the presence of a term within a document as an indication that the document is somehow "about" that term, they do not take into account when a term might be explicitly negated. Medical data, by its nature, contains a high frequency of negated terms - e.g. "review of systems showed no chest pain or shortness of breath". This papers presents a study of the effects of negation on information retrieval. We present a number of experiments to determine whether negation has a significant negative affect on IR performance and whether language models that take negation into account might improve performance. We use a collection of real medical records as our test corpus. Our findings are that negation has some affect on system performance, but this will likely be confined to domains such as medical data where negation is prevalent.
Resumo:
The development of locally-based healthcare initiatives, such as community health coalitions that focus on capacity building programs and multi-faceted responses to long-term health problems, have become an increasingly important part of the public health landscape. As a result of their complexity and the level of investment, it has become necessary to develop innovative ways to help manage these new healthcare approaches. Geographical Information Systems (GIS) have been suggested as one of the innovative approaches that will allow community health coalitions to better manage and plan their activities. The focus of this paper is to provide a commentary on the use of GIS as a tool for community coalitions and discuss some of the potential benefits and issues surrounding the development of these tools.
Informed learning: a pedagogical construct attending simultaneously to information use and learning.
Resumo:
The idea of informed learning, applicable in academic, workplace and community settings, has been derived largely from a program of phenomenographic research in the field of information literacy, which has illuminated the experience of using information to learn. Informed learning is about simultaneous attention to information use and learning, where both information and learning are considered to be relational; and is built upon a series of key concepts such as second–order perspective, simultaneity, awareness, and relationality. Informed learning also relies heavily on reflection as a strategy for bringing about learning. As a pedagogical construct, informed learning supports inclusive curriculum design and implementation. This paper reports aspects of the informed learning research agenda which are currently being pursued at the Queensland University of Technology (QUT). The first part elaborates the idea of informed learning, examines the key concepts underpinning this pedagogical construct, and explains its emergence from the research base of the QUT Information Studies research team. The second presents a case, which demonstrates the ongoing development of informed learning theory and practice, through the development of inclusive informed learning for a culturally diverse higher education context.
Resumo:
This article explores the quality of accounting information in listed family firms. The authors exploit the features of the Italian equitymarket characterizd by high ownership concentration across all tpes of firms to disentangle the effects of family ownership from other major block holders on the quality of accounting information. The findings document that family firms convey financial information of higher quality compared to the nonfamily peers. Furthermore the authors provide evidence that the determinants of accounting quality differ across family and nonfamily firms.
Resumo:
This paper presents a novel method for remaining useful life prediction using the Elliptical Basis Function (EBF) network and a Markov chain. The EBF structure is trained by a modified Expectation-Maximization (EM) algorithm in order to take into account the missing covariate set. No explicit extrapolation is needed for internal covariates while a Markov chain is constructed to represent the evolution of external covariates in the study. The estimated external and the unknown internal covariates constitute an incomplete covariate set which are then used and analyzed by the EBF network to provide survival information of the asset. It is shown in the case study that the method slightly underestimates the remaining useful life of an asset which is a desirable result for early maintenance decision and resource planning.
Resumo:
The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.
Resumo:
Aim. This paper is a report of a review conducted to identify (a) best practice in information transfer from the emergency department for multi-trauma patients; (b) conduits and barriers to information transfer in trauma care and related settings; and (c) interventions that have an impact on information communication at handover and beyond. Background. Information transfer is integral to effective trauma care, and communication breakdown results in important challenges to this. However, evidence of adequacy of structures and processes to ensure transfer of patient information through the acute phase of trauma care is limited. Data sources. Papers were sourced from a search of 12 online databases and scanning references from relevant papers for 1990–2009. Review methods. The review was conducted according to the University of York’s Centre for Reviews and Dissemination guidelines. Studies were included if they concerned issues that influenced information transfer for patients in healthcare settings. Results. Forty-five research papers, four literature reviews and one policy statement were found to be relevant to parts of the topic, but not all of it. The main issues emerging concerned the impact of communication breakdown in some form, and included communication issues within trauma team processes, lack of structure and clarity during handovers including missing, irrelevant and inaccurate information, distractions and poorly documented care. Conclusion. Many factors influence information transfer but are poorly identified in relation to trauma care. The measurement of information transfer, which is integral to patient handover, has not been the focus of research to date. Nonetheless, documented patient information is considered evidence of care and a resource that affects continuing care.
Resumo:
Information Overload and Mismatch are two fundamental problems affecting the effectiveness of information filtering systems. Even though both term-based and patternbased approaches have been proposed to address the problems of overload and mismatch, neither of these approaches alone can provide a satisfactory solution to address these problems. This paper presents a novel two-stage information filtering model which combines the merits of term-based and pattern-based approaches to effectively filter sheer volume of information. In particular, the first filtering stage is supported by a novel rough analysis model which efficiently removes a large number of irrelevant documents, thereby addressing the overload problem. The second filtering stage is empowered by a semantically rich pattern taxonomy mining model which effectively fetches incoming documents according to the specific information needs of a user, thereby addressing the mismatch problem. The experimental results based on the RCV1 corpus show that the proposed twostage filtering model significantly outperforms the both termbased and pattern-based information filtering models.
Resumo:
In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. This article will discuss a research project that fills this gap. Funded by the Australian Learning and Teaching Council, the project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. This article will provide a detailed discussion on each of these themes. The study’s findings also suggest that “librarian 2.0” is a state of mind, and that the Australian LIS profession is undergoing a significant shift in “attitude.”