978 resultados para Defense information, Classified
Resumo:
Most infrastructure project developments are complex in nature, particularly in the planning phase. During this stage, many vague alternatives are tabled - from the strategic to operational level. Human judgement and decision making are characterised by biases, errors and the use of heuristics. These factors are intangible and hard to measure because they are subjective and qualitative in nature. The problem with human judgement becomes more complex when a group of people are involved. The variety of different stakeholders may cause conflict due to differences in personal judgements. Hence, the available alternatives increase the complexities of the decision making process. Therefore, it is desirable to find ways of enhancing the efficiency of decision making to avoid misunderstandings and conflict within organisations. As a result, numerous attempts have been made to solve problems in this area by leveraging technologies such as decision support systems. However, most construction project management decision support systems only concentrate on model development and neglect fundamentals of computing such as requirement engineering, data communication, data management and human centred computing. Thus, decision support systems are complicated and are less efficient in supporting the decision making of project team members. It is desirable for decision support systems to be simpler, to provide a better collaborative platform, to allow for efficient data manipulation, and to adequately reflect user needs. In this chapter, a framework for a more desirable decision support system environment is presented. Some key issues related to decision support system implementation are also described.
Resumo:
The trend of diminished funding, demands for greater efficiency and higher public accountability have led to a rapid expansion of interest in the bibliometric assessment of research performance of universities. A pilot research is conducted to provide a preliminary overview of the research performance of the building and construction schools or departments through the analysis of bibliometric indicators including the journal impact factor (JIF) published by Institute for Scientific Information (ISI). The suitability of bibliometric evaluation approaches as a measure of research quality in building and construction management research field is discussed.
Resumo:
With an increasing level of collaboration amongst researchers, software developers and industry practitioners in the past three decades, building information modelling (BIM) is now recognized as an emerging technological and procedural shift within the architect, engineering and construction (AEC) industry. BIM is not only considered as a way to make a profound impact on the professions of AEC, but is also regarded as an approach to assist the industry to develop new ways of thinking and practice. Despite the widespread development and recognition of BIM, a succinct and systematic review of the existing BIM research and achievement is scarce. It is also necessary to take stock on existing applications and have a fresh look at where BIM should be heading and how it can benefit from the advances being made. This paper first presents a review of BIM research and achievement in AEC industry. A number of suggestions are then made for future research in BIM. This paper maintains that the value of BIM during design and construction phases is well documented over the last decade, and new research needs to expand the level of development and analysis from design/build stage to postconstruction and facility asset management. New research in BIM could also move beyond the traditional building type to managing the broader range of facilities and built assets and providing preventative maintenance schedules for sustainable and intelligent buildings
Resumo:
An essential challenge for organizations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. Using three case examples, this paper explores how Enterprise 2.0 technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering organizations. The paper is intended to be a timely introduction to the benefits and issues associated with the use of Enterprise 2.0 technologies with the aim of achieving the positive outcomes associated with knowledge management
Resumo:
The paper examines the fallout of the Lehman Brothers collapse in Hong Kong. As an international financial hub in Asia, Hong Kong was profoundly affected by the collapse of this company. As a result, it impacted negatively on the public’s confidence in the Hong Kong’s banking sector. Furthermore, this event has exposed a number of regulatory deficiencies in Hong Kong. In response to this financial crisis, the Hong Kong government had made an unprecedented move to negotiate with local banks to refund the investors. In addition, the government has also sought public consultation on proposal to enhance the regulation of the sale of financial products. This paper argues that there needs to be amendments to the prevailing laws and the inclusions of legal rules to back up those proposed measures so that the disclosed information from the financial institution will not mislead the investors or misrepresent the products offered.
Resumo:
Wireless Multi-media Sensor Networks (WMSNs) have become increasingly popular in recent years, driven in part by the increasing commoditization of small, low-cost CMOS sensors. As such, the challenge of automatically calibrating these types of cameras nodes has become an important research problem, especially for the case when a large quantity of these type of devices are deployed. This paper presents a method for automatically calibrating a wireless camera node with the ability to rotate around one axis. The method involves capturing images as the camera is rotated and computing the homographies between the images. The camera parameters, including focal length, principal point and the angle and axis of rotation can then recovered from two or more homographies. The homography computation algorithm is designed to deal with the limited resources of the wireless sensor and to minimize energy con- sumption. In this paper, a modified RANdom SAmple Consensus (RANSAC) algorithm is proposed to effectively increase the efficiency and reliability of the calibration procedure.
Resumo:
Thoroughly revised and updated, this popular book provides a comprehensive yet easy to read guide to modern contact lens practice. Beautifully re-designed in a clean, contemporary layout, this second edition presents relevant and up-to-date information in a systematic manner, with a logical flow of subject matter from front to back. This book wonderfully captures the ‘middle ground’ in the contact lens field … somewhere between a dense research-based tome and a basic fitting guide. As such, it is ideally suited for both students and general eye care practitioners who require a practical, accessible and uncluttered account of the contact lens field. Contents Part 1 Introduction Historical perspective. The anterior eye Visual optics Clinical instruments Part 2 Soft contact lenses Soft lens materials Soft lens manufacture Soft lens optics Soft lens measurement Soft lens design and fitting Soft toric lens design and fitting Soft lens care systems Part 3 Rigid contact lenses Rigid lens materials Rigid lens manufacture Rigid lens optics Rigid lens measurement Rigid lens design and fitting Rigid toric lens design and fitting Rigid lens care systems Part 4 Lens replacement modalities Unplanned lens replacement Daily soft lens replacement Planned soft lens replacement Planned rigid lens replacement Part 5 Special lenses and fitting considerations Scleral lenses Tinted lenses Presbyopia Continuous wear Sport Keratoconus High ametropia Paediatric fitting Therapeutic applications Post-refractive Surgery Post-keratoplasty Orthokeratology Diabetes Part 6 Patient examination and management History taking Preliminary examination Patient education Aftercare Complications Digital imaging Compliance Practice management Appendices Index
Resumo:
We advance the proposition that dynamic stochastic general equilibrium (DSGE) models should not only be estimated and evaluated with full information methods. These require that the complete system of equations be specified properly. Some limited information analysis, which focuses upon specific equations, is therefore likely to be a useful complement to full system analysis. Two major problems occur when implementing limited information methods. These are the presence of forward-looking expectations in the system as well as unobservable non-stationary variables. We present methods for dealing with both of these difficulties, and illustrate the interaction between full and limited information methods using a well-known model.
Resumo:
Consumer personal information is now a valuable commodity for most corporations. Concomitant with increased value is the expansion of new legal obligations to protect personal information. Mandatory data breach notification laws are an important new development in this regard. Such laws require a corporation that has suffered a data breach, which involves personal information, such as a computer hacking incident, to notify those persons who may have been affected by the breach. Regulators may also need to be notified. Australia currently does not have a mandatory data breach notification law but this may be about to change. The Australian Law Reform Commission has suggested that a data breach notification scheme be implemented through the Privacy Act 1988 (Cth). However, the notification of data breaches may already be required under the continuous disclosure regime stipulated by the Corporations Act 2001 (Cth) and the Australian Stock Exchange (ASX) Listing Rules. Accordingly, this article examines whether the notification of data breaches is a statutory requirement of the existing continuous disclosure regime and whether the ASX should therefore be notified of such incidents.
Resumo:
Mandatory data breach notification has become a matter of increasing concern for law reformers. In Australia, this issue was recently addressed as part of a comprehensive review of privacy law conducted by the Australian Law Reform Commission (ALRC) which recommended a uniform national regime for protecting personal information applicable to both the public and private sectors. As in all federal systems, the distribution of powers between central and state governments poses problems for national consistency. In the authors’ view, a uniform approach to mandatory data breach notification has greater merit than a ‘jurisdiction specific’ approach epitomized by US state-based laws. The US response has given rise to unnecessary overlaps and inefficiencies as demonstrated by a review of different notification triggers and encryption safe harbors. Reviewing the US response, the authors conclude that a uniform approach to data breach notification is inherently more efficient.
Resumo:
Type unions, pointer variables and function pointers are a long standing source of subtle security bugs in C program code. Their use can lead to hard-to-diagnose crashes or exploitable vulnerabilities that allow an attacker to attain privileged access over classified data. This paper describes an automatable framework for detecting such weaknesses in C programs statically, where possible, and for generating assertions that will detect them dynamically, in other cases. Exclusively based on analysis of the source code, it identifies required assertions using a type inference system supported by a custom made symbol table. In our preliminary findings, our type system was able to infer the correct type of unions in different scopes, without manual code annotations or rewriting. Whenever an evaluation is not possible or is difficult to resolve, appropriate runtime assertions are formed and inserted into the source code. The approach is demonstrated via a prototype C analysis tool.
Resumo:
Groundwater is increasingly recognised as an important yet vulnerable natural resource, and a key consideration in water cycle management. However, communication of sub-surface water system behaviour, as an important part of encouraging better water management, is visually difficult. Modern 3D visualisation techniques can be used to effectively communicate these complex behaviours to engage and inform community stakeholders. Most software developed for this purpose is expensive and requires specialist skills. The Groundwater Visualisation System (GVS) developed by QUT integrates a wide range of surface and sub-surface data, to produce a 3D visualisation of the behaviour, structure and connectivity of groundwater/surface water systems. Surface data (elevation, surface water, land use, vegetation and geology) and data collected from boreholes (bore locations and subsurface geology) are combined to visualise the nature, structure and connectivity of groundwater/surface water systems. Time-series data (water levels, groundwater quality, rainfall, stream flow and groundwater abstraction) is displayed as an animation within the 3D framework, or graphically, to show water system condition changes over time. GVS delivers an interactive, stand-alone 3D Visualisation product that can be used in a standard PC environment. No specialised training or modelling skills are required. The software has been used extensively in the SEQ region to inform and engage both water managers and the community alike. Examples will be given of GVS visualisations developed in areas where there have been community concerns around groundwater over-use and contamination.
Resumo:
Most information retrieval (IR) models treat the presence of a term within a document as an indication that the document is somehow "about" that term, they do not take into account when a term might be explicitly negated. Medical data, by its nature, contains a high frequency of negated terms - e.g. "review of systems showed no chest pain or shortness of breath". This papers presents a study of the effects of negation on information retrieval. We present a number of experiments to determine whether negation has a significant negative affect on IR performance and whether language models that take negation into account might improve performance. We use a collection of real medical records as our test corpus. Our findings are that negation has some affect on system performance, but this will likely be confined to domains such as medical data where negation is prevalent.