1000 resultados para Reid, Robbie
Resumo:
As ‘The Architect’s Handbook of Professional Practice’ (cited by Riskus (2007) suggests, Building Information Modelling, or BIM, is “the use of virtual building information models to develop building design solutions, design documentation, and to analyse construction processes”. We would suggest such a definition, while useful, should be extended to include the operational phases of built assets (such as maintenance and decommissioning), and also be applied to the whole area of infrastructure. As a set of technologies, BIM holds promise to deliver benefits for the property, construction, and infrastructure management industries – particularly improved efficiencies and effectiveness through enhanced collaboration at all stages of the construction cycle. There are several important qualifiers, barriers, enablers, and some disadvantages with this suite of technologies. This report outlines the costs and benefits enablers and barriers associated with BIM, and makes suggestions about how these issues may be addressed.
Resumo:
Teaching literacy requires accurate and current knowledge in the field (Commonwealth of Australia, 2005). There have been persistent inquiries into what constitutes specialist knowledge and skills for teaching students to be literate. Preservice teacher education is fundamental to literacy development, which includes the approaches universities employ to prepare graduates for teaching literacy. Indeed, preservice teacher programs and literacy education also elicit insatiable media coverage. There is a continued push to improve literacy outcomes for school students across the nation and prepare the literacy knowledge and skills of Australian teachers. This study mainly focuses on 10 final-year preservice teachers attending a regional university campus who volunteered for further experiences to teach students to read traditional texts. These preservice teachers completed three university literacy units before commencing with practical applications. A literacy program, titled Reading Squadron, was developed in partnership between a local primary school and the university. Primary students were identified by the school as requiring literacy support. Preservice teachers attended a whole day training session run by school staff at the university and then visited the school for two one-hour sessions each week over a six-week period. Each preservice teacher was assigned two students and worked with each student for half an hour twice a week. The aim of this small-scale qualitative study was to investigate the perceptions of the preservice teachers and school staff as a result of their involvement in the Reading Squadron program. The preservice teachers completed a questionnaire to determine their views of the program and ascertain how it assisted their development. Further data were gathered from the preservice teachers through individual face-to-face interviews. Three school staff involved in the program also completed a questionnaire to determine the value of the program. Results indicated that the preservice teachers made links between theory and practice, and felt they gained knowledge about teaching reading. Three preservice teachers noted it was difficult to work around timetable commitments but gained from the experience and suggested embedding such experiences into university literacy units. Data gathered from school staff indicated that six-weeks was not sufficient time to measure improvements in the school students, however, they were supportive of such a program, particularly for its continuation. Collaborations between schools and universities can provide opportunities for preservice teachers to use theoretical knowledge gained from core university subjects with application to assist primary students’ literacy development in schools. Teachers in this study were supportive of the Reading Squadron program, however, more data needed to be collected to understand the literacy improvement of students. Longitudinal studies are required to ascertain specific knowledge and skills gained by preservice teachers to teach reading and how these programs enhance students’ literacy levels.
Resumo:
This work aims to take advantage of recent developments in joint factor analysis (JFA) in the context of a phonetically conditioned GMM speaker verification system. Previous work has shown performance advantages through phonetic conditioning, but this has not been shown to date with the JFA framework. Our focus is particularly on strategies for combining the phone-conditioned systems. We show that the classic fusion of the scores is suboptimal when using multiple GMM systems. We investigate several combination strategies in the model space, and demonstrate improvement over score-level combination as well as over a non-phonetic baseline system. This work was conducted during the 2008 CLSP Workshop at Johns Hopkins University.
Resumo:
The increasing prevalence of International New Ventures (INVs) during the past twenty years has been highlighted by numerous studies (Knight and Cavusgil, 1996, Moen, 2002). International New Ventures are firms, typically small to medium enterprises, that internationalise within six years of inception (Oviatt and McDougall, 1997). To date there has been no general consensus within the literature on a theoretical framework of internationalisation to explain the internationalisation process of INVs (Madsen and Servais, 1997). However, some researchers have suggested that the innovation diffusion model may provide a suitable theoretical framework (Chetty & Hamilton, 1996, Fan & Phan, 2007).The proposed model was based on the existing and well-established innovation diffusion theories drawn from consumer behaviour and internationalisation literature to explain the internationalisation process of INVs (Lim, Sharkey, and Kim, 1991, Reid, 1981, Robertson, 1971, Rogers, 1962, Wickramasekera and Oczkowski, 2006). The results of this analysis indicated that the synthesied model of export adoption was effective in explaining the internationalisation process of INVs within the Queensland Food and Beverage Industry. Significantly the results of the analysis also indicated that features of the original I-models developed in the consumer behaviour literature, that had limited examination within the internationalisation literature were confirmed. This includes the ability of firms, or specifically decision-makers, to skip stages based om previous experience.
Resumo:
While spoken term detection (STD) systems based on word indices provide good accuracy, there are several practical applications where it is infeasible or too costly to employ an LVCSR engine. An STD system is presented, which is designed to incorporate a fast phonetic decoding front-end and be robust to decoding errors whilst still allowing for rapid search speeds. This goal is achieved through mono-phone open-loop decoding coupled with fast hierarchical phone lattice search. Results demonstrate that an STD system that is designed with the constraint of a fast and simple phonetic decoding front-end requires a compromise to be made between search speed and search accuracy.
Resumo:
The increasing prevalence of International New Ventures (INVs) during the past twenty years has been highlighted by numerous studies (Knight and Cavusgil, 1996, Moen, 2002). International New Ventures are firms, typically small to medium enterprises, that internationalise within six years of inception (Oviatt and McDougall, 1997). To date there has been no general consensus within the literature on a theoretical framework of internationalisation to explain the internationalisation process of INVs (Madsen and Servais, 1997). However, some researchers have suggested that the innovation diffusion model may provide a suitable theoretical framework (Chetty & Hamilton, 1996, Fan & Phan, 2007).The proposed model was based on the existing and well-established innovation diffusion theories drawn from consumer behaviour and internationalisation literature to explain the internationalisation process of INVs (Lim, Sharkey, and Kim, 1991, Reid, 1981, Robertson, 1971, Rogers, 1962, Wickramasekera and Oczkowski, 2006). The results of this analysis indicated that the synthesied model of export adoption was effective in explaining the internationalisation process of INVs within the Queensland Food and Beverage Industry. Significantly the results of the analysis also indicated that features of the original I-models developed in the consumer behaviour literature, that had limited examination within the internationalisation literature were confirmed. This includes the ability of firms, or specifically decision-makers, to skip stages based om previous experience.
Resumo:
This paper proposes the use of the Bayes Factor to replace the Bayesian Information Criterion (BIC) as a criterion for speaker clustering within a speaker diarization system. The BIC is one of the most popular decision criteria used in speaker diarization systems today. However, it will be shown in this paper that the BIC is only an approximation to the Bayes factor of marginal likelihoods of the data given each hypothesis. This paper uses the Bayes factor directly as a decision criterion for speaker clustering, thus removing the error introduced by the BIC approximation. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, leading to a 14.7% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.
Resumo:
UCON is an emerging access control framework that lacks an administration model. In this paper we define the problem of administration and propose a novel administrative model. At the core of this model is the concept of attribute, which is also the central component of UCON. In our model, attributes are created by the assertions of subjects, which ascribe properties/rights to other subjects or objects. Through such a treatment of attributes, administration capabilities can be delegated from one subject to another and as a consequence UCON is improved in three aspects. First, immutable attributes that are currently considered as external to the model can be incorporated and thereby treated as mutable at- tributes. Second, the current arbitrary categorisation of users (as modifiers of attributes), to system and administrator can be removed. Attributes and objects are only modifiable by those who possess administration capability over them. Third, the delegation of administration over objects and properties that is not currently expressible in UCON is made possible.
Resumo:
This article presents a survey of authorisation models and considers their ‘fitness-for-purpose’ in facilitating information sharing. Network-supported information sharing is an important technical capability that underpins collaboration in support of dynamic and unpredictable activities such as emergency response, national security, infrastructure protection, supply chain integration and emerging business models based on the concept of a ‘virtual organisation’. The article argues that present authorisation models are inflexible and poorly scalable in such dynamic environments due to their assumption that the future needs of the system can be predicted, which in turn justifies the use of persistent authorisation policies. The article outlines the motivation and requirement for a new flexible authorisation model that addresses the needs of information sharing. It proposes that a flexible and scalable authorisation model must allow an explicit specification of the objectives of the system and access decisions must be made based on a late trade-off analysis between these explicit objectives. A research agenda for the proposed Objective-based Access Control concept is presented.
Resumo:
US state-based data breach notification laws have unveiled serious corporate and government failures regarding the security of personal information. These laws require organisations to notify persons who may be affected by an unauthorized acquisition of their personal information. Safe harbours to notification exist if personal information is encrypted. Three types of safe harbour have been identified in the literature: exemptions, rebuttable presumptions and factors. The underlying assumption of exemptions is that encrypted personal information is secure and therefore unauthorized access does not pose a risk. However, the viability of this assumption is questionable when examined against data breaches involving encrypted information and the demanding practical requirements of effective encryption management. Recent recommendations by the Australian Law Reform Commission (ALRC) would amend the Privacy Act 1988 (Cth) to implement a data breach scheme that includes a different type of safe harbour, factor based analysis. The authors examine the potential capability of the ALRC’s proposed encryption safe harbour in relation to the US experience at the state legislature level.
Resumo:
The objective of this paper is to provide an overview of mine automation applications, developed at the Queensland Centre for Advanced Technology (QCAT), which make use of IEEE 802.11b wireless local area networks (WLANs). The paper has been prepared for a 2002 conference entitled "Creating the Virtual Enterprise - Leveraging wireless technology within existing business models for corporate advantage". Descriptions of the WLAN components have been omitted here as such details are presented in the accompanying papers. The structure of the paper is as follows. Application overviews are provided in Sections 2 to 7. Some pertinent strengths and weaknesses are summarised in Section 8. Please refer to http://www.mining-automation.com/ or contact the authors for further information.
Resumo:
This present paper reviews the reliability and validity of visual analogue scales (VAS) in terms of (1) their ability to predict feeding behaviour, (2) their sensitivity to experimental manipulations, and (3) their reproducibility. VAS correlate with, but do not reliably predict, energy intake to the extent that they could be used as a proxy of energy intake. They do predict meal initiation in subjects eating their normal diets in their normal environment. Under laboratory conditions, subjectively rated motivation to eat using VAS is sensitive to experimental manipulations and has been found to be reproducible in relation to those experimental regimens. Other work has found them not to be reproducible in relation to repeated protocols. On balance, it would appear, in as much as it is possible to quantify, that VAS exhibit a good degree of within-subject reliability and validity in that they predict with reasonable certainty, meal initiation and amount eaten, and are sensitive to experimental manipulations. This reliability and validity appears more pronounced under the controlled (but more arti®cial) conditions of the laboratory where the signal : noise ratio in experiments appears to be elevated relative to real life. It appears that VAS are best used in within-subject, repeated-measures designs where the effect of different treatments can be compared under similar circumstances. They are best used in conjunction with other measures (e.g. feeding behaviour, changes in plasma metabolites) rather than as proxies for these variables. New hand-held electronic appetite rating systems (EARS) have been developed to increase reliability of data capture and decrease investigator workload. Recent studies have compared these with traditional pen and paper (P&P) VAS. The EARS have been found to be sensitive to experimental manipulations and reproducible relative to P&P. However, subjects appear to exhibit a signi®cantly more constrained use of the scale when using the EARS relative to the P&P. For this reason it is recommended that the two techniques are not used interchangeably
Resumo:
One of the most celebrated qualities of the Internet is its enabling of simultaneity and multiplicity. By allowing users to open as many windows into the world as they (and their computers) can withstand, the Internet is understood to have brought places and cultures together on a scale and in a manner unprecedented. Yet, while the Internet has enabled many to reconnect with cultures and places long distanced and/or lost, it has also led to the belief that these reconnections are established with little correspondent cost to existent ties of belonging. In this paper, I focus on the dilemma multiple belongings engender for the ties of national belonging and question the sanguinity of multiple belongings as practised online. In particular, I use Lefebvre's notion of lived space to unpack the problems and contradictions of what has been called 'Greater China' for the ethnic Chinese minority in nations like Malaysia, Singapore and Australia.
Resumo:
Authorised users (insiders) are behind the majority of security incidents with high financial impacts. Because authorisation is the process of controlling users’ access to resources, improving authorisation techniques may mitigate the insider threat. Current approaches to authorisation suffer from the assumption that users will (can) not depart from the expected behaviour implicit in the authorisation policy. In reality however, users can and do depart from the canonical behaviour. This paper argues that the conflict of interest between insiders and authorisation mechanisms is analogous to the subset of problems formally studied in the field of game theory. It proposes a game theoretic authorisation model that can ensure users’ potential misuse of a resource is explicitly considered while making an authorisation decision. The resulting authorisation model is dynamic in the sense that its access decisions vary according to the changes in explicit factors that influence the cost of misuse for both the authorisation mechanism and the insider.