998 resultados para More, Hannah, 1745-1833.
Resumo:
Continuous biometric authentication schemes (CBAS) are built around the biometrics supplied by user behavioural characteristics and continuously check the identity of the user throughout the session. The current literature for CBAS primarily focuses on the accuracy of the system in order to reduce false alarms. However, these attempts do not consider various issues that might affect practicality in real world applications and continuous authentication scenarios. One of the main issues is that the presented CBAS are based on several samples of training data either of both intruder and valid users or only the valid users' profile. This means that historical profiles for either the legitimate users or possible attackers should be available or collected before prediction time. However, in some cases it is impractical to gain the biometric data of the user in advance (before detection time). Another issue is the variability of the behaviour of the user between the registered profile obtained during enrollment, and the profile from the testing phase. The aim of this paper is to identify the limitations in current CBAS in order to make them more practical for real world applications. Also, the paper discusses a new application for CBAS not requiring any training data either from intruders or from valid users.
Resumo:
At QUT research data refers to information that is generated or collected to be used as primary sources in the production of original research results, and which would be required to validate or replicate research findings (Callan, De Vine, & Baker, 2010). Making publicly funded research data discoverable by the broader research community and the public is a key aim of the Australian National Data Service (ANDS). Queensland University of Technology (QUT) has been innovating in this space by undertaking mutually dependant technical and content (metadata) focused projects funded by ANDS. Research Data Librarians identified and described datasets generated from Category 1 funded research at QUT, by interviewing researchers, collecting metadata and fashioning metadata records for upload to the Australian Research Data commons (ARDC) and exposure through the Research Data Australia interface. In parallel to this project, a Research Data Management Service and Metadata hub project were being undertaken by QUT High Performance Computing & Research Support specialists. These projects will collectively store and aggregate QUT’s metadata and research data from multiple repositories and administration systems and contribute metadata directly by OAI-PMH compliant feed to RDA. The pioneering nature of the work has resulted in a collaborative project dynamic where good data management practices and the discoverability and sharing of research data were the shared drivers for all activity. Each project’s development and progress was dependent on feedback from the other. The metadata structure evolved in tandem with the development of the repository and the development of the repository interface responded to meet the needs of the data interview process. The project environment was one of bottom-up collaborative approaches to process and system development which matched top-down strategic alliances crossing organisational boundaries in order to provide the deliverables required by ANDS. This paper showcases the work undertaken at QUT, focusing on the Seeding the Commons project as a case study, and illustrates how the data management projects are interconnected. It describes the processes and systems being established to make QUT research data more visible and the nature of the collaborations between organisational areas required to achieve this. The paper concludes with the Seeding the Commons project outcomes and the contribution this project made to getting more research data ‘out there’.
Resumo:
A review of Lloyd Jones's Mister Pip; winner of the 2007 Commonwealth Writer's Prize and shortlisted for the 2007 Man Booker Prize.
Resumo:
This interview with Paul Makeham was conducted in 2010 by Felipe Carneiro from Brazilian business magazine Exame. Structured around Exame's "seven questions" format ("Sete Perguntas"), the interview ranges across topics relating to the creative economy, including the increasingly important role of creativity in business, and the role of education in promoting creativity.
Resumo:
Occlusion is a big challenge for facial expression recognition (FER) in real-world situations. Previous FER efforts to address occlusion suffer from loss of appearance features and are largely limited to a few occlusion types and single testing strategy. This paper presents a robust approach for FER in occluded images and addresses these issues. A set of Gabor based templates is extracted from images in the gallery using a Monte Carlo algorithm. These templates are converted into distance features using template matching. The resulting feature vectors are robust to occlusion. Occluded eyes and mouth regions and randomly places occlusion patches are used for testing. Two testing strategies analyze the effects of these occlusions on the overall recognition performance as well as each facial expression. Experimental results on the Cohn-Kanade database confirm the high robustness of our approach and provide useful insights about the effects of occlusion on FER. Performance is also compared with previous approaches.
Resumo:
In an environment where economic, political and technological change is the rule, a fundamental business strategy should be the defence of traditional markets and thoughtful entry into new markets, with an aim to increase market penetration and stimulate profit. The success of such a strategy will depend on the success of firms to do more and better for customers than their competitors. In other words, the firm’s primary competitive advantage will come from changes they implement to please their customers. In the construction industry, complexity of technical knowledge and construction processes have traditionally encouraged clients to play a largely passive role in the management of their project. However, today’s clients not only want to know about internal efficiency of their projects but also need to know how they and their contractors compare and compete against their competitors. Given the vulnerability of construction activities in the face of regional financial crisis, constructors need to be proactive in the search to improve their internal firm and project processes to ensure profitability and market responsiveness. In this context, reengineering is a radical design that emphasises customer satisfaction rather than cost reduction This paper discusses the crucial role of the client-project interface and how project networks could facilitate and improve information dissemination and sharing, collaborative efforts, decision-making and improved project climate. An intra-project network model is presented, and project managers’ roles and competencies in forming and coordinating project workgroups is discussed.
Resumo:
The introduction by the Australian federal government of its Carbon Pollution Reduction Scheme was a decisive step in the transformation of Australia into a low carbon economy. Since the release of the Scheme, however, political discourse relating to environmental sustainability and climate change in Australia has focused primarily on political, scientific and economic issues. Insufficient attention has been paid to the financial opportunities which commoditisation of the carbon market may offer, and little emphasis has been placed on the legal implications for the creation of a "new" asset and market. This article seeks to shed some light on the discernable opportunities which the Scheme should provide to participants in the Australian and international debt markets.
Resumo:
Prior to the decision of the High Court in Black v Garnock (2007) 230 CLR 438 it was an established principle in Queensland that a judgment creditor acting under an enforcement warrant could take no interest beyond what the judgment debtor could give. However, the decision of the High Court called this principle into question. This article examines the current position in the context of s 120 of the Land Title Act 1994 (Qld) , Queensland Titles Office practice and standard contractual provisions. This examination is further informed by the recent decision of Martin J in Secure Funding Pty Ltd v Doneley [2010] QSC 91.
Resumo:
If a real estate agent describes a property as being “a golden opportunity to invest” the expression will be readily construed as mere “puffery”. The legal landscape changes when a real estate agent describes a property as “leased” and having a “guaranteed net income”. Can an agent avoid potential liability, for an inaccurate description, by arguing that they were merely acting as a messenger to pass on information received from their vendor client? The potential liability of real estate agent “messengers” was recently considered by the Queensland Court of Appeal in Banks & Anor v Copas Newnham Pty Ltd & Ors [2002] QCA 217.
Resumo:
This article compares YouTube and the National Film and Sound Archive (NFSA) as resources for television historians interested in viewing old Australian television programs. The author searched for seventeen important television programs, identified in a previous research project, to compare what was available in the two archives and how easy it was to find. The analysis focused on differences in curatorial practices of accessioning and cataloguing. NFSA is stronger in current affairs and older programs, while YouTube is stronger in game shows and lifestyle programs. YouTube is stronger than the NFSA on “human interest” material—births, marriages, and deaths. YouTube accessioning more strongly accords with popular histories of Australian television. Both NFSA and YouTube offer complete episodes of programs, while YouTube also offers many short clips of “moments.” YouTube has more surprising pieces of rare ephemera. YouTube cataloguing is more reliable than that of the NFSA, with fewer broken links. The YouTube metadata can be searched more intuitively. The NFSA generally provides more useful reference information about production and broadcast dates.
Resumo:
Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a two-layer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.