44 resultados para Gessner, Conrad, 1764-1826.

em Queensland University of Technology - ePrints Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

While extensive literature exists on knowledge-based urban development (KBUD) focusing on large metropolitan cities, there is a paucity of literature looking into similar developments in small regional towns. The major aim of the paper is to examine the nature and potential for building knowledge precincts in regional towns. Through a review of extant literature on knowledge precincts, five key value elements and principles for development are identified. These principles are then tested and applied to a case study of the small town of Cooroy in Noosa, Australia. The Cooroy Lower Mill Site and its surroundings are the designated location for what may be called a community-based creative knowledge precinct. The opportunities and challenges for setting up a creative knowledge precinct in Cooroy were examined. The study showed that there is a potential to develop Cooroy with the provision of cultural and learning facilities, partnerships with government, business and educational institutions, and networking with other creative and knowledge precincts in the region. However, there are also specific challenges relating to the development of a knowledge precinct within the regional town and these relate to critical mass, competition and governance.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports one aspect of a study of 28 young adults (18–26 years) engaging with the uncertain (contested) science of a television news report about recent research into mobile phone health risks. The aim of the study was to examine these young people’s ‘accounts of scientific knowledge’ in this context. Seven groups of friends responded to the news report, initially in focus group discussions. Later in semi-structured interviews they elaborated their understanding of the nature of science through their explanations of the scientists’ disagreement and described their mobile phone safety risk assessments. This paper presents their accounts in terms of their views of the nature of science and their concept understanding. Discussions were audio-recorded then analysed by coding the talk in terms of issues raised, which were grouped into themes and interpreted in terms of a moderate social constructionist theoretical framing. In this context, most participants expressed a ‘common sense’ view of the nature of science, describing it as an atheoretical, technical procedure of scientists testing their personal opinions on the issue, subject to the influence of funding sponsors. The roles of theory and data interpretation were largely ignored. It is argued that the nature of science understanding is crucial to engagement with contemporary socioscientific issues, particularly the roles of argumentation, theory, data interpretation, and the distinction of science from common sense. Implications for school science relate primarily to nature of science teaching and the inclusion of socioscientific issues in school science curricula. Future research directions are considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This architectural and urban design project was conducted as part of the Brisbane Airport Corporations master-planning Atelier, run in conjunction with City Lab. This creation and innovation event brought together approximately 80 designers, associated professionals, and both local and state government representatives to research concepts for future development and planning of the Brisbane airport site. The Team Delta research project explored the development of a new precinct cluster around the existing international terminal building; with a view of reinforcing the sense of place and arrival. The development zone explores the options of developing a subtropical character through landscape elements such as open plazas, tourist attractions, links to existing adjacent waterways, and localised rapid transport options. The proposal tests the possibilities of developing a cultural hub in conjunction with transport infrastructure and the airport terminal(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It hasn’t been a good year for media barons. Actually, it’s not been a great century. In 2007 Baron Conrad Black was sent to jail in the US for embezzling his shareholders. Silvio Berlusconi’s grip on the Italian media hasn’t prevented a steady flow of allegations of sleaze and scandal since 2009, which have reduced him to a global laughing stock. And since July 2011, we have seen the dizzying fall of Rupert Murdoch and his son, James, from their positions of unquestioned (and unquestionable) authority at the helm of the world’s most powerful media empire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mesenchymal stem cells (MSC) are emerging as a leading cellular therapy for a number of diseases. However, for such treatments to become available as a routine therapeutic option, efficient and cost-effective means for industrial manufacture of MSC are required. At present, clinical grade MSC are manufactured through a process of manual cell culture in specialized cGMP facilities. This process is open, extremely labor intensive, costly, and impractical for anything more than a small number of patients. While it has been shown that MSC can be cultivated in stirred bioreactor systems using microcarriers, providing a route to process scale-up, the degree of numerical expansion achieved has generally been limited. Furthermore, little attention has been given to the issue of primary cell isolation from complex tissues such as placenta. In this article we describe the initial development of a closed process for bulk isolation of MSC from human placenta, and subsequent cultivation on microcarriers in scalable single-use bioreactor systems. Based on our initial data, we estimate that a single placenta may be sufficient to produce over 7,000 doses of therapeutic MSC using a large-scale process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling video sequences by subspaces has recently shown promise for recognising human actions. Subspaces are able to accommodate the effects of various image variations and can capture the dynamic properties of actions. Subspaces form a non-Euclidean and curved Riemannian manifold known as a Grassmann manifold. Inference on manifold spaces usually is achieved by embedding the manifolds in higher dimensional Euclidean spaces. In this paper, we instead propose to embed the Grassmann manifolds into reproducing kernel Hilbert spaces and then tackle the problem of discriminant analysis on such manifolds. To achieve efficient machinery, we propose graph-based local discriminant analysis that utilises within-class and between-class similarity graphs to characterise intra-class compactness and inter-class separability, respectively. Experiments on KTH, UCF Sports, and Ballet datasets show that the proposed approach obtains marked improvements in discrimination accuracy in comparison to several state-of-the-art methods, such as the kernel version of affine hull image-set distance, tensor canonical correlation analysis, spatial-temporal words and hierarchy of discriminative space-time neighbourhood features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background subtraction is a fundamental low-level processing task in numerous computer vision applications. The vast majority of algorithms process images on a pixel-by-pixel basis, where an independent decision is made for each pixel. A general limitation of such processing is that rich contextual information is not taken into account. We propose a block-based method capable of dealing with noise, illumination variations, and dynamic backgrounds, while still obtaining smooth contours of foreground objects. Specifically, image sequences are analyzed on an overlapping block-by-block basis. A low-dimensional texture descriptor obtained from each block is passed through an adaptive classifier cascade, where each stage handles a distinct problem. A probabilistic foreground mask generation approach then exploits block overlaps to integrate interim block-level decisions into final pixel-level foreground segmentation. Unlike many pixel-based methods, ad-hoc postprocessing of foreground masks is not required. Experiments on the difficult Wallflower and I2R datasets show that the proposed approach obtains on average better results (both qualitatively and quantitatively) than several prominent methods. We furthermore propose the use of tracking performance as an unbiased approach for assessing the practical usefulness of foreground segmentation methods, and show that the proposed approach leads to considerable improvements in tracking accuracy on the CAVIAR dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the field of face recognition, Sparse Representation (SR) has received considerable attention during the past few years. Most of the relevant literature focuses on holistic descriptors in closed-set identification applications. The underlying assumption in SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such assumption is easily violated in the more challenging face verification scenario, where an algorithm is required to determine if two faces (where one or both have not been seen before) belong to the same person. In this paper, we first discuss why previous attempts with SR might not be applicable to verification problems. We then propose an alternative approach to face verification via SR. Specifically, we propose to use explicit SR encoding on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which are then concatenated to form an overall face descriptor. Due to the deliberate loss spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment & various image deformations. Within the proposed framework, we evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN), and an implicit probabilistic technique based on Gaussian Mixture Models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the proposed local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, in both verification and closed-set identification problems. The experiments also show that l1-minimisation based encoding has a considerably higher computational than the other techniques, but leads to higher recognition rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in computational geodynamics are applied to explore the link between Earth’s heat, its chemistry and its mechanical behavior. Computational thermal-mechanical solutions are now allowing us to understand Earth patterns by solving the basic physics of heat transfer. This approach is currently used to solve basic convection patterns of terrestrial planets. Applying the same methodology to smaller scales delivers promising similarities between observed and predicted structures which are often the site of mineral deposits. The new approach involves a fully coupled solution to the energy, momentum and continuity equations of the system at all scales, allowing the prediction of fractures, shear zones and other typical geological patterns out of a randomly perturbed initial state. The results of this approach are linking a global geodynamic mechanical framework over regional-scale mineral deposits down to the underlying micro-scale processes. Ongoing work includes the challenge of incorporating chemistry into the formulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding network traffic behaviour is crucial for managing and securing computer networks. One important technique is to mine frequent patterns or association rules from analysed traffic data. On the one hand, association rule mining usually generates a huge number of patterns and rules, many of them meaningless or user-unwanted; on the other hand, association rule mining can miss some necessary knowledge if it does not consider the hierarchy relationships in the network traffic data. Aiming to address such issues, this paper proposes a hybrid association rule mining method for characterizing network traffic behaviour. Rather than frequent patterns, the proposed method generates non-similar closed frequent patterns from network traffic data, which can significantly reduce the number of patterns. This method also proposes to derive new attributes from the original data to discover novel knowledge according to hierarchy relationships in network traffic data and user interests. Experiments performed on real network traffic data show that the proposed method is promising and can be used in real applications. Copyright2013 John Wiley & Sons, Ltd.