114 resultados para Wi-Fi
Resumo:
Most recommender systems attempt to use collaborative filtering, content-based filtering or hybrid approach to recommend items to new users. Collaborative filtering recommends items to new users based on their similar neighbours, and content-based filtering approach tries to recommend items that are similar to new users' profiles. The fundamental issues include how to profile new users, and how to deal with the over-specialization in content-based recommender systems. Indeed, the terms used to describe items can be formed as a concept hierarchy. Therefore, we aim to describe user profiles or information needs by using concepts vectors. This paper presents a new method to acquire user information needs, which allows new users to describe their preferences on a concept hierarchy rather than rating items. It also develops a new ranking function to recommend items to new users based on their information needs. The proposed approach is evaluated on Amazon book datasets. The experimental results demonstrate that the proposed approach can largely improve the effectiveness of recommender systems.
Resumo:
Different reputation models are used in the web in order to generate reputation values for products using uses' review data. Most of the current reputation models use review ratings and neglect users' textual reviews, because it is more difficult to process. However, we argue that the overall reputation score for an item does not reflect the actual reputation for all of its features. And that's why the use of users' textual reviews is necessary. In our work we introduce a new reputation model that defines a new aggregation method for users' extracted opinions about products' features from users' text. Our model uses features ontology in order to define general features and sub-features of a product. It also reflects the frequencies of positive and negative opinions. We provide a case study to show how our results compare with other reputation models.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
The Australian Curriculum: English (AC:E) is being implemented in Queensland and asks teachers and curriculum designers to incorporate the cross curriculum priority of Sustainability. This paper examines some texts suitable for inclusion in classroom study and suggests some companion texts that may be studied alongside them, including online resources by the ABC and those developed online for the Australian Curriculum. We also suggest some formative and summative assessment possibilities for responding to the selected works in this guide. We have endeavoured to investigate literature that enable students to explore and produce text types across the three AC:E categories: persuasive, imaginative and informative. The selected texts cover traditional novels, novellas, Sci-fi and speculative fiction, non-fiction, documentary, feature film and animation. Some of the texts reviewed here also cover the other cross curriculum priorities including texts by Aboriginal and Torres Strait Islander writers and some which also include Asian representations. We have also indicated which of the AC:E the general capabilities are addressed in each text.
Resumo:
It has been almost fi ve years since I fi rst published the article entitled “Much Ado About Staining” in Review of Optometry, which explored what we really knew in 2006 about the relationship between “corneal staining” and contact lens multipurpose solutions (MPS). This was published just prior to the controversial “staining grid.” While the Grid showed MPS-associated hyperfl uorescence under the slitlamp at two hours, it did not explain the “what” or “why” behind it; even so, many proponents of the Grid continue to suggest that it shows us which solution/lens combinations are “biocompatible” and which are not. New evidence suggests that the preservative-associated transient hyperfl uorescence (or PATH) observed at two hours after lens insertion is a benign phenomenon due to an interaction between fl uorescein, MPS preservatives, and corneal cell membranes. The misinterpretation of PATH as “real” corneal staining, like that observed in pathological conditions, may be due in part to the fact that there is not a lot of teaching regarding the true properties of fl uorescein and what is actually occurring when we see either PATH or corneal staining. To discuss the science of fl uorescein, corneal staining, and PATH, I have asked some of the preeminent research experts in the study of fl uorescence spectroscopy and corneal staining from around the world to share their new research and personal opinions on these topics...
Resumo:
The Australian masonry standard allows either prism tests or correction factors based on the block height and mortar thickness to evaluate masonry compressive strength. The correction factor helps the taller units with conventional 10 mm mortar being not disadvantaged due to size effect. In recent times, 2-4 mm thick, high-adhesive mortars and H blocks with only the mid-web shell are used in masonry construction. H blocks and thinner and higher adhesive mortars have renewed interest of the compression behaviour of hollow concrete masonry and hence is revisited in this paper. This paper presents an experimental study carried out to examine the effects of the thickness of mortar joints, the type of mortar adhesives and the presence of web shells in the hollow concrete masonry prisms under axial compression. A non-contact digital image correlation technique was used to measure the deformation of the prisms and was found adequate for the determination of strain fi eld of the loaded face shells subjected to axial compression. It is found that the absence of end web shells lowers the compressive strength and stiffness of the prisms and the thinner and higher adhesive mortars increase the compressive strength and stiffness, while lowering the Poisson's ratio. © Institution of Engineers Australia, 2013.
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
Very little is known about the infl uence of the mechanical environment on the healing of large segmental defects. This partly reflects the lack of standardised, well characterised technologies to enable such studies. Here we report the design, construction and characterisation of a novel external fixator for use in conjunction with rat femoral defects. This device not only imposes a predetermined axial stiffness on the lesion, but also enables the stiffness to be changed during the healing process. The main frame of the fi xator consists of polyethylethylketone with titanium alloy mounting pins. The stiffness of the fi xator is determined by interchangeable connection elements of different thicknesses. Fixators were shown to stabilise 5 mm femoral defects in rats in vivo for at least 8 weeks during unrestricted cage activity. No distortion or infections, including pin infections, were noted. The healing process was simulated in vitro by inserting into a 5 mm femoral defect, materials whose Young’s moduli approximated those of the different tissues present in regenerating bone. These studies confirmed that, although the external fixator is the major determinant of axial stiffness during the early phase of healing, the regenerate within the lesion subsequently dominates this property. There is much clinical interest in altering the mechanics of the defect to enhance bone healing. Our data suggest that, if alteration of the mechanical environment is to be used to modulate the healing of large segmental defects, this needs to be performed before the tissue properties become dominant.
Resumo:
This volume stems from the 1st International Conference on Epithelial-Mesenchymal Transitions (EMT), which was convened by the editors on October 5–8, 2003 in the beautiful setting of Port Douglas, Queensland, Australia. EMT, the name given to the transformation of cells arranged in a coherent layer – epithelial cells – to more individualistic and potential motile cells – mesenchymal cells – was recognized decades ago by Prof. Elisabeth (Betty) Hay (Harvard Medical School, Boston, Mass., USA) as a primary mechanism in embryogenesis for remodelling tissues. More recently EMT has been seen as crucial to the spread and invasion of carcinoma, and more recently still, EMT-like changes have been detected in various pathologies marked by fi brosis. Despite the basic and clinical importance of EMT, this extremely rapidly growing fi eld had never previously had a conference devoted to it, and indeed the disciplines of developmental biology, cancer and pathology rarely interact although they have much to share. The chapters assembled for this volume encompass these three major themes of the meeting, development, pathology and cancer, and further highlight the commonality in terms of mechanisms and outcomes among them...
Resumo:
We present a text watermarking scheme that embeds a bitstream watermark Wi in a text document P preserving the meaning, context, and flow of the document. The document is viewed as a set of paragraphs, each paragraph being a set of sentences. The sequence of paragraphs and sentences used to embed watermark bits is permuted using a secret key. Then, English language sentence transformations are used to modify sentence lengths, thus embedding watermarking bits in the Least Significant Bits (LSB) of the sentences’ cardinalities. The embedding and extracting algorithms are public, while the secrecy and security of the watermark depends on a secret key K. The probability of False Positives is extremely small, hence avoiding incidental occurrences of our watermark in random text documents. Majority voting provides security against text addition, deletion, and swapping attacks, further reducing the probability of False Positives. The scheme is secure against the general attacks on text watermarks such as reproduction (photocopying, FAX), reformatting, synonym substitution, text addition, text deletion, text swapping, paragraph shuffling and collusion attacks.
Resumo:
"This work forms part of a much larger collaborative album project in progress between Tim Bruniges, Julian Knowles and David Trumpmanis which explores the intersections between traditional rock instrumentation and analogue and digital media. All of the creative team are performers, composers and producers. The material for the album was thus generated by a series of in studio improvisations and performances with each collaborator assuming a range of different and alternating roles – guitars, electronics, drums, percussion, bass, keyboards production. Thematically the work explores the intersection of instrumental (post) rock, ambient music, and historical electro-acoustic tape composition traditions. Over the past 10 years, musical practice has become increasingly hybrid, with the traditional boundaries between genre becoming progressively eroded. At the same time, digital tools have replaced many of the major analogue technologies that dominated music production and performance in the 20th century. The disappearance of analogue media in mainstream musical practice has had a profound effect on the sonic characteristics of contemporary music and the gestural basis for its production. Despite the increasing power of digital technologies, a small but dedicated group of practitioners has continued to prize and use analogue technology for its unique sounds and the non-linearity of the media, aestheticising its inherent limitations and flaws. At the most radical end of this spectrum lie glitch and lo-fi musical forms, seen in part as reactions to the clinical nature of digital media and the perceived lack of character associated with its transparency. Such developments have also problematised the traditional relationships between media and genre, where specific techniques and their associated sounds have become genre markers. Tristate is an investigation into this emerging set of dialogues between analogue and digital media across composition, production and performance. It employs analogue tape loops in performance, where a tape machine ‘performer’ records and hand manipulates loops of an electric guitar performer on ‘destroyed’ tape stock (intentionally damaged tape), processing the output of this analogue system in the digital domain with contemporary sound processors. In doing so it investigates how the most extreme sonic signatures of analogue media – tape dropout and noise – can be employed alongside contemporary digital sound gestures in both compositional and performance contexts and how the extremes of the two media signatures can brought together both compositionally and performatively. In respect of genre, the work established strategies for merging compositional techniques from the early musique concrete tradition of the 1940s with late 60s popular music experimentalism and the laptop glitch electronica movement of the early 2000s. Lastly, the work explores how analogue recording studio technologies can be used as performance tools, thus illuminating and foregrounding the performative/gestural dimensions of traditional analogue studio tools in use."
Resumo:
Metarhizium anisopliae is a well-characterized biocontrol agent of a wide range of insects including cane grubs. In this study, a two-dimensional (2D) electrophoresis was used to display secreted proteins of M. anisopliae strain FI-1045 growing on the whole greyback cane grubs and their isolated cuticles. Hydrolytic enzymes secreted by M. anisopliae play a key role in insect cuticle-degradation and initiation of the infection process. We have identified all the 101 protein spots displayed by cross-species identification (CSI) from the fungal kingdom. Among the identified proteins were 64-kDa serine carboxypeptidase, 1,3 beta-exoglucanase, Dynamin GTPase, THZ kinase, calcineurin like phosphoesterase, and phosphatidylinositol kinase secreted by M. ansiopliae (FI-1045) in response to exposure to the greyback cane grubs and their isolated cuticles. These proteins have not been previously identified from the culture supernatant of M. anisopliae during infection. To our knowledge, this the first proteomic map established to study the extracellular proteins secreted by M. ansiopliae (FI-1045) during infection of greyback cane grubs and its cuticles.
Resumo:
Metarhizium anisopliae is a naturally occurring cosmopolitan fungus infecting greyback canegrubs (Dermolepida albohirtum). The main molecular factors involved in the complex interactions occurring between the greyback canegrubs and M. anisopliae (FI-1045) were investigated by comparing the proteomes of healthy canegrubs, canegrubs infected with Metarhizium and fungus only. Differentially expressed proteins from the infected canegrubs were subjected to mass spectrometry to search for pathogenicity related proteins. Immune-related proteins of canegrubs identified in this study include cytoskeletal proteins (actin), cell communication proteins, proteases and peptidases. Fungal proteins identified include metalloproteins, acyl-CoA, cyclin proteins and chorismate mutase. Comparative proteome analysis provided a view into the cellular reactions triggered in the canegrub in response to the fungal infection at the onset of biological control.
Resumo:
This was a catalogue essay for the emerging Brisbane artist Sarah Byrne's exhibition Trenchmouth at MetroArts from 31st October until 21st November 2009. The essay contextualised her practice in the history of experimental soundart and discussed her methods of practice and approach to sound from the postion of a multi-discplinary artist.The essay also discussed the way in which her practice engages with and recontextualises camp, trash, and lo-fi aspects of popular culture
Resumo:
Drawing on recent doctoral research (Gillespie 2013), this paper presents snapshots of teacher librarians’ lived experience as evidence-based practitioners in Australian schools, and it defi nes what constitutes evidence for teacher librarians. Gillespie’s doctoral research responded to fi ndings of the Australian Government’s Inquiry into School Libraries and Teacher Librarians in Australian Schools (House of Representatives, Standing Committee on Education and Employment 2011) which indicated the urgent need for empirical evidence about the professional practices of teacher librarians and their contributions to school goals and student academic, social and cultural achievements. Such evidence is crucial to ensure the future viability of teacher librarians and their contribution to learning and teaching in schools. This qualitative, interpretive research responds to this need by exploring the experiences of 15 Australian teacher librarians to discover how they gathered and used evidence in performing their professional roles.