994 resultados para Computer files.
Resumo:
The identification and classification of network traffic and protocols is a vital step in many quality of service and security systems. Traffic classification strategies must evolve, alongside the protocols utilising the Internet, to overcome the use of ephemeral or masquerading port numbers and transport layer encryption. This research expands the concept of using machine learning on the initial statistics of flow of packets to determine its underlying protocol. Recognising the need for efficient training/retraining of a classifier and the requirement for fast classification, the authors investigate a new application of k-means clustering referred to as 'two-way' classification. The 'two-way' classification uniquely analyses a bidirectional flow as two unidirectional flows and is shown, through experiments on real network traffic, to improve classification accuracy by as much as 18% when measured against similar proposals. It achieves this accuracy while generating fewer clusters, that is, fewer comparisons are needed to classify a flow. A 'two-way' classification offers a new way to improve accuracy and efficiency of machine learning statistical classifiers while still maintaining the fast training times associated with the k-means.
Resumo:
This letter derives mathematical expressions for the received signal-to-interference-plus-noise ratio (SINR) of uplink Single Carrier (SC) Frequency Division Multiple Access (FDMA) multiuser MIMO systems. An improved frequency domain receiver algorithm is derived for the studied systems, and is shown to be significantly superior to the conventional linear MMSE based receiver in terms of SINR and bit error rate (BER) performance.
Resumo:
This paper considers a Q-ary orthogonal direct-sequence code-division multiple-access (DS-CDMA) system with high-rate space-time linear dispersion codes (LDCs) in time-varying Rayleigh fading multiple-input-multiple-output (MIMO) channels. We propose a joint multiuser detection, LDC decoding, Q-ary demodulation, and channel-decoding algorithm and apply the turbo processing principle to improve system performance in an iterative fashion. The proposed iterative scheme demonstrates faster convergence and superior performance compared with the V-BLAST-based DS-CDMA system and is shown to approach the single-user performance bound. We also show that the CDMA system is able to exploit the time diversity offered by the LDCS in rapid-fading channels.
Resumo:
The importance and use of text extraction from camera based coloured scene images is rapidly increasing with time. Text within a camera grabbed image can contain a huge amount of meta data about that scene. Such meta data can be useful for identification, indexing and retrieval purposes. While the segmentation and recognition of text from document images is quite successful, detection of coloured scene text is a new challenge for all camera based images. Common problems for text extraction from camera based images are the lack of prior knowledge of any kind of text features such as colour, font, size and orientation as well as the location of the probable text regions. In this paper, we document the development of a fully automatic and extremely robust text segmentation technique that can be used for any type of camera grabbed frame be it single image or video. A new algorithm is proposed which can overcome the current problems of text segmentation. The algorithm exploits text appearance in terms of colour and spatial distribution. When the new text extraction technique was tested on a variety of camera based images it was found to out perform existing techniques (or something similar). The proposed technique also overcomes any problems that can arise due to an unconstraint complex background. The novelty in the works arises from the fact that this is the first time that colour and spatial information are used simultaneously for the purpose of text extraction.
Resumo:
The decision of the U.S. Supreme Court in 1991 in Feist Publications, Inc. v. Rural Tel. Service Co. affirmed originality as a constitutional requirement for copyright. Originality has a specific sense and is constituted by a minimal degree of creativity and independent creation. The not original is the more developed concept within the decision. It includes the absence of a minimal degree of creativity as a major constituent. Different levels of absence of creativity also are distinguished, from the extreme absence of creativity to insufficient creativity. There is a gestalt effect of analogy between the delineation of the not original and the concept of computability. More specific correlations can be found within the extreme absence of creativity. "[S]o mechanical" in the decision can be correlated with an automatic mechanical procedure and clauses with a historical resonance with understandings of computability as what would naturally be regarded as computable. The routine within the extreme absence of creativity can be regarded as the product of a computational process. The concern of this article is with rigorously establishing an understanding of the extreme absence of creativity, primarily through the correlations with aspects of computability. The understanding established is consistent with the other elements of the not original. It also revealed as testable under real-world conditions. The possibilities for understanding insufficient creativity, a minimal degree of creativity, and originality, from the understanding developed of the extreme absence of creativity, are indicated.
Resumo:
Context. Complex molecules such as ethanol and dimethyl ether have been observed in a number of hot molecular cores and hot corinos. Attempts to model the molecular formation process using gas phase only models have so far been unsuccessful. Aims. To demonstrate that grain surface processing is a viable mechanism for complex molecule formation in these environments. Methods. A variable environment parameter computer model has been constructed which includes both gas and surface chemistry. This is used to investigate a variety of cloud collapse scenarios. Results. Comparison between model results and observation shows that by combining grain surface processing with gas phase chemistry complex molecules can be produced in observed abundances in a number of core and corino scenarios. Differences in abundances are due to the initial atomic and molecular composition of the core/corino and varying collapse timescales. Conclusions. Grain surface processing, combined with variation of physical conditions, can be regarded as a viable method for the formation of complex molecules in the environment found in the vicinity of a hot core/corino and produce abundances comparable to those observed.
Resumo:
Within the ever-changing arenas of architectural design and education, the core element of architectural education remains: that of the design process. The consideration of how to design in addition to what to design presents architectural educators with that most constant and demanding challenge of how do we best teach the design process?
This challenge is arguably most acute at a student's early stages of their architectural education. In their first years in architecture, students will commonly concentrate on the end product rather than the process. This is, in many ways, understandable. A great deal of time, money and effort go into their final presentations. They believe that it is what is on the wall that is going to be assessed. Armed with new computer skills, they want to produce eye-catching graphics that are often no more than a celebration of a CAD package. In an era of increasing speed, immediacy of information and powerful advertising it is unsurprising that students want to race quickly to presenting an end-product.
Recognising that trend, new teaching methods and models were introduced into the second year undergraduate studio over the past two years at Queen's University Belfast, aimed at promoting student self-reflection and making the design process more relevant to the students. This paper will first generate a critical discussion on the difficulties associated with the design process before outlining some of the methods employed to help promote the following; an understanding of concept, personalisation of the design process for the individual student; adding realism and value to the design process and finally, getting he students to play to their strengths in illustrating their design process like an element of product. Frameworks, examples, outcomes and student feedback will all be presented to help illustrate the effectiveness of the new strategies employed in making the design process firstly, more relevant and therefore secondly, of greater value, to the architecture student.
Resumo:
Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool.
Resumo:
The role of rhodopsin as a structural prototype for the study of the whole superfamily of G protein-coupled receptors (GPCRs) is reviewed in an historical perspective. Discovered at the end of the nineteenth century, fully sequenced since the early 1980s, and with direct three-dimensional information available since the 1990s, rhodopsin has served as a platform to gather indirect information on the structure of the other superfamily members. Recent breakthroughs have elicited the solution of the structures of additional receptors, namely the beta 1- and beta 2-adrenergic receptors and the A(2A) adenosine receptor, now providing an opportunity to gauge the accuracy of homology modeling and molecular docking techniques and to perfect the computational protocol. Notably, in coordination with the solution of the structure of the A(2A) adenosine receptor, the first "critical assessment of GPCR structural modeling and docking" has been organized, the results of which highlighted that the construction of accurate models, although challenging, is certainly achievable. The docking of the ligands and the scoring of the poses clearly emerged as the most difficult components. A further goal in the field is certainly to derive the structure of receptors in their signaling state, possibly in complex with agonists. These advances, coupled with the introduction of more sophisticated modeling algorithms and the increase in computer power, raise the expectation for a substantial boost of the robustness and accuracy of computer-aided drug discovery techniques in the coming years.
Resumo:
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
Resumo:
Listeners experience electroacoustic music as full of significance and meaning, and they experience spatiality as one of the factors contributing to its meaningfulness. If we want to understand spatiality in electroacoustic music, we must understand how the listener’s mental processes give rise to the experience of meaning. In electroacoustic music as in everyday life, these mental processes unite the peripheral auditory system with human spatial cognition. In the discussion that follows we consider a range of the listener’s mental processes relating space and meaning from the perceptual attributes of spatial imagery to the spatial reference frames for places and navigation. When considering multichannel loudspeaker systems in particular, an important part of the discussion is focused on the distinctive and idiomatic ways in which this particular mode of sound production contributes to and situates meaning. These idiosyncrasies include the phenomenon of image dispersion, the important consequences of the precedence effect and the influence of source characteristics on spatial imagery. These are discussed in close relation to the practicalities of artistic practice and to the potential for artistic meaning experienced by the listener.
Resumo:
The key question posed here is how listeners experience meaning when listening to electroacoustic music, especially how they experience it as art. This question is addressed by connecting electroacoustic listening with the ways that the mind constructs meaning in everyday life. Initially, the topic of the everyday mind provides a framework for discussing cognitive schemas, mental spaces, the Event schema and auditory gist. Then, specific idioms of electroacoustic music are examined that give rise to artistic meaning. These include the creative binding of circumstances with events and the conceptual blending that creates metaphorical meaning. Finally, the listener's experience of long-term events is discussed is relation to the location event-structure metaphor.