40 resultados para Graph-based methods
Resumo:
The group of haemosporidian parasites is of general interest to basic and applied science, since several species infect mammals, leading to malaria and associated disease symptoms. Although the great majority of haemosporidian parasites appear in bird hosts, as in the case of Leucocytozoon buteonis, there is little genomic information about genetic aspects of their co-evolution with hosts. Consequently, there is a high need for parasite-enrichment strategies enabling further analyses of the genomes, namely without exposure to DNA-intercalating dyes. Here, we used flow cytometry without an additional labelling step to enrich L. buteonis from infected buzzard blood. A specific, defined area of two-dimensional scattergramms was sorted and the fraction was further analysed. The successful enrichment of L. buteonis in the sorted fraction was demonstrated by Giemsa-staining and qPCR revealing a clear increase of parasite-specific genes, while host-specific genes were significantly decreased. This is the first report describing a labelling-free enrichment approach of L. buteonis from infected buzzard blood. The enrichment of parasites presented here is free of nucleic acid-intercalating dyes which may interfere with fluorescence-based methods or subsequent sequencing approaches.
Resumo:
This paper describes a study of the use of immersive Virtual reality technologies in the design of a new hospital. It uses Schön’s concept of reflective practice and video-based methods to analyse the ways design teams approach and employ a full scale 3D immersive environment – a CAVE – in collaborative design work. The analysis describes four themes relating to reflective practice occurring in the setting: orienting to the CAVE technology itself, orienting to the representation of the specific design within the CAVE, activities accounting for, or exploring alternatives within the design for the use and users of the space, and more strategic interactions around how to best represent the design and model to the client within the CAVE setting. The analysis also reveals some unique aspects of design work in this environment. Perhaps most significantly, rather than enhancing or adding to an existing understanding of design through paper based or non-immersive digital representations, it is often acting to challenge or surprise the participants as they experience the immersive, full scale version of their own design.
Resumo:
Using the novel technique of topic modelling, this paper examines thematic patterns and their changes over time in a large corpus of corporate social responsibility (CSR) reports produced in the oil sector. Whereas previous research on corporate communications has been small-scale or interested in selected lexical aspects and thematic categories identified ex ante, our approach allows for thematic patterns to emerge from the data. The analysis reveals a number of major trends and topic shifts pointing to changing practices of CSR. Nowadays ‘people’, ‘communities’ and ‘rights’ seem to be given more prominence, whereas ‘environmental protection’ appears to be less relevant. Using more established corpus-based methods, we subsequently explore two top phrases - ‘human rights’ and ‘climate change’ that were identified as representative of the shifting thematic patterns. Our approach strikes a balance between the purely quantitative and qualitative methodologies and offers applied linguists new ways of exploring discourse in large collections of texts.
Resumo:
For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.
Resumo:
Modern methods of spawning new technological motifs are not appropriate when it is desired to realize artificial life as an actual real world entity unto itself (Pattee 1995; Brooks 2006; Chalmers 1995). Many fundamental aspects of such a machine are absent in common methods, which generally lack methodologies of construction. In this paper we mix classical and modern studies in order to attempt to realize an artificial life form from first principles. A model of an algorithm is introduced, its methodology of construction is presented, and the fundamental source from which it sprang is discussed.
Resumo:
The proportional odds model provides a powerful tool for analysing ordered categorical data and setting sample size, although for many clinical trials its validity is questionable. The purpose of this paper is to present a new class of constrained odds models which includes the proportional odds model. The efficient score and Fisher's information are derived from the profile likelihood for the constrained odds model. These results are new even for the special case of proportional odds where the resulting statistics define the Mann-Whitney test. A strategy is described involving selecting one of these models in advance, requiring assumptions as strong as those underlying proportional odds, but allowing a choice of such models. The accuracy of the new procedure and its power are evaluated.
Resumo:
This paper formally derives a new path-based neural branch prediction algorithm (FPP) into blocks of size two for a lower hardware solution while maintaining similar input-output characteristic to the algorithm. The blocked solution, here referred to as B2P algorithm, is obtained using graph theory and retiming methods. Verification approaches were exercised to show that prediction performances obtained from the FPP and B2P algorithms differ within one mis-prediction per thousand instructions using a known framework for branch prediction evaluation. For a chosen FPGA device, circuits generated from the B2P algorithm showed average area savings of over 25% against circuits for the FPP algorithm with similar time performances thus making the proposed blocked predictor superior from a practical viewpoint.
Resumo:
This article describes a number of velocity-based moving mesh numerical methods formultidimensional nonlinear time-dependent partial differential equations (PDEs). It consists of a short historical review followed by a detailed description of a recently developed multidimensional moving mesh finite element method based on conservation. Finite element algorithms are derived for both mass-conserving and non mass-conserving problems, and results shown for a number of multidimensional nonlinear test problems, including the second order porous medium equation and the fourth order thin film equation as well as a two-phase problem. Further applications and extensions are referenced.
Resumo:
Crystallization must occur in honey in order to produce set or creamed honey; however, the process must occur in a controlled manner in order to obtain an acceptable product. As a consequence, reliable methods are needed to measure the crystal content of honey (φ expressed as kg crystal per kg honey), which can also be implemented with relative ease in industrial production facilities. Unfortunately, suitable methods do not currently exist. This article reports on the development of 2 independent offline methods to measure the crystal content in honey based on differential scanning calorimetry and high-performance liquid chromatography. The 2 methods gave highly consistent results on the basis of paired t-test involving 143 experimental points (P > 0.05, r**2 = 0.99). The crystal content also correlated with the relative viscosity, defined as the ratio of the viscosity of crystal containing honey to that of the same honey when all crystals are dissolved, giving the following correlation: μr = 1 + 1398.8∅**2.318. This correlation can be used to estimate the crystal content of honey in industrial production facilities. The crystal growth rate at a temperature of 14 ◦C—the normal crystallization temperature used in practice—was linear, and the growth rate also increased with the total glucose content in the honey.