101 resultados para Computational Aeroacoustics (CAA)
Resumo:
Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, the social sciences, and media and communications. New technologies may enhance the traditional aims of journalism, or may initiate greater interaction between journalists and information and communication technology (ICT) specialists. The enhanced use of computing in news production is related in particular to three factors: larger government data sets becoming more widely available; the increasingly sophisticated and ubiquitous nature of software; and the developing digital economy. Drawing upon international examples, this paper argues that computational journalism techniques may provide new foundations for original investigative journalism and increase the scope for new forms of interaction with readers. Computer journalism provides a major opportunity to enhance the delivery of original investigative journalism, and to attract and retain readers online.
Resumo:
This chapter focuses on the interactions and roles between delays and intrinsic noise effects within cellular pathways and regulatory networks. We address these aspects by focusing on genetic regulatory networks that share a common network motif, namely the negative feedback loop, leading to oscillatory gene expression and protein levels. In this context, we discuss computational simulation algorithms for addressing the interplay of delays and noise within the signaling pathways based on biological data. We address implementational issues associated with efficiency and robustness. In a molecular biology setting we present two case studies of temporal models for the Hes1 gene (Monk, 2003; Hirata et al., 2002), known to act as a molecular clock, and the Her1/Her7 regulatory system controlling the periodic somite segmentation in vertebrate embryos (Giudicelli and Lewis, 2004; Horikawa et al., 2006).
Resumo:
Abstract—Computational Intelligence Systems (CIS) is one of advanced softwares. CIS has been important position for solving single-objective / reverse / inverse and multi-objective design problems in engineering. The paper hybridise a CIS for optimisation with the concept of Nash-Equilibrium as an optimisation pre-conditioner to accelerate the optimisation process. The hybridised CIS (Hybrid Intelligence System) coupled to the Finite Element Analysis (FEA) tool and one type of Computer Aided Design(CAD) system; GiD is applied to solve an inverse engineering design problem; reconstruction of High Lift Systems (HLS). Numerical results obtained by the hybridised CIS are compared to the results obtained by the original CIS. The benefits of using the concept of Nash-Equilibrium are clearly demonstrated in terms of solution accuracy and optimisation efficiency.
An experimental and computational investigation of performance of Green Gully for reusing stormwater
Resumo:
A new stormwater quality improvement device (SQID) called ‘Green Gully’ has been designed and developed in this study with an aim to re-using stormwater for irrigating plants and trees. The main purpose of the Green Gully is to collect road runoff/stormwater, make it suitable for irrigation and provide an automated network system for watering roadside plants and irrigational areas. This paper presents the design and development of Green Gully along with experimental and computational investigations of the performance of Green Gully. Performance (in the form of efficiency, i.e. the percentage of water flow through the gully grate) was experimentally determined using a gully model in the laboratory first, then a three dimensional numerical model was developed and simulated to predict the efficiency of Green Gully as a function of flow rate. Computational Fluid Dynamics (CFD) code FLUENT was used for the simulation. GAMBIT was used for geometry creation and mesh generation. Experimental and simulation results are discussed and compared in this paper. The predicted efficiency was compared with the laboratory measured efficiency. It was found that the simulated results are in good agreement with the experimental results.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
We report on analysis of discussions in an online community of people with chronic illness using socio-cognitively motivated, automatically produced semantic spaces. The analysis aims to further the emerging theory of "transition" (how people can learn to incorporate the consequences of illness into their lives). An automatically derived representation of sense of self for individuals is created in the semantic space by the analysis of the email utterances of the community members. The movement over time of the sense of self is visualised, via projection, with respect to axes of "ordinariness" and "extra-ordinariness". Qualitative evaluation shows that the visualisation is paralleled by the transitions of people during the course of their illness. The research aims to progress tools for analysis of textual data to promote greater use of tacit knowledge as found in online virtual communities. We hope it also encourages further interest in representation of sense-of-self.
Resumo:
Cyclic nitroxide radicals represent promising alternatives to the iodine-based redox mediator commonly used in dye-sensitized solar cells (DSSCs). To date DSSCs with nitroxide-based redox mediators have achieved energy conversion efficiencies of just over 5 % but efficiencies of over 15 % might be achievable, given an appropriate mediator. The efficacy of the mediator depends upon two main factors: it must reversibly undergo one-electron oxidation and it must possess an oxidation potential in a range of 0.600-0.850 V (vs. a standard hydrogen electrode (SHE) in acetonitrile at 25 °C). Herein, we have examined the effect that structural modifications have on the value of the oxidation potential of cyclic nitroxides as well as the reversibility of the oxidation process. These included alterations to the N-containing skeleton (pyrrolidine, piperidine, isoindoline, azaphenalene, etc.), as well as the introduction of different substituents (alkyl-, methoxy-, amino-, carboxy-, etc.) to the ring. Standard oxidation potentials were calculated using high-level ab initio methodology that was demonstrated to be very accurate (with a mean absolute deviation from experimental values of only 16 mV). An optimal value of 1.45 for the electrostatic scaling factor for UAKS radii in acetonitrile solution was obtained. Established trends in the values of oxidation potentials were used to guide molecular design of stable nitroxides with desired E° ox and a number of compounds were suggested for potential use as enhanced redox mediators in DSSCs. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
Purpose: The management of unruptured aneurysms remains controversial as treatment infers potential significant risk to the currently well patient. The decision to treat is based upon aneurysm location, size and abnormal morphology (e.g. bleb formation). A method to predict bleb formation would thus help stratify patient treatment. Our study aims to investigate possible associations between intra-aneurysmal flow dynamics and bleb formation within intracranial aneurysms. Competing theories on aetiology appear in the literature. Our purpose is to further clarify this issue. Methodology: We recruited data from 3D rotational angiograms (3DRA) of 30 patients with cerebral aneurysms and bleb formation. Models representing aneurysms pre-bleb formation were reconstructed by digitally removing the bleb, then computational fluid dynamics simulations were run on both pre and post bleb models. Pulsatile flow conditions and standard boundary conditions were imposed. Results: Aneurysmal flow structure, impingement regions, wall shear stress magnitude and gradients were produced for all models. Correlation of these parameters with bleb formation was sought. Certain CFD parameters show significant inter patient variability, making statistically significant correlation difficult on the partial data subset obtained currently. Conclusion: CFD models are readily producible from 3DRA data. Preliminary results indicate bleb formation appears to be related to regions of high wall shear stress and direct impingement regions of the aneurysm wall.
Resumo:
The feasibility of using an in-hardware implementation of a genetic algorithm (GA) to solve the computationally expensive travelling salesman problem (TSP) is explored, especially in regard to hardware resource requirements for problem and population sizes. We investigate via numerical experiments whether a small population size might prove sufficient to obtain reasonable quality solutions for the TSP, thereby permitting relatively resource efficient hardware implementation on field programmable gate arrays (FPGAs). Software experiments on two TSP benchmarks involving 48 and 532 cities were used to explore the extent to which population size can be reduced without compromising solution quality, and results show that a GA allowed to run for a large number of generations with a smaller population size can yield solutions of comparable quality to those obtained using a larger population. This finding is then used to investigate feasible problem sizes on a targeted Virtex-7 vx485T-2 FPGA platform via exploration of hardware resource requirements for memory and data flow operations.
Resumo:
CTAC2012 was the 16th biennial Computational Techniques and Applications Conference, and took place at Queensland University of Technology from 23 - 26 September, 2012. The ANZIAM Special Interest Group in Computational Techniques and Applications is responsible for the CTAC meetings, the first of which was held in 1981.
Resumo:
Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.
Resumo:
Triangle-shaped nanohole, nanodot, and lattice antidot structures in hexagonal boron-nitride (h-BN) monolayer sheets are characterized with density functional theory calculations utilizing the local spin density approximation. We find that such structures may exhibit very large magnetic moments and associated spin splitting. N-terminated nanodots and antidots show strong spin anisotropy around the Fermi level, that is, half-metallicity. While B-terminated nanodots are shown to lack magnetism due to edge reconstruction, B-terminated nanoholes can retain magnetic character due to the enhanced structural stability of the surrounding two-dimensional matrix. In spite of significant lattice contraction due to the presence of multiple holes, antidot super lattices are predicted to be stable, exhibiting amplified magnetism as well as greatly enhanced half-metallicity. Collectively, the results indicate new opportunities for designing h-BNbased nanoscale devices with potential applications in the areas of spintronics, light emission, and photocatalysis.
Resumo:
Selective separation of nitrogen (N2) from methane (CH4) is highly significant in natural gas purification, and it is very challenging to achieve this because of their nearly identical size (the molecular diameters of N2 and CH4 are 3.64 Å and 3.80 Å, respectively). Here we theoretically study the adsorption of N2 and CH4 on B12 cluster and solid boron surfaces a-B12 and c-B28. Our results show that these electron-deficiency boron materials have higher selectivity in adsorbing and capturing N2 than CH4, which provides very useful information for experimentally exploiting boron materials for natural gas purification.
Resumo:
First-principles computational studies indicate that (B, N, or O)-doped graphene ribbon edges can substantially reduce the energy barrier for H2 dissociative adsorption. The low barrier is competitive with many widely used metal or metal oxide catalysts. This suggests that suitably functionalized graphene architectures are promising metal-free alternatives for low-cost catalytic processes.