630 resultados para Computational studies
Resumo:
Proton-bound dimers consisting of two glycerophospholipids with different headgroups were prepared using negative ion electrospray ionization and dissociated in a triple quadrupole mass spectrometer. Analysis of the tandem mass spectra of the dimers using the kinetic method provides, for the first time, an order of acidity for the phospholipid classes in the gas phase of PE < PA << PG < PS < PI. Hybrid density functional calculations on model phospholipids were used to predict the absolute deprotonation enthalpies of the phospholipid classes from isodesmic proton transfer reactions with phosphoric acid. The computational data largely support the experimental acidity trend, with the exception of the relative acidity ranking of the two most acidic phospholipid species. Possible causes of the discrepancy between experiment and theory are discussed and the experimental trend is recommended. The sequence of gas phase acidities for the phospholipid headgroups is found to (1) have little correlation with the relative ionization efficiencies of the phospholipid classes observed in the negative ion electrospray process, and (2) correlate well with fragmentation trends observed upon collisional activation of phospholipid \[M - H](-) anions. (c) 2005 American Society for Mass Spectrometry.
Resumo:
First-principles computational studies indicate that (B, N, or O)-doped graphene ribbon edges can substantially reduce the energy barrier for H2 dissociative adsorption. The low barrier is competitive with many widely used metal or metal oxide catalysts. This suggests that suitably functionalized graphene architectures are promising metal-free alternatives for low-cost catalytic processes.
Resumo:
The vibration serviceability limit state is an important design consideration for two-way, suspended concrete floors that is not always well understood by many practicing structural engineers. Although the field of floor vibration has been extensively developed, at present there are no convenient design tools that deal with this problem. Results from this research have enabled the development of a much-needed, new method for assessing the vibration serviceability of flat, suspended concrete floors in buildings. This new method has been named, the Response Coefficient-Root Function (RCRF) method. Full-scale, laboratory tests have been conducted on a post-tensioned floor specimen at Queensland University of Technology’s structural laboratory. Special support brackets were fabricated to perform as frictionless, pinned connections at the corners of the specimen. A series of static and dynamic tests were performed in the laboratory to obtain basic material and dynamic properties of the specimen. Finite-element-models have been calibrated against data collected from laboratory experiments. Computational finite-element-analysis has been extended to investigate a variety of floor configurations. Field measurements of floors in existing buildings are in good agreement with computational studies. Results from this parametric investigation have led to the development of new approach for predicting the design frequencies and accelerations of flat, concrete floor structures. The RCRF method is convenient tool to assist structural engineers in the design for the vibration serviceability limit-state of in-situ concrete floor systems.
Resumo:
Fractures of long bones are sometimes treated using various types of fracture fixation devices including internal plate fixators. These are specialised plates which are used to bridge the fracture gap(s) whilst anatomically aligning the bone fragments. The plate is secured in position by screws. The aim of such a device is to support and promote the natural healing of the bone. When using an internal fixation device, it is necessary for the clinician to decide upon many parameters, for example, the type of plate and where to position it; how many and where to position the screws. While there have been a number of experimental and computational studies conducted regarding the configuration of screws in the literature, there is still inadequate information available concerning the influence of screw configuration on fracture healing. Because screw configuration influences the amount of flexibility at the area of fracture, it has a direct influence on the fracture healing process. Therefore, it is important that the chosen screw configuration does not inhibit the healing process. In addition to the impact on the fracture healing process, screw configuration plays an important role in the distribution of stresses in the plate due to the applied loads. A plate that experiences high stresses is prone to early failure. Hence, the screw configuration used should not encourage the occurrence of high stresses. This project develops a computational program in Fortran programming language to perform mathematical optimisation to determine the screw configuration of an internal fixation device within constraints of interfragmentary movement by minimising the corresponding stress in the plate. Thus, the optimal solution suggests the positioning and number of screws which satisfies the predefined constraints of interfragmentary movements. For a set of screw configurations the interfragmentary displacement and the stress occurring in the plate were calculated by the Finite Element Method. The screw configurations were iteratively changed and each time the corresponding interfragmentary displacements were compared with predefined constraints. Additionally, the corresponding stress was compared with the previously calculated stress value to determine if there was a reduction. These processes were continued until an optimal solution was achieved. The optimisation program has been shown to successfully predict the optimal screw configuration in two cases. The first case was a simplified bone construct whereby the screw configuration solution was comparable with those recommended in biomechanical literature. The second case was a femoral construct, of which the resultant screw configuration was shown to be similar to those used in clinical cases. The optimisation method and programming developed in this study has shown that it has potential to be used for further investigations with the improvement of optimisation criteria and the efficiency of the program.
Resumo:
This review focuses on one of the fundamental phenomena that occur upon application of sufficiently strong electric fields to gases, namely the formation and propagation of ionization waves-streamers. The dynamics of streamers is controlled by strongly nonlinear coupling, in localized streamer tip regions, between enhanced (due to charge separation) electric field and ionization and transport of charged species in the enhanced field. Streamers appear in nature (as initial stages of sparks and lightning, as huge structures-sprites above thunderclouds), and are also found in numerous technological applications of electrical discharges. Here we discuss the fundamental physics of the guided streamer-like structures-plasma bullets which are produced in cold atmospheric-pressure plasma jets. Plasma bullets are guided ionization waves moving in a thin column of a jet of plasma forming gases (e.g.,He or Ar) expanding into ambient air. In contrast to streamers in a free (unbounded) space that propagate in a stochastic manner and often branch, guided ionization waves are repetitive and highly-reproducible and propagate along the same path-the jet axis. This property of guided streamers, in comparison with streamers in a free space, enables many advanced time-resolved experimental studies of ionization waves with nanosecond precision. In particular, experimental studies on manipulation of streamers by external electric fields and streamer interactions are critically examined. This review also introduces the basic theories and recent advances on the experimental and computational studies of guided streamers, in particular related to the propagation dynamics of ionization waves and the various parameters of relevance to plasma streamers. This knowledge is very useful to optimize the efficacy of applications of plasma streamer discharges in various fields ranging from health care and medicine to materials science and nanotechnology.
Resumo:
Developing nano/micro-structures which can effectively upgrade the intriguing properties of electrode materials for energy storage devices is always a key research topic. Ultrathin nanosheets were proved to be one of the potential nanostructures due to their high specific surface area, good active contact areas and porous channels. Herein, we report a unique hierarchical micro-spherical morphology of well-stacked and completely miscible molybdenum disulfide (MoS2) nanosheets and graphene sheets, were successfully synthesized via a simple and industrial scale spray-drying technique to take the advantages of both MoS2 and graphene in terms of their high practical capacity values and high electronic conductivity, respectively. Computational studies were performed to understand the interfacial behaviour of MoS2 and graphene, which proves high stability of the composite with high interfacial binding energy (−2.02 eV) among them. Further, the lithium and sodium storage properties have been tested and reveal excellent cyclic stability over 250 and 500 cycles, respectively, with the highest initial capacity values of 1300 mAh g−1 and 640 mAh g−1 at 0.1 A g−1.
Resumo:
The results of a recent study have shown that there is a severe shortage of donor hearts to meet the demand of patients suffering from acute heart failures, and patients who received a left ventricular assist device (LVAD) have extended lives. However, some of them develop right heart failure syndrome, and these patients required a right ventricular assist device (RVAD). Hence, current research focus is in the development of a bi-ventricular assist device (Bi-VAD). Computational Fluid Dynamics (CFD) is useful for estimating blood damage for design of a Bi-VAD centrifugal heart pump to meet the demand of the left and right ventricles of a normal hearts with a flow rate of 5 lit/min and the supply pressure of 100 mmHg for the left ventricle and 20 mmHg for the right ventricle. Numerical studies have been conducted to predict pressure, flow rate, the velocity profiles, and streamlines in a continuous flow Bi-VAD using. Based on the predictions of numerical simulations, only few flow regions in the Bi-VAD exhibited signs of velocity profiles and stagnation points, thereby signifying potentially low levels of thrombogenesis.
Resumo:
This chapter traces the development of the global digital storytelling movement from its origins in California to its adoption by the BBC in the UK and its subsequent dispersal around the world. It identifies the foundational practices, uneven development and diffusion, and emergent practices internationally.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, the social sciences, and media and communications. New technologies may enhance the traditional aims of journalism, or may initiate greater interaction between journalists and information and communication technology (ICT) specialists. The enhanced use of computing in news production is related in particular to three factors: larger government data sets becoming more widely available; the increasingly sophisticated and ubiquitous nature of software; and the developing digital economy. Drawing upon international examples, this paper argues that computational journalism techniques may provide new foundations for original investigative journalism and increase the scope for new forms of interaction with readers. Computer journalism provides a major opportunity to enhance the delivery of original investigative journalism, and to attract and retain readers online.
Resumo:
This chapter focuses on the interactions and roles between delays and intrinsic noise effects within cellular pathways and regulatory networks. We address these aspects by focusing on genetic regulatory networks that share a common network motif, namely the negative feedback loop, leading to oscillatory gene expression and protein levels. In this context, we discuss computational simulation algorithms for addressing the interplay of delays and noise within the signaling pathways based on biological data. We address implementational issues associated with efficiency and robustness. In a molecular biology setting we present two case studies of temporal models for the Hes1 gene (Monk, 2003; Hirata et al., 2002), known to act as a molecular clock, and the Her1/Her7 regulatory system controlling the periodic somite segmentation in vertebrate embryos (Giudicelli and Lewis, 2004; Horikawa et al., 2006).
Resumo:
We present a novel, web-accessible scientific workflow system which makes large-scale comparative studies accessible without programming or excessive configuration requirements. GPFlow allows a workflow defined on single input values to be automatically lifted to operate over collections of input values and supports the formation and processing of collections of values without the need for explicit iteration constructs. We introduce a new model for collection processing based on key aggregation and slicing which guarantees processing integrity and facilitates automatic association of inputs, allowing scientific users to manage the combinatorial explosion of data values inherent in large scale comparative studies. The approach is demonstrated using a core task from comparative genomics, and builds upon our previous work in supporting combined interactive and batch operation, through a lightweight web-based user interface.
Resumo:
ForscherInnen aus Sozial- und Geisteswissenschaften interessieren sich seit nunmehr einem Jahrzehnt für Blogs, Online-Tagebücher und Online-Journale. Auch wenn die Zuwachsrate der Blogosphäre seit der Blütezeit des Bloggens in den 2000ern stagniert, bleiben Blogs doch eines der bedeutendsten Genres der internetgestützten Kommunikation. Tatsächlich ist nach der Massenabwanderung zu Facebook, Twitter und anderen erst in jüngerer Zeit entstandenen Kommunikationsmitteln eine etwas kleinere, aber umso stärker etablierte Blogosphäre von engagierten und eingeschworenen Teilnehmenden übriggeblieben. Blogs werden mittlerweile als Teil einer institutionellen, persönlichen und Gruppen-Kommunikationstrategie akzeptiert. In Stil und Inhalt liegen sie zwischen den statischeren Informationen auf konventionellen Websites und den ständig aktualisierten Facebook- und Twitter-Newsfeeds. Blogs ermöglichen es ihren AutorInnen (und deren KommentatorInnen), bestimmte Themen im Umfang von einigen hundert bis zu einigen tausend Wörtern zu durchdenken, in kürzeren Posts ins Detail zu gehen und ggf. intensiver durchdachte Texte anderswo zu publizieren. Zudem sind sie auch ein sehr flexibles Medium: Bilder, Audio-, Video- sowie andere Materialien können mühelos eingefügt werden - und natürlich auch das grundlegende Instrument des Bloggens: Hyperlinks.