912 resultados para Images - Computational methods
Resumo:
Solid-extracellular fluid interaction is believed to play an important role in the strain-rate dependent mechanical behaviors of shoulder articular cartilages. It is believed that the kangaroo shoulder joint is anatomically and biomechanically similar to human shoulder joint and it is easy to get in Australia. Therefore, the kangaroo humeral head cartilage was used as the suitable tissue for the study in this paper. Indentation tests from quasi-static (10-4/sec) to moderately high strain-rate (10-2/sec) on kangaroo humeral head cartilage tissues were conduced to investigate the strain-rate dependent behaviors. A finite element (FE) model was then developed, in which cartilage was conceptualized as a porous solid matrix filled with incompressible fluids. In this model, the solid matrix was modeled as an isotropic hyperelastic material and the percolating fluid follows Darcy’s law. Using inverse FE procedure, the constitutive parameters related to stiffness, compressibility of the solid matrix and permeability were obtained from the experimental results. The effect of solid-extracellular fluid interaction and drag force (the resistance to fluid movement) on strain-rate dependent behavior was investigated by comparing the influence of constant, strain dependent and strain-rate dependent permeability on FE model prediction. The newly developed porohyperelastic cartilage model with the inclusion of strain-rate dependent permeability was found to be able to predict the strain-rate dependent behaviors of cartilages.
Resumo:
The advent of very high resolution (VHR) optical satellites capable of producing stereo images led to a new era in extracting digital elevation model which commenced with the launch of IKONOS. The special specifications of VHR optical satellites besides, the significant economic profit stimulated other countries and companies to have their constellations such as EROS-A1 and EROS-B1 as the cooperation between Israel and ImageSat. QuickBird, WorldView-1 and WorldVew-2 were launched by DigitalGlobe. ALOS and GeoEye-1 were offered by Japan and GeoEye Respectively. In addition to aforementioned satellites, Indian and South Korea initiated their own constellation by launching CartoSat-1 and KOPOSAT-2 respectively.The availability of all so-called satellites make a huge market of stereo images for extracting of digital elevation model and other correspondent applications such as, producing orthorectifcatin images and updating maps. Therefore, there is a need for a comprehensive comparison for scientific and commercial clients to choose appropriate satellite images and methods of generating digital elevation model to obtain optimum results. This paper will thus give a review about the specifications of VHR optical satellites. Then it will discuss the automatic elaborating of digital elevation model. Finally an overview of studies and corresponding results is reported.
Resumo:
This chapter discusses the methodological aspects and empirical findings of a large-scale, funded project investigating public communication through social media in Australia. The project concentrates on Twitter, but we approach it as representative of broader current trends toward the integration of large datasets and computational methods into media and communication studies in general, and social media scholarship in particular. The research discussed in this chapter aims to empirically describe networks of affiliation and interest in the Australian Twittersphere, while reflecting on the methodological implications and imperatives of ‘big data’ in the humanities. Using custom network crawling technology, we have conducted a snowball crawl of Twitter accounts operated by Australian users to identify more than one million users and their follower/followee relationships, and have mapped their interconnections. In itself, the map provides an overview of the major clusters of densely interlinked users, largely centred on shared topics of interest (from politics through arts to sport) and/or sociodemographic factors (geographic origins, age groups). Our map of the Twittersphere is the first of its kind for the Australian part of the global Twitter network, and also provides a first independent and scholarly estimation of the size of the total Australian Twitter population. In combination with our investigation of participation patterns in specific thematic hashtags, the map also enables us to examine which areas of the underlying follower/followee network are activated in the discussion of specific current topics – allowing new insights into the extent to which particular topics and issues are of interest to specialised niches or to the Australian public more broadly. Specifically, we examine the Twittersphere footprint of dedicated political discussion, under the #auspol hashtag, and compare it with the heightened, broader interest in Australian politics during election campaigns, using #ausvotes; we explore the different patterns of Twitter activity across the map for major television events (the popular competitive cooking show #masterchef, the British #royalwedding, and the annual #stateoforigin Rugby League sporting contest); and we investigate the circulation of links to the articles published by a number of major Australian news organisations across the network. Such analysis, which combines the ‘big data’-informed map and a close reading of individual communicative phenomena, makes it possible to trace the dynamic formation and dissolution of issue publics against the backdrop of longer-term network connections, and the circulation of information across these follower/followee links. Such research sheds light on the communicative dynamics of Twitter as a space for mediated social interaction. Our work demonstrates the possibilities inherent in the current ‘computational turn’ (Berry, 2010) in the digital humanities, as well as adding to the development and critical examination of methodologies for dealing with ‘big data’ (boyd and Crawford, 2011). Out tools and methods for doing Twitter research, released under Creative Commons licences through our project Website, provide the basis for replicable and verifiable digital humanities research on the processes of public communication which take place through this important new social network.
Numerical investigation of motion and deformation of a single red blood cell in a stenosed capillary
Resumo:
It is generally assumed that influence of the red blood cells (RBCs) is predominant in blood rheology. The healthy RBCs are highly deformable and can thus easily squeeze through the smallest capillaries having internal diameter less than their characteristic size. On the other hand, RBCs infected by malaria or other diseases are stiffer and so less deformable. Thus it is harder for them to flow through the smallest capillaries. Therefore, it is very important to critically and realistically investigate the mechanical behavior of both healthy and infected RBCs which is a current gap in knowledge. The motion and the steady state deformed shape of the RBCs depend on many factors, such as the geometrical parameters of the capillary through which blood flows, the membrane bending stiffness and the mean velocity of the blood flow. In this study, motion and deformation of a single two-dimensional RBC in a stenosed capillary is explored by using smoothed particle hydrodynamics (SPH) method. An elastic spring network is used to model the RBC membrane, while the RBC's inside fluid and outside fluid are treated as SPH particles. The effect of RBC's membrane stiffness (kb), inlet pressure (P) and geometrical parameters of the capillary on the motion and deformation of the RBC is studied. The deformation index, RBC's mean velocity and the cell membrane energy are analyzed when the cell passes through the stenosed capillary. The simulation results demonstrate that the kb, P and the geometrical parameters of the capillary have a significant impact on the RBCs' motion and deformation in the stenosed section.
Resumo:
Environmental sensors collect massive amounts of audio data. This thesis investigates computational methods to support human analysts in identifying faunal vocalisations from that audio. A series of experiments was conducted to trial the effectiveness of novel user interfaces. This research examines the rapid scanning of spectrograms, decision support tools for users, and cleaning methods for folksonomies. Together, these investigations demonstrate that providing computational support to human analysts increases their efficiency and accuracy; this allows bioacoustics projects to efficiently utilise their valuable human analysts.
Resumo:
This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.
Resumo:
Environmental acoustic recordings can be used to perform avian species richness surveys, whereby a trained ornithologist can observe the species present by listening to the recording. This could be made more efficient by using computational methods for iteratively selecting the richest parts of a long recording for the human observer to listen to, a process known as “smart sampling”. This allows scaling up to much larger ecological datasets. In this paper we explore computational approaches based on information and diversity of selected samples. We propose to use an event detection algorithm to estimate the amount of information present in each sample. We further propose to cluster the detected events for a better estimate of this amount of information. Additionally, we present a time dispersal approach to estimating diversity between iteratively selected samples. Combinations of approaches were evaluated on seven 24-hour recordings that have been manually labeled by bird watchers. The results show that on average all the methods we have explored would allow annotators to observe more new species in fewer minutes compared to a baseline of random sampling at dawn.
Resumo:
Different human activities like combustion of fossil fuels, biomass burning, industrial and agricultural activities, emit a large amount of particulates into the atmosphere. As a consequence, the air we inhale contains significant amount of suspended particles, including organic and inorganic solids and liquids, as well as various microorganism, which are solely responsible for a number of pulmonary diseases. Developing a numerical model for transport and deposition of foreign particles in realistic lung geometry is very challenging due to the complex geometrical structure of the human lung. In this study, we have numerically investigated the airborne particle transport and its deposition in human lung surface. In order to obtain the appropriate results of particle transport and deposition in human lung, we have generated realistic lung geometry from the CT scan obtained from a local hospital. For a more accurate approach, we have also created a mucus layer inside the geometry, adjacent to the lung surface and added all apposite mucus layer properties to the wall surface. The Lagrangian particle tracking technique is employed by using ANSYS FLUENT solver to simulate the steady-state inspiratory flow. Various injection techniques have been introduced to release the foreign particles through the inlet of the geometry. In order to investigate the effects of particle size on deposition, numerical calculations are carried out for different sizes of particles ranging from 1 micron to 10 micron. The numerical results show that particle deposition pattern is completely dependent on its initial position and in case of realistic geometry; most of the particles are deposited on the rough wall surface of the lung geometry instead of carinal region.
Resumo:
Aerosol deposition in cylindrical tubes is a subject of interest to researchers and engineers in many applications of aerosol physics and metrology. Investigation of nano-particles in different aspects such as lungs, upper airways, batteries and vehicle exhaust gases is vital due the smaller size, adverse health effect and higher trouble for trapping than the micro-particles. The Lagrangian particle tracking provides an effective method for simulating the deposition of nano-particles as well as micro-particles as it accounts for the particle inertia effect as well as the Brownian excitation. However, using the Lagrangian approach for simulating ultrafine particles has been limited due to computational cost and numerical difficulties. In this paper, the deposition of nano-particles in cylindrical tubes under laminar condition is studied using the Lagrangian particle tracking method. The commercial Fluent software is used to simulate the fluid flow in the pipes and to study the deposition and dispersion of nano-particles. Different particle diameters as well as different flow rates are examined. The point analysis in a uniform flow is performed for validating the Brownian motion. The results show good agreement between the calculated deposition efficiency and the analytic correlations in the literature. Furthermore, for the nano-particles with the diameter more than 40 nm, the calculated deposition efficiency by the Lagrangian method is less than the analytic correlations based on Eulerian method due to statistical error or the inertia effect.
Resumo:
Several mechanisms have been proposed to explain the action of enzymes at the atomic level. Among them, the recent proposals involving short hydrogen bonds as a step in catalysis by Gerlt and Gassman [1] and proton transfer through low barrier hydrogen bonds (LBHBs) [2, 3] have attracted attention. There are several limitations to experimentally testing such hypotheses, Recent developments in computational methods facilitate the study of active site-ligand complexes to high levels of accuracy, Our previous studies, which involved the docking of the dinucleotide substrate UpA to the active site of RNase A [4, 5], enabled us to obtain a realistic model of the ligand-bound active site of RNase A. From these studies, based on empirical potential functions, we were able to obtain the molecular dynamics averaged coordinates of RNase A, bound to the ligand UpA. A quantum mechanical study is required to investigate the catalytic process which involves the cleavage and formation of covalent bonds. In the present study, we have investigated the strengths of some of the hydrogen bonds between the active site residues of RNase A and UpA at the ab initio quantum chemical level using the molecular dynamics averaged coordinates as the starting point. The 49 atom system and other model systems were optimized at the 3-21G level and the energies of the optimized systems were obtained at the 6-31G* level. The results clearly indicate the strengthening of hydrogen bonds between neutral residues due to the presence of charged species at appropriate positions. Such a strengthening manifests itself in the form of short hydrogen bonds and a low barrier for proton transfer. In the present study, the proton transfer between the 2'-OH of ribose (from the substrate) and the imidazole group from the H12 of RNase A is influenced by K41, which plays a crucial role in strengthening the neutral hydrogen bond, reducing the barrier for proton transfer.
Resumo:
In the thesis it is discussed in what ways concepts and methodology developed in evolutionary biology can be applied to the explanation and research of language change. The parallel nature of the mechanisms of biological evolution and language change is explored along with the history of the exchange of ideas between these two disciplines. Against this background computational methods developed in evolutionary biology are taken into consideration in terms of their applicability to the study of historical relationships between languages. Different phylogenetic methods are explained in common terminology, avoiding the technical language of statistics. The thesis is on one hand a synthesis of earlier scientific discussion, and on the other an attempt to map out the problems of earlier approaches in addition to finding new guidelines in the study of language change on their basis. Primarily literature about the connections between evolutionary biology and language change, along with research articles describing applications of phylogenetic methods into language change have been used as source material. The thesis starts out by describing the initial development of the disciplines of evolutionary biology and historical linguistics, a process which right from the beginning can be seen to have involved an exchange of ideas concerning the mechanisms of language change and biological evolution. The historical discussion lays the foundation for the handling of the generalised account of selection developed during the recent few decades. This account is aimed for creating a theoretical framework capable of explaining both biological evolution and cultural change as selection processes acting on self-replicating entities. This thesis focusses on the capacity of the generalised account of selection to describe language change as a process of this kind. In biology, the mechanisms of evolution are seen to form populations of genetically related organisms through time. One of the central questions explored in this thesis is whether selection theory makes it possible to picture languages are forming populations of a similar kind, and what a perspective like this can offer to the understanding of language in general. In historical linguistics, the comparative method and other, complementing methods have been traditionally used to study the development of languages from a common ancestral language. Computational, quantitative methods have not become widely used as part of the central methodology of historical linguistics. After the fading of a limited popularity enjoyed by the lexicostatistical method since the 1950s, only in the recent years have also the computational methods of phylogenetic inference used in evolutionary biology been applied to the study of early language history. In this thesis the possibilities offered by the traditional methodology of historical linguistics and the new phylogenetic methods are compared. The methods are approached through the ways in which they have been applied to the Indo-European languages, which is the most thoroughly investigated language family using both the traditional and the phylogenetic methods. The problems of these applications along with the optimal form of the linguistic data used in these methods are explored in the thesis. The mechanisms of biological evolution are seen in the thesis as parallel in a limited sense to the mechanisms of language change, however sufficiently so that the development of a generalised account of selection is deemed as possibly fruiful for understanding language change. These similarities are also seen to support the validity of using phylogenetic methods in the study of language history, although the use of linguistic data and the models of language change employed by these models are seen to await further development.