630 resultados para Computational studies
Resumo:
Characterization of the epigenetic profile of humans since the initial breakthrough on the human genome project has strongly established the key role of histone modifications and DNA methylation. These dynamic elements interact to determine the normal level of expression or methylation status of the constituent genes in the genome. Recently, considerable evidence has been put forward to demonstrate that environmental stress implicitly alters epigenetic patterns causing imbalance that can lead to cancer initiation. This chain of consequences has motivated attempts to computationally model the influence of histone modification and DNA methylation in gene expression and investigate their intrinsic interdependency. In this paper, we explore the relation between DNA methylation and transcription and characterize in detail the histone modifications for specific DNA methylation levels using a stochastic approach.
Resumo:
Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.
Resumo:
Systems-level identification and analysis of cellular circuits in the brain will require the development of whole-brain imaging with single-cell resolution. To this end, we performed comprehensive chemical screening to develop a whole-brain clearing and imaging method, termed CUBIC (clear, unobstructed brain imaging cocktails and computational analysis). CUBIC is a simple and efficient method involving the immersion of brain samples in chemical mixtures containing aminoalcohols, which enables rapid whole-brain imaging with single-photon excitation microscopy. CUBIC is applicable to multicolor imaging of fluorescent proteins or immunostained samples in adult brains and is scalable from a primate brain to subcellular structures. We also developed a whole-brain cell-nuclear counterstaining protocol and a computational image analysis pipeline that, together with CUBIC reagents, enable the visualization and quantification of neural activities induced by environmental stimulation. CUBIC enables time-course expression profiling of whole adult brains with single-cell resolution.
Resumo:
Epigenetic changes correspond to heritable modifications of the chromosome structure, which do not involve alteration of the DNA sequence but do affect gene expression. These mechanisms play an important role in normal cell differentiation, but aberration is associated also with several diseases, including cancer and neural disorders. In consequence, despite intensive studies in recent years, the contribution of modifications remains largely unquantified due to overall system complexity and insufficient data. Computational models can provide powerful auxiliary tools to experimentation, not least as scales from the sub-cellular through cell populations (or to networks of genes) can be spanned. In this paper, the challenges to development, of realistic cross-scale models, are discussed and illustrated with respect to current work.
Resumo:
This thesis aims at studying the structural behaviour of high bond strength masonry shear walls by developing a combined interface and surface contact model. The results are further verified by a cost-effective structural level model which was then extensively used for predicting all possible failure modes of high bond strength masonry shear walls. It is concluded that the increase in bond strength of masonry modifies the failure mode from diagonal cracking to base sliding and doesn't proportionally increase the in-plane shear capacity. This can be overcome by increasing pre-compression pressure which causes failure through blocks. A design equation is proposed and high bond strength masonry is recommended for taller buildings and/ or pre-stressed masonry applications.
Resumo:
This paper reports the details of an experimental study of cold-formed steel hollow section columns at ambient and elevated temperatures. In this study the global buckling behaviour of cold-formed Square Hollow Section (SHS) slender columns under axial compression was investigated at various uniform elevated temperatures up to 700℃. The results of these column tests are reported in this paper, which include the buckling/failure modes at elevated temperatures, and ultimate load versus temperature curves. Finite element models of tested columns were also developed and their behaviour and ultimate capacities at ambient and elevated temperatures were studied. Fire design rules given in European and American standards including the Direct Strength Method (DSM) based design rules were used to predict the ultimate capacities of tested columns at elevated temperatures. Elevated temperature mechanical properties and stress-strain models given in European steel design standards and past researches were used with design rules and finite element models to investigate their effects on SHS column capacities. Comparisons of column capacities from tests and finite element analyses with those predicted by current design rules were used to determine the accuracy of currently available column design rules in predicting the capacities of SHS columns at elevated temperatures and the need to use appropriate elevated temperature material stress-strain models. This paper presents the important findings derived from the comparisons of these column capacities.
Resumo:
Purpose – In structural, earthquake and aeronautical engineering and mechanical vibration, the solution of dynamic equations for a structure subjected to dynamic loading leads to a high order system of differential equations. The numerical methods are usually used for integration when either there is dealing with discrete data or there is no analytical solution for the equations. Since the numerical methods with more accuracy and stability give more accurate results in structural responses, there is a need to improve the existing methods or develop new ones. The paper aims to discuss these issues. Design/methodology/approach – In this paper, a new time integration method is proposed mathematically and numerically, which is accordingly applied to single-degree-of-freedom (SDOF) and multi-degree-of-freedom (MDOF) systems. Finally, the results are compared to the existing methods such as Newmark’s method and closed form solution. Findings – It is concluded that, in the proposed method, the data variance of each set of structural responses such as displacement, velocity, or acceleration in different time steps is less than those in Newmark’s method, and the proposed method is more accurate and stable than Newmark’s method and is capable of analyzing the structure at fewer numbers of iteration or computation cycles, hence less time-consuming. Originality/value – A new mathematical and numerical time integration method is proposed for the computation of structural responses with higher accuracy and stability, lower data variance, and fewer numbers of iterations for computational cycles.
Resumo:
This chapter discusses the methodological aspects and empirical findings of a large-scale, funded project investigating public communication through social media in Australia. The project concentrates on Twitter, but we approach it as representative of broader current trends toward the integration of large datasets and computational methods into media and communication studies in general, and social media scholarship in particular. The research discussed in this chapter aims to empirically describe networks of affiliation and interest in the Australian Twittersphere, while reflecting on the methodological implications and imperatives of ‘big data’ in the humanities. Using custom network crawling technology, we have conducted a snowball crawl of Twitter accounts operated by Australian users to identify more than one million users and their follower/followee relationships, and have mapped their interconnections. In itself, the map provides an overview of the major clusters of densely interlinked users, largely centred on shared topics of interest (from politics through arts to sport) and/or sociodemographic factors (geographic origins, age groups). Our map of the Twittersphere is the first of its kind for the Australian part of the global Twitter network, and also provides a first independent and scholarly estimation of the size of the total Australian Twitter population. In combination with our investigation of participation patterns in specific thematic hashtags, the map also enables us to examine which areas of the underlying follower/followee network are activated in the discussion of specific current topics – allowing new insights into the extent to which particular topics and issues are of interest to specialised niches or to the Australian public more broadly. Specifically, we examine the Twittersphere footprint of dedicated political discussion, under the #auspol hashtag, and compare it with the heightened, broader interest in Australian politics during election campaigns, using #ausvotes; we explore the different patterns of Twitter activity across the map for major television events (the popular competitive cooking show #masterchef, the British #royalwedding, and the annual #stateoforigin Rugby League sporting contest); and we investigate the circulation of links to the articles published by a number of major Australian news organisations across the network. Such analysis, which combines the ‘big data’-informed map and a close reading of individual communicative phenomena, makes it possible to trace the dynamic formation and dissolution of issue publics against the backdrop of longer-term network connections, and the circulation of information across these follower/followee links. Such research sheds light on the communicative dynamics of Twitter as a space for mediated social interaction. Our work demonstrates the possibilities inherent in the current ‘computational turn’ (Berry, 2010) in the digital humanities, as well as adding to the development and critical examination of methodologies for dealing with ‘big data’ (boyd and Crawford, 2011). Out tools and methods for doing Twitter research, released under Creative Commons licences through our project Website, provide the basis for replicable and verifiable digital humanities research on the processes of public communication which take place through this important new social network.
Resumo:
This research developed and applied an evaluative framework to analyse multiple scales of decision-making for environmental management planning. It is the first exploration of the sociological theory of structural-functionalism and its usefulness to support evidence based decision-making in a planning context. The framework was applied to analyse decision-making in Queensland's Cape York Peninsula and Wet Tropics regions.
Resumo:
This chapter explores the possibility and exigencies of employing hypotheses, or educated guesses, as the basis for ethnographic research design. The authors’ goal is to examine whether using hypotheses might provide a path to resolve some of the challenges to knowledge claims produced by ethnographic studies. Through resolution of the putative division between qualitative and quantitative research traditions , it is argued that hypotheses can serve as inferential warrants in qualitative and ethnographic studies.
Resumo:
Morphological and physiological characteristics of neurons located in the dorsolateral and two ventral subdivisions of the lateral amygdala (LA) have been compared in order to differentiate their roles in the formation and storage of fear memories (Alphs et al, SfN abs 623.1, 2003). Briefly, in these populations, significant differences are observed in input resistance, membrane time constant, firing frequency, dendritic tortuosity, numbers of primary dendrites, dendritic segments and dendritic nodes...
Resumo:
Vicki Mayer’s book is unusual in that, despite its title, it is not about television producers at all, or at least not in the sense that scholars and the television industry itself have traditionally understood the role. Rather than referring to those in creative, managerial or financial control, or those with substantial intellectual input into a program, Mayer uses the term in a deliberately broad sense to mean, essentially, anyone ‘whose labor, however small, contributes to [television] production’ (179).
Resumo:
Raman and infrared spectra of the uranyl mineral phurcalite, Ca2(UO2)3O2(PO4)2⋅7H2O, from Red Canyon, Utah, USA, were studied and tentatively interpreted. Observed bands were assigned to the stretching and bending vibrations of (UO2)2+ and (PO4)3− units and to the stretching and bending vibrations and libration modes of water molecules. Approximate lengths of U–O in (UO2)2+ and O–H⋯O hydrogen bond lengths were inferred from observed stretching vibrations. The presence of structurally nonequivalent U6+ and P5+ was inferred from the number of corresponding stretching bands of (UO2)2+ and (PO4)3− units observed in the Raman and infrared spectra..
Resumo:
In vitro studies and mathematical models are now being widely used to study the underlying mechanisms driving the expansion of cell colonies. This can improve our understanding of cancer formation and progression. Although much progress has been made in terms of developing and analysing mathematical models, far less progress has been made in terms of understanding how to estimate model parameters using experimental in vitro image-based data. To address this issue, a new approximate Bayesian computation (ABC) algorithm is proposed to estimate key parameters governing the expansion of melanoma cell (MM127) colonies, including cell diffusivity, D, cell proliferation rate, λ, and cell-to-cell adhesion, q, in two experimental scenarios, namely with and without a chemical treatment to suppress cell proliferation. Even when little prior biological knowledge about the parameters is assumed, all parameters are precisely inferred with a small posterior coefficient of variation, approximately 2–12%. The ABC analyses reveal that the posterior distributions of D and q depend on the experimental elapsed time, whereas the posterior distribution of λ does not. The posterior mean values of D and q are in the ranges 226–268 µm2h−1, 311–351 µm2h−1 and 0.23–0.39, 0.32–0.61 for the experimental periods of 0–24 h and 24–48 h, respectively. Furthermore, we found that the posterior distribution of q also depends on the initial cell density, whereas the posterior distributions of D and λ do not. The ABC approach also enables information from the two experiments to be combined, resulting in greater precision for all estimates of D and λ.
Resumo:
Cold-formed steel members have been widely used in residential and commercial buildings as primary load bearing structural elements. They are often made of thin steel sheets and hence they are more susceptible to local buckling. The buckling behaviour of cold-formed steel compression members under fire conditions is not fully investigated yet and hence there is a lack of knowledge on the fire performance of cold-formed steel compression members. Current cold-formed steel design standards do not provide adequate design guidelines for the fire design of cold-formed steel compression members. Therefore a research project based on extensive experimental and numerical studies was undertaken to investigate the local buckling behaviour of light gauge cold-formed steel compression members under simulated fire conditions. First a series of 91 local buckling tests was conducted at ambient and uniform elevated temperatures up to 700oC on cold-formed lipped and unlipped channels. Suitable finite element models were then developed to simulate the behaviour of tested columns and were validated using test results. All the ultimate load capacity results for local buckling were compared with the predictions from the available design rules based on AS/NZS 4600, BS 5950 Part 5, Eurocode 3 Parts 1.2 and 1.3 and the direct strength method (DSM), based on which suitable recommendations have been made for the fire design of cold-formed steel compression members subject to local buckling at uniform elevated temperatures.