256 resultados para Computational tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systems-level identification and analysis of cellular circuits in the brain will require the development of whole-brain imaging with single-cell resolution. To this end, we performed comprehensive chemical screening to develop a whole-brain clearing and imaging method, termed CUBIC (clear, unobstructed brain imaging cocktails and computational analysis). CUBIC is a simple and efficient method involving the immersion of brain samples in chemical mixtures containing aminoalcohols, which enables rapid whole-brain imaging with single-photon excitation microscopy. CUBIC is applicable to multicolor imaging of fluorescent proteins or immunostained samples in adult brains and is scalable from a primate brain to subcellular structures. We also developed a whole-brain cell-nuclear counterstaining protocol and a computational image analysis pipeline that, together with CUBIC reagents, enable the visualization and quantification of neural activities induced by environmental stimulation. CUBIC enables time-course expression profiling of whole adult brains with single-cell resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epigenetic changes correspond to heritable modifications of the chromosome structure, which do not involve alteration of the DNA sequence but do affect gene expression. These mechanisms play an important role in normal cell differentiation, but aberration is associated also with several diseases, including cancer and neural disorders. In consequence, despite intensive studies in recent years, the contribution of modifications remains largely unquantified due to overall system complexity and insufficient data. Computational models can provide powerful auxiliary tools to experimentation, not least as scales from the sub-cellular through cell populations (or to networks of genes) can be spanned. In this paper, the challenges to development, of realistic cross-scale models, are discussed and illustrated with respect to current work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – In structural, earthquake and aeronautical engineering and mechanical vibration, the solution of dynamic equations for a structure subjected to dynamic loading leads to a high order system of differential equations. The numerical methods are usually used for integration when either there is dealing with discrete data or there is no analytical solution for the equations. Since the numerical methods with more accuracy and stability give more accurate results in structural responses, there is a need to improve the existing methods or develop new ones. The paper aims to discuss these issues. Design/methodology/approach – In this paper, a new time integration method is proposed mathematically and numerically, which is accordingly applied to single-degree-of-freedom (SDOF) and multi-degree-of-freedom (MDOF) systems. Finally, the results are compared to the existing methods such as Newmark’s method and closed form solution. Findings – It is concluded that, in the proposed method, the data variance of each set of structural responses such as displacement, velocity, or acceleration in different time steps is less than those in Newmark’s method, and the proposed method is more accurate and stable than Newmark’s method and is capable of analyzing the structure at fewer numbers of iteration or computation cycles, hence less time-consuming. Originality/value – A new mathematical and numerical time integration method is proposed for the computation of structural responses with higher accuracy and stability, lower data variance, and fewer numbers of iterations for computational cycles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter discusses the methodological aspects and empirical findings of a large-scale, funded project investigating public communication through social media in Australia. The project concentrates on Twitter, but we approach it as representative of broader current trends toward the integration of large datasets and computational methods into media and communication studies in general, and social media scholarship in particular. The research discussed in this chapter aims to empirically describe networks of affiliation and interest in the Australian Twittersphere, while reflecting on the methodological implications and imperatives of ‘big data’ in the humanities. Using custom network crawling technology, we have conducted a snowball crawl of Twitter accounts operated by Australian users to identify more than one million users and their follower/followee relationships, and have mapped their interconnections. In itself, the map provides an overview of the major clusters of densely interlinked users, largely centred on shared topics of interest (from politics through arts to sport) and/or sociodemographic factors (geographic origins, age groups). Our map of the Twittersphere is the first of its kind for the Australian part of the global Twitter network, and also provides a first independent and scholarly estimation of the size of the total Australian Twitter population. In combination with our investigation of participation patterns in specific thematic hashtags, the map also enables us to examine which areas of the underlying follower/followee network are activated in the discussion of specific current topics – allowing new insights into the extent to which particular topics and issues are of interest to specialised niches or to the Australian public more broadly. Specifically, we examine the Twittersphere footprint of dedicated political discussion, under the #auspol hashtag, and compare it with the heightened, broader interest in Australian politics during election campaigns, using #ausvotes; we explore the different patterns of Twitter activity across the map for major television events (the popular competitive cooking show #masterchef, the British #royalwedding, and the annual #stateoforigin Rugby League sporting contest); and we investigate the circulation of links to the articles published by a number of major Australian news organisations across the network. Such analysis, which combines the ‘big data’-informed map and a close reading of individual communicative phenomena, makes it possible to trace the dynamic formation and dissolution of issue publics against the backdrop of longer-term network connections, and the circulation of information across these follower/followee links. Such research sheds light on the communicative dynamics of Twitter as a space for mediated social interaction. Our work demonstrates the possibilities inherent in the current ‘computational turn’ (Berry, 2010) in the digital humanities, as well as adding to the development and critical examination of methodologies for dealing with ‘big data’ (boyd and Crawford, 2011). Out tools and methods for doing Twitter research, released under Creative Commons licences through our project Website, provide the basis for replicable and verifiable digital humanities research on the processes of public communication which take place through this important new social network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Morphological and physiological characteristics of neurons located in the dorsolateral and two ventral subdivisions of the lateral amygdala (LA) have been compared in order to differentiate their roles in the formation and storage of fear memories (Alphs et al, SfN abs 623.1, 2003). Briefly, in these populations, significant differences are observed in input resistance, membrane time constant, firing frequency, dendritic tortuosity, numbers of primary dendrites, dendritic segments and dendritic nodes...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explored the state of the use of e-learning tools within Learning Management Systems in higher education and developed a distinct framework to explain the factors influencing users' engagement with these tools. The study revealed that the Learning Management System design, preferences for other tools, availability of time, lack of adequate knowledge about tools, pedagogical practices, and social influences affect the uptake of Learning Management System tools. Semi structured interviews with 74 students and lecturers of a major Australian university were used as a source of data. The applied thematic analysis method was used to analyse the collected data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental sensors collect massive amounts of audio data. This thesis investigates computational methods to support human analysts in identifying faunal vocalisations from that audio. A series of experiments was conducted to trial the effectiveness of novel user interfaces. This research examines the rapid scanning of spectrograms, decision support tools for users, and cleaning methods for folksonomies. Together, these investigations demonstrate that providing computational support to human analysts increases their efficiency and accuracy; this allows bioacoustics projects to efficiently utilise their valuable human analysts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis introduces a new way of using prior information in a spatial model and develops scalable algorithms for fitting this model to large imaging datasets. These methods are employed for image-guided radiation therapy and satellite based classification of land use and water quality. This study has utilized a pre-computation step to achieve a hundredfold improvement in the elapsed runtime for model fitting. This makes it much more feasible to apply these models to real-world problems, and enables full Bayesian inference for images with a million or more pixels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction & Aims Optimising fracture treatments requires a sound understanding of relationships between stability, callus development and healing outcomes. This has been the goal of computational modelling, but discrepancies remain between simulations and experimental results. We compared healing patterns vs fixation stiffness between a novel computational callus growth model and corresponding experimental data. Hypothesis We hypothesised that callus growth is stimulated by diffusible signals, whose production is in turn regulated by mechanical conditions at the fracture site. We proposed that introducing this scheme into computational models would better replicate the observed tissue patterns and the inverse relationship between callus size and fixation stiffness. Method Finite element models of bone healing under stiff and flexible fixation were constructed, based on the parameters of a parallel rat femoral osteotomy study. An iterative procedure was implemented, to simulate the development of callus and its mechanical regulation. Tissue changes were regulated according to published mechano-biological criteria. Predictions of healing patterns were compared between standard models, with a pre-defined domain for callus development, and a novel approach, in which periosteal callus growth is driven by a diffusible signal. Production of this signal was driven by local mechanical conditions. Finally, each model’s predictions were compared to the corresponding histological data. Results Models in which healing progressed within a prescribed callus domain predicted that greater interfragmentary movements would displace early periosteal bone formation further from the fracture. This results from artificially large distortional strains predicted near the fracture edge. While experiments showed increased hard callus size under flexible fixation, this was not reflected in the standard models. Allowing the callus to grow from a thin soft tissue layer, in response to a mechanically stimulated diffusible signal, results in a callus shape and tissue distribution closer to those observed histologically. Importantly, the callus volume increased with increasing interfragmentary movement. Conclusions A novel method to incorporate callus growth into computational models of fracture healing allowed us to successfully capture the relationship between callus size and fixation stability observed in our rat experiments. This approach expands our toolkit for understanding the influence of different fixation strategies on healing outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a major effort in medical imaging to develop algorithms to extract information from DTI and HARDI, which provide detailed information on brain integrity and connectivity. As the images have recently advanced to provide extraordinarily high angular resolution and spatial detail, including an entire manifold of information at each point in the 3D images, there has been no readily available means to view the results. This impedes developments in HARDI research, which need some method to check the plausibility and validity of image processing operations on HARDI data or to appreciate data features or invariants that might serve as a basis for new directions in image segmentation, registration, and statistics. We present a set of tools to provide interactive display of HARDI data, including both a local rendering application and an off-screen renderer that works with a web-based viewer. Visualizations are presented after registration and averaging of HARDI data from 90 human subjects, revealing important details for which there would be no direct way to appreciate using conventional display of scalar images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian housing sector contributes about a fifth of national greenhouse gas (GHG) emissions. GHG emissions contribute to climate change which leads to an increase in the occurrence or intensity of natural disasters and damage of houses. To ensure housing performance in the face of climate change, various rating tools for residential property have been introduced in different countries. The aim of this paper is to present a preliminary comparison between international and Australian rating tools in terms of purpose, use and sustainability elements for residential property. The methodologies used are to review, classify, compare and identify similarities and differences between rating tools. Two international tools, Building Research Establishment Environmental Assessment Methodology (BREEAM) (UK) and Leadership in Energy and Environmental Design for Homes (LEED-Homes) (USA), will be compared to two Australian tools, Green Star – Multi Unit Residential v1 and EnviroDevelopment. All four rating tools include management, energy, water and material aspects. The findings reveal thirteen elements that fall under three categories: spatial planning, occupants’ health and comfort, and environmental conditions. The variations in different tools may result from differences in local prevailing climate. Not all sustainability elements covered by international rating tools are included in the Australian rating tools. The voluntary nature of the tools implies they are not broadly applied in their respective market and that there is a policy implementation gap. A comprehensive rating tool could be developed in Australia to promote and lessen the confusion about sustainable housing, which in turn assist in improving the supply and demand of sustainable housing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphitic like layered materials exhibit intriguing electronic structures and thus the search for new types of two-dimensional (2D) monolayer materials is of great interest for developing novel nano-devices. By using density functional theory (DFT) method, here we for the first time investigate the structure, stability, electronic and optical properties of monolayer lead iodide (PbI2). The stability of PbI2 monolayer is first confirmed by phonon dispersion calculation. Compared to the calculation using generalized gradient approximation, screened hybrid functional and spin–orbit coupling effects can not only predicts an accurate bandgap (2.63 eV), but also the correct position of valence and conduction band edges. The biaxial strain can tune its bandgap size in a wide range from 1 eV to 3 eV, which can be understood by the strain induced uniformly change of electric field between Pb and I atomic layer. The calculated imaginary part of the dielectric function of 2D graphene/PbI2 van der Waals type hetero-structure shows significant red shift of absorption edge compared to that of a pure monolayer PbI2. Our findings highlight a new interesting 2D material with potential applications in nanoelectronics and optoelectronics.