957 resultados para biological models
Resumo:
This dissertation seeks to define and classify potential forms of Nonlinear structure and explore the possibilities they afford for the creation of new musical works. It provides the first comprehensive framework for the discussion of Nonlinear structure in musical works and provides a detailed overview of the rise of nonlinearity in music during the 20th century. Nonlinear events are shown to emerge through significant parametrical discontinuity at the boundaries between regions of relatively strong internal cohesion. The dissertation situates Nonlinear structures in relation to linear structures and unstructured sonic phenomena and provides a means of evaluating Nonlinearity in a musical structure through the consideration of the degree to which the structure is integrated, contingent, compressible and determinate as a whole. It is proposed that Nonlinearity can be classified as a three dimensional space described by three continuums: the temporal continuum, encompassing sequential and multilinear forms of organization, the narrative continuum encompassing processual, game structure and developmental narrative forms and the referential continuum encompassing stylistic allusion, adaptation and quotation. The use of spectrograms of recorded musical works is proposed as a means of evaluating Nonlinearity in a musical work through the visual representation of parametrical divergence in pitch, duration, timbre and dynamic over time. Spectral and structural analysis of repertoire works is undertaken as part of an exploration of musical nonlinearity and the compositional and performative features that characterize it. The contribution of cultural, ideological, scientific and technological shifts to the emergence of Nonlinearity in music is discussed and a range of compositional factors that contributed to the emergence of musical Nonlinearity is examined. The evolution of notational innovations from the mobile score to the screen score is plotted and a novel framework for the discussion of these forms of musical transmission is proposed. A computer coordinated performative model is discussed, in which a computer synchronises screening of notational information, provides temporal coordination of the performers through click-tracks or similar methods and synchronises the audio processing and synthesized elements of the work. It is proposed that such a model constitutes a highly effective means of realizing complex Nonlinear structures. A creative folio comprising 29 original works that explore nonlinearity is presented, discussed and categorised utilising the proposed classifications. Spectrograms of these works are employed where appropriate to illustrate the instantiation of parametrically divergent substructures and examples of structural openness through multiple versioning.
Resumo:
Parallel interleaved converters are finding more applications everyday, for example they are frequently used for VRMs on PC main boards mainly to obtain better transient response. Parallel interleaved converters can have their inductances uncoupled, directly coupled or inversely coupled, all of which have different applications with associated advantages and disadvantages. Coupled systems offer more control over converter features, such as ripple currents, inductance volume and transient response. To be able to gain an intuitive understanding of which type of parallel interleaved converter, what amount of coupling, what number of levels and how much inductance should be used for different applications a simple equivalent model is needed. As all phases of an interleaved converter are supposed to be identical, the equivalent model is nothing more than a separate inductance which is common to all phases. Without utilising this simplification the design of a coupled system is quite daunting. Being able to design a coupled system involves solving and understanding the RMS currents of the input, individual phase (or cell) and output. A procedure using this equivalent model and a small amount of modulo arithmetic is detailed.
Resumo:
This thesis concerns the mathematical model of moving fluid interfaces in a Hele-Shaw cell: an experimental device in which fluid flow is studied by sandwiching the fluid between two closely separated plates. Analytic and numerical methods are developed to gain new insights into interfacial stability and bubble evolution, and the influence of different boundary effects is examined. In particular, the properties of the velocity-dependent kinetic undercooling boundary condition are analysed, with regard to the selection of only discrete possible shapes of travelling fingers of fluid, the formation of corners on the interface, and the interaction of kinetic undercooling with the better known effect of surface tension. Explicit solutions to the problem of an expanding or contracting ring of fluid are also developed.
Resumo:
This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.
Resumo:
Electricity is the cornerstone of modern life. It is essential to economic stability and growth, jobs and improved living standards. Electricity is also the fundamental ingredient for a dignified life; it is the source of such basic human requirements as cooked food, a comfortable living temperature and essential health care. For these reasons, it is unimaginable that today's economies could function without electricity and the modern energy services that it delivers. Somewhat ironically, however, the current approach to electricity generation also contributes to two of the gravest and most persistent problems threatening the livelihood of humans. These problems are anthropogenic climate change and sustained human poverty. To address these challenges, the global electricity sector must reduce its reliance on fossil fuel sources. In this context, the object of this research is twofold. Initially it is to consider the design of the Renewable Energy (Electricity) Act 2000 (Cth) (Renewable Electricity Act), which represents Australia's primary regulatory approach to increase the production of renewable sourced electricity. This analysis is conducted by reference to the regulatory models that exist in Germany and Great Britain. Within this context, this thesis then evaluates whether the Renewable Electricity Act is designed effectively to contribute to a more sustainable and dignified electricity generation sector in Australia. On the basis of the appraisal of the Renewable Electricity Act, this thesis contends that while certain aspects of the regulatory regime have merit, ultimately its design does not represent an effective and coherent regulatory approach to increase the production of renewable sourced electricity. In this regard, this thesis proposes a number of recommendations to reform the existing regime. These recommendations are not intended to provide instantaneous or simple solutions to the current regulatory regime. Instead, the purpose of these recommendations is to establish the legal foundations for an effective regulatory regime that is designed to increase the production of renewable sourced electricity in Australia in order to contribute to a more sustainable and dignified approach to electricity production.
Resumo:
The Safe System approach to road safety utilises a holistic view of the interactions among vehicles, roads and road users. Yet, the contribution of each of these factors to crashes is vastly different. The role of road users is widely acknowledged as an overwhelming contributor to road crashes. Substantial gains have been made with improvements to vehicle and roads over a number of years. However, improvements of the road user’s behaviour has been (in some cases) less substantial. A road user behaviour that is relatively unregulated is driver sleepiness, which is part of the ‘fatal five’ of risky road user behaviours. The effect of sleepiness is ubiquitous – sleepiness is a state that most, if not all drivers on our roads has experienced, and is habitually exposed to. The quality and quantity of daily sleep is integral to our level of neurobehavioural performance during wakefulness and as such can have a compounding effect on a number of other risky driving behaviours. This paper will discuss the potential influence of sleepiness as an interceding factor for a number of risky driving behaviours. Little effort has been given to increasing awareness of the deleterious and wide ranging effects that sleepiness has on road safety. Given the wide ranging influence of sleepiness, improvements of ‘sleep health’ as a protective factor at the community or individual level could lead to significant reductions in road trauma and increases of general well being. A discussion of potential actions to reduce sleepiness is required if reductions of road trauma are to continue.
Resumo:
Monte Carlo simulations were used to investigate the relationship between the morphological characteristics and the diffusion tensor (DT) of partially aligned networks of cylindrical fibres. The orientation distributions of the fibres in each network were approximately uniform within a cone of a given semi-angle (θ0). This semi-angle was used to control the degree of alignment of the fibres. The networks studied ranged from perfectly aligned (θ0 = 0) to completely disordered (θ0 = 90°). Our results are qualitatively consistent with previous numerical models in the overall behaviour of the DT. However, we report a non-linear relationship between the fractional anisotropy (FA) of the DT and collagen volume fraction, which is different to the findings from previous work. We discuss our results in the context of diffusion tensor imaging of articular cartilage. We also demonstrate how appropriate diffusion models have the potential to enable quantitative interpretation of the experimentally measured diffusion-tensor FA in terms of collagen fibre alignment distributions.
Resumo:
In vivo small molecules as necessary intermediates are involved in numerous critical metabolic pathways and biological processes associated with many essential biological functions and events. There is growing evidence that MS-based metabolomics is emerging as a powerful tool to facilitate the discovery of functional small molecules that can better our understanding of development, infection, nutrition, disease, toxicity, drug therapeutics, gene modifications and host-pathogen interaction from metabolic perspectives. However, further progress must still be made in MS-based metabolomics because of the shortcomings in the current technologies and knowledge. This technique-driven review aims to explore the discovery of in vivo functional small molecules facilitated by MS-based metabolomics and to highlight the analytic capabilities and promising applications of this discovery strategy. Moreover, the biological significance of the discovery of in vivo functional small molecules with different biological contexts is also interrogated at a metabolic perspective.
Resumo:
Introduction. The purpose of this chapter is to address the question raised in the chapter title. Specifically, how can models of motor control help us understand low back pain (LBP)? There are several classes of models that have been used in the past for studying spinal loading, stability, and risk of injury (see Reeves and Cholewicki (2003) for a review of past modeling approaches), but for the purpose of this chapter we will focus primarily on models used to assess motor control and its effect on spine behavior. This chapter consists of 4 sections. The first section discusses why a shift in modeling approaches is needed to study motor control issues. We will argue that the current approach for studying the spine system is limited and not well-suited for assessing motor control issues related to spine function and dysfunction. The second section will explore how models can be used to gain insight into how the central nervous system (CNS) controls the spine. This segues segue nicely into the next section that will address how models of motor control can be used in the diagnosis and treatment of LBP. Finally, the last section will deal with the issue of model verification and validity. This issue is important since modelling accuracy is critical for obtaining useful insight into the behavior of the system being studied. This chapter is not intended to be a critical review of the literature, but instead intended to capture some of the discussion raised during the 2009 Spinal Control Symposium, with some elaboration on certain issues. Readers interested in more details are referred to the cited publications.
Resumo:
This project’s aim was to create new experimental models in small animals for the investigation of infections related to bone fracture fixation implants. Animal models are essential in orthopaedic trauma research and this study evaluated new implants and surgical techniques designed to improve standardisation in these experiments, and ultimately to minimise the number of animals needed in future work. This study developed and assessed procedures using plates and inter-locked nails to stabilise fractures in rabbit thigh bones. Fracture healing was examined with mechanical testing and histology. The results of this work contribute to improvements in future small animal infection experiments.
Resumo:
Pavlovian fear conditioning is a robust technique for examining behavioral and cellular components of fear learning and memory. In fear conditioning, the subject learns to associate a previously neutral stimulus with an inherently noxious co-stimulus. The learned association is reflected in the subjects' behavior upon subsequent re-exposure to the previously neutral stimulus or the training environment. Using fear conditioning, investigators can obtain a large amount of data that describe multiple aspects of learning and memory. In a single test, researchers can evaluate functional integrity in fear circuitry, which is both well characterized and highly conserved across species. Additionally, the availability of sensitive and reliable automated scoring software makes fear conditioning amenable to high-throughput experimentation in the rodent model; thus, this model of learning and memory is particularly useful for pharmacological and toxicological screening. Due to the conserved nature of fear circuitry across species, data from Pavlovian fear conditioning are highly translatable to human models. We describe equipment and techniques needed to perform and analyze conditioned fear data. We provide two examples of fear conditioning experiments, one in rats and one in mice, and the types of data that can be collected in a single experiment. © 2012 Springer Science+Business Media, LLC.
Resumo:
Pavlovian fear conditioning, also known as classical fear conditioning is an important model in the study of the neurobiology of normal and pathological fear. Progress in the neurobiology of Pavlovian fear also enhances our understanding of disorders such as posttraumatic stress disorder (PTSD) and with developing effective treatment strategies. Here we describe how Pavlovian fear conditioning is a key tool for understanding both the neurobiology of fear and the mechanisms underlying variations in fear memory strength observed across different phenotypes. First we discuss how Pavlovian fear models aspects of PTSD. Second, we describe the neural circuits of Pavlovian fear and the molecular mechanisms within these circuits that regulate fear memory. Finally, we show how fear memory strength is heritable; and describe genes which are specifically linked to both changes in Pavlovian fear behavior and to its underlying neural circuitry. These emerging data begin to define the essential genes, cells and circuits that contribute to normal and pathological fear.
Communication models of institutional online communities : the role of the ABC cultural intermediary
Resumo:
The co-creation of cultural artefacts has been democratised given the recent technological affordances of information and communication technologies. Web 2.0 technologies have enabled greater possibilities of citizen inclusion within the media conversations of their nations. For example, the Australian audience has more opportunities to collaboratively produce and tell their story to a broader audience via the public service media (PSM) facilitated platforms of the Australian Broadcasting Corporation (ABC). However, providing open collaborative production for the audience gives rise to the problem, how might the PSM manage the interests of all the stakeholders and align those interests with its legislated Charter? This paper considers this problem through the ABC’s user-created content participatory platform, ABC Pool and highlights the cultural intermediary as the role responsible for managing these tensions. This paper also suggests cultural intermediation is a useful framework for other media organisations engaging in co-creative activities with their audiences.
Communication models of institutional online communities : the role of the ABC cultural intermediary
Resumo:
The co-creation of cultural artefacts has been democratised given the recent technological affordances of information and communication technologies. Web 2.0 technologies have enabled greater possibilities of citizen inclusion within the media conversations of their nations. For example, the Australian audience has more opportunities to collaboratively produce and tell their story to a broader audience via the public service media (PSM) facilitated platforms of the Australian Broadcasting Corporation (ABC). However, providing open collaborative production for the audience gives rise to the problem, how might the PSM manage the interests of all the stakeholders and align those interests with its legislated Charter? This paper considers this problem through the ABC’s user-created content participatory platform, ABC Pool and highlights the cultural intermediary as the role responsible for managing these tensions. This paper also suggests cultural intermediation is a useful framework for other media organisations engaging in co-creative activities with their audiences.
Resumo:
Most mathematical models of collective cell spreading make the standard assumption that the cell diffusivity and cell proliferation rate are constants that do not vary across the cell population. Here we present a combined experimental and mathematical modeling study which aims to investigate how differences in the cell diffusivity and cell proliferation rate amongst a population of cells can impact the collective behavior of the population. We present data from a three–dimensional transwell migration assay which suggests that the cell diffusivity of some groups of cells within the population can be as much as three times higher than the cell diffusivity of other groups of cells within the population. Using this information, we explore the consequences of explicitly representing this variability in a mathematical model of a scratch assay where we treat the total population of cells as two, possibly distinct, subpopulations. Our results show that when we make the standard assumption that all cells within the population behave identically we observe the formation of moving fronts of cells where both subpopulations are well–mixed and indistinguishable. In contrast, when we consider the same system where the two subpopulations are distinct, we observe a very different outcome where the spreading population becomes spatially organized with the more motile subpopulation dominating at the leading edge while the less motile subpopulation is practically absent from the leading edge. These modeling predictions are consistent with previous experimental observations and suggest that standard mathematical approaches, where we treat the cell diffusivity and cell proliferation rate as constants, might not be appropriate.