921 resultados para visualisation formalism
Resumo:
We propose a bridge between two important parallel programming paradigms: data parallelism and communicating sequential processes (CSP). Data parallel pipelined architectures obtained with the Alpha language can be embedded in a control intensive application expressed in CSP-based Handel formalism. The interface is formally defined from the semantics of the languages Alpha and Handel. This work will ease the design of compute intensive applications on FPGAs.
Resumo:
A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.
Resumo:
As the building industry proceeds in the direction of low impact buildings, research attention is being drawn towards the reduction of carbon dioxide emission and waste. Starting from design and construction to operation and demolition, various building materials are used throughout the whole building lifecycle involving significant energy consumption and waste generation. Building Information Modelling (BIM) is emerging as a tool that can support holistic design-decision making for reducing embodied carbon and waste production in the building lifecycle. This study aims to establish a framework for assessing embodied carbon and waste underpinned by BIM technology. On the basis of current research review, the framework is considered to include functional modules for embodied carbon computation. There are a module for waste estimation, a knowledge-base of construction and demolition methods, a repository of building components information, and an inventory of construction materials’ energy and carbon. Through both static 3D model visualisation and dynamic modelling supported by the framework, embodied energy (carbon), waste and associated costs can be analysed in the boundary of cradle-to-gate, construction, operation, and demolition. The proposed holistic modelling framework provides a possibility to analyse embodied carbon and waste from different building lifecycle perspectives including associated costs. It brings together existing segmented embodied carbon and waste estimation into a unified model, so that interactions between various parameters through the different building lifecycle phases can be better understood. Thus, it can improve design-decision support for optimal low impact building development. The applicability of this framework is anticipated being developed and tested on industrial projects in the near future.
Resumo:
An AHRC funded project titled: Picturing ideas? Visualising and Synthesising Ideas as art (2009-10). Outputs including: 4 exhibitions; 4 publications; 3 papers; 2 largescale backlit digital prints; 1 commissioned print. (See Additional Information) ----ABSTRACT: Utilising the virtuality of digital imagery this practice-led project explored the possibility of the cross-articulation between text and image and the bridging or synthesising potential of the visual affect of ideas. A series of digital images were produced 'picturing' or 'visualising' philosophical ideas derived from the writings of the philosopher Giles Deleuze, as remodellings of pre-existing philosophical ideas; developed through dialogues and consultation with specialists in the fields from which the ideas were drawn (philosophy, psychology, film) as well as artists and theorists concerned with ideas of 'mental imagery' and visualisation. Final images were produced as a synthesis (or combination) of these visualisations and presented in the format of large scale, backlit digital prints at a series of prestigious international exhibitions (see details above). Evaluation took the form of a four page illustrated text in Frieze magazine (August 2009) and three papers delivered at University of Ulster, Goldsmiths College of Art and Loughborough University. The project also included the publication of a catalogue essay (EAST 09) and an illustrated poem (in the Dark Monarch publication). A print version of the image was commissioned by Invisible Exports Gallery, New York and subsequently exhibited in The Devos Art Museum, School of Art & Design at Northern Michigan University and in a publication edited by Cedar Lewisohn for Tate Publishing. The project was funded by an AHRC practice-led grant (17K) and Arts Council of England award (1.5K). The outputs, including high profile, publicly accessible exhibitions, prestigious publications and conference papers ensured the dissemination of the research to a wide range of audiences, including scholars/researchers across the arts and humanities engaged in practice-based and interdisciplinary theoretical work (in particular in the fields of contemporary art and art theory and those working on the integration of art and theory/philosophy/psychology) but also the wider audience for contemporary art.
Resumo:
Virtual reality has the potential to improve visualisation of building design and construction, but its implementation in the industry has yet to reach maturity. Present day translation of building data to virtual reality is often unidirectional and unsatisfactory. Three different approaches to the creation of models are identified and described in this paper. Consideration is given to the potential of both advances in computer-aided design and the emerging standards for data exchange to facilitate an integrated use of virtual reality. Commonalities and differences between computer-aided design and virtual reality packages are reviewed, and trials of current system, are described. The trials have been conducted to explore the technical issues related to the integrated use of CAD and virtual environments within the house building sector of the construction industry and to investigate the practical use of the new technology.
Resumo:
Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.
Resumo:
The use of light microscopy and DMACA staining strongly suggested that plant and animal cell nuclei act as sinks for flavanols [1, 2]. Detailed uv-vis spectroscopic titration experiments indicated that histone proteins are the likely binding sites in the nucleus [2]. Here we report the development of a multi-photon excitation microscopy technique combined with fluorescent lifetime measurements of flavanols. Using this technique, (+) catechin, (-) epicatechin and (-) epigallocatechin gallate (EGCG) showed strikingly different excited state lifetimes in solution. Interaction of histone proteins with flavanols was indicated by the appearance of a significant τ2-component of 1.7 to 4.0ns. Tryptophan interference could be circumvented in the in vivo fluorescence lifetime imaging microscopy (FLIM) experiments with 2-photon excitation at 630nm. This enabled visualisation and semi-quantitative measurements that demonstrated unequivocally the absorption of (+)catechin, (-)epicatechin and EGCG by nuclei of onion cells. 3D FLIM revealed for the first time that externally added EGCG penetrated the whole nucleus in onion cells. The relative proportions of EGCG in cytoplasm: nucleus: nucleoli were ca. 1:10:100. FLIM experiments may therefore facilitate probing the health effects of EGCG, which is the major constituent of green tea.
Resumo:
2011 is the centenary year of the short paper (Wilson,1911) first describing the cloud chamber, the device for visualising high-energy charged particles which earned the Scottish physicist Charles Thomas Rees (‘CTR’) Wilson the 1927 Nobel Prize for physics. His many achievements in atmospheric science, some of which have current relevance, are briefly reviewed here. CTR Wilson’s lifetime of scientific research work was principally in atmospheric electricity at the Cavendish Laboratory, Cambridge; he was Reader in Electrical Meteorology from 1918 and Jacksonian Professor from 1925 to 1935. However, he is immortalised in physics for his invention of the cloud chamber, because of its great significance as an early visualisation tool for particles such as cosmic rays1 (Galison, 1997). Sir Lawrence Bragg summarised its importance:
Resumo:
Software representations of scenes, i.e. the modelling of objects in space, are used in many application domains. Current modelling and scene description standards focus on visualisation dimensions, and are intrinsically limited by their dependence upon their semantic interpretation and contextual application by humans. In this paper we propose the need for an open, extensible and semantically rich modelling language, which facilitates a machine-readable semantic structure. We critically review existing standards and techniques, and highlight a need for a semantically focussed scene description language. Based on this defined need we propose a preliminary solution, based on hypergraph theory, and reflect on application domains.
Resumo:
Throughout pregnancy the cytotrophoblast, the stem cell of the placenta, gives rise to the differentiated forms of trophoblasts. The two main cell lineages are the syncytiotrophoblast and the invading extravillous trophoblast. A successful pregnancy requires extravillous trophoblasts to migrate and invade through the decidua and then remodel the maternal spiral arteries. Many invasive cells use specialised cellular structures called invadopodia or podosomes in order to degrade extracellular matrix. Despite being highly invasive cells, the presence of invadapodia or podosomes has not previously been investigated in trophoblasts. In this study these structures have been identified and characterised in extravillous trophoblasts. The role of specialised invasive structures in trophoblasts in the degradation of the extracellular matrix was compared with well characterised podosomes and invadopodia in other invasive cells and the trophoblast specific structures were characterised by using a sensitive matrix degradation assay which enabled visualisation of the structures and their dynamics. We show trophoblasts form actin rich protrusive structures which have the ability to degrade the extracellular matrix during invasion. The degradation ability and dynamics of the structures closely resemble podosomes, but have unique characteristics that have not previously been described in other cell types. The composition of these structures does not conform to the classic podosome structure, with no distinct ring of plaque proteins such as paxillin or vinculin. In addition, trophoblast podosomes protrude more deeply into the extracellular matrix than established podosomes, resembling invadopodia in this regard. We also show several significant pathways such as Src kinase, MAPK kinase and PKC along with MMP-2 and 9 as key regulators of extracellular matrix degradation activity in trophoblasts, while podosome activity was regulated by the rigidity of the extracellular matrix.
Resumo:
This report describes the analysis and development of novel tools for the global optimisation of relevant mission design problems. A taxonomy was created for mission design problems, and an empirical analysis of their optimisational complexity performed - it was demonstrated that the use of global optimisation was necessary on most classes and informed the selection of appropriate global algorithms. The selected algorithms were then applied to the di®erent problem classes: Di®erential Evolution was found to be the most e±cient. Considering the speci¯c problem of multiple gravity assist trajectory design, a search space pruning algorithm was developed that displays both polynomial time and space complexity. Empirically, this was shown to typically achieve search space reductions of greater than six orders of magnitude, thus reducing signi¯cantly the complexity of the subsequent optimisation. The algorithm was fully implemented in a software package that allows simple visualisation of high-dimensional search spaces, and e®ective optimisation over the reduced search bounds.
Resumo:
We present an intercomparison and verification analysis of 20 GCMs (Global Circulation Models) included in the 4th IPCC assessment report regarding their representation of the hydrological cycle on the Danube river basin for 1961–2000 and for the 2161–2200 SRESA1B scenario runs. The basin-scale properties of the hydrological cycle are computed by spatially integrating the precipitation, evaporation, and runoff fields using the Voronoi-Thiessen tessellation formalism. The span of the model- simulated mean annual water balances is of the same order of magnitude of the observed Danube discharge of the Delta; the true value is within the range simulated by the models. Some land components seem to have deficiencies since there are cases of violation of water conservation when annual means are considered. The overall performance and the degree of agreement of the GCMs are comparable to those of the RCMs (Regional Climate Models) analyzed in a previous work, in spite of the much higher resolution and common nesting of the RCMs. The reanalyses are shown to feature several inconsistencies and cannot be used as a verification benchmark for the hydrological cycle in the Danubian region. In the scenario runs, for basically all models the water balance decreases, whereas its interannual variability increases. Changes in the strength of the hydrological cycle are not consistent among models: it is confirmed that capturing the impact of climate change on the hydrological cycle is not an easy task over land areas. Moreover, in several cases we find that qualitatively different behaviors emerge among the models: the ensemble mean does not represent any sort of average model, and often it falls between the models’ clusters.
Resumo:
Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.
Resumo:
Variational data assimilation in continuous time is revisited. The central techniques applied in this paper are in part adopted from the theory of optimal nonlinear control. Alternatively, the investigated approach can be considered as a continuous time generalization of what is known as weakly constrained four-dimensional variational assimilation (4D-Var) in the geosciences. The technique allows to assimilate trajectories in the case of partial observations and in the presence of model error. Several mathematical aspects of the approach are studied. Computationally, it amounts to solving a two-point boundary value problem. For imperfect models, the trade-off between small dynamical error (i.e. the trajectory obeys the model dynamics) and small observational error (i.e. the trajectory closely follows the observations) is investigated. This trade-off turns out to be trivial if the model is perfect. However, even in this situation, allowing for minute deviations from the perfect model is shown to have positive effects, namely to regularize the problem. The presented formalism is dynamical in character. No statistical assumptions on dynamical or observational noise are imposed.
Resumo:
The dielectric constant, epsilon', and the dielectric loss, epsilon'', for gelatin films were measured in the glassy and rubbery states over a frequency range from 20 Hz to 10 MHz; epsilon' and epsilon'' were transformed into M* formalism (M* = 1/(epsilon' - i epsilon'') = M' + iM''; i, the imaginary unit). The peak of epsilon'' was masked probably due to dc conduction, but the peak of M'', e.g. the conductivity relaxation, for the gelatin used was observed. By fitting the M'' data to the Havriliak-Negami type equation, the relaxation time, tauHN, was evaluated. The value of the activation energy, Etau, evaluated from an Arrhenius plot of 1/tauHN, agreed well with that of Esigma evaluated from the DC conductivity sigma0 both in the glassy and rubbery states, indicating that the conductivity relaxation observed for the gelatin films was ascribed to ionic conduction. The value of the activation energy in the glassy state was larger than that in the rubbery state.