120 resultados para Multi-scale hierarchical framework
Resumo:
Fingerprinting is a well known approach for identifying multimedia data without having the original data present but instead what amounts to its essence or 'DNA'. Current approaches show insufficient deployment of various types of knowledge that could be brought to bear in providing a fingerprinting framework that remains effective, efficient and can accommodate both the whole as well as elemental protection at appropriate levels of abstraction to suit various Zones of Interest (ZoI) in an image or cross media artefact. The proposed framework aims to deliver selective composite fingerprinting that is powerfully aided by leveraging both multi-modal information as well as a rich spectrum of collateral context knowledge including both image-level collaterals and also the inevitably needed market intelligence knowledge such as customers' social networks interests profiling which we can deploy as a crucial component of our fingerprinting collateral knowledge.
Resumo:
We present a novel kinetic multi-layer model that explicitly resolves mass transport and chemical reaction at the surface and in the bulk of aerosol particles (KM-SUB). The model is based on the PRA framework of gas-particle interactions (Poschl-Rudich-Ammann, 2007), and it includes reversible adsorption, surface reactions and surface-bulk exchange as well as bulk diffusion and reaction. Unlike earlier models, KM-SUB does not require simplifying assumptions about steady-state conditions and radial mixing. The temporal evolution and concentration profiles of volatile and non-volatile species at the gas-particle interface and in the particle bulk can be modeled along with surface concentrations and gas uptake coefficients. In this study we explore and exemplify the effects of bulk diffusion on the rate of reactive gas uptake for a simple reference system, the ozonolysis of oleic acid particles, in comparison to experimental data and earlier model studies. We demonstrate how KM-SUB can be used to interpret and analyze experimental data from laboratory studies, and how the results can be extrapolated to atmospheric conditions. In particular, we show how interfacial and bulk transport, i.e., surface accommodation, bulk accommodation and bulk diffusion, influence the kinetics of the chemical reaction. Sensitivity studies suggest that in fine air particulate matter oleic acid and compounds with similar reactivity against ozone (carbon-carbon double bonds) can reach chemical lifetimes of many hours only if they are embedded in a (semi-)solid matrix with very low diffusion coefficients (< 10(-10) cm(2) s(-1)). Depending on the complexity of the investigated system, unlimited numbers of volatile and non-volatile species and chemical reactions can be flexibly added and treated with KM-SUB. We propose and intend to pursue the application of KM-SUB as a basis for the development of a detailed master mechanism of aerosol chemistry as well as for the derivation of simplified but realistic parameterizations for large-scale atmospheric and climate models.
Resumo:
We present a novel kinetic multi-layer model that explicitly resolves mass transport and chemical reaction at the surface and in the bulk of aerosol particles (KM-SUB). The model is based on the PRA framework of gas–particle interactions (P¨oschl et al., 5 2007), and it includes reversible adsorption, surface reactions and surface-bulk exchange as well as bulk diffusion and reaction. Unlike earlier models, KM-SUB does not require simplifying assumptions about steady-state conditions and radial mixing. The temporal evolution and concentration profiles of volatile and non-volatile species at the gas-particle interface and in the particle bulk can be modeled along with surface 10 concentrations and gas uptake coefficients. In this study we explore and exemplify the effects of bulk diffusion on the rate of reactive gas uptake for a simple reference system, the ozonolysis of oleic acid particles, in comparison to experimental data and earlier model studies. We demonstrate how KM-SUB can be used to interpret and analyze experimental data from laboratory stud15 ies, and how the results can be extrapolated to atmospheric conditions. In particular, we show how interfacial transport and bulk transport, i.e., surface accommodation, bulk accommodation and bulk diffusion, influence the kinetics of the chemical reaction. Sensitivity studies suggest that in fine air particulate matter oleic acid and compounds with similar reactivity against ozone (C=C double bonds) can reach chemical lifetimes of 20 multiple hours only if they are embedded in a (semi-)solid matrix with very low diffusion coefficients (10−10 cm2 s−1). Depending on the complexity of the investigated system, unlimited numbers of volatile and non-volatile species and chemical reactions can be flexibly added and treated with KM-SUB. We propose and intend to pursue the application of KM-SUB 25 as a basis for the development of a detailed master mechanism of aerosol chemistry as well as for the derivation of simplified but realistic parameterizations for large-scale atmospheric and climate models.
Resumo:
Where users are interacting in a distributed virtual environment, the actions of each user must be observed by peers with sufficient consistency and within a limited delay so as not to be detrimental to the interaction. The consistency control issue may be split into three parts: update control; consistent enactment and evolution of events; and causal consistency. The delay in the presentation of events, termed latency, is primarily dependent on the network propagation delay and the consistency control algorithms. The latency induced by the consistency control algorithm, in particular causal ordering, is proportional to the number of participants. This paper describes how the effect of network delays may be reduced and introduces a scalable solution that provides sufficient consistency control while minimising its effect on latency. The principles described have been developed at Reading over the past five years. Similar principles are now emerging in the simulation community through the HLA standard. This paper attempts to validate the suggested principles within the schema of distributed simulation and virtual environments and to compare and contrast with those described by the HLA definition documents.
Resumo:
A theoretical framework for the joint conservation of energy and momentum in the parameterization of subgrid-scale processes in climate models is presented. The framework couples a hydrostatic resolved (planetary scale) flow to a nonhydrostatic subgrid-scale (mesoscale) flow. The temporal and horizontal spatial scale separation between the planetary scale and mesoscale is imposed using multiple-scale asymptotics. Energy and momentum are exchanged through subgrid-scale flux convergences of heat, pressure, and momentum. The generation and dissipation of subgrid-scale energy and momentum is understood using wave-activity conservation laws that are derived by exploiting the (mesoscale) temporal and horizontal spatial homogeneities in the planetary-scale flow. The relations between these conservation laws and the planetary-scale dynamics represent generalized nonacceleration theorems. A derived relationship between the wave-activity fluxes-which represents a generalization of the second Eliassen-Palm theorem-is key to ensuring consistency between energy and momentum conservation. The framework includes a consistent formulation of heating and entropy production due to kinetic energy dissipation.
Resumo:
Undeniably, anticipation plays a crucial role in cognition. By what means, to what extent, and what it achieves remain open questions. In a recent BBS target article, Clark (in press) depicts an integrative model of the brain that builds on hierarchical Bayesian models of neural processing (Rao and Ballard, 1999; Friston, 2005; Brown et al., 2011), and their most recent formulation using the free-energy principle borrowed from thermodynamics (Feldman and Friston, 2010; Friston, 2010; Friston et al., 2010). Hierarchical generative models of cognition, such as those described by Clark, presuppose the manipulation of representations and internal models of the world, in as much detail as is perceptually available. Perhaps surprisingly, Clark acknowledges the existence of a “virtual version of the sensory data” (p. 4), but with no reference to some of the historical debates that shaped cognitive science, related to the storage, manipulation, and retrieval of representations in a cognitive system (Shanahan, 1997), or accounting for the emergence of intentionality within such a system (Searle, 1980; Preston and Bishop, 2002). Instead of demonstrating how this Bayesian framework responds to these foundational questions, Clark describes the structure and the functional properties of an action-oriented, multi-level system that is meant to combine perception, learning, and experience (Niedenthal, 2007).
Resumo:
Cover crops are sown to provide a number of ecosystem services including nutrient management, mitigation of diffuse pollution, improving soil structure and organic matter content, weed suppression, nitrogen fixation and provision of resources for biodiversity. Although the decision to sow a cover crop may be driven by a desire to achieve just one of these objectives, the diversity of cover crops species and mixtures available means that there is potential to combine a number of ecosystem services within the same crop and growing season. Designing multi-functional cover crops would potentially help to reconcile the often conflicting agronomic and environmental agendas and contribute to the optimal use of land. We present a framework for integrating multiple ecosystem services delivered by cover crops that aims to design a mixture of species with complementary growth habit and functionality. The optimal number and identity of species will depend on the services included in the analysis, the functional space represented by the available species pool and the community dynamics of the crop in terms of dominance and co-existence. Experience from a project that applied the framework to fertility building leys in organic systems demonstrated its potential and emphasised the importance of the initial choice of species to include in the analysis
Resumo:
This paper describes the development and first results of the “Community Integrated Assessment System” (CIAS), a unique multi-institutional modular and flexible integrated assessment system for modelling climate change. Key to this development is the supporting software infrastructure, SoftIAM. Through it, CIAS is distributed between the community of institutions which has each contributed modules to the CIAS system. At the heart of SoftIAM is the Bespoke Framework Generator (BFG) which enables flexibility in the assembly and composition of individual modules from a pool to form coupled models within CIAS, and flexibility in their deployment onto the available software and hardware resources. Such flexibility greatly enhances modellers’ ability to re-configure the CIAS coupled models to answer different questions, thus tracking evolving policy needs. It also allows rigorous testing of the robustness of IA modelling results to the use of different component modules representing the same processes (for example, the economy). Such processes are often modelled in very different ways, using different paradigms, at the participating institutions. An illustrative application to the study of the relationship between the economy and the earth’s climate system is provided.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Correlating Bayesian date estimates with climatic events and domestication using a bovine case study
Resumo:
The tribe Bovini contains a number of commercially and culturally important species, such as cattle. Understanding their evolutionary time scale is important for distinguishing between post-glacial and domestication-associated population expansions, but estimates of bovine divergence times have been hindered by a lack of reliable calibration points. We present a Bayesian phylogenetic analysis of 481 mitochondrial D-loop sequences, including 228 radiocarbon-dated ancient DNA sequences, using a multi-demographic coalescent model. By employing the radiocarbon dates as internal calibrations, we co-estimate the bovine phylogeny and divergence times in a relaxed-clock framework. The analysis yields evidence for significant population expansions in both taurine and zebu cattle, European aurochs and yak clades. The divergence age estimates support domestication-associated expansion times (less than 12 kyr) for the major haplogroups of cattle. We compare the molecular and palaeontological estimates for the Bison-Bos divergence.
Resumo:
This paper considers the potential contribution of secondary quantitative analyses of large scale surveys to the investigation of 'other' childhoods. Exploring other childhoods involves investigating the experience of young people who are unequally positioned in relation to multiple, embodied, identity locations, such as (dis)ability, 'class', gender, sexuality, ethnicity and race. Despite some possible advantages of utilising extensive databases, the paper outlines a number of methodological problems with existing surveys which tend to reinforce adultist and broader hierarchical social relations. It is contended that scholars of children's geographies could overcome some of these problematic aspects of secondary data sources by endeavouring to transform the research relations of large scale surveys. Such endeavours would present new theoretical, ethical and methodological complexities, which are briefly considered.