975 resultados para Novel theory
Resumo:
We present a new version of non-local density functional theory (NL-DFT) adapted to description of vapor adsorption isotherms on amorphous materials like non-porous silica. The novel feature of this approach is that it accounts for the roughness of adsorbent surface. The solid–fluid interaction is described in the same framework as in the case of fluid–fluid interactions, using the Weeks–Chandler–Andersen (WCA) scheme and the Carnahan–Starling (CS) equation for attractive and repulsive parts of the Helmholtz free energy, respectively. Application to nitrogen and argon adsorption isotherms on non-porous silica LiChrospher Si-1000 at their boiling points, recently published by Jaroniec and co-workers, has shown an excellent correlative ability of our approach over the complete range of pressures, which suggests that the surface roughness is mostly the reason for the observed behavior of adsorption isotherms. From the analysis of these data, we found that in the case of nitrogen adsorption short-range interactions between oxygen atoms on the silica surface and quadrupole of nitrogen molecules play an important role. The approach presented in this paper may be further used in quantitative analysis of adsorption and desorption isotherms in cylindrical pores such as MCM-41 and carbon nanotubes.
Resumo:
In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
Purpose – Qualitative theory building approaches, such as grounded theory method (GTM), are still not very widespread and rigorously applied in operations management (OM) research. Yet it is agreed that more systematic observation of current industrial phenomena is necessary to help managers deal with their problems. The purpose of this paper is to provide an example to help guide other researchers on using GTM for theory building in OM research. Design/methodology/approach – A GTM study in the German automotive industry consisting of 31 interviews is followed by a validation stage comprising a survey (110 responses) and a focus group. Findings – The result is an example of conducting GTM research in OM, illustrated by the development of the novel collaborative enterprise governance framework for inter-firm relationship governance in the German automotive industry. Research limitations/implications – GTM is appropriate for qualitative theory building research, but the resultant theories need further testing. Research is necessary to identify the transferability of the collaborative enterprise governance concept to other industries than automotive, to other organisational areas than R&D and to product and service settings that are less complex and innovative. Practical implications – The paper helps researchers make more informed use of GTM when engaging in qualitative theory building research in OM. Originality/value – There is a lack of explicit and well-informed use of GTM in OM research because of poor understanding. This paper addresses this deficiency. The collaborative enterprise governance framework is a significant contribution in an area of growing importance within OM.
Resumo:
Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.
Resumo:
Purpose – To propose and investigate a stable numerical procedure for the reconstruction of the velocity of a viscous incompressible fluid flow in linear hydrodynamics from knowledge of the velocity and fluid stress force given on a part of the boundary of a bounded domain. Design/methodology/approach – Earlier works have involved the similar problem but for stationary case (time-independent fluid flow). Extending these ideas a procedure is proposed and investigated also for the time-dependent case. Findings – The paper finds a novel variation method for the Cauchy problem. It proves convergence and also proposes a new boundary element method. Research limitations/implications – The fluid flow domain is limited to annular domains; this restriction can be removed undertaking analyses in appropriate weighted spaces to incorporate singularities that can occur on general bounded domains. Future work involves numerical investigations and also to consider Oseen type flow. A challenging problem is to consider non-linear Navier-Stokes equation. Practical implications – Fluid flow problems where data are known only on a part of the boundary occur in a range of engineering situations such as colloidal suspension and swimming of microorganisms. For example, the solution domain can be the region between to spheres where only the outer sphere is accessible for measurements. Originality/value – A novel variational method for the Cauchy problem is proposed which preserves the unsteady Stokes operator, convergence is proved and using recent for the fundamental solution for unsteady Stokes system, a new boundary element method for this system is also proposed.
Resumo:
This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
TEST is a novel taxonomy of knowledge representations based on three distinct hierarchically organized representational features: Tropism, Embodiment, and Situatedness. Tropic representational features reflect constraints of the physical world on the agent's ability to form, reactivate, and enrich embodied (i.e., resulting from the agent's bodily constraints) conceptual representations embedded in situated contexts. The proposed hierarchy entails that representations can, in principle, have tropic features without necessarily having situated and/or embodied features. On the other hand, representations that are situated and/or embodied are likely to be simultaneously tropic. Hence, although we propose tropism as the most general term, the hierarchical relationship between embodiment and situatedness is more on a par, such that the dominance of one component over the other relies on the distinction between offline storage versus online generation as well as on representation-specific properties. © 2013 Cognitive Science Society, Inc.
Resumo:
A large number of studies have been devoted to modeling the contents and interactions between users on Twitter. In this paper, we propose a method inspired from Social Role Theory (SRT), which assumes that a user behaves differently in different roles in the generation process of Twitter content. We consider the two most distinctive social roles on Twitter: originator and propagator, who respectively posts original messages and retweets or forwards the messages from others. In addition, we also consider role-specific social interactions, especially implicit interactions between users who share some common interests. All the above elements are integrated into a novel regularized topic model. We evaluate the proposed method on real Twitter data. The results show that our method is more effective than the existing ones which do not distinguish social roles. Copyright 2013 ACM.
Resumo:
This study presents the first part of a CFD study on the performance of a downer reactor for biomass pyrolysis. The reactor was equipped with a novel gas-solid separation method, developed by the co-authors from the ICFAR (Canada). The separator, which was designed to allow for fast separation of clean pyrolysis gas, consisted of a cone deflector and a gas exit pipe installed inside the downer reactor. A multi-fluid model (Eulerian-Eulerian) with constitutive relations adopted from the kinetic theory of granular flow was used to simulate the multiphase flow. The effects of the various parameters including operation conditions, separator geometry and particle properties on the overall hydrodynamics and separation efficiency were investigated. The model prediction of the separator efficiency was compared with experimental measurements. The results revealed distinct hydrodynamic features around the cone separator, allowing for up to 100% separation efficiency. The developed model provided a platform for the second part of the study, where the biomass pyrolysis is simulated and the product quality as a function of operating conditions is analyzed. Crown Copyright © 2014 Published by Elsevier B.V. All rights reserved.
Resumo:
The extremely surface sensitive technique of metastable de-excitation spectroscopy (MDS) has been utilized to probe the bonding and reactivity of crotyl alcohol over Pd(111) and provide insight into the selective oxidation pathway to crotonaldehyde. Auger de-excitation (AD) of metastable He (23S) atoms reveals distinct features associated with the molecular orbitals of the adsorbed alcohol, corresponding to emission from the hydrocarbon skeleton, the O n nonbonding, and C═C π states. The O n and C═C π states of the alcohol are reversed when compared to those of the aldehyde. Density functional theory (DFT) calculations of the alcohol show that an adsorption mode with both C═C and O bonds aligned somewhat parallel to the surface is energetically favored at a substrate temperature below 200 K. Density of states calculations for such configurations are in excellent agreement with experimental MDS measurements. MDS revealed oxidative dehydrogenation of crotyl alcohol to crotonaldehyde between 200 and 250 K, resulting in small peak shifts to higher binding energy. Intramolecular changes lead to the opposite assignment of the first two MOs in the alcohol versus the aldehyde, in accordance with DFT and UPS studies of the free molecules. Subsequent crotonaldehyde decarbonylation and associated propylidyne formation above 260 K could also be identified by MDS and complementary theoretical calculations as the origin of deactivation and selectivity loss. Combining MDS and DFT in this way represents a novel approach to elucidating surface catalyzed reaction pathways associated with a “real-world” practical chemical transformation, namely the selective oxidation of alcohols to aldehydes.
Resumo:
In this paper, we explore the idea of social role theory (SRT) and propose a novel regularized topic model which incorporates SRT into the generative process of social media content. We assume that a user can play multiple social roles, and each social role serves to fulfil different duties and is associated with a role-driven distribution over latent topics. In particular, we focus on social roles corresponding to the most common social activities on social networks. Our model is instantiated on microblogs, i.e., Twitter and community question-answering (cQA), i.e., Yahoo! Answers, where social roles on Twitter include "originators" and "propagators", and roles on cQA are "askers" and "answerers". Both explicit and implicit interactions between users are taken into account and modeled as regularization factors. To evaluate the performance of our proposed method, we have conducted extensive experiments on two Twitter datasets and two cQA datasets. Furthermore, we also consider multi-role modeling for scientific papers where an author's research expertise area is considered as a social role. A novel application of detecting users' research interests through topical keyword labeling based on the results of our multi-role model has been presented. The evaluation results have shown the feasibility and effectiveness of our model.
Resumo:
EMBARGOED The literature on inter-organisational collaboration, although wide-ranging, offers little guidance on collaboration as process. It focuses in the main on human attributes like leadership, trust and agency, but gives little consideration to the role of objects in the development of inter-organisational collaborations. A central aim of this thesis is to understand the interaction of objects and humans in the development of a particular health and social care partnership in the North East of England. This socio-material perspective was achieved through actor-network theory (ANT) as a methodology, in which the researcher is equally sensitised to the role of human and non-human entities in the development of a network. The case study is that of the North East Lincolnshire Care Trust Plus (CTP). This was a unique health and social care collaboration arrangement between North East Lincolnshire Council and North East Lincolnshire Primary Care Trust, setup to address heath inequalities in the region. The CTP was conceived and developed at a local level by the respective organisation’s decision makers in the face of considerable opposition from regional policy makers and national regulators. However, despite this opposition, the directors eventually achieved their goal and the CTP became operational on 1st September 2007. This study seeks to understand how the CTP was conceived and developed, in the face of this opposition. The thesis makes a number of original contributions. Firstly, it adds to the current body of literature on collaboration by identifying how objects can help problematize issues and cement inter-organisational collaborations. Secondly it provides a novel account describing how two public sector organisations created a unique collaboration, despite pressing resistance from the regulatory authorities; and thirdly it extends Callon’s (1996) notion of problematization to examine how, what is rather vaguely described as ‘context’ in the literature, becomes enmeshed in decisions to collaborate. UNTIL 03/02/2016 THIS THESIS IS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT
Resumo:
In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.
Resumo:
Since the development of large scale power grid interconnections and power markets, research on available transfer capability (ATC) has attracted great attention. The challenges for accurate assessment of ATC originate from the numerous uncertainties in electricity generation, transmission, distribution and utilization sectors. Power system uncertainties can be mainly described as two types: randomness and fuzziness. However, the traditional transmission reliability margin (TRM) approach only considers randomness. Based on credibility theory, this paper firstly built models of generators, transmission lines and loads according to their features of both randomness and fuzziness. Then a random fuzzy simulation is applied, along with a novel method proposed for ATC assessment, in which both randomness and fuzziness are considered. The bootstrap method and multi-core parallel computing technique are introduced to enhance the processing speed. By implementing simulation for the IEEE-30-bus system and a real-life system located in Northwest China, the viability of the models and the proposed method is verified.