947 resultados para model categories homotopy theory quillen functor equivalence derived adjunction cofibrantly generated


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A spatial process observed over a lattice or a set of irregular regions is usually modeled using a conditionally autoregressive (CAR) model. The neighborhoods within a CAR model are generally formed deterministically using the inter-distances or boundaries between the regions. An extension of CAR model is proposed in this article where the selection of the neighborhood depends on unknown parameter(s). This extension is called a Stochastic Neighborhood CAR (SNCAR) model. The resulting model shows flexibility in accurately estimating covariance structures for data generated from a variety of spatial covariance models. Specific examples are illustrated using data generated from some common spatial covariance functions as well as real data concerning radioactive contamination of the soil in Switzerland after the Chernobyl accident.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organisations employ Enterprise Social Networks (ESNs) (such as Yammer) expecting better intra-organisational communication and collaboration. However, ESNs are struggling to gain momentum and wide adoption among users. Promoting user participation is a challenge, particularly in relation to lurkers – the silent ESN members who do not contribute any content. Building on behaviour change research, we propose a three-route model consisting of the central, peripheral and coercive routes of influence that depict users’ cognitive strategies, and we examine how management interventions (e.g. sending promotional emails) impact users’ beliefs and (consequent) posting and lurking behaviours in ESNs. Furthermore, we identify users’ salient motivations to lurk or post. We employ a multi-method research design to conceptualise, operationalise and validate the research model. This study has implications for academics and practitioners regarding the nature, patterns and outcomes of management interventions in prompting ESN.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As Business Process Management (BPM) is evolving and organisations are becoming more process oriented, the need for Expertise in BPM amongst practitioners has increased. Proactively managing Expertise in BPM is essential to unlock the potential of BPM as a management paradigm and competitive advantage. Whilst great attention is being paid by the BPM community to the technological aspects of BPM, relatively little research or work has been done concerning the expertise aspect of BPM. There is a substantial body of knowledge on expertise itself, however there is no common framework in existence at the time of writing, describing the fundamental attributes characterising Expertise in the illustrative context of BPM. There are direct implications of the understanding and characterisation of Expertise in the context of BPM as a key strategic component and success factor of BPM itself, as well as for those involved in BPM. Expertise in the context of BPM needs to be characterised to understand it, and be able to proactively manage it. Given the relative infancy of research into Expertise in the context of BPM, an exploration of the relevance and importance of Expertise in the context of BPM was considered essential, to ensure the study itself was of value to the BPM field. The aims of this research are firstly to address the two research questions 'why is expertise important and relevant in the context of BPM?', and 'how can Expertise in the context of BPM be characterised?', and secondly, the development of a comprehensive and validated A-priori model characterising Expertise in the illustrative context of BPM. The study is theory-guided. It has been undertaken via an extensive literature review across relevant literature domains, and a revelatory case study utilising several methods: informal discussions, an open-ended survey, and participant observation. An a-priori model was then developed which comprised of several Constructs and Sub-constructs, and several overall aspects of Expertise in BPM. This was followed by the conduct of interviews in the validation phase of the revelatory case study. The primary contributions of this study are to the fields of expertise, BPM and research. Contributions to the field of expertise include a comprehensive review of expertise literature in general and synthesised critique on expertise research, characterisation of expertise in an illustrative context as a system, and a comprehensive narrative of the dynamics and interrelationships of the core attributes characterising expertise. Contributions to the field of BPM include firstly, the establishment of the importance of understanding Expertise in the context of BPM, including a comprehensive overview of the role the relevance and importance of Expertise in the context of BPM, through explanation of the effect of Expertise in BPM. Secondly, a model characterising Expertise in the context of BPM, which can be used by BPM practitioners to clearly articulate and illuminate the state of Expertise in BPM in organisations. Contributions to the field of research include an extended view of Systems Theory developed, reflecting the importance of the system context in systems thinking, and a narrative on ontological innovation through the positioning of ontology as a meta-model of Expertise in the context of BPM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intelligent Transport Systems (ITS) have the potential to substantially reduce the number of crashes caused by human errors at railway levels crossings. Such systems, however, will only exert an influence on driving behaviour if they are accepted by the driver. This study aimed at assessing driver acceptance of different ITS interventions designed to enhance driver behaviour at railway crossings. Fifty eight participants, divided into three groups, took part in a driving simulator study in which three ITS devices were tested: an in-vehicle visual ITS, an in-vehicle audio ITS, and an on-road valet system. Driver acceptance of each ITS intervention was assessed in a questionnaire guided by the Technology Acceptance Model and the Theory of Planned Behaviour. Overall, results indicated that the strongest intentions to use the ITS devices belonged to participants exposed to the road-based valet system at passive crossings. The utility of both models in explaining drivers’ intention to use the systems is discussed, with results showing greater support for the Theory of Planned Behaviour. Directions for future studies, along with strategies that target attitudes and subjective norms to increase drivers’ behavioural intentions, are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theories of individual attitudes toward IT include task technology fit (TTF), technology acceptance model (TAM), unified theory of acceptance and use of technology (UTAUT), cognitive fit, expectation disconfirmation, and computer self-efficacy. Examination of these theories reveals three main concerns. First, the theories mostly ‘‘black box’’ (or omit) the IT artifact. Second, appropriate mid-range theory is not developed to contribute to disciplinary progress and to serve the needs of our practitioner community. Third, theories are overlapping but incommensurable. We propose a theoretical framework that harmonizes these attitudinal theories and shows how they can be specialized to include relevant IS phenomenon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prevention and management of childhood overweight and obesity is a health priority for governments and communities throughout the developed world. A conceptual model, Research around Practice in Childhood Obesity (RAPICO), has been developed to guide capacity building in a coordinated 'bench to fieldwork' initiative to address this public health problem. Translation of research findings into sustainable responses with optimal fit requires consideration of context-specific relevance, cost-effectiveness, feasibility and levels of available support. The RAPICO model uses program theory to describe a framework for progressing practitionercommunityresearch partnerships to address low, medium and high levels of risk for childhood overweight and obesity within community settings. A case study describing the development of a logic model to inform risk-linked responses to childhood overweight and obesity is presented for the Ipswich community in south-east Queensland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The steady MHD mixed convection flow of a viscoelastic fluid in the vicinity of two-dimensional stagnation point with magnetic field has been investigated under the assumption that the fluid obeys the upper-convected Maxwell (UCM) model. Boundary layer theory is used to simplify the equations of motion. induced magnetic field and energy which results in three coupled non-linear ordinary differential equations which are well-posed. These equations have been solved by using finite difference method. The results indicate the reduction in the surface velocity gradient, surface heat transfer and displacement thickness with the increase in the elasticity number. These trends are opposite to those reported in the literature for a second-grade fluid. The surface velocity gradient and heat transfer are enhanced by the magnetic and buoyancy parameters. The surface heat transfer increases with the Prandtl number, but the surface velocity gradient decreases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space viewpoint is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces $\mathcal{S_I}$ and $\mathcal{S_C}$ and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating $\mathcal{S_I}$ and $\mathcal{S_C}$ is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. The average case CC of the relevant greater-than (GT) function is characterized within two bits. In the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Societal reactions to norm breaking behavior of children reveal, how we understand childhood, the relations between generations and communitie's ratio of tolerance. In Finland the children that repeatedly commit crimes receive social service measures that are based on Child Welfare Act. In the city of Helsinki (Stadi in the slang of Helsinki) existed an agency specifically established for ill-behaving children until the 1980's, agter which an unified agency for the maltreated and maladjusted children was founded. Through five boys' welfare cases, this research aims at defining what kind of positions, social relations and structures are constructed in the social dynamics of these children's everyday lives. The cases cover different decades from the 1940s to the present. At the same time the cases reflect the child welfare and societal practices, and reveal how the communities have participated in constructing deviance in different eras. The research is meta-theoretically based on critical realism and specifically on Roy Bhaskar's transformative model of social activity. The cases are analyzed in the framework of Edwin M. Lemert's societal reaction theory. Thus the focus of the study is on the wide structural context of the institutional and societal definitions of deviance. The research is methodologically based on a qualitative multiple case study research. The primary data consist of classified child welfare case files collected from the archives of the city of Helsinki. The data of the institutional level consist of the annual reports from 1943 to 2004 and the ordinances from 1907 onwards, and of various committee documents produced in the law-making process of child welfare, youth and criminal legislation of the 20th century. Empirical finding are interpreted in a dialogue with previous historical and child welfare research, contemporary literature and studies on the urban development. The analysis is based on Derek Layder's model of adaptive theory. The research forms a viewpoint to the historical study of child welfare, in which the historical era, its agents and the dynamics of their mutual relations are studied through an individual level reconstruction based on the societal reaction theory. The case analyses reveal how the positions of the children form differently in the different eras of child welfare practices. In the 1940s the child is positioned as a psychopath and a criminal type. The measures are aimed at protecting the community from the disturbed child, and at adjusting the individual by isolation. From 1960s to 1980s the child is positioned as a child in need of help and support. The child becomes a victim, a subject that occupies rights, and a target of protection. In the turn of the millennium a norm breaking child is positioned as a dangerous individual that, in the name of the community safety, has to be confined. The case analyses also reveal the prevailing academic and practical paradigms of the time. Keywords: childhood, youth, child protection, child welfare, delinquency, crime, deviance, history, critical realism, case study research

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuroblastoma has successfully served as a model system for the identification of neuroectoderm-derived oncogenes. However, in spite of various efforts, only a few clinically useful prognostic markers have been found. Here, we present a framework, which integrates DNA, RNA and tissue data to identify and prioritize genetic events that represent clinically relevant new therapeutic targets and prognostic biomarkers for neuroblastoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show how, for large classes of systems with purely second-class constraints, further information can be obtained about the constraint algebra. In particular, a subset consisting of half the full set of constraints is shown to have vanishing mutual brackets. Some other constraint brackets are also shown to be zero. The class of systems for which our results hold includes examples from non-relativistic particle mechanics as well as relativistic field theory. The results are derived at the classical level for Poisson brackets, but in the absence of commutator anomalies the same results will hold for the commutators of the constraint operators in the corresponding quantised theories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new theory of gravitation has been proposed in a more general space-time than Riemannian. It is a generalization of the ECSK and Brans-Dicke (BD) theory of gravitation. It is found that, in contrast to the standard the ECSK theory, a parity-violating propagating torsion is generated by the BD scalar field. The interesting consequence of the theory is that it can successfully predict solar system experimental results to desired accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space view-point is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces f(s) and f(g) and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating f(s) and f(g) is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication-complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. Extensions to the multi-party case is straightforward and is briefly discussed. The average case CC of the relevant greaterthan (CT) function is characterized within two bits. Under the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm. 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last few decades have witnessed application of graph theory and topological indices derived from molecular graph in structure-activity analysis. Such applications are based on regression and various multivariate analyses. Most of the topological indices are computed for the whole molecule and used as descriptors for explaining properties/activities of chemical compounds. However, some substructural descriptors in the form of topological distance based vertex indices have been found to be useful in identifying activity related substructures and in predicting pharmacological and toxicological activities of bioactive compounds. Another important aspect of drug discovery e. g. designing novel pharmaceutical candidates could also be done from the distance distribution associated with such vertex indices. In this article, we will review the development and applications of this approach both in activity prediction as well as in designing novel compounds.