64 resultados para Multi-level Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concern with what can explain variation in generalized social trust has led to an abundance of theoretical models. Defining generalized social trust as a belief in human benevolence, we focus on the emancipation theory and social capital theory as well as the ethnic diversity and economic development models of trust. We then determine which dimensions of individuals’ behavior and attitudes as well as of their national context are the most important predictors. Using data from 20 countries that participated in round one of the European Social Survey, we test these models at their respective level of analysis, individual and/or national. Our analysis revealed that individuals’ own trust in the political system as a moral and competent institution was the most important predictor of generalized social trust at the individual level, while a country’s level of affluence was the most important contextual predictor, indicating that different dimensions are significant at the two levels of analysis. This analysis also raised further questions as to the meaning of social capital at the two levels of analysis and the conceptual equivalence of its civic engagement dimension across cultures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From a macro perspective, it is widely acknowledged that University incubation models within a region are important stimulants of economic development through innovation and job creation. With the emergence of quadruple helix innovation ecosystems, universities have had re-evaluate their University incubation activity and models to engage more fully with industry and end users. However, within a given region, the type of University may influence their ability to engage with quadruple helix stakeholders and consequently impact their incubation activity. To date there is a scarcity of research which explores this 'meso' environment and its subsequent impact on University incubation models. Therefore, the aim of this paper is to use a stakeholder lens to explore University Incubation models within unique regional and organisational characteristics and constraints. The research methodology employed was based on a comparative case analysis of incubation of two different Universities within a UK peripheral region. It was found that variances existed in relation to the two universities incubation models which were found to result from both regional (macro environment) and organisational (meso environment) influences (i.e. university type). This research contributes to both regional and national agendas by empirically illustrating the need for appropriate design and tailoring of university incubation models (via acknowledgement of quadruple helix stakeholder influence) to incorporate contextual influences rather than adopting a best practise approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their non-deterministic performance. Although CAMs are favoured by technology vendors due to their deterministic high lookup rates, they suffer from the problems of high power dissipation and high silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multi-level cutting the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their nondeterministic performance. Although content addressable memories (CAMs) are favoured by technology vendors due to their deterministic high-lookup rates, they suffer from the problems of high-power consumption and high-silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multilevel cutting of the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Adaptability to changing circumstances is a key feature of living creatures. Understanding such adaptive processes is central to developing successful autonomous artifacts. In this paper two perspectives are brought to bear on the issue of adaptability. The first is a short term perspective which looks at adaptability in terms of the interactions between the agent and the environment. The second perspective involves a hierarchical evolutionary model which seeks to identify higher-order forms of adaptability based on the concept of adaptive meta-constructs. Task orientated and agent-centered models of adaptive processes in artifacts are considered from these two perspectives. The former isrepresented by the fitness function approach found in evolutionary learning, and the latter in terms of the concepts of empowerment and homeokinesis found in models derived from the self-organizing systems approach. A meta-construct approach to adaptability based on the identification of higher level meta-metrics is also outlined. 2009 Published by Elsevier B.V.