944 resultados para Complexity theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We test the broken windows theory using a field experiment in a shared area of an academic workplace(the department common room). More specifically, we explore academics’ and postgraduate students’ behavior under an order condition (a clean environment) and a disorder condition (a messy environment). We find strong evidence that signs of disorderly behavior trigger littering: In 59% of the cases, subjects litter in the disorder treatment as compared to 18% in the order condition. These results remain robust in a multivariate analysis even when controlling for a large set of factors not directly examined by previous studies. Overall, when academic staff and postgraduate students observe that others have violated the social norm of keeping the common room clean, all else being equal, the probability of littering increases by around 40%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Substantial research efforts have been expended to deal with the complexity of concurrent systems that is inherent to their analysis, e.g., works that tackle the well-known state space explosion problem. Approaches differ in the classes of properties that they are able to suitably check and this is largely a result of the way they balance the trade-off between analysis time and space employed to describe a concurrent system. One interesting class of properties is concerned with behavioral characteristics. These properties are conveniently expressed in terms of computations, or runs, in concurrent systems. This article introduces the theory of untanglings that exploits a particular representation of a collection of runs in a concurrent system. It is shown that a representative untangling of a bounded concurrent system can be constructed that captures all and only the behavior of the system. Representative untanglings strike a unique balance between time and space, yet provide a single model for the convenient extraction of various behavioral properties. Performance measurements in terms of construction time and size of representative untanglings with respect to the original specifications of concurrent systems, conducted on a collection of models from practice, confirm the scalability of the approach. Finally, this article demonstrates practical benefits of using representative untanglings when checking various behavioral properties of concurrent systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australian queer (GLBTIQ) university student activist media is an important site of self-representation. Community media is a significant site for the development of queer identity, community and a key part of queer politics. This paper reviews my research into queer student media, which is grounded in a queer theoretical perspective. Rob Cover argues that queer theoretical approaches that study media products fail to consider the material contexts that contribute to their construction. I use an ethnographic approach to examine how editors construct queer identity and community in queer student media. My research contributes to queer media scholarship by addressing the gap that Cover identifies, and to the rich scholarship on negotiations of queer community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A clear understanding of the cognitive-emotional processes underpinning desires to overconsume foods and adopt sedentary lifestyles can inform the development of more effective interventions to promote healthy eating and physical activity. The Elaborated Intrusion Theory of Desires offers a framework that can help in this endeavor through its emphases on the roles of intrusive thoughts and elaboration of multisensory imagery. There is now substantial evidence that tasks that compete for limited working memory resources with food-related imagery can reduce desires to eat that food, and that positive imagery can promote functional behavior. Meditation mindfulness can also short-circuit elaboration of dysfunctional cognition. Functional Decision Making is an approach that applies laboratory-based research on desire, to provide a motivational intervention to establish and entrench behavior changes, so healthy eating and physical activity become everyday habits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Readily accepted knowledge regarding crash causation is consistently omitted from efforts to model and subsequently understand motor vehicle crash occurrence and their contributing factors. For instance, distracted and impaired driving accounts for a significant proportion of crash occurrence, yet is rarely modeled explicitly. In addition, spatially allocated influences such as local law enforcement efforts, proximity to bars and schools, and roadside chronic distractions (advertising, pedestrians, etc.) play a role in contributing to crash occurrence and yet are routinely absent from crash models. By and large, these well-established omitted effects are simply assumed to contribute to model error, with predominant focus on modeling the engineering and operational effects of transportation facilities (e.g. AADT, number of lanes, speed limits, width of lanes, etc.) The typical analytical approach—with a variety of statistical enhancements—has been to model crashes that occur at system locations as negative binomial (NB) distributed events that arise from a singular, underlying crash generating process. These models and their statistical kin dominate the literature; however, it is argued in this paper that these models fail to capture the underlying complexity of motor vehicle crash causes, and thus thwart deeper insights regarding crash causation and prevention. This paper first describes hypothetical scenarios that collectively illustrate why current models mislead highway safety researchers and engineers. It is argued that current model shortcomings are significant, and will lead to poor decision-making. Exploiting our current state of knowledge of crash causation, crash counts are postulated to arise from three processes: observed network features, unobserved spatial effects, and ‘apparent’ random influences that reflect largely behavioral influences of drivers. It is argued; furthermore, that these three processes in theory can be modeled separately to gain deeper insight into crash causes, and that the model represents a more realistic depiction of reality than the state of practice NB regression. An admittedly imperfect empirical model that mixes three independent crash occurrence processes is shown to outperform the classical NB model. The questioning of current modeling assumptions and implications of the latent mixture model to current practice are the most important contributions of this paper, with an initial but rather vulnerable attempt to model the latent mixtures as a secondary contribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Underlying all assessments are human judgements regarding the quality of students’ understandings. Despite their ubiquity, those judgements are conceptually elusive. The articles selected for inclusion in this issue explore the complexity of judgement practice raising critical questions that challenge existing views and accepted policy and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What are the information practices of teen content creators? In the United States over two thirds of teens have participated in creating and sharing content in online communities that are developed for the purpose of allowing users to be producers of content. This study investigates how teens participating in digital participatory communities find and use information as well as how they experience the information. From this investigation emerged a model of their information practices while creating and sharing content such as film-making, visual art work, story telling, music, programming, and web site design in digital participatory communities. The research uses grounded theory methodology in a social constructionist framework to investigate the research problem: what are the information practices of teen content creators? Data was gathered through semi-structured interviews and observation of teen’s digital communities. Analysis occurred concurrently with data collection, and the principle of constant comparison was applied in analysis. As findings were constructed from the data, additional data was collected until a substantive theory was constructed and no new information emerged from data collection. The theory that was constructed from the data describes five information practices of teen content creators. The five information practices are learning community, negotiating aesthetic, negotiating control, negotiating capacity, and representing knowledge. In describing the five information practices there are three necessary descriptive components, the community of practice, the experiences of information and the information actions. The experiences of information include information as participation, inspiration, collaboration, process, and artifact. Information actions include activities that occur in the categories of gathering, thinking and creating. The experiences of information and information actions intersect in the information practices, which are situated within the specific community of practice, such as a digital participatory community. Finally, the information practices interact and build upon one another and this is represented in a graphic model and explanation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to Karl Popper, widely regarded as one of the greatest philosophers of science in the 20th century, falsifiability is the primary characteristic that distinguishes scientific theories from ideologies – or dogma. For example, for people who argue that schools should treat creationism as a scientific theory, comparable to modern theories of evolution, advocates of creationism would need to become engaged in the generation of falsifiable hypothesis, and would need to abandon the practice of discouraging questioning and inquiry. Ironically, scientific theories themselves are accepted or rejected based on a principle that might be called survival of the fittest. So, for healthy theories on development to occur, four Darwinian functions should function: (a) variation – avoid orthodoxy and encourage divergent thinking, (b) selection – submit all assumptions and innovations to rigorous testing, (c) diffusion – encourage the shareability of new and/or viable ways of thinking, and (d) accumulation – encourage the reuseability of viable aspects of productive innovations.