997 resultados para question complexity


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Online enquiry communities such as Question Answering (Q&A) websites allow people to seek answers to all kind of questions. With the growing popularity of such platforms, it is important for community managers to constantly monitor the performance of their communities. Although different metrics have been proposed for tracking the evolution of such communities, maturity, the process in which communities become more topic proficient over time, has been largely ignored despite its potential to help in identifying robust communities. In this paper, we interpret community maturity as the proportion of complex questions in a community at a given time. We use the Server Fault (SF) community, a Question Answering (Q&A) community of system administrators, as our case study and perform analysis on question complexity, the level of expertise required to answer a question. We show that question complexity depends on both the length of involvement and the level of contributions of the users who post questions within their community. We extract features relating to askers, answerers, questions and answers, and analyse which features are strongly correlated with question complexity. Although our findings highlight the difficulty of automatically identifying question complexity, we found that complexity is more influenced by both the topical focus and the length of community involvement of askers. Following the identification of question complexity, we define a measure of maturity and analyse the evolution of different topical communities. Our results show that different topical communities show different maturity patterns. Some communities show a high maturity at the beginning while others exhibit slow maturity rate. Copyright 2013 ACM.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Project Management (PM) as an academic field is relatively new in Australian universities. Moreover, the field is distributed across four main areas: business (management), built environment and construction, engineering and more recently ICT (information systems). At an institutional level, with notable exceptions, there is little engagement between researchers working in those individual areas. Consequently, an initiative was launched in 2009 to create a network of PM researchers to build a disciplinary base for PM in Australia. The initiative took the form of a bi-annual forum. The first forum established the constituency and spread of PM research in Australia (Sense et al., 2011). This special issue of IJPM arose out of the second forum, held in 2012, that explored the notion of an Australian perspective on PM. At the forum, researchers were invited to collaborate to explore issues, methodological approaches, and theoretical positions underpinning their research and to answer the question: is there a distinctly Australian research agenda which responds to the current challenges of large and complex projects in our region? From a research point of view, it was abundantly clear at the forum that many of the issues facing Australian researchers are shared around the world. However, what emerged from the forum as the Australian perspective was a set of themes and research issues that dominate the Australia research agenda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2001 45% (2.7 billion) of the world’s population of approximately 6.1 billion lived in ‘moderate poverty’ on less than US $ 2 per person per day (World Population Summary, 2012). In the last 60 years there have been many theories attempting to explain development, why some countries have the fastest growth in history, while others stagnate and so far no way has been found to explain the differences. Traditional views imply that development is the aggregation of successes from multiple individual business enterprises, but this ignores the interactions between and among institutions, organisations and individuals in the economy, which can often have unpredictable effects. Complexity Development Theory proposes that by viewing development as an emergent property of society, we can help create better development programs at the organisational, institutional and national levels. This paper asks how the principals of CAS can be used to develop CDT principals used to develop and operate development programs at the bottom of the pyramid in developing economies. To investigate this research question we conduct a literature review to define and describe CDT and create propositions for testing. We illustrate these propositions using a case study of an Asset Based Community Development (ABCD) Program for existing and nascent entrepreneurs in the Democratic Republic of the Congo (DRC). We found evidence that all the principals of CDT were related to the characteristics of CAS. If this is the case, development programs will be able to select which CAS needed to test these propositions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the POSSIBLE WINNER problem in computational social choice theory, we are given a set of partial preferences and the question is whether a distinguished candidate could be made winner by extending the partial preferences to linear preferences. Previous work has provided, for many common voting rules, fixed parameter tractable algorithms for the POSSIBLE WINNER problem, with number of candidates as the parameter. However, the corresponding kernelization question is still open and in fact, has been mentioned as a key research challenge 10]. In this paper, we settle this open question for many common voting rules. We show that the POSSIBLE WINNER problem for maximin, Copeland, Bucklin, ranked pairs, and a class of scoring rules that includes the Borda voting rule does not admit a polynomial kernel with the number of candidates as the parameter. We show however that the COALITIONAL MANIPULATION problem which is an important special case of the POSSIBLE WINNER problem does admit a polynomial kernel for maximin, Copeland, ranked pairs, and a class of scoring rules that includes the Borda voting rule, when the number of manipulators is polynomial in the number of candidates. A significant conclusion of our work is that the POSSIBLE WINNER problem is harder than the COALITIONAL MANIPULATION problem since the COALITIONAL MANIPULATION problem admits a polynomial kernel whereas the POSSIBLE WINNER problem does not admit a polynomial kernel. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For any q > 1, let MOD_q be a quantum gate that determines if the number of 1's in the input is divisible by q. We show that for any q,t > 1, MOD_q is equivalent to MOD_t (up to constant depth). Based on the case q=2, Moore has shown that quantum analogs of AC^(0), ACC[q], and ACC, denoted QAC^(0)_wf, QACC[2], QACC respectively, define the same class of operators, leaving q > 2 as an open question. Our result resolves this question, implying that QAC^(0)_wf = QACC[q] = QACC for all q. We also prove the first upper bounds for QACC in terms of related language classes. We define classes of languages EQACC, NQACC (both for arbitrary complex amplitudes) and BQACC (for rational number amplitudes) and show that they are all contained in TC^(0). To do this, we show that a TC^(0) circuit can keep track of the amplitudes of the state resulting from the application of a QACC operator using a constant width polynomial size tensor sum. In order to accomplish this, we also show that TC^(0) can perform iterated addition and multiplication in certain field extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The North Carolinian author Thomas Wolfe (1900‐1938) has long suffered under the “charge of autobiography,” which lingers to this day in critical assessments of his work. Criticism of Wolfe is frequently concerned with questions of generic classification, but since the 1950s, re‐assessments of Wolfe’s work have suggested that Wolfe’s “autobiographical fiction” exhibits a complexity that merits further investigation. Strides in autobiographical and narrative theory have prompted reconsiderations of texts that defy the artificial boundaries of autobiography and fiction. Wolfe has been somewhat neglected in the canon of American fiction of his era, but deserves to be reconsidered in terms of how he engages with the challenges and contradictions of writing about or around the self. This thesis investigates why Wolfe’s work has been the source of considerable critical discomfort and confusion with regard to the relationship between Wolfe’s life and his writing. It explores this issue through an examination of elements of Wolfe’s work that problematise categorisation. Firstly, it investigates the concept of Wolfe as “storyteller.” It explores the motivations and philosophies that underpin Wolfe’s work and his concept of himself as a teller of tales, and examines aspects of Wolfe’s writing process that have their roots in medieval traditions of the memorisation and recitation of tales. The thesis then conducts a detailed examination of how Wolfe describes the process of transforming his memory into narrative through writing. The latter half of the thesis examines narrative techniques used by Wolfe, firstly analysing his extensive use of the iterative and pseudo‐iterative modes, and then his unusual deployment of narrators and focalization. This project sheds light on elements of Wolfe’s approach to writing and narrative strategies that he employs that have previously been overlooked, and that have created considerable critical confusion with regard to the supposedly “autobiographical” genesis of his work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper offers a new insight into how organizations engage with external complexity. It applies a political action perspective that draws attention to the hitherto neglected question of how the relative power organizational leaders enjoy within their environments is significant for the actions they can take on behalf of their organizations when faced with external complexity. It identifies cognitive and relational complexity as two dimensions of the environment with which organizations have to engage. It proposes three modes whereby organizations may engage with environmental complexity that are conditioned by an organization's power within its environment. It also considers the intention associated with each mode, as well as the implications of these modes of engagement for how an organization can learn about its environment and for the use of rationality and intuition in its strategic decision-making. The closing discussion considers how this analysis integrates complexity and political action perspectives in a way that contributes to theoretical development and provides the basis for a dynamic political co-evolutionary approach. © The Author(s) 2011.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has long been substantial interest in understanding consumer food choices, where a key complexity in this context is the potentially large amount of heterogeneity in tastes across individual consumers, as well as the role of underlying attitudes towards food and cooking. The present paper underlines that both tastes and attitudes are unobserved, and makes the case for a latent variable treatment of these components. Using empirical data collected in Northern Ireland as part of a wider study to elicit intra-household trade-offs between home-cooked meal options, we show how these latent sensitivities and attitudes drive both the choice behaviour as well as the answers to supplementary questions. We find significant heterogeneity across respondents in these underlying factors and show how incorporating them in our models leads to important insights into preferences. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Influence diagrams are intuitive and concise representations of structured decision problems. When the problem is non-Markovian, an optimal strategy can be exponentially large in the size of the diagram. We can avoid the inherent intractability by constraining the size of admissible strategies, giving rise to limited memory influence diagrams. A valuable question is then how small do strategies need to be to enable efficient optimal planning. Arguably, the smallest strategies one can conceive simply prescribe an action for each time step, without considering past decisions or observations. Previous work has shown that finding such optimal strategies even for polytree-shaped diagrams with ternary variables and a single value node is NP-hard, but the case of binary variables was left open. In this paper we address such a case, by first noting that optimal strategies can be obtained in polynomial time for polytree-shaped diagrams with binary variables and a single value node. We then show that the same problem is NP-hard if the diagram has multiple value nodes. These two results close the fixed-parameter complexity analysis of optimal strategy selection in influence diagrams parametrized by the shape of the diagram, the number of value nodes and the maximum variable cardinality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Addressing building energy use is a pressing issue for building sector decision makers across Europe. In Sweden, some regions have adopted a target of reducing energy use in buildings by 50% until 2050. However, building codes currently do not support as ambitious objectives as these, and novel approaches to addressing energy use in buildings from a regional perspective are called for. The purpose of this licentiate thesis was to provide a deeper understanding of most relevant issues with regard to energy use in buildings from a broad perspective and to suggest pathways towards reaching the long-term savings objective. Current trends in building sector structure and energy use point to detached houses constructed before 1981 playing a key role in the energy transition, especially in the rural areas of Sweden. In the Swedish county of Dalarna, which was used as a study area in this thesis, these houses account for almost 70% of the residential heating demand. Building energy simulations of eight sample houses from county show that there is considerable techno-economic potential for energy savings in these houses, but not quite enough to reach the 50% savings objective. Two case studies from rural Sweden show that savings well beyond 50% are achievable, both when access to capital and use of high technology are granted and when they are not. However, on a broader scale both direct and indirect rebound effects will have to be expected, which calls for more refined approaches to energy savings. Furthermore, research has shown that the techno-economic potential is in fact never realised, not even in the most well-designed intervention programmes, due to the inherent complexity of human behaviour with respect to energy use. This is not taken account of in neither current nor previous Swedish energy use legislation. Therefore an approach that considers the technical prerequisites, economic aspects and the perspective of the many home owners, based on Community-Based Social Marketing methodology, is suggested as a way forward towards reaching the energy savings target.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We used the statistical measurements of information entropy, disequilibrium and complexity to infer a hierarchy of equations of state for two types of compact stars from the broad class of neutron stars, namely, with hadronic composition and with strange quark composition. Our results show that, since order costs energy. Nature would favor the exotic strange stars even though the question of how to form the strange stars cannot be answered within this approach. (C) 2012 Elsevier B.V. All rights reserved.