674 resultados para Modern portfolio theory
Resumo:
What are the information practices of teen content creators? In the United States over two thirds of teens have participated in creating and sharing content in online communities that are developed for the purpose of allowing users to be producers of content. This study investigates how teens participating in digital participatory communities find and use information as well as how they experience the information. From this investigation emerged a model of their information practices while creating and sharing content such as film-making, visual art work, story telling, music, programming, and web site design in digital participatory communities. The research uses grounded theory methodology in a social constructionist framework to investigate the research problem: what are the information practices of teen content creators? Data was gathered through semi-structured interviews and observation of teen’s digital communities. Analysis occurred concurrently with data collection, and the principle of constant comparison was applied in analysis. As findings were constructed from the data, additional data was collected until a substantive theory was constructed and no new information emerged from data collection. The theory that was constructed from the data describes five information practices of teen content creators. The five information practices are learning community, negotiating aesthetic, negotiating control, negotiating capacity, and representing knowledge. In describing the five information practices there are three necessary descriptive components, the community of practice, the experiences of information and the information actions. The experiences of information include information as participation, inspiration, collaboration, process, and artifact. Information actions include activities that occur in the categories of gathering, thinking and creating. The experiences of information and information actions intersect in the information practices, which are situated within the specific community of practice, such as a digital participatory community. Finally, the information practices interact and build upon one another and this is represented in a graphic model and explanation.
Resumo:
Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.
Resumo:
The Gallery of Modern Art (GoMA) in Brisbane, Australia’s third largest city, recently staged ‘21st Century: Art of the First Decade’. The gallery spaces were replete with a commissioned slide by Carsten Höller, an installation of Rivane Neuenschwande’s I Wish Your Wish (2003), a table of white Legos, a room of purple balloons and other participatory or interactive artworks designed to engage multiple publics and encourage audience participation in a variety of ways. Many of the featured projects used day-to-day experiences and offered new conceptions about art practice and what they can elicit in their public – raise awareness about local issues, help audiences imagine different ways of negotiating their environs or experi-ence a museum in a new way. At times, the bottom floor galleries resembled a theme park – adults and children playing with Legos and using Höller’s slide. This article examines the benefits and limitations of such artistic interventions by relating the GoMA exhibition to Brisbane City Council’s campaign of ‘Together Brisbane’ (featuring images of Neunenschwande’s ribbons); a response to the devastation brought to the city and its surrounds in January 2011. During the Brisbane floods, GoMA’s basement was damaged, the museum closed and upon reopening, visitor numbers soared. In this context, GoMA’s use of engaged art practice – always verging on the ephemeral and ‘fun’ – has been used to project a wider notion of a collective urban public. What questions does this raise, not only regarding the cultural politics around the social and participatory ‘turn’ in art practice, but its use to address a much wider urban public in a moment of crisis.
Resumo:
Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.
Resumo:
This article discusses different accounts of Shanghai Modern, the period between 1920s and 1940s in which the city occupied a unique position within China and the world. It places discussions of this period in the context of the resurgence of urban led modernization in China, led by Shanghai. It looks in particular at Leo Ou-Fan Lee’s attempt to link the cosmopolitanism of Shanghai modern with prospects for this new post-reform China. I then discuss Ackbas Abbas’ response to this book and use this as a way of reflecting on the progress of Shanghai urban development and its divergence/ convergence with similar processes in the West. The article then looks at the other significant moment of the Cultural Revolution as a way of opening up discussions of Chinese and Shanghainese modernity beyond that of simply an absorption into Western capitalist modernity. It concludes by briefly introducing this volume.
Resumo:
'Contemporary Australia: Women' is the second in a series of triennial exhibitions at the Gallery of Modern Art in Queensland, providing a survey of contemporary art practices across the country. This exhibition's focus on women artists comes in the wake of a number of high profile international exhibitions looking at women artists in both contemporary and historical contexts. This review situates the exhibition within this field and considers its significance.
Resumo:
Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.
Resumo:
his study presents an improved method of dealing with embedded tax liabilities in portfolio choice. We argue that using a risk-free discount rate is appropriate for calculating the present value of future tax liabilities. Supportive of recent research, our results found a taxation-induced preference of holding equities over bonds, and a location preference of holding equities in the taxable account and bonds in retirement accounts. These important findings contrast with traditional investment advice which suggests a greater capacity for risk in retirement accounts.
Resumo:
Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.
Resumo:
According to Karl Popper, widely regarded as one of the greatest philosophers of science in the 20th century, falsifiability is the primary characteristic that distinguishes scientific theories from ideologies – or dogma. For example, for people who argue that schools should treat creationism as a scientific theory, comparable to modern theories of evolution, advocates of creationism would need to become engaged in the generation of falsifiable hypothesis, and would need to abandon the practice of discouraging questioning and inquiry. Ironically, scientific theories themselves are accepted or rejected based on a principle that might be called survival of the fittest. So, for healthy theories on development to occur, four Darwinian functions should function: (a) variation – avoid orthodoxy and encourage divergent thinking, (b) selection – submit all assumptions and innovations to rigorous testing, (c) diffusion – encourage the shareability of new and/or viable ways of thinking, and (d) accumulation – encourage the reuseability of viable aspects of productive innovations.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.
Resumo:
This paper reports a study that explored a new construct: ‘climate of fear’. We hypothesised that climate of fear would vary across work sites within organisations, but not across organisations. This is in contrast a to measures of organisational culture, which were expected to vary both within and across organisations. To test our hypotheses, we developed a new 13-item measure of perceived fear in organisations and tested it in 20 sites across two organisations (N ≡ 209). Culture variables measured were innovative leadership culture, and communication culture. Results were that climate of fear did vary across sites in both organisations, while differences across organisations were not significant, as we anticipated. Organisational culture, however, varied between the organisations, and within one of the organisations. The climate of fear scale exhibited acceptable psychometric properties
Resumo:
An engaging narrative is maintained throughout this edited collection of articles that address the issue of militarism in international relations. The book seamlessly integrates historical and contemporary perspectives on militarism with theory and relevant international case studies, resulting in a very informative read. The work is comprised of three parts. Part 1 deals with the theorisation of militarism and includes chapters by Anna Stavrianakis and Jan Selby, Martin Shaw, Simon Dalby, and Nicola Short. It covers a range of topics relating to historical and contemporary theories of militarism, geopolitical threat construction, political economy, and the US military’s ‘cultural turn’.
Multi-level knowledge transfer in software development outsourcing projects : the agency theory view
Resumo:
In recent years, software development outsourcing has become even more complex. Outsourcing partner have begun‘re- outsourcing’ components of their projects to other outsourcing companies to minimize cost and gain efficiencies, creating a multi-level hierarchy of outsourcing. This research in progress paper presents preliminary findings of a study designed to understand knowledge transfer effectiveness of multi-level software development outsourcing projects. We conceptualize the SD-outsourcing entities using the Agency Theory. This study conceptualizes, operationalises and validates the concept of Knowledge Transfer as a three-phase multidimensional formative index of 1) Domain knowledge, 2) Communication behaviors, and 3) Clarity of requirements. Data analysis identified substantial, significant differences between the Principal and the Agent on two of the three constructs. Using Agency Theory, supported by preliminary findings, the paper also provides prescriptive guidelines of reducing the friction between the Principal and the Agent in multi-level software outsourcing.