994 resultados para reaction window theory
Resumo:
What are the information practices of teen content creators? In the United States over two thirds of teens have participated in creating and sharing content in online communities that are developed for the purpose of allowing users to be producers of content. This study investigates how teens participating in digital participatory communities find and use information as well as how they experience the information. From this investigation emerged a model of their information practices while creating and sharing content such as film-making, visual art work, story telling, music, programming, and web site design in digital participatory communities. The research uses grounded theory methodology in a social constructionist framework to investigate the research problem: what are the information practices of teen content creators? Data was gathered through semi-structured interviews and observation of teen’s digital communities. Analysis occurred concurrently with data collection, and the principle of constant comparison was applied in analysis. As findings were constructed from the data, additional data was collected until a substantive theory was constructed and no new information emerged from data collection. The theory that was constructed from the data describes five information practices of teen content creators. The five information practices are learning community, negotiating aesthetic, negotiating control, negotiating capacity, and representing knowledge. In describing the five information practices there are three necessary descriptive components, the community of practice, the experiences of information and the information actions. The experiences of information include information as participation, inspiration, collaboration, process, and artifact. Information actions include activities that occur in the categories of gathering, thinking and creating. The experiences of information and information actions intersect in the information practices, which are situated within the specific community of practice, such as a digital participatory community. Finally, the information practices interact and build upon one another and this is represented in a graphic model and explanation.
Resumo:
Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.
Resumo:
Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.
Resumo:
The structure and composition of reaction products between Bi-Sr-Ca-Cu-oxide (BSCCO) thick films and alumina substrates have been characterized using a combination of electron diffraction, scanning electron microscopy and energy dispersive X-ray spectrometry (EDX). Sr and Ca are found to be the most reactive cations with alumina. Sr4Al6O12SO4 is formed between the alumina substrates and BSCCO thick films prepared from paste with composition close to Bi-2212 (and Bi-2212 + 10 wt.% Ag). For paste with composition close to Bi(Pb)-2223 + 20 wt.% Ag, a new phase with f.c.c. structure, lattice parameter about a = 24.5 A and approximate composition Al3Sr2CaBi2CuOx has been identified in the interface region. Understanding and control of these reactions is essential for growth of high quality BSCCO thick films on alumina. (C) 1997 Elsevier Science S.A.
Resumo:
Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.
Resumo:
Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.
Resumo:
In this paper we give an overview of some very recent work, as well as presenting a new approach, on the stochastic simulation of multi-scaled systems involving chemical reactions. In many biological systems (such as genetic regulation and cellular dynamics) there is a mix between small numbers of key regulatory proteins, and medium and large numbers of molecules. In addition, it is important to be able to follow the trajectories of individual molecules by taking proper account of the randomness inherent in such a system. We describe different types of simulation techniques (including the stochastic simulation algorithm, Poisson Runge-Kutta methods and the balanced Euler method) for treating simulations in the three different reaction regimes: slow, medium and fast. We then review some recent techniques on the treatment of coupled slow and fast reactions for stochastic chemical kinetics and present a new approach which couples the three regimes mentioned above. We then apply this approach to a biologically inspired problem involving the expression and activity of LacZ and LacY proteins in E coli, and conclude with a discussion on the significance of this work. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper reports a study that explored a new construct: ‘climate of fear’. We hypothesised that climate of fear would vary across work sites within organisations, but not across organisations. This is in contrast a to measures of organisational culture, which were expected to vary both within and across organisations. To test our hypotheses, we developed a new 13-item measure of perceived fear in organisations and tested it in 20 sites across two organisations (N ≡ 209). Culture variables measured were innovative leadership culture, and communication culture. Results were that climate of fear did vary across sites in both organisations, while differences across organisations were not significant, as we anticipated. Organisational culture, however, varied between the organisations, and within one of the organisations. The climate of fear scale exhibited acceptable psychometric properties
Resumo:
This paper identifies two major forces driving change in media policy worldwide: media convergence, and renewed concerns about media ethics, with the latter seen in the U.K. Leveson Inquiry. It focuses on two major public inquiries in Australia during 2011-2012 – the Independent Media Inquiry (Finkelstein Review) and the Convergence Review – and the issues raised about future regulation of journalism and news standards. Drawing upon perspectives from media theory, it observes the strong influence of social responsibility theories of the media in the Finkelstein Review, and the adverse reaction these received from those arguing from Fourth Estate/free press perspectives, which were also consistent with the longstanding opposition of Australian newspaper proprietors to government regulation. It also discusses the approaches taken in the Convergence Review to regulating for news standards, in light of the complexities arising from media convergence. The paper concludes with consideration of the fast-changing environment in which such proposals to transform media regulation are being considered, including the crisis of news media organisation business models, as seen in Australia with major layoffs of journalists from the leading print media publications.
Resumo:
Polymerase chain reaction (PCR) was developed for the detection of Banana bunchy top virus (BBTV) at maximum after 210 min and at minimum after 90 min using Pc-1 and Pc-2, respectively. PCR detection of BBTV in crude sap indicated that the freezing of banana tissue in liquid nitrogen (LN2) before extraction was more effective than using sand as the extraction technique. BBTV was also detected using PCR assay in 69 healthy and diseased plants using Na-PO4 buffer containing 1 % SDS. PCR detection of BBTV in nucleic acid extracts using seven different extraction buffers to adapt the use of PCR in routine detection in the field was studied. Results proved that BBTV was detected with high sensitivity in nucleic acid extracts more than in infectious sap. The results also suggested the common aetiology for the BBTV by the PCR reactions of BBTV in nucleic acid extracts from Australia, Burundi, Egypt, France, Gabon, Philippines and Taiwan. Results also proved a positive relation between the Egyptian-BBTV isolate and abaca bunchy top isolate from the Philippines, but there no relation was found with the Cucumber mosaic cucumovirus (CMV) isolates from Egypt and Philippines and Banana bract mosaic virus (BBMV) were found.
Resumo:
Objective: To determine the prevalence, severity, location, etiology, treatment, and healing of medical device-related pressure ulcers in intensive care patients for up to 7 days. Design: Prospective repeated measures study. Setting and participants: Patients in 6 intensive care units of 2 major medical centers, one each in Australia and the United States, were screened 1 day per month for 6 months. Those with device-related ulcers were followed daily up to 7 days. Outcome measures: Device-related ulcer prevalence, pain, infection, treatment, healing. Results: 15/483 patients had device-related ulcers and 9/15 with 11 ulcers were followed beyond screening. Their mean age was 60.5 years, most were men, over-weight, and at increased pressure ulcer risk. Endotracheal and nasogastric tubes were the cause of most device-related ulcers. Repositioning was the most frequent treatment. 4/11 ulcers healed within the 7 day observation period. Conclusion: Device-related ulcer prevalence was 3.1%, similar to that reported in the limited literature available, indicating an ongoing problem. Systematic assessment and repositioning of devices are the mainstays of care. We recommend continued prevalence determination and that nurses remain vigilant to prevent device-related ulcers, especially in patients with nasogastric and endotracheal tubes.
Resumo:
This work investigates the accuracy and efficiency tradeoffs between centralized and collective (distributed) algorithms for (i) sampling, and (ii) n-way data analysis techniques in multidimensional stream data, such as Internet chatroom communications. Its contributions are threefold. First, we use the Kolmogorov-Smirnov goodness-of-fit test to show that statistical differences between real data obtained by collective sampling in time dimension from multiple servers and that of obtained from a single server are insignificant. Second, we show using the real data that collective data analysis of 3-way data arrays (users x keywords x time) known as high order tensors is more efficient than centralized algorithms with respect to both space and computational cost. Furthermore, we show that this gain is obtained without loss of accuracy. Third, we examine the sensitivity of collective constructions and analysis of high order data tensors to the choice of server selection and sampling window size. We construct 4-way tensors (users x keywords x time x servers) and analyze them to show the impact of server and window size selections on the results.
Resumo:
We consider a two-dimensional space-fractional reaction diffusion equation with a fractional Laplacian operator and homogeneous Neumann boundary conditions. The finite volume method is used with the matrix transfer technique of Ilić et al. (2006) to discretise in space, yielding a system of equations that requires the action of a matrix function to solve at each timestep. Rather than form this matrix function explicitly, we use Krylov subspace techniques to approximate the action of this matrix function. Specifically, we apply the Lanczos method, after a suitable transformation of the problem to recover symmetry. To improve the convergence of this method, we utilise a preconditioner that deflates the smallest eigenvalues from the spectrum. We demonstrate the efficiency of our approach for a fractional Fisher’s equation on the unit disk.