933 resultados para effective approaches
Resumo:
This research has established, through ultrasound, near infrared spectroscopy and biomechanics experiments, parameters and parametric relationships that can form the framework for quantifying the integrity of the articular cartilage-on-bone laminate, and objectively distinguish between normal/healthy and abnormal/degenerated joint tissue, with a focus on articular cartilage. This has been achieved by: 1. using traditional experimental methods to produce new parameters for cartilage assessment; 2. using novel methodologies to develop new parameters; and 3. investigating the interrelationships between mechanical, structural and molec- ular properties to identify and select those parameters and methodologies that can be used in a future arthroscopic probe based on points 1 and 2. By combining the molecular, micro- and macro-structural characteristics of the tissue with its mechanical properties, we arrive at a set of critical benchmarking parameters for viable and early-stage non-viable cartilage. The interrelationships between these characteristics, examined using a multivariate analysis based on principal components analysis, multiple linear regression and general linear modeling, could then to deter- mine those parameters and relationships which have the potential to be developed into a future clinical device. Specifically, this research has found that the ultrasound and near infrared techniques can subsume the mechanical parameters and combine to characterise the tissue at the molecular, structural and mechanical levels over the full depth of the cartilage matrix. It is the opinion in this thesis that by enabling the determination of the precise area of in uence of a focal defect or disease in the joint, demarcating the boundaries of articular cartilage with dierent levels of degeneration around a focal defect, better surgical decisions that will advance the processes of joint management and treatment will be achieved. Providing the basis for a surgical tool, this research will contribute to the enhancement and quanti�cation of arthroscopic procedures, extending to post- treatment monitoring and as a research tool, will enable a robust method for evaluating developing (particularly focalised) treatments.
Resumo:
Research in the early years places increasing importance on participatory methods to engage children. The playback of video-recording to stimulate conversation is a research method that enables children’s accounts to be heard and attends to a participatory view. During video-stimulated sessions, participants watch an extract of video-recording of a specific event in which they were involved, and then account for their participation in that event. Using an interactional perspective, this paper draws distinctions between video-stimulated accounts and a similar research method, popular in education, that of video-stimulated recall. Reporting upon a study of young children’s interactions in a playground, video-stimulated accounts are explicated to show how the participants worked toward the construction of events in the video-stimulated session. This paper discusses how the children account for complex matters within their social worlds, and manage the accounting of others in the video-stimulated session. When viewed from an interactional perspective and used alongside fine grained analytic approaches, video-stimulated accounts are an effective method to provide the standpoint of the children involved and further the competent child paradigm.
Resumo:
Regional safety program managers face a daunting challenge in the attempt to reduce deaths, injuries, and economic losses that result from motor vehicle crashes. This difficult mission is complicated by the combination of a large perceived need, small budget, and uncertainty about how effective each proposed countermeasure would be if implemented. A manager can turn to the research record for insight, but the measured effect of a single countermeasure often varies widely from study to study and across jurisdictions. The challenge of converting widespread and conflicting research results into a regionally meaningful conclusion can be addressed by incorporating "subjective" information into a Bayesian analysis framework. Engineering evaluations of crashes provide the subjective input on countermeasure effectiveness in the proposed Bayesian analysis framework. Empirical Bayes approaches are widely used in before-and-after studies and "hot-spot" identification; however, in these cases, the prior information was typically obtained from the data (empirically), not subjective sources. The power and advantages of Bayesian methods for assessing countermeasure effectiveness are presented. Also, an engineering evaluation approach developed at the Georgia Institute of Technology is described. Results are presented from an experiment conducted to assess the repeatability and objectivity of subjective engineering evaluations. In particular, the focus is on the importance, methodology, and feasibility of the subjective engineering evaluation for assessing countermeasures.
Resumo:
The critical factor in determining students' interest and motivation to learn science is the quality of the teaching. However, science typically receives very little time in primary classrooms, with teachers often lacking the confidence to engage in inquiry-based learning because they do not have a sound understanding of science or its associated pedagogical approaches. Developing teacher knowledge in this area is a major challenge. Addressing these concerns with didactic "stand and deliver" modes of Professional Development (PD) has been shown to have little relevance or effectiveness, yet is still the predominant approach used by schools and education authorities. In response to that issue, the constructivist-inspired Primary Connections professional learning program applies contemporary theory relating to the characteristics of effective primary science teaching, the changes required for teachers to use those pedagogies, and professional learning strategies that facilitate such change. This study investigated the nature of teachers' engagement with the various elements of the program. Summative assessments of such PD programs have been undertaken previously, however there was an identified need for a detailed view of the changes in teachers' beliefs and practices during the intervention. This research was a case study of a Primary Connections implementation. PD workshops were presented to a primary school staff, then two teachers were observed as they worked in tandem to implement related curriculum units with their Year 4/5 classes over a six-month period. Data including interviews, classroom observations and written artefacts were analysed to identify common themes and develop a set of assertions related to how teachers changed their beliefs and practices for teaching science. When teachers implement Primary Connections, their students "are more frequently curious in science and more frequently learn interesting things in science" (Hackling & Prain, 2008). This study has found that teachers who observe such changes in their students consequently change their beliefs and practices about teaching science. They enhance science learning by promoting student autonomy through open-ended inquiries, and they and their students enhance their scientific literacy by jointly constructing investigations and explaining their findings. The findings have implications for teachers and for designers of PD programs. Assertions related to teaching science within a pedagogical framework consistent with the Primary Connections model are that: (1) promoting student autonomy enhances science learning; (2) student autonomy presents perceived threats to teachers but these are counteracted by enhanced student engagement and learning; (3) the structured constructivism of Primary Connections resources provides appropriate scaffolding for teachers and students to transition from didactic to inquiry-based learning modes; and (4) authentic science investigations promote understanding of scientific literacy and the "nature of science". The key messages for designers of PD programs are that: (1) effective programs model the pedagogies being promoted; (2) teachers benefit from taking the role of student and engaging in the proposed learning experiences; (3) related curriculum resources foster long-term engagement with new concepts and strategies; (4) change in beliefs and practices occurs after teachers implement the program or strategy and see positive outcomes in their students; and (5) implementing this study's PD model is efficient in terms of resources. Identified topics for further investigation relate to the role of assessment in providing evidence to support change in teachers' beliefs and practices, and of teacher reflection in making such change more sustainable.
Resumo:
Patients undergoing radiation therapy for cancer face a series of challenges that require support from a multidisciplinary team which includes radiation oncology nurses. However, the specific contribution of nursing, and the models of care that best support the delivery of nursing interventions in the radiotherapy setting, is not well described. In this case study, the Interaction Model of Client Health Behaviour and the associated principles of person-centred care were incorporated into a new model of care that was implemented in one radiation oncology setting in Brisbane, Australia. The new model of care was operationalised through a Primary Nursing/Collaborative Practice framework. To evaluate the impact of the new model for patients and health professionals, multiple sources of data were collected from patients and clinical staff prior to, during, and 18 months following introduction of the practice redesign. One cohort of patients and clinical staff completed surveys incorporating measures of key outcomes immediately prior to implementation of the model, while a second cohort of patients and clinical staff completed these same surveys 18 months following introduction of the model. In-depth interviews were also conducted with nursing, medical and allied health staff throughout the implementation phase to obtain a more comprehensive account of the processes and outcomes associated with implementing such a model. From the patients’ perspectives, this study demonstrated that, although adverse effects of radiotherapy continue to affect patient well-being, patients continue to be satisfied with nursing care in this specialty, and that they generally reported high levels of functioning despite undergoing a curative course of radiotherapy. From the health professionals’ perspective, there was evidence of attitudinal change by nursing staff within the radiotherapy department which reflected a greater understanding and appreciation of a more person-centred approach to care. Importantly, this case study has also confirmed that a range of factors need to be considered when redesigning nursing practice in the radiotherapy setting, as the challenges associated with changing traditional practices, ensuring multidisciplinary approaches to care, and resourcing a new model were experienced. The findings from this study suggest that the move from a relatively functional approach to a person-centred approach in the radiotherapy setting has contributed to some improvements in the provision of individualised and coordinated patient care. However, this study has also highlighted that primary nursing may be limited in its approach as a framework for patient care unless it is supported by a whole team approach, an appropriate supportive governance model, and sufficient resourcing. Introducing such a model thus requires effective education, preparation and ongoing support for the whole team. The challenges of providing care in the context of complex interdisciplinary relationships have been highlighted by this study. Aspects of this study may assist in planning further nursing interventions for patients undergoing radiotherapy for cancer, and continue to enhance the contribution of the radiation oncology nurse to improved patient outcomes.
Resumo:
Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.
Resumo:
Nitrous oxide (N2O) is primarily produced by the microbially-mediated nitrification and denitrification processes in soils. It is influenced by a suite of climate (i.e. temperature and rainfall) and soil (physical and chemical) variables, interacting soil and plant nitrogen (N) transformations (either competing or supplying substrates) as well as land management practices. It is not surprising that N2O emissions are highly variable both spatially and temporally. Computer simulation models, which can integrate all of these variables, are required for the complex task of providing quantitative determinations of N2O emissions. Numerous simulation models have been developed to predict N2O production. Each model has its own philosophy in constructing simulation components as well as performance strengths. The models range from those that attempt to comprehensively simulate all soil processes to more empirical approaches requiring minimal input data. These N2O simulation models can be classified into three categories: laboratory, field and regional/global levels. Process-based field-scale N2O simulation models, which simulate whole agroecosystems and can be used to develop N2O mitigation measures, are the most widely used. The current challenge is how to scale up the relatively more robust field-scale model to catchment, regional and national scales. This paper reviews the development history, main construction components, strengths, limitations and applications of N2O emissions models, which have been published in the literature. The three scale levels are considered and the current knowledge gaps and challenges in modelling N2O emissions from soils are discussed.
Resumo:
The effective atomic number is widely employed in radiation studies, particularly for the characterisation of interaction processes in dosimeters, biological tissues and substitute materials. Gel dosimeters are unique in that they comprise both the phantom and dosimeter material. In this work, effective atomic numbers for total and partial electron interaction processes have been calculated for the first time for a Fricke gel dosimeter, five hypoxic and nine normoxic polymer gel dosimeters. A range of biological materials are also presented for comparison. The spectrum of energies studied spans 10 keV to 100 MeV, over which the effective atomic number varies by 30 %. The effective atomic numbers of gels match those of soft tissue closely over the full energy range studied; greater disparities exist at higher energies but are typically within 4 %.
Resumo:
Fractional Fokker-Planck equations (FFPEs) have gained much interest recently for describing transport dynamics in complex systems that are governed by anomalous diffusion and nonexponential relaxation patterns. However, effective numerical methods and analytic techniques for the FFPE are still in their embryonic state. In this paper, we consider a class of time-space fractional Fokker-Planck equations with a nonlinear source term (TSFFPE-NST), which involve the Caputo time fractional derivative (CTFD) of order α ∈ (0, 1) and the symmetric Riesz space fractional derivative (RSFD) of order μ ∈ (1, 2). Approximating the CTFD and RSFD using the L1-algorithm and shifted Grunwald method, respectively, a computationally effective numerical method is presented to solve the TSFFPE-NST. The stability and convergence of the proposed numerical method are investigated. Finally, numerical experiments are carried out to support the theoretical claims.
Resumo:
Increasingly, software is no longer developed as a single system, but rather as a smart combination of so-called software services. Each of these provides an independent, specific and relatively small piece of functionality, which is typically accessible through the Internet from internal or external service providers. To the best of our knowledge, there are no standards or models that describe the sourcing process of these software based services (SBS). We identify the sourcing requirements for SBS and associate the key characteristics of SBS (with the sourcing requirements introduced). Furthermore, we investigate the sourcing of SBS with the related works in the field of classical procurement, business process outsourcing, and information systems sourcing. Based on the analysis, we conclude that the direct adoption of these approaches for SBS is not feasible and new approaches are required for sourcing SBS.