921 resultados para Primary level


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction and Aims: Remote delivery of interventions is needed to address large numbers of people with alcohol use disorders who are spread over large areas. Previous correspondence trials typically examined its effects as stand-alone treatment. This study aimed to test whether adding postal treatment to general practitioner (GP) support would lower alcohol use more than GP intervention alone. Design and Methods: A single-blind, randomised controlled trial with a crossover design was conducted over 12 months on 204 people with alcohol use disorders. Participants in an immediate correspondence condition received treatment over the first 3 months; those receiving delayed treatment received it in months 3–6. Results: Few participants were referred from GPs, and little intervention was offered by them. At 3 months, 78% of participants remained in the study. Those in immediate treatment showed greater reductions in alcohol per week, drinking days, anxiety, depression and distress than those in the delayed condition. However, post-treatment and follow-up outcomes still showed elevated alcohol use, depression, anxiety and distress. Greater baseline anxiety predicted better alcohol outcomes, although more mental distress at baseline predicted dropout. Discussion and Conclusions: The study gave consistent results with those from previous research on correspondence treatments, and showed that high levels of participant engagement over 3 months can be obtained. Substantial reductions in alcohol use are seen, with indications that they are well maintained. However, many participants continue to show high-risk alcohol use and psychological distress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, practitioners and researchers alike have turned their attention to knowledge management (KM) in order to increase organisational performance (OP). As a result, many different approaches and strategies have been investigated and suggested for how knowledge should be managed to make organisations more effective and efficient. However, most research has been undertaken in the for-profit sector, with only a few studies focusing on the benefits nonprofit organisations might gain by managing knowledge. This study broadly investigates the impact of knowledge management on the organisational performance of nonprofit organisations. Organisational performance can be evaluated through either financial or non-financial measurements. In order to evaluate knowledge management and organisational performance, non-financial measurements are argued to be more suitable given that knowledge is an intangible asset which often cannot be expressed through financial indicators. Non-financial measurement concepts of performance such as the balanced scorecard or the concept of Intellectual Capital (IC) are well accepted and used within the for-profit and nonprofit sectors to evaluate organisational performance. This study utilised the concept of IC as the method to evaluate KM and OP in the context of nonprofit organisations due to the close link between KM and IC: Indeed, KM is concerned with managing the KM processes of creating, storing, sharing and applying knowledge and the organisational KM infrastructure such as organisational culture or organisational structure to support these processes. On the other hand, IC measures the knowledge stocks in different ontological levels: at the individual level (human capital), at the group level (relational capital) and at the organisational level (structural capital). In other words, IC measures the value of the knowledge which has been managed through KM. As KM encompasses the different KM processes and the KM infrastructure facilitating these processes, previous research has investigated the relationship between KM infrastructure and KM processes. Organisational culture, organisational structure and the level of IT support have been identified as the main factors of the KM infrastructure influencing the KM processes of creating, storing, sharing and applying knowledge. Other research has focused on the link between KM and OP or organisational effectiveness. Based on existing literature, a theoretical model was developed to enable the investigation of the relation between KM (encompassing KM infrastructure and KM processes) and IC. The model assumes an association between KM infrastructure and KM processes, as well as an association between KM processes and the various levels of IC (human capital, structural capital and relational capital). As a result, five research questions (RQ) with respect to the various factors of the KM infrastructure as well as with respect to the relationship between KM infrastructure and IC were raised and included into the research model: RQ 1 Do nonprofit organisations which have a Hierarchy culture have a stronger IT support than nonprofit organisations which have an Adhocracy culture? RQ 2 Do nonprofit organisations which have a centralised organisational structure have a stronger IT support than nonprofit organisations which have decentralised organisational structure? RQ 3 Do nonprofit organisations which have a stronger IT support have a higher value of Human Capital than nonprofit organisations which have a less strong IT support? RQ 4 Do nonprofit organisations which have a stronger IT support have a higher value of Structural Capital than nonprofit organisations which have a less strong IT support? RQ 5 Do nonprofit organisations which have a stronger IT support have a higher value of Relational Capital than nonprofit organisations which have a less strong IT support? In order to investigate the research questions, measurements for IC were developed which were linked to the main KM processes. The final KM/IC model contained four items for evaluating human capital, five items for evaluating structural capital and four items for evaluating relational capital. The research questions were investigated through empirical research using a case study approach with the focus on two nonprofit organisations providing trade promotions services through local offices worldwide. Data for the investigation of the assumptions were collected via qualitative as well as quantitative research methods. The qualitative study included interviews with representatives of the two participating organisations as well as in-depth document research. The purpose of the qualitative study was to investigate the factors of the KM infrastructure (organisational culture, organisational structure, IT support) of the organisations and how these factors were related to each other. On the other hand, the quantitative study was carried out through an online-survey amongst staff of the various local offices. The purpose of the quantitative study was to investigate which impact the level of IT support, as the main instrument of the KM infrastructure, had on IC. Overall several key themes were found as a result of the study: • Knowledge Management and Intellectual Capital were complementary with each other, which should be expressed through measurements of IC based on KM processes. • The various factors of the KM infrastructure (organisational culture, organisational structure and level of IT support) are interdependent. • IT was a primary instrument through which the different KM processes (creating, storing, sharing and applying knowledge) were performed. • A high level of IT support was evident when participants reported higher level of IC (human capital, structural capital and relational capital). The study supported previous research in the field of KM and replicated the findings from other case studies in this area. The study also contributed to theory by placing the KM research within the nonprofit context and analysing the linkage between KM and IC. From the managerial perspective, the findings gave clear indications that would allow interested parties, such as nonprofit managers or consultants to understand more about the implications of KM on OP and to use this knowledge for implementing efficient and effective KM strategies within their organisations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The aim of this paper was to examine self-efficacy and perceived appropriateness among rural general practitioners (GPs) in regards to screening and intervention for physical, lifestyle and mental health issues. ----- Method: Fifty GPs from 25 practices in eight rural Queensland towns completed a written survey designed for the study. ----- Results: General practitioners rated opportunistic screening or assessment for smoking and for detection of relapse of mental disorders as the most appropriate, with even cardiovascular and diabetes risk falling behind these. Self-efficacy was highest for medical disorders for smoking assessment. It was significantly lower for alcohol, mental health issues, and addressing risks of physical disorder in people with mental disorders. ----- Conclusions: High appropriateness ratings suggest that current strategies to boost self-efficacy of GPs in addressing mental health issues are timely.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION In their target article, Yuri Hanin and Muza Hanina outlined a novel multidisciplinary approach to performance optimisation for sport psychologists called the Identification-Control-Correction (ICC) programme. According to the authors, this empirically-verified, psycho-pedagogical strategy is designed to improve the quality of coaching and consistency of performance in highly skilled athletes and involves a number of steps including: (i) identifying and increasing self-awareness of ‘optimal’ and ‘non-optimal’ movement patterns for individual athletes; (ii) learning to deliberately control the process of task execution; and iii), correcting habitual and random errors and managing radical changes of movement patterns. Although no specific examples were provided, the ICC programme has apparently been successful in enhancing the performance of Olympic-level athletes. In this commentary, we address what we consider to be some important issues arising from the target article. We specifically focus attention on the contentious topic of optimization in neurobiological movement systems, the role of constraints in shaping emergent movement patterns and the functional role of movement variability in producing stable performance outcomes. In our view, the target article and, indeed, the proposed ICC programme, would benefit from a dynamical systems theoretical backdrop rather than the cognitive scientific approach that appears to be advocated. Although Hanin and Hanina made reference to, and attempted to integrate, constructs typically associated with dynamical systems theoretical accounts of motor control and learning (e.g., Bernstein’s problem, movement variability, etc.), these ideas required more detailed elaboration, which we provide in this commentary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report focuses on risk-assessment practices in the private rental market, with particular consideration of their impact on low-income renters. It is based on the fieldwork undertaken in the second stage of the research process that followed completion of the Positioning Paper. The key research question this study addressed was: What are the various factors included in ‘risk-assessments’ by real estate agents in allocating ‘affordable’ tenancies? How are these risks quantified and managed? What are the key outcomes of their decision-making? The study builds on previous research demonstrating that a relatively large proportion of low-cost private rental accommodation is occupied by moderate- to high-income households (Wulff and Yates 2001; Seelig 2001; Yates et al. 2004). This is occurring in an environment where the private rental sector is now the de facto main provider of rental housing for lower-income households across Australia (Seelig et al. 2005) and where a number of factors are implicated in patterns of ‘income–rent mismatching’. These include ongoing shifts in public housing assistance; issues concerning eligibility for rent assistance; ‘supply’ factors, such as loss of low-cost rental stock through upgrading and/or transfer to owner-occupied housing; patterns of supply and demand driven largely by middle- to high-income owner-investors and renters; and patterns of housing need among low-income households for whom affordable housing is not appropriate. In formulating a way of approaching the analysis of ‘risk-assessment’ in rental housing management, this study has applied three sociological perspectives on risk: Beck’s (1992) formulation of risk society as entailing processes of ‘individualisation’; a socio-cultural perspective which emphasises the situated nature of perceptions of risk; and a perspective which has drawn attention to different modes of institutional governance of subjects, as ‘carriers of specific indicators of risk’. The private rental market was viewed as a social institution, and the research strategy was informed by ‘institutional ethnography’ as a method of enquiry. The study was based on interviews with property managers, real estate industry representatives, tenant advocates and community housing providers. The primary focus of inquiry was on ‘the moment of allocation’. Six local areas across metropolitan and regional Queensland, New South Wales, and South Australia were selected as case study localities. In terms of the main findings, it is evident that access to private rental housing is not just a matter of ‘supply and demand’. It is also about assessment of risk among applicants. Risk – perceived or actual – is thus a critical factor in deciding who gets housed, and how. Risk and its assessment matter in the context of housing provision and in the development of policy responses. The outcomes from this study also highlight a number of salient points: 1.There are two principal forms of risk associated with property management: financial risk and risk of litigation. 2. Certain tenant characteristics and/or circumstances – ability to pay and ability to care for the rented property – are the main factors focused on in assessing risk among applicants for rental housing. Signals of either ‘(in)ability to pay’ and/or ‘(in)ability to care for the property’ are almost always interpreted as markers of high levels of risk. 3. The processing of tenancy applications entails a complex and variable mix of formal and informal strategies of risk-assessment and allocation where sorting (out), ranking, discriminating and handing over characterise the process. 4. In the eyes of property managers, ‘suitable’ tenants can be conceptualised as those who are resourceful, reputable, competent, strategic and presentable. 5. Property managers clearly articulated concern about risks entailed in a number of characteristics or situations. Being on a low income was the principal and overarching factor which agents considered. Others included: - unemployment - ‘big’ families; sole parent families - domestic violence - marital breakdown - shift from home ownership to private rental - Aboriginality and specific ethnicities - physical incapacity - aspects of ‘presentation’. The financial vulnerability of applicants in these groups can be invoked, alongside expressed concerns about compromised capacities to manage income and/or ‘care for’ the property, as legitimate grounds for rejection or a lower ranking. 6. At the level of face-to-face interaction between the property manager and applicants, more intuitive assessments of risk based upon past experience or ‘gut feelings’ come into play. These judgements are interwoven with more systematic procedures of tenant selection. The findings suggest that considerable ‘risk’ is associated with low-income status, either directly or insofar as it is associated with other forms of perceived risk, and that such risks are likely to impede access to the professionally managed private rental market. Detailed analysis suggests that opportunities for access to housing by low-income householders also arise where, for example: - the ‘local experience’ of an agency and/or property manager works in favour of particular applicants - applicants can demonstrate available social support and financial guarantors - an applicant’s preference or need for longer-term rental is seen to provide a level of financial security for the landlord - applicants are prepared to agree to specific, more stringent conditions for inspection of properties and review of contracts - the particular circumstances and motivations of landlords lead them to consider a wider range of applicants - In particular circumstances, property managers are prepared to give special consideration to applicants who appear worthy, albeit ‘risky’. The strategic actions of demonstrating and documenting on the part of vulnerable (low-income) tenant applicants can improve their chances of being perceived as resourceful, capable and ‘savvy’. Such actions are significant because they help to persuade property managers not only that the applicant may have sufficient resources (personal and material) but that they accept that the onus is on themselves to show they are reputable, and that they have valued ‘competencies’ and understand ‘how the system works’. The parameters of the market do shape the processes of risk-assessment and, ultimately, the strategic relation of power between property manager and the tenant applicant. Low vacancy rates and limited supply of lower-cost rental stock, in all areas, mean that there are many more tenant applicants than available properties, creating a highly competitive environment for applicants. The fundamental problem of supply is an aspect of the market that severely limits the chances of access to appropriate and affordable housing for low-income rental housing applicants. There is recognition of the impact of this problem of supply. The study indicates three main directions for future focus in policy and program development: providing appropriate supports to tenants to access and sustain private rental housing, addressing issues of discrimination and privacy arising in the processes of selecting suitable tenants, and addressing problems of supply.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efforts to improve mathematics and science content knowledge have in many institutions required redefining teacher education through new teaching and learning. See, for example, Peard & Pumadevi (2007) for an account of one such attempt involving the development of a Foundations Unit, Scientific and Quantitative Literacy. This unit is core for all first year pre-service primary teacher education students at Queensland University of Technology (QUT) and two Education Institutes in Malaysia, Institute Perguruan Raja Melewar (IPRM), and Institute Perguruan Teknik (IPT) Kuala Lumpur. Since then, QUT has modified the unit to adopt a thematic approach to the same content. An aim of the unit rewrite was the development of a positive attitude and disposition to the teaching and learning of mathematics and science, with a curiosity and willingness to speculate about and explore the world. Numeracy was specifically identified within the mathematics encountered and appropriately embedded in the science learning area. The importance of the ability to engage in communication of and about mathematics and science was considered crucial to the development of pre-service primary teachers. Cognisance was given to the appropriate selection and use of technology to enhance learning - digital technologies were embedded in the teaching, learning and assessment of the unit to avoid being considered as an optional extra. This was achieved around the theme of “the sustainable school”. This „sustainability‟ theme was selected due to its prominence in Australia‟s futures-oriented National Curriculum which will be implemented in 2011. This paper outlines the approach taken to the implementation of the unit and discusses early indicators of its effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efforts to improve mathematics and science content knowledge have in many institutions required redefining teacher education through new teaching and learning. See, for example, Peard & Pumadevi (2007) for an account of one such attempt involving the development of a Foundations Unit, Scientific and Quantitative Literacy. This unit is core for all first year pre-service primary teacher education students at Queensland University of Technology (QUT) and two Education Institutes in Malaysia, Institute Perguruan Raja Melewar (IPRM), and Institute Perguruan Teknik (IPT) Kuala Lumpur. Since then, QUT has modified the unit to adopt a thematic approach to the same content. An aim of the unit rewrite was the development of a positive attitude and disposition to the teaching and learning of mathematics and science, with a curiosity and willingness to speculate about and explore the world. Numeracy was specifically identified within the mathematics encountered and appropriately embedded in the science learning area. The importance of the ability to engage in communication of and about mathematics and science was considered crucial to the development of pre-service primary teachers. Cognisance was given to the appropriate selection and use of technology to enhance learning - digital technologies were embedded in the teaching, learning and assessment of the unit to avoid being considered as an optional extra. This was achieved around the theme of “the sustainable school”. This ‘sustainability’ theme was selected due to its prominence in Australia’s futures-oriented National Curriculum which will be implemented in 2011. This paper outlines the approach taken to the implementation of the unit and discusses early indicators of its effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the enabling effect of using blended learning and synchronous internet mediated communication technologies to improve learning and develop a Sense of Community (SOC) in a group of post-graduate students consisting of a mix of on-campus and off-campus students. Both quantitative and qualitative data collected over a number of years supports the assertion that the blended learning environment enhanced both teaching and learning. The development of a SOC was pivotal to the success of the blended approach when working with geographically isolated groups within a single learning environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Examined the social adaptation of 32 children in grades 3–6 with mild intellectual disability: 13 Ss were partially integrated into regular primary school classes and 19 Ss were full-time in separate classes. Sociometric status was assessed using best friend and play rating measures. Consistent with previous research, children with intellectual disability were less socially accepted than were a matched group of 32 children with no learning disabilities. Children in partially integrated classes received more play nominations than those in separate classes, but had no greater acceptance as a best friend. On teachers' reports, disabled children had higher levels of inappropriate social behaviours, but there was no significant difference in appropriate behaviours. Self-assessments by integrated children were more negative than those by children in separate classes, and their peer-relationship satisfaction was lower. Ratings by disabled children of their satisfaction with peer relationships were associated with ratings of appropriate social skills by themselves and their teachers, and with self-ratings of negative behaviour. The study confirmed that partial integration can have negative consequences for children with an intellectual disability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tested the hypothesis that level of performance and persistence in completing tasks is affected by mood. 44 female and 41 male college students received tape-recorded instructions to recall vividly happy or sad experiences or to imagine a neutral situation. Results for the primary dependent variables on which a mood difference was predicted were analyzed with a multivariate analysis of variance (MANOVA). After the induction happy Ss persisted longer at an anagrams task and solved more anagrams than sad Ss. Women were also faster at reaching solutions when happy than sad. Results support the hypothesis that positive moods promote persistence and ultimate success, but they raise questions about the role of self-efficacy and the sources of gender differences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Worldwide, education systems have undergone unprecedented change due to a variety of economic, social, and political forces (Limerick, Cunnington & Crowther, 2002). The People’s Republic of China (PRC) is no exception. Continuous educational reform at primary and secondary levels in Mainland China has created new challenges and accountabilities for school principals. The important role of principals in primary and secondary schools has been acknowledged in both policy documents and the broader literature (Central Committee of the Chinese Communist Party, 1985; F. Chen, 2005; Chu, 2003; W. Huang, 2005; T. Wang, 2003). Yet, most of the literature on primary and secondary school principals in Mainland China is prescriptive in nature, identifying from the perspectives of researchers and academics what principals should do and how they should enact leadership. Lacking in this research is an awareness of the daily practices and lived experiences of principals. Furthermore, within the small body of writing on primary and secondary school principals in Mainland China, gender is seldom given any attention. To date, only a small number of empirical studies have focused on female principals as a specific category of research (Zen, 2004; Zhong, 2004). This study aimed to explore the professional lives of two female exemplary school principals in urban primary schools in Mainland China. A qualitative exploratory case study was used. Semi-structured interviews with each individual female principal, with six teachers in each of the school sites and with the superintendent of each principal were conducted. Field observations and document analysis were also undertaken to obtain multiple insights about their leadership practices. The conceptual framework was based largely on the theory of Gronn (1999) and incorporated five core leadership practices (vision building, ethical considerations, teaching and learning, power utilisation, and dealing with risks and challenges) taken from the wider literature. The key findings of this study were twofold. Firstly, while the five leadership practices were evident in the leadership of the two principals, this study identified some subtle differences in the way they approached each of them. Secondly, contextual factors such as Chinese traditional culture, the contemporary societal context, and the school organisational context, in addition to the biographical experiences of each principal were significant factors in shaping the way in which they exercised their leadership practices in the schools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interactive educational courseware has been adopted in diverse education sectors such as primary, secondary, tertiary education, vocational and professional training. In Malaysian educational context, the ministry of education has implemented Smart School Project that aims to increase high level of academic achievement in primary and secondary schools by using interactive educational courseware. However, many researchers have reported that many coursewares fail to accommodate the learner and teacher needs. In particular, the interface design is not appropriately designed in terms of quality of learning. This paper reviews educational courseware development process in terms of defining quality of interface design and suggests a conceptual model of interface design through the integration of design components and interactive learning experience into the development process. As a result, it defines the concept of interactive learning experience in a more practical approach in order to implement each stage of the development process in a seamless and integrated way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach,which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this paper we propose two approaches which measure multi-level association rules to help evaluate their interestingness. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Association rule mining has made many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we firstly propose a definition for redundancy; then we propose a concise representation called Reliable basis for representing non-redundant association rules for both exact rules and approximate rules. An important contribution of this paper is that we propose to use the certainty factor as the criteria to measure the strength of the discovered association rules. With the criteria, we can determine the boundary between redundancy and non-redundancy to ensure eliminating as many redundant rules as possible without reducing the inference capacity of and the belief to the remaining extracted non-redundant rules. We prove that the redundancy elimination based on the proposed Reliable basis does not reduce the belief to the extracted rules. We also prove that all association rules can be deduced from the Reliable basis. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules.