892 resultados para Lifelong learning: one focus, different systems
Resumo:
This article describes the types of discourse 10 Australian grade 4-6 teachers used after they had been trained to embed cooperative learning in their curriculum and to use communication skills to promote students' thinking and to scaffold their learning. One audiotaped classroom social science lesson involving cooperative learning was analyzed for each teacher. We provide vignettes from 2 teachers as they worked with groups and from 2 student groups. The data from the audiotapes showed that the teachers used a range of mediated-learning behaviors in their interactions with the children that included challenging their perspectives, asking more cognitive and metacognitive questions, and scaffolding their learning. In turn, in their interactions with each other, the children modelled many of the types of discourse they heard their teachers use. Follow-up interviews with the teachers revealed that they believed it was important to set expectations for children's group behaviors, teach the social skills students needed to deal with disagreement in groups, and establish group structures so children understood what was required both from each other and the task. The teachers reported that mixed ability and gender groups worked best and that groups should be no larger than 5 students. All teachers' programs were based on a child-centered philosophy that recognized the importance of constructivist approaches to learning and the key role interaction plays in promoting social reasoning and learning.
Resumo:
Historically, perhaps because of its matching process traditions, career counselling has tended to be viewed more simplistically than other fields of counselling. However, in the latter part of the 20th century the career development industry witnessed rapid growth and seems set for a promising future. Such growth has corresponded with irreversible change in the world of work, the emergence of lifelong learning as integral to people's careers, and broader and more holistic definitions of career and career development that have gained widespread acceptance. With the increased influence of constructivism, career counselling has emerged from its vocational guidance origins as a profession in its own right. Increasingly, policymakers are recognizing the importance of career guidance and counselling in assisting to achieve policy goals related to lifelong learning, employment, and social equity. Thus, closer links have been created between policymakers and practitioner associations such as the Australian Association of Career Counsellors (AACC). Such intense focus on career guidance and counselling has also resulted in closer scrutiny of its professional standards and qualifications. Consequently, at the same time as there being increased demand for and interest in career counselling, practitioner associations are faced with issues related to redefining their roles with members, the community, and policymakers. This article will describe the changed context of career counselling, current issues such as standards and accreditation, and redefinition of the profession. The AACC's response to these challenges will be the focus of this article.
Resumo:
The relative merits of different systems of property rights to allocate water among different extractive uses are evaluated for the case where variability of supply is important. Three systems of property rights are considered. In the first, variable supply is dealt with through the use of water entitlements defined as shares of the total quantity available. In the second, there are two types of water entitlements, one for water with a high security of supply and the other a lower security right for the residual supply. The third is a system of entitlements specified as state-contingent claims. With zero transaction costs, all systems are efficient. In the realistic situation where transaction costs matter, the system based on state-contingent claims is globally optimal, and the system with high-security and lower security entitlements is preferable to the system with share entitlements.
Resumo:
‘Adolescence’ has become increasingly recognised as a nebulous concept. Previous conceptualisations of adolescence have adopted a ‘deficit’ view, regarding teenagers as ‘unfinished’ adults. The deficit view of adolescence is highly problematic in an era where adulthood itself is difficult to define. The terms ‘kidult’ or ‘adultescent’ have emerged to describe adult-age people whose interests and priorities match those of their teenage counterparts. Rather than relying on ‘lock-step’ models of physical, cognitive and social growth put forward by developmental psychology, adolescence can be more usefully defined by looking at the common experiences of people in their teenage years. Common experiences arise at an institutional level; for example, all adolescents are treated as the same by legal and education systems. The transition from primary to secondary schooling is a milestone for all children, exposing them to a new type of educational environment. Shared experiences also arise from generational factors. Today’s adolescents belong to the millennial generation, characterised by technological competence, global perspectives, high susceptibility to media influence, individualisation and rapid interactions. This generation focuses on teamwork, achievement, modesty and good conduct, and has great potential for significant collective accomplishments. These generational factors challenge educators to provide relevant learning experiences for today’s students. Many classrooms still utilise textbook-based pedagogy more suited to previous generations, resulting in disengagement among millennial students. Curriculum content must also be tailored to generational needs. The rapid pace of change, as well as the fluidity of identity created by dissolving geographical and vocational boundaries, mean that the millennial generation will need more than a fixed set of skills and knowledge to enter adulthood. Teachers must enable their students to think like ‘expert novices’, adept at assimilating new concepts in depth and prepared to engage in lifelong learning.
Resumo:
To participate effectively in the post-industrial information societies and knowledge/service economies of the 21st century, individuals must be better-informed, have greater thinking and problem-solving abilities, be self-motivated; have a capacity for cooperative interaction; possess varied and specialised skills; and be more resourceful and adaptable than ever before. This paper reports on one outcome from a national project funded by the Ministerial Council on Education, Employment Training and Youth Affairs, which investigated what practices, processes, strategies and structures best promote lifelong learning and the development of lifelong learners in the middle years of schooling. The investigation linked lifelong learning with middle schooling because there were indications that middle schooling reform practices also lead to the development of lifelong learning attributes, which is regarded as a desirable outcome of schooling in Australia. While this larger project provides depth around these questions, this paper specifically reports on the development of a three-phase model that can guide the sequence in which schools undertaking middle schooling reform attend to particular core component changes. The model is developed from the extensive analysis of 25 innovative schools around the nation, and provides a unique insight into the desirable sequences and time spent achieving reforms, along with typical pitfalls that lead to a regression in the reform process. Importantly, the model confirms that schooling reform takes much more time than planners typically expect or allocate, and there are predictable and identifiable inhibitors to achieving it.
Resumo:
NASA is working on complex future missions that require cooperation between multiple satellites or rovers. To implement these systems, developers are proposing and using intelligent and autonomous systems. These autonomous missions are new to NASA, and the software development community is just learning to develop such systems. With these new systems, new verification and validation techniques must be used. Current techniques have been developed based on large monolithic systems. These techniques have worked well and reliably, but do not translate to the new autonomous systems that are highly parallel and nondeterministic.
Resumo:
In recent years there has been an increased interest in applying non-parametric methods to real-world problems. Significant research has been devoted to Gaussian processes (GPs) due to their increased flexibility when compared with parametric models. These methods use Bayesian learning, which generally leads to analytically intractable posteriors. This thesis proposes a two-step solution to construct a probabilistic approximation to the posterior. In the first step we adapt the Bayesian online learning to GPs: the final approximation to the posterior is the result of propagating the first and second moments of intermediate posteriors obtained by combining a new example with the previous approximation. The propagation of em functional forms is solved by showing the existence of a parametrisation to posterior moments that uses combinations of the kernel function at the training points, transforming the Bayesian online learning of functions into a parametric formulation. The drawback is the prohibitive quadratic scaling of the number of parameters with the size of the data, making the method inapplicable to large datasets. The second step solves the problem of the exploding parameter size and makes GPs applicable to arbitrarily large datasets. The approximation is based on a measure of distance between two GPs, the KL-divergence between GPs. This second approximation is with a constrained GP in which only a small subset of the whole training dataset is used to represent the GP. This subset is called the em Basis Vector, or BV set and the resulting GP is a sparse approximation to the true posterior. As this sparsity is based on the KL-minimisation, it is probabilistic and independent of the way the posterior approximation from the first step is obtained. We combine the sparse approximation with an extension to the Bayesian online algorithm that allows multiple iterations for each input and thus approximating a batch solution. The resulting sparse learning algorithm is a generic one: for different problems we only change the likelihood. The algorithm is applied to a variety of problems and we examine its performance both on more classical regression and classification tasks and to the data-assimilation and a simple density estimation problems.
Resumo:
Objective: It is investigated to which extent measures of nonlinearity derived from surrogate data analysis are capable to quantify the changes of epileptic activity related to varying vigilance levels. Methods: Surface and intracranial EEG from foramen ovale (FO-)electrodes was recorded from a patient with temporal lobe epilepsy under presurgical evaluation over one night. Different measures of nonlinearity were estimated for non-overlapping 30-s segments for selected channels from surface and intracranial EEG. Additionally spectral measures were calculated. Sleep stages were scored according to Rechtschaffen/Kales and epileptic transients were counted and classified by visual inspection. Results: In the intracranial recordings stronger nonlinearity was found ipsilateral to the epileptogenic focus, more pronounced in NREM sleep, weaker in REM sleep. The dynamics within the NREM episodes varied with the different nonlinearity measures. Some nonlinearity measures showed variations with the sleep cycle also in the intracranial recordings contralateral to the epileptic focus and in the surface EEG. It is shown that the nonlinearity is correlated with short-term fluctuations of the delta power. The higher frequency of occurrence of clinical relevant epileptic spikes in the first NREM episode was not clearly reflected in the nonlinearity measures. Conclusions: It was confirmed that epileptic activity renders the EEG nonlinear. However, it was shown that the sleep dynamics itself also effects the nonlinearity measures. Therefore, at the present stage it is not possible to establish a unique connection between the studied nonlinearity measures and specific types of epileptic activity in sleep EEG recordings.
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.
Resumo:
In the field of mental health risk assessment, there is no standardisation between the data used in different systems. As a first step towards the possible interchange of data between assessment tools, an ontology has been constructed for a particular one, GRiST (Galatean Risk Screening Tool). We briefly introduce GRiST and its data structures, then describe the ontology and the benefits that have already been realised from the construction process. For example, the ontology has been used to check the consistency of the various trees used in the model. We then consider potential uses in integration of data from other sources. © 2009 IEEE.
Resumo:
This work presents a model for development of project proposals by students as an approach to teaching information technology while promoting entrepreneurship and reflection. In teams of 3 to 5 participants, students elaborate a project proposal on a topic they have negotiated with each other and with the teacher. The project domain is related to the practical application of state-of-theart information technology in areas of substantial public interest or of immediate interest to the participants. This gives them ample opportunities for reflection not only on technical but also on social, economic, environmental and other dimensions of information technology. This approach has long been used with students of different years and programs of study at the Faculty of Mathematics and Informatics, Plovdiv University “Paisiy Hilendarski”. It has been found to develop all eight key competences for lifelong learning set forth in the Reference Framework and procedural skills required in real life.
Resumo:
This empirical study investigates the performance of cross border M&A. The first stage is to identify the determinants of making cross border M&A complete. One focus here is to extend the existing empirical evidence in the field of cross border M&A and exploit the likelihood of M&A from a different perspective. Given the determinants of cross border M&A completions, the second stage is to investigate the effects of cross border M&A on post-acquisition firm performance for both targets and acquirers. The thesis exploits a hitherto unused data base, which consists of those firms that are rumoured to be undertaking M&A, and then follow the deal to completion or abandonment. This approach highlights a number of limitations to the previous literature, which relies on statistical methodology to identify potential but non-existent mergers. This thesis changes some conventional understanding for M&A activity. Cross border M&A activity is underpinned by various motives such as synergy, management discipline, and acquisition of complementary resources. Traditionally, it is believed that these motives will boost the international M&A activity and improve firm performance after takeovers. However, this thesis shows that such factors based on these motives as acquirer’s profitability and liquidity and target’s intangible resource actually deter the completion of cross border M&A in the period of 2002-2011. The overall finding suggests that the cross border M&A is the efficiency-seeking activity rather than the resource-seeking activity. Furthermore, compared with firms in takeover rumours, the completion of M&A lowers firm performance. More specifically, the difficulties in transfer of competitive advantages and integration of strategic assets lead to low firm performance in terms of productivity. Besides, firms cannot realise the synergistic effect and managerial disciplinary effect once a cross border M&A is completed, which suggests a low post-acquisition profitability level.
Resumo:
There is nothing more difficult to plan, more Doubtful of success, nor more dangerous to manage Than the creation of a new system. For the initiator has the enmity of all who would Profit by the preservation of the old system, and Merely lukewarm defenders in those who should gain By the new one. N. Machiavelli (1513) Abstract: The purpose of this paper is twofold. First, we want to challenge the notion of "human capital" as "education, training and work experience" and suggest that it is the "quality of the workforce" that matters, here defined as the set of characteristics that allow workers to function in a specific institutional and historical context. Our main conclusion is that the quality of the workforce is affected by the institutional environment where the workers live and that therefore it can vary across countries and institutional contexts. Second, we want to show the empirical relevance of this last point by testing the extent to which the quality of institutions (here proxied by the governance indicators of Kaufmann etal. (2007)) can affect the quality of the workforce (proxied by the percentage of the working age population registered in a lifelong learning program). Our empirical analysis is conducted on a data-set of 11 European countries observed over the period 1996-2006. The results indicate that countries with better governance indicators are also endowed with a more qualified workforce. © 2011 American Journal of Economics and Sociology, Inc.
Resumo:
In the year 2001, the Commission on Dietetic Registration (CDR) will begin a new process of recertifying Registered Dietitians (RD) using a self-directed lifelong learning portfolio model. The model, entitled Professional Development 2001 (PD 2001), is designed to increase competency through targeted learning. This portfolio consists of five steps: reflection, learning needs assessment, formulation of a learning plan, maintenance of a learning log, and evaluation of the learning plan. By targeting learning, PD 2001 is predicted to foster more up-to-date practitioners than the current method that requires only a quantity of continuing education hours. This is the first major change in the credentialing system since 1975. The success or failure of the new system will impact the future of approximately 60,000 practitioners. The purpose of this study was to determine the readiness of RDs to change to the new system. Since the model is dependent on setting goals and developing learning plans, this study examined the methods dietitians use to determine their five-year goals and direction in practice. It also determined RD's attitudes towards PD 2001 and identified some of the factors that influenced their beliefs. A dual methodological design using focus groups and questionnaires was utilized. Sixteen focus groups were held during state dietetic association meetings. Demographic data was collected on the 132 registered dietitians who participated in the focus groups using a self-administered questionnaire. The audiotaped sessions were transcribed into 643 pages of text and analyzed using Non-numerical Unstructured Data - Indexing Searching and Theorizing (NUD*IST version 4). Thirty-four of the 132 participants (26%) had formal five-year goals. Fifty-four participants (41%) performed annual self-assessments. In general, dietitians did not currently have professional goals nor conduct self-assessments and they claimed they did not have the skills or confidence to perform these tasks. Major barriers to successful implementation of PD 2001 are uncertainty, misinterpretation, and misinformation about the process and purpose, which in turn contribute to negative impressions. Renewed vigor to provide a positive, accurate message along with presenting goal-setting strategies will be necessary for better acceptance of this professional development process. ^
Resumo:
Land use and transportation interaction has been a research topic for several decades. There have been efforts to identify impacts of transportation on land use from several different perspectives. One focus has been the role of transportation improvements in encouraging new land developments or relocation of activities due to improved accessibility. The impacts studied have included property values and increased development. Another focus has been on the changes in travel behavior due to better mobility and accessibility. Most studies to date have been conducted in metropolitan level, thus unable to account for interactions spatially and temporally at smaller geographic scales. ^ In this study, a framework for studying the temporal interactions between transportation and land use was proposed and applied to three selected corridor areas in Miami-Dade County, Florida. The framework consists of two parts: one is developing of temporal data and the other is applying time series analysis to this temporal data to identify their dynamic interactions. Temporal GIS databases were constructed and used to compile building permit data and transportation improvement projects. Two types of time series analysis approaches were utilized: univariate models and multivariate models. Time series analysis is designed to describe the dynamic consequences of time series by developing models and forecasting the future of the system based on historical trends. Model estimation results from the selected corridors were then compared. ^ It was found that the time series models predicted residential development better than commercial development. It was also found that results from three study corridors varied in terms of the magnitude of impacts, length of lags, significance of the variables, and the model structure. Long-run effect or cumulated impact of transportation improvement on land developments was also measured with time series techniques. The study offered evidence that congestion negatively impacted development and transportation investments encouraged land development. ^