886 resultados para learning on the job
Resumo:
This qualitative, phenomenological study investigated first generation students’ perceptions of the challenges they experienced in the process of accessing higher education and the type of school-based support that was received. Particular emphasis was placed on the impact of parental education level on access to postsecondary education (PSE) and how differences in support at the primary and secondary levels of schooling influenced access. Purposeful, homogenous sampling was used to select 6 first generation students attending a postsecondary institution located in Ontario. Analysis of the data revealed that several interrelated factors impact first generation students’ access to postsecondary education. These include familial experiences and expectations, school streaming practices, secondary school teachers’ and guidance counselors’ representations of postsecondary education, and the nature of school-based support that participants received. The implications for theory, research, and practice are discussed and recommendations for enhancing school-based support to ensure equitable access to postsecondary education for first generation students are provided.
Resumo:
Neither democracy nor globalization can explain the doubling of the peacetime public share in many Western countries between World Wars I and II. Here we examine two other explanations that are consistent with the timing of the observed changes, namely, (1) a shift in the demand for public goods and (2) the effect of war on the willingness to share. We first model each of these approaches as a contingency-learning phenomenon within Schelling’s Multi-Person Dilemma. We then derive verifiable propositions from each hypothesis. National time series of public spending as a share of GNP reveal no unit root but a break in trend, a result shown to favor explanation (2) over (1).
Resumo:
Production Planning and Control (PPC) systems have grown and changed because of the developments in planning tools and models as well as the use of computers and information systems in this area. Though so much is available in research journals, practice of PPC is lagging behind and does not use much from published research. The practices of PPC in SMEs lag behind because of many reasons, which need to be explored This research work deals with the effect of identified variables such as forecasting, planning and control methods adopted, demographics of the key person, standardization practices followed, effect of training, learning and IT usage on firm performance. A model and framework has been developed based on literature. Empirical testing of the model has been done after collecting data using a questionnaire schedule administered among the selected respondents from Small and Medium Enterprises (SMEs) in India. Final data included 382 responses. Hypotheses linking SME performance with the use of forecasting, planning and controlling were formed and tested. Exploratory factor analysis was used for data reduction and for identifying the factor structure. High and low performing firms were classified using a Logistic Regression model. A confirmatory factor analysis was used to study the structural relationship between firm performance and dependent variables.
Resumo:
We present distribution independent bounds on the generalization misclassification performance of a family of kernel classifiers with margin. Support Vector Machine classifiers (SVM) stem out of this class of machines. The bounds are derived through computations of the $V_gamma$ dimension of a family of loss functions where the SVM one belongs to. Bounds that use functions of margin distributions (i.e. functions of the slack variables of SVM) are derived.
Resumo:
Recent developments in the area of reinforcement learning have yielded a number of new algorithms for the prediction and control of Markovian environments. These algorithms, including the TD(lambda) algorithm of Sutton (1988) and the Q-learning algorithm of Watkins (1989), can be motivated heuristically as approximations to dynamic programming (DP). In this paper we provide a rigorous proof of convergence of these DP-based learning algorithms by relating them to the powerful techniques of stochastic approximation theory via a new convergence theorem. The theorem establishes a general class of convergent algorithms to which both TD(lambda) and Q-learning belong.
Resumo:
To recognize a previously seen object, the visual system must overcome the variability in the object's appearance caused by factors such as illumination and pose. Developments in computer vision suggest that it may be possible to counter the influence of these factors, by learning to interpolate between stored views of the target object, taken under representative combinations of viewing conditions. Daily life situations, however, typically require categorization, rather than recognition, of objects. Due to the open-ended character both of natural kinds and of artificial categories, categorization cannot rely on interpolation between stored examples. Nonetheless, knowledge of several representative members, or prototypes, of each of the categories of interest can still provide the necessary computational substrate for the categorization of new instances. The resulting representational scheme based on similarities to prototypes appears to be computationally viable, and is readily mapped onto the mechanisms of biological vision revealed by recent psychophysical and physiological studies.
Resumo:
Blogging has become one of the key ingredients of the so-called socials networks. This phenomenon has indeed invaded the world of education. Connections between people, comments on each other posts, and assessment of innovation are usually interesting characteristics of blogs related to students and scholars. Blogs have become a kind of new form of authority, bringing about (divergent) discussions which lead to creation of knowledge. The use of blogs as an innovative, educational tool is not at all new. However, their use in universities is not very widespread yet. Blogging for personal affairs is rather commonplace, but blogging for professional affairs – teaching, research and service, is scarce, despite the availability of ready-to-use, free tools. Unfortunately, Information Society has not reached yet enough some universities: not only are (student) blogs scarcely used as an educational tool, but it is quite rare to find a blog written by University professors. The Institute of Computational Chemistry of the University of Girona and the Department of Chemistry of the Universitat Autònoma de Barcelona has joined forces to create “InnoCiència”, a new Group on Digital Science Communitation. This group, formed by ca. ten researchers, has promoted the use of blogs, twitters. wikis and other tools of Web 2.0 in activities in Catalonia concerning the dissemination of Science, like Science Week, Open Day or Researchers’ Night. Likewise, its members promote use of social networking tools in chemistry- and communication-related courses. This communication explains the outcome of social-network experiences with teaching undergraduate students and organizing research communication events. We provide live, hands-on examples and interactive ground to show how blogs and twitters can be used to enhance the yield of teaching and research. Impact of blogging and other social networking tools on the outcome of the learning process is very depending on the target audience and the environmental conditions. A few examples are provided and some proposals to use these techniques efficiently to help students are hinted
Resumo:
An interactive tutorial on how to reference books correctly. It begins with an example, and interactively draws the student through the stages of accessing the relevant information through to how to include the final citation in the bibliography. It concludes with a ‘test your knowledge’ set of activities. When you view this object note that the panel on the left generated by the repository can be dragged sideways to view the learning object full screen.
Resumo:
El desarrollo del presente documento constituye una investigación sobre las actitudes de los directivos frente a la adopción del e-learning como herramienta de trabajo en las organizaciones de Bogotá. Para ello se realizó una encuesta a 101 directivos, tomando como base el tipo de muestreo de conveniencia; esto con el objetivo de identificar sus actitudes frente al uso del e-learning y su influencia dentro de la organización. Como resultado se obtuvo que las actitudes de los directivos influencian en el uso de herramientas e-learning, así como también en las acciones que promueven su uso y en las actitudes de los empleados; por otro lado se identificó que las creencias relacionadas con la apropiación de herramientas e-learning y los factores facilitadores del uso de estas, influencian en las actitudes de los directivos. Lo anterior, corresponde a los análisis llevados a cabo a partir de los resultados contrastados con los estudios empíricos hallados y el marco teórico desarrollado.
Resumo:
The present study contributes to the literature on the Job Demands-resources model in the italian school context. the aim of this paper is to examine how the interaction between work-family conflict (i.e., a typical job demand) and opportunities to learn and to develop and self-efficacy (i.e., typical job and personal resources, respectively) affect the core dimensions of burnout (exhaustion and depersonalization) and work engagement (vigor and dedication). Hypotheses were tested with a cross-sectional design among 143 teachers of a junior high school in the north of Italy. Results of moderated multiple regression analysis partially supported the hypotheses as the opportunities to learn and to develop buffered against the aversive effects of work-family conflict on depersonalization, whereas self-efficacy moderated the relationship between work-family conflict and vigor. From a practical viewpoint, our findings suggest that opportunities to learn and to develop and self-efficacy are important re- sources that help teachers to reduce the negative effects related to work-family conflict.
Resumo:
The object of this experience is to offer the students the opportunity to take part in the construction of a pedagogic strategy centred on the ludic, for the promotion of the integral health and the prevention of the disease with an educational community; directed to supporting and qualifying the well-being so much individually as group. The project is designed to five years, about interdisciplinary character (Speech Therapy, Medicine, Psychology, Nursery, Occupational Therapy), interinstitutional (Universidad del Rosario, Universidad de San Buenaventura y Universidad de Cundinamarca) and intersectorial (Education and Health). It considers the different actors of the educational community and school and the home as propitious scenes for the strengthening potential, beside being the fundamental spaces for the construction of knowledges and learnings concerning the integral health. To achieve the target, one has come constructing from the second semester of 2003, one pedagogic strategy centred on the ludic and the creativity, from which they are planned, they develop and evaluate the actions of promotion of skills, values, behaviors and attitudes in the care of the health and the prevention of disease, orientated to the early, opportune and effective detection of risk factors and problematic of the development that they affect the integral health. The above mentioned strategy raises a so called scene Bienestarópolis: A healthy world for conquering, centred on prominent figures, spaces and elements that alternate between the fantasy and the reality to facilitate the approximation, the interiorización and the appropriation of the integral health. Across this one, the children motivated by the adults enter an imaginary world in that theirs desires, knowledges and attitudes are the axis of his development. Since Vigotsky raises it, in the game the child realizes actions in order to adapt to the world that surrounds it acquiring skills for the learning. The actions of the project have involved 410 children and 25 teachers, of the degrees Zero, The First and The Second of basic primary; 90 parents of family, and an average of 40 students and 8 teachers of the already mentioned disciplines.
Resumo:
En junio de 2000 el Departamento Nacional de Estadística de Colombia adopto una nueva definición de medición de desempleo siguiendo los estándares sugeridos por la organización Internacional del Trabajo (OIT). El cambio de definición implico una reducción de la tasa de desempleo en cerca de dos puntos porcentuales. En este documento contrastamos la experiencia colombiana con otra experiencias internacionales, y analizamos las implicaciones empíricas y teóricas de este cambio de definición usando dos tipos de estimaciones cuantitativas: en la primera se contrasta las principales características de las diferentes categorías clasificadas según la definición nueva y vieja de desempleo (empleado, desempleado y fuera de la fuerza laboral) usando el algoritmo EM; en la segunda se pone a prueba la implicación del desempleo estructural y su relación con el perfil educacional de personas desempleadas y las características teóricas que enfrentan los estándares de la OIT en la definición de empleo.
Resumo:
This paper examines the clinical application of research done on aphasia and the learning characteristics of aphasics.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
In this review, we consider three possible criteria by which knowledge might be regarded as implicit or inaccessible: It might be implicit only in the sense that it is difficult to articulate freely, or it might be implicit according to either an objective threshold or a subjective threshold. We evaluate evidence for these criteria in relation to artificial grammar learning, the control of complex systems, and sequence learning, respectively. We argue that the convincing evidence is not yet in, but construing the implicit nature of implicit learning in terms of a subjective threshold is most likely to prove fruitful for future research. Furthermore, the subjective threshold criterion may demarcate qualitatively different types of knowledge. We argue that (1) implicit, rather than explicit, knowledge is often relatively inflexible in transfer to different domains, (2) implicit, rather than explicit, learning occurs when attention is focused on specific items and not underlying rules, and (3) implicit learning and the resulting knowledge are often relatively robust.