833 resultados para Learning from one Example


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Networked Learning, e-Learning and Technology Enhanced Learning have each been defined in different ways, as people's understanding about technology in education has developed. Yet each could also be considered as a terminology competing for a contested conceptual space. Theoretically this can be a ‘fertile trans-disciplinary ground for represented disciplines to affect and potentially be re-orientated by others’ (Parchoma and Keefer, 2012), as differing perspectives on terminology and subject disciplines yield new understandings. Yet when used in government policy texts to describe connections between humans, learning and technology, terms tend to become fixed in less fertile positions linguistically. A deceptively spacious policy discourse that suggests people are free to make choices conceals an economically-based assumption that implementing new technologies, in themselves, determines learning. Yet it actually narrows choices open to people as one route is repeatedly in the foreground and humans are not visibly involved in it. An impression that the effective use of technology for endless improvement is inevitable cuts off critical social interactions and new knowledge for multiple understandings of technology in people's lives. This paper explores some findings from a corpus-based Critical Discourse Analysis of UK policy for educational technology during the last 15 years, to help to illuminate the choices made. This is important when through political economy, hierarchical or dominant neoliberal logic promotes a single ‘universal model’ of technology in education, without reference to a wider social context (Rustin, 2013). Discourse matters, because it can ‘mould identities’ (Massey, 2013) in narrow, objective economically-based terms which 'colonise discourses of democracy and student-centredness' (Greener and Perriton, 2005:67). This undermines subjective social, political, material and relational (Jones, 2012: 3) contexts for those learning when humans are omitted. Critically confronting these structures is not considered a negative activity. Whilst deterministic discourse for educational technology may leave people unconsciously restricted, I argue that, through a close analysis, it offers a deceptively spacious theoretical tool for debate about the wider social and economic context of educational technology. Methodologically it provides insights about ways technology, language and learning intersect across disciplinary borders (Giroux, 1992), as powerful, mutually constitutive elements, ever-present in networked learning situations. In sharing a replicable approach for linguistic analysis of policy discourse I hope to contribute to visions others have for a broader theoretical underpinning for educational technology, as a developing field of networked knowledge and research (Conole and Oliver, 2002; Andrews, 2011).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New labor movements are currently emerging across the Global South. This is happening in countries as disparate as China, Egypt, and Iran. New developments are taking place within labor movements in places such as Colombia, Indonesia, Iraq, Mexico, Pakistan and Venezuela. Activists and leaders in these labor movements are seeking information from workers and unions around the world. However, many labor activists today know little or nothing about the last period of intense efforts to build international labor solidarity, the years 1978-2007. One of the key labor movements of this period, and which continues today, is the KMU Labor Center of the Philippines. It is this author’s contention that there is a lot unknown about the KMU that would help advance global labor solidarity today. This paper focuses specifically on the KMU’s development, and shares five things that have emerged from this author’s study of the KMU: a new type of trade unionism, new union organizations, an emphasis on rank and file education, building relations with sectoral organizations, and the need to build international labor solidarity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subspaces and manifolds are two powerful models for high dimensional signals. Subspaces model linear correlation and are a good fit to signals generated by physical systems, such as frontal images of human faces and multiple sources impinging at an antenna array. Manifolds model sources that are not linearly correlated, but where signals are determined by a small number of parameters. Examples are images of human faces under different poses or expressions, and handwritten digits with varying styles. However, there will always be some degree of model mismatch between the subspace or manifold model and the true statistics of the source. This dissertation exploits subspace and manifold models as prior information in various signal processing and machine learning tasks.

A near-low-rank Gaussian mixture model measures proximity to a union of linear or affine subspaces. This simple model can effectively capture the signal distribution when each class is near a subspace. This dissertation studies how the pairwise geometry between these subspaces affects classification performance. When model mismatch is vanishingly small, the probability of misclassification is determined by the product of the sines of the principal angles between subspaces. When the model mismatch is more significant, the probability of misclassification is determined by the sum of the squares of the sines of the principal angles. Reliability of classification is derived in terms of the distribution of signal energy across principal vectors. Larger principal angles lead to smaller classification error, motivating a linear transform that optimizes principal angles. This linear transformation, termed TRAIT, also preserves some specific features in each class, being complementary to a recently developed Low Rank Transform (LRT). Moreover, when the model mismatch is more significant, TRAIT shows superior performance compared to LRT.

The manifold model enforces a constraint on the freedom of data variation. Learning features that are robust to data variation is very important, especially when the size of the training set is small. A learning machine with large numbers of parameters, e.g., deep neural network, can well describe a very complicated data distribution. However, it is also more likely to be sensitive to small perturbations of the data, and to suffer from suffer from degraded performance when generalizing to unseen (test) data.

From the perspective of complexity of function classes, such a learning machine has a huge capacity (complexity), which tends to overfit. The manifold model provides us with a way of regularizing the learning machine, so as to reduce the generalization error, therefore mitigate overfiting. Two different overfiting-preventing approaches are proposed, one from the perspective of data variation, the other from capacity/complexity control. In the first approach, the learning machine is encouraged to make decisions that vary smoothly for data points in local neighborhoods on the manifold. In the second approach, a graph adjacency matrix is derived for the manifold, and the learned features are encouraged to be aligned with the principal components of this adjacency matrix. Experimental results on benchmark datasets are demonstrated, showing an obvious advantage of the proposed approaches when the training set is small.

Stochastic optimization makes it possible to track a slowly varying subspace underlying streaming data. By approximating local neighborhoods using affine subspaces, a slowly varying manifold can be efficiently tracked as well, even with corrupted and noisy data. The more the local neighborhoods, the better the approximation, but the higher the computational complexity. A multiscale approximation scheme is proposed, where the local approximating subspaces are organized in a tree structure. Splitting and merging of the tree nodes then allows efficient control of the number of neighbourhoods. Deviation (of each datum) from the learned model is estimated, yielding a series of statistics for anomaly detection. This framework extends the classical {\em changepoint detection} technique, which only works for one dimensional signals. Simulations and experiments highlight the robustness and efficacy of the proposed approach in detecting an abrupt change in an otherwise slowly varying low-dimensional manifold.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although errors might foster learning, they can also be perceived as something to avoid if they are associated with negative consequences (e.g., receiving a bad grade or being mocked by classmates). Such adverse perceptions may trigger negative emotions and error-avoidance attitudes, limiting the possibility to use errors for learning. These students’ reactions may be influenced by relational and cultural aspects of errors that characterise the learning environment. Accordingly, the main aim of this research was to investigate whether relational and cultural characteristics associated with errors affect psychological mechanisms triggered by making mistakes. In the theoretical part, we described the role of errors in learning using an integrated multilevel (i.e., psychological, relational, and cultural levels of analysis) approach. Then, we presented three studies that analysed how cultural and relational error-related variables affect psychological aspects. The studies adopted a specific empirical methodology (i.e., qualitative, experimental, and correlational) and investigated different samples (i.e., teachers, primary school pupils and middle school students). Findings of study one (cultural level) highlighted errors acquire different meanings that are associated with different teachers’ error-handling strategies (e.g., supporting or penalising errors). Study two (relational level) demonstrated that teachers’ supportive error-handling strategies promote students’ perceptions of being in a positive error climate. Findings of study three (relational and psychological level) showed that positive error climate foster students’ adaptive reactions towards errors and learning outcomes. Overall, our findings indicated that different variables influence students’ learning from errors process and teachers play an important role in conveying specific meanings of errors during learning activities, dealing with students’ mistakes supportively, and establishing an error-friendly classroom environment.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two claims pervade the literature on the political economy of market reforms: that economic crises cause reforms; and that crises matter because they bring into question the validity of the economic model held to be responsible for them. Economic crises are said to spur a process of learning that is conducive to the abandonment of failing models and to the adoption of successful models. But although these claims have become the conventional wisdom, they have been hardly tested empirically due to the lack of agreement on what constitutes a crisis and to difficulties in measuring learning from them. I propose a model of rational learning from experience and apply it to the decision to open the economy. Using data from 1964 through 1990, I show that learning from the 1982 debt crisis was relevant to the first wave of adoption of an export promotion strategy, but learning was conditional on the high variability of economic outcomes in countries that opened up to trade. Learning was also symbolic in that the sheer number of other countries that liberalized was a more important driver of others’ decisions to follow suit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study a general static noisy rational expectations model where investors have private information about asset payoffs, with common and private components, and about their own exposure to an aggregate risk factor, and derive conditions for existence and uniqueness (or multiplicity) of equilibria. We find that a main driver of the characterization of equilibria is whether the actions of investors are strategic substitutes or complements. This latter property in turn is driven by the strength of a private learning channel from prices, arising from the multidimensional sources of asymmetric information, in relation to the usual public learning channel. When the private learning channel is strong (weak) in relation to the public we have strong (weak) strategic complementarity in actions and potentially multiple (unique) equilibria. The results enable a precise characterization of whether information acquisition decisions are strategic substitutes or complements. We find that the strategic substitutability in information acquisition result obtained in Grossman and Stiglitz (1980) is robust. JEL Classification: D82, D83, G14 Keywords: Rational expectations equilibrium, asymmetric information, risk exposure, hedging, supply information, information acquisition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This booklet outlines advice on many key nutritional issues for children aged one to five. It includes information on how to provide a healthy, balanced diet for this age group, guidance on suitable snacks and drinks, feeding a vegetarian child, vitamin supplements and iron, making the most of mealtimes and how to deal with fussy eaters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A hemodialysis population from a dialysis unit in the city of Recife, Northeastern Brazil, was screened to assess the prevalence of hepatitis C virus (HCV) infection and to investigate the associated risk factors. Hemodialysis patients (n = 250) were interviewed and serum samples tested for anti-HCV antibodies by enzyme-linked immunosorbent assay (ELISA). All samples were also tested for HCV RNA by reverse transcriptase nested polymerase chain reaction (RT-nested-PCR). Out of 250 patients, 21 (8.4%) were found to be seropositive by ELISA, and 19 (7.6%) patients were HCV RNA positive. HCV viraemia was present in 90.5% of the anti-HCV positive patients. The predominant genotype was HCV 1a (8/19), followed by 3a (7/19), and 1b (4/19). None of the anti-HCV negative patients were shown to be viraemic by the PCR. Univariate analysis of risk factors showed that time spent on hemodialysis, the number of blood transfusions and a blood transfusion before November 1993 were associated with HCV positivity. However, multivariate analysis revealed that blood transfusions before November 1993 were significantly associated with HCV infection in this population. Low prevalence levels were encountered in this center, however prospective studies are necessary to confirm these findings.