868 resultados para on-line education
Resumo:
In the present paper, we discuss the time before the “age of reports”. Besides the Coleman Report in the period of Coleman, the Lady Plowden Report also appeared, while there were important studies in France (Bourdieu & Passeron, 1964; Peyre, 1959) and studies that inaugurated comprehensive education in Nordic countries. We focus on the period after the World War II, which was marked by rising economic nationalism, on the one hand, and by the second wave of mass education, on the other, bearing the promise of more equality and a reduction of several social inequalities, both supposed to be ensured by school. It was a period of great expectations related to the power of education and the rise of educational meritocracy. On this background, in the second part of the paper, the authors attempt to explore the phenomenon of the aforementioned reports, which significantly questioned the power of education and, at the same time, enabled the formation of evidence-based education policies. In this part of the paper, the central place is devoted to the case of socialist Yugoslavia/Slovenia and its striving for more equality and equity through education. Through the socialist ideology of more education for all, socialist Yugoslavia, with its exaggerated stress on the unified school and its overemphasised belief in simple equality, overstepped the line between relying on comprehensive education as an important mechanism for increasing the possibility of more equal and just education, on the one hand, and the myth of the almighty unified school capable of eradicating social inequalities, especially class inequalities, on the other. With this radical approach to the reduction of inequalities, socialist policy in the then Yugoslavia paradoxically reduced the opportunity for greater equality, and even more so for more equitable education. (DIPF/Orig.)
Resumo:
Marketization has changed the education system. If we say that education is a market, this transforms the understanding of education and influences how people act. In this paper, adult-education school-leaders’ talk is analysed and seven metaphors for education are found: education as administration, market, matching, democracy, policy work, integration and learning. Exploring empirical metaphors provides a rich illustration of coinciding meanings. In line with studies on policy texts, economic metaphors are found to dominate. This should be understood not only as representing liberal ideology, as is often discussed in analyses of policy papers, but also as representing economic theory. In other words, contemporary adult education can be understood as driven by economic theories. The difference and relation between ideology and theory should be further examined since they have an impact on our society and on our everyday lives. (DIPF/Orig.)
Resumo:
Marketization has changed the education system. If we say that education is a market, this transforms the understanding of education and influences how people act. In this paper, adult-education school-leaders’ talk is analysed and seven metaphors for education are found: education as administration, market, matching, democracy, policy work, integration and learning. Exploring empirical metaphors provides a rich illustration of coinciding meanings. In line with studies on policy texts, economic metaphors are found to dominate. This should be understood not only as representing liberal ideology, as is often discussed in analyses of policy papers, but also as representing economic theory. In other words, contemporary adult education can be understood as driven by economic theories. The difference and relation between ideology and theory should be further examined since they have an impact on our society and on our everyday lives.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Geographic Data Warehouses (GDW) are one of the main technologies used in decision-making processes and spatial analysis, and the literature proposes several conceptual and logical data models for GDW. However, little effort has been focused on studying how spatial data redundancy affects SOLAP (Spatial On-Line Analytical Processing) query performance over GDW. In this paper, we investigate this issue. Firstly, we compare redundant and non-redundant GDW schemas and conclude that redundancy is related to high performance losses. We also analyze the issue of indexing, aiming at improving SOLAP query performance on a redundant GDW. Comparisons of the SB-index approach, the star-join aided by R-tree and the star-join aided by GiST indicate that the SB-index significantly improves the elapsed time in query processing from 25% up to 99% with regard to SOLAP queries defined over the spatial predicates of intersection, enclosure and containment and applied to roll-up and drill-down operations. We also investigate the impact of the increase in data volume on the performance. The increase did not impair the performance of the SB-index, which highly improved the elapsed time in query processing. Performance tests also show that the SB-index is far more compact than the star-join, requiring only a small fraction of at most 0.20% of the volume. Moreover, we propose a specific enhancement of the SB-index to deal with spatial data redundancy. This enhancement improved performance from 80 to 91% for redundant GDW schemas.
Resumo:
Glyceraldehyde-3-phosphate dehydrogenase (GAPDH) plays an important role in the life cycle of the Trypanosoma cruzi, and an immobilized enzyme reactor (IMER) has been developed for use in the on-line screening for GAPDH inhibitors. An IMER containing human GAPDH has been previously reported; however, these conditions produced a T. cruzi GAPDH-IMER with poor activity and stability. The factors affecting the stability of the human and T. cruzi GAPDHs in the immobilization process and the influence of pH and buffer type on the stability and activity of the IMERs have been investigated. The resulting T. cruzi GAPDH-IMER was coupled to an analytical octyl column, which was used to achieve chromatographic separation of NAD+ from NADH. The production of NADH stimulated by D-glyceraldehyde-3-phosphate was used to investigate the activity and kinetic parameters of the immobilized T. cruzi GAPDH. The Michaelis-Menten constant (K-m) values determined for D-glyceraldehyde-3-phosphate and NAD(+) were K-m = 0.5 +/- 0.05 mM and 0.648 +/- 0.08 mM, respectively, which were consistent with the values obtained using the non-immobilized enzyme.
Resumo:
We introduce the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS). CATT-BRAMS is an on-line transport model fully consistent with the simulated atmospheric dynamics. Emission sources from biomass burning and urban-industrial-vehicular activities for trace gases and from biomass burning aerosol particles are obtained from several published datasets and remote sensing information. The tracer and aerosol mass concentration prognostics include the effects of sub-grid scale turbulence in the planetary boundary layer, convective transport by shallow and deep moist convection, wet and dry deposition, and plume rise associated with vegetation fires in addition to the grid scale transport. The radiation parameterization takes into account the interaction between the simulated biomass burning aerosol particles and short and long wave radiation. The atmospheric model BRAMS is based on the Regional Atmospheric Modeling System (RAMS), with several improvements associated with cumulus convection representation, soil moisture initialization and surface scheme tuned for the tropics, among others. In this paper the CATT-BRAMS model is used to simulate carbon monoxide and particulate material (PM(2.5)) surface fluxes and atmospheric transport during the 2002 LBA field campaigns, conducted during the transition from the dry to wet season in the southwest Amazon Basin. Model evaluation is addressed with comparisons between model results and near surface, radiosondes and airborne measurements performed during the field campaign, as well as remote sensing derived products. We show the matching of emissions strengths to observed carbon monoxide in the LBA campaign. A relatively good comparison to the MOPITT data, in spite of the fact that MOPITT a priori assumptions imply several difficulties, is also obtained.
Resumo:
An (n, d)-expander is a graph G = (V, E) such that for every X subset of V with vertical bar X vertical bar <= 2n - 2 we have vertical bar Gamma(G)(X) vertical bar >= (d + 1) vertical bar X vertical bar. A tree T is small if it has at most n vertices and has maximum degree at most d. Friedman and Pippenger (1987) proved that any ( n; d)- expander contains every small tree. However, their elegant proof does not seem to yield an efficient algorithm for obtaining the tree. In this paper, we give an alternative result that does admit a polynomial time algorithm for finding the immersion of any small tree in subgraphs G of (N, D, lambda)-graphs Lambda, as long as G contains a positive fraction of the edges of Lambda and lambda/D is small enough. In several applications of the Friedman-Pippenger theorem, including the ones in the original paper of those authors, the (n, d)-expander G is a subgraph of an (N, D, lambda)-graph as above. Therefore, our result suffices to provide efficient algorithms for such previously non-constructive applications. As an example, we discuss a recent result of Alon, Krivelevich, and Sudakov (2007) concerning embedding nearly spanning bounded degree trees, the proof of which makes use of the Friedman-Pippenger theorem. We shall also show a construction inspired on Wigderson-Zuckerman expander graphs for which any sufficiently dense subgraph contains all trees of sizes and maximum degrees achieving essentially optimal parameters. Our algorithmic approach is based on a reduction of the tree embedding problem to a certain on-line matching problem for bipartite graphs, solved by Aggarwal et al. (1996).
Resumo:
There is little empirical data about the impact of digital inclusion on cognition among older adults. This paper aimed at investigating the effects of a digital inclusion program in the cognitive performance of older individuals who participated in a computer learning workshop named ""Idosos On-Line`` (Elderly Online). Forty-two aged individuals participated in the research study: 22 completed the computer training workshop and 20 constituted the control group. All subjects answered a sociodemographic questionnaire and completed the Addenbrooke`s cognitive examination, revised (ACE-R), which examines five cognitive domains: orientation and attention, memory, verbal fluency, language, and visuo-spatial skills. It was noted that the experimental group`s cognitive performance significantly improved after the program, particularly in the language and memory domains, when compared to the control group. These findings suggest that the acquisition of new knowledge and the use of a new tool, that makes it possible to access the Internet, may bring gains to cognition. (C) 2010 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Since the beginning of Physical Education entrance in the brazilin public schools, the game has been frequently used as content, and in the course of time that practice seems to be intensified. In spite of many approaches of different purposes to justify its pedagogic usefulness, the game has been used as an indiscriminate way due to the fascination that it provides to the students. The present study searches for a description and analysis of children`s (10-12 years old) attitudes behaviors in games, on Physical Education classes, inside a public school. The study was accomplished with the researcher also attending as a teacher (action research). For the accomplishment of the study 55 children were filmed in four different games, of different kinds (exposed, transformed, and spontaneous). The classes` description and analysis were focused in the attitude axis and it was defined four topics for the discussion: Conflicts, Respect of rules, Expressiveness, and Competitiveness. The relationship between the individual with the game and its culture were pointed as the main characteristics in the configuration of the ludicrous activity atmosphere. It was also possible to observe specific situations of this relationship, once the games were limited to the social games (Piaget category), in a school atmosphere where children have students roles. Due to the obtained results, the study proposes a reflexive practice in which the students notice their own attitudes and try to adapt the game to their needs and not he other way around. In this perspective, the teacher has an important mediator roll, once he will be responsible to point out the students` difficulties and promote discussions in favor to provide teamwork.
Resumo:
This work deals with neural network (NN)-based gait pattern adaptation algorithms for an active lower-limb orthosis. Stable trajectories with different walking speeds are generated during an optimization process considering the zero-moment point (ZMP) criterion and the inverse dynamic of the orthosis-patient model. Additionally, a set of NNs is used to decrease the time-consuming analytical computation of the model and ZMP. The first NN approximates the inverse dynamics including the ZMP computation, while the second NN works in the optimization procedure, giving an adapted desired trajectory according to orthosis-patient interaction. This trajectory adaptation is added directly to the trajectory generator, also reproduced by a set of NNs. With this strategy, it is possible to adapt the trajectory during the walking cycle in an on-line procedure, instead of changing the trajectory parameter after each step. The dynamic model of the actual exoskeleton, with interaction forces included, is used to generate simulation results. Also, an experimental test is performed with an active ankle-foot orthosis, where the dynamic variables of this joint are replaced in the simulator by actual values provided by the device. It is shown that the final adapted trajectory follows the patient intention of increasing the walking speed, so changing the gait pattern. (C) Koninklijke Brill NV, Leiden, 2011
Resumo:
Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).
Resumo:
To determine the effect of slurry rheology on industrial grinding performance, 45 surveys were conducted on 16 full-scale grinding mills in five sites. Four operating variables - mill throughput, slurry density, slurry viscosity and feed fines content-were investigated. The rheology of the mill discharge slurries was measured either on-line or off-line, and the data were processed using a standard procedure to obtain a full range of flow curves. Multi-linear regression was employed as a statistical analysis tool to determine whether or not rheological effects exert an influence on industrial grinding, and to assess the influence of the four mill operating conditions on mill performance in terms of the Grinding Index, a criterion describing the overall breakage of particles across the mill. The results show that slurry rheology does influence industrial grinding. The trends of these effects on Grinding Index depend upon the rheological nature of the slurry-whether the slurries are dilatant or pseudoplastic, and whether they exhibit a high or low yield stress. The interpretation of the regression results is discussed, the observed effects are summarised, and the potential for incorporating rheological principles into process control is considered, Guidelines are established to improve industrial grinding operations based on knowledge of the rheological effects. This study confirms some trends in the effect of slurry rheology on grinding reported in the literature, and extends these to a broader understanding of the relationship between slurry properties and rheology, and their effects on industrial milling performance. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Ten surveys of the ball milling circuit at the Mt Isa Mines (MIM) Copper Concentrator were conducted aiming to identify any changes in slurry theology caused by the use of chrome balls charge, and the associated effect on grinding performance. Slurry theology was measured using an on-line viscometer. The data were mass balanced and analysed with statistical tools. Comparison of the rheogram demonstrated that slurry density and fines content affected slurry rheology significantly, while the effect of the chrome ball charge being negligible. Statistical analysis showed the effects of mill throughput and cyclone efficiency on the Grinding Index (a term describing the overall breakage). There was no difference in the Grinding Index between using the chrome ball charge and the ordinary steel ball charge. (C) 2002 Elsevier Science Ltd. All rights reserved.