980 resultados para Conceptual art
Resumo:
The aim of this study has been to challenge or expand the present views on special education. In a series of six articles this thesis will directly or indirectly debate questions relating to inclusive and exclusive mechanisms in society. It is claimed that the tension between traditionalism and inclusionism within special education may harm the legitimation of special education as a profession of the welfare state. The articles address the relationship between these two approaches. The traditionalism-inclusionism controversy is partly rooted in different ways of understanding the role of special education with respect to democracy. It seems, however, that the traditionalism-inclusionism controversy tends to lead researchers to debate paradigmatic positions with each other than to develop alternative strategies for dealing with the delicate challenge of the differences within education. ---- There are three major areas of this discussion. The first part presents the theory of research programmes as a way of describing the content, the possibilities, and the problems of the different approaches. The main argument is that the concept of research programmes more clearly emphasizes the ethical responsibilities involved in research within the field of special education than does the paradigmatic approach. The second part considers the social aspects of the debate between traditionalism and inclusionism from different perspectives. A central claim made is that the work seen within special education must be understood as a reaction to the social and political world that the profession is part of, and that this also is part of a specific historical development. Even though it is possible to claim that the main aim for special education is to help people that are looked at as disabled or feel disabled, it is also necessary to understand that the profession is highly constrained by the grand narrative of the welfare state and the historical discourse that this profession is part of. The third part focuses on a central aspect of special education: the humanistic solutions towards people who are left behind by ordinary education. The humanistic obligation for special education is part of the general aim of the welfare state to provide an education for a democratic and an inclusive society. This humanistic aim and the goal to offer an education for democracy seem therefore, to dominate the understanding of how special education works.
Resumo:
The aim of the doctoral dissertation was to further our theoretical and empirical understanding of media education as practised in the context of Finnish basic education. The current era of intensive use of the Internet is recognised too. The doctoral dissertation presents the subject didactic dimension of media education as one of the main results of the conceptual analysis. The theoretical foundation is based on the idea of dividing the concept of media education into media and education (Vesterinen et al., 2006). As two ends of the dimension, these two can be understood didactically as content and pedagogy respectively. In the middle, subject didactics is considered to have one form closer to content matter (Subject Didactics I learning about media) and another closer to general pedagogical questions (Subject Didactics II learning with/through media). The empirical case studies of the dissertation are reported with foci on media literacy in the era of Web 2.0 (Kynäslahti et al., 2008), teacher reasoning in media educational situations (Vesterinen, Kynäslahti - Tella, 2010) and the research methodological implications of the use of information and communication technologies in the school (Vesterinen, Toom - Patrikainen, 2010). As a conclusion, Media-Based Media Education and Cross-Curricular Media Education are presented as two subject didactic modes of media education in the school context. Episodic Media Education is discussed as the third mode of media education where less organised teaching, studying and learning related to media takes place, and situations (i.e. episodes, if you like) without proper planning or thorough reflection are in focus. Based on the theoretical and empirical understanding gained in this dissertation, it is proposed that instead of occupying a corner of its own in the school curriculum, media education should lead the wider change in Finnish schools.
Resumo:
The majority of Internet traffic use Transmission Control Protocol (TCP) as the transport level protocol. It provides a reliable ordered byte stream for the applications. However, applications such as live video streaming place an emphasis on timeliness over reliability. Also a smooth sending rate can be desirable over sharp changes in the sending rate. For these applications TCP is not necessarily suitable. Rate control attempts to address the demands of these applications. An important design feature in all rate control mechanisms is TCP friendliness. We should not negatively impact TCP performance since it is still the dominant protocol. Rate Control mechanisms are classified into two different mechanisms: window-based mechanisms and rate-based mechanisms. Window-based mechanisms increase their sending rate after a successful transfer of a window of packets similar to TCP. They typically decrease their sending rate sharply after a packet loss. Rate-based solutions control their sending rate in some other way. A large subset of rate-based solutions are called equation-based solutions. Equation-based solutions have a control equation which provides an allowed sending rate. Typically these rate-based solutions react slower to both packet losses and increases in available bandwidth making their sending rate smoother than that of window-based solutions. This report contains a survey of rate control mechanisms and a discussion of their relative strengths and weaknesses. A section is dedicated to a discussion on the enhancements in wireless environments. Another topic in the report is bandwidth estimation. Bandwidth estimation is divided into capacity estimation and available bandwidth estimation. We describe techniques that enable the calculation of a fair sending rate that can be used to create novel rate control mechanisms.
Resumo:
Clustering is a process of partitioning a given set of patterns into meaningful groups. The clustering process can be viewed as consisting of the following three phases: (i) feature selection phase, (ii) classification phase, and (iii) description generation phase. Conventional clustering algorithms implicitly use knowledge about the clustering environment to a large extent in the feature selection phase. This reduces the need for the environmental knowledge in the remaining two phases, permitting the usage of simple numerical measure of similarity in the classification phase. Conceptual clustering algorithms proposed by Michalski and Stepp [IEEE Trans. PAMI, PAMI-5, 396–410 (1983)] and Stepp and Michalski [Artif. Intell., pp. 43–69 (1986)] make use of the knowledge about the clustering environment in the form of a set of predefined concepts to compute the conceptual cohesiveness during the classification phase. Michalski and Stepp [IEEE Trans. PAMI, PAMI-5, 396–410 (1983)] have argued that the results obtained with the conceptual clustering algorithms are superior to conventional methods of numerical classification. However, this claim was not supported by the experimental results obtained by Dale [IEEE Trans. PAMI, PAMI-7, 241–244 (1985)]. In this paper a theoretical framework, based on an intuitively appealing set of axioms, is developed to characterize the equivalence between the conceptual clustering and conventional clustering. In other words, it is shown that any classification obtained using conceptual clustering can also be obtained using conventional clustering and vice versa.
Resumo:
A state-of-the-art model of the coupled ocean-atmosphere system, the climate forecast system (CFS), from the National Centres for Environmental Prediction (NCEP), USA, has been ported onto the PARAM Padma parallel computing system at the Centre for Development of Advanced Computing (CDAC), Bangalore and retrospective predictions for the summer monsoon (June-September) season of 2009 have been generated, using five initial conditions for the atmosphere and one initial condition for the ocean for May 2009. Whereas a large deficit in the Indian summer monsoon rainfall (ISMR; June-September) was experienced over the Indian region (with the all-India rainfall deficit by 22% of the average), the ensemble average prediction was for above-average rainfall during the summer monsoon. The retrospective predictions of ISMR with CFS from NCEP for 1981-2008 have been analysed. The retrospective predictions from NCEP for the summer monsoon of 1994 and that from CDAC for 2009 have been compared with the simulations for each of the seasons with the stand-alone atmospheric component of the model, the global forecast system (GFS), and observations. It has been shown that the simulation with GFS for 2009 showed deficit rainfall as observed. The large error in the prediction for the monsoon of 2009 can be attributed to a positive Indian Ocean Dipole event seen in the prediction from July onwards, which was not present in the observations. This suggests that the error could be reduced with improvement of the ocean model over the equatorial Indian Ocean.
Resumo:
Distribution of particle reinforcements in cast composites is determined by the morphology of the solidification front. Interestingly, during solidification, the morphology of the interface is intrinsically affected by the presence of dispersed reinforcements. Thus the dispersoid distribution and length scale of matrix microstructure is a result of the interplay between these two. A proper combination of material and process parameters can be used to obtain composites with tailored microstructures. This requires the generation of a broad data base and optimization of the complete solidification process. The length scale of soldification microtructure has a large influence on the mechanical properties of the composites. This presentation addresses the concept of a particle distribution map which can help in predicting particle distribution under different solidification conditions Future research directions have also been indicated.