755 resultados para Computer Uses in Education
Resumo:
Aquest estudi analitza -seguint una metodologia quantitativa basada en una mostra representativa de 2.093 professors i 23.864 estudiants i reforçada amb elements qualitatius- la transició que es produeix en el sistema universitari públic català cap a un model més adaptat a les noves necessitats de la societat xarxa. Per a això, es posa especial èmfasi en l'anàlisi dels usos que es fa d'Internet (l'eina clau de la societat xarxa) en el món universitari i en les transformacions que es donen o es donaran com a conseqüència d'aquests usos.
Resumo:
When subjects studied at school are close to societal discourses and to the students' social identities, when they have high emotional resonance, is it possible to enable the students to distance themselves from their emotions and personal experience, and to conceptualise them? Examining the relation between emotion and learning through the lens of socio-cultural psychology, the aim of our study was to shed light on "secondarisation" processes, that is, processes that transform personal experience and emotions into conceptualised forms of thinking. We analysed 85 video-recorded lessons in education for cultural diversity involving 12 teachers (of primary and secondary schools). Having identified episodes in which emotions were put into words or personal experience was reported, we analysed the use of pronouns (taken as indicators of secondarisation processes) and found a recurrent pattern: "the unicity-genericity routine". We illustrate the functioning of this routine with various excerpts taken from lessons in education for diversity taught in the classes of two teachers in primary school. The results show that the interplay between unicity and genericity works as a discursive resource for the development of secondarisation processes.
Resumo:
Abstract The neo-liberal capitalist ideology has come under heavy fire with anecdotal evidence indicating a link between these same values and unethical behavior. Academic institutions reflect social values and act as socializing agents for the young. Can this explain the high and increasing rates of cheating that currently prevail in education? Our first chapter examines the question of whether self-enhancement values of power and açhievement, the individual level equivalent of neo-liberal capitalist values, predict positive attitudes towards cheating. Furthermore, we explore the mediating role of motivational factors. Results of four studies reveal that self-enhancement value endorsement predicts the adoption of performance-approach goals, a relationship mediated by introjected regulation, namely desire for social approval and that self-enhancement value endorsement also predicts the condoning of cheating, a relationship mediated by performance-approach goal adoption. However, self-transcendence values prescribed by a normatively salient source have the potential to reduce the link between self-enhancement value endorsément and attitudes towards cheating. Normative assessment constitutes a key tool used by academic institutions to socialize young people to accept the competitive, meritocratic nature of a sociéty driven by a neo-liberal capitalist ideology. As such, the manifest function of grades is to motivate students to work hard and to buy into the competing ethos. Does normative assessment fulfill these functions? Our second chapter explores the reward-intrinsic motivation question in the context of grading, arguably a high-stakes reward. In two experiments, the relative capacity of graded high performance as compared to the task autonomy experienced in an ungraded task to predict post-task intrinsic motivation is assessed. Results show that whilst the graded task performance predicts post-task appreciation, it fails to predict ongoing motivation. However, perceived autonomy experienced in non-graded condition, predicts both post-task appreciation and ongoing motivation. Our third chapter asks whether normative assessment inspires the spirit of competition in students. Results of three experimental studies reveal that expectation of a grade for a task, compared to no grade, induces greater adoption of performance-avoidance, but not performance-approach, goals. Experiment 3 provides an explanatory mechanism for this, showing that reduced autonomous motivation experienced in previous graded tasks mediates the relationship between grading and adoption of performance avoidance goals in a subsequent task. The above results, when combined, provide evidence as to the deleterious effects of self enhancement values and the associated practice of normative assessment in school on student motivation, goals and ethics. We conclude by using value and motivation theory to explore solutions to this problem.
Resumo:
Report for the scientific sojourn carried out at the School of Computing of the University of Dundee, United Kingdom, from 2010 to 2012. This document is a scientific report of the work done, main results, publications and accomplishment of the objectives of the 2-year post-doctoral research project with reference number BP-A 00239. The project has addressed the topic of older people (60+) and Information and Communication Technologies (ICT), which is a topic of growing social and research interest, from a Human-Computer Interaction perspective. Over a 2-year period (June 2010-June 2012), we have conducted classical ethnography of ICT use in a computer clubhouse in Scotland, addressing interaction barriers and strategies, social sharing practices in Social Network Sites, and ICT learning, and carried out rapid ethnographical studies related to geo-enabled ICT and e-government services towards supporting independent living and active ageing. The main results have provided a much deeper understanding of (i) the everyday use of Computer-Mediated Communication tools, such as video-chats and blogs, and its evolution as older people’s experience with ICT increases over time, (ii) cross-cultural aspects of ICT use in the north and south of Europe, (iii) the relevance of cognition over vision in interacting with geographical information and a wide range of ICT tools, despite common stereotypes (e.g. make things bigger), (iv) the important relationship offline-online to provide older people with socially inclusive and meaningful eservices for independent living and active ageing, (v) how older people carry out social sharing practices in the popular YouTube, (vi) their user experiences and (vii) the challenges they face in ICT learning and the strategies they use to become successful ICT learners over time. The research conducted in this project has been published in 17 papers, 4 in journals – two of which in JCR, 5 in conferences, 4 in workshops and 4 in magazines. Other public output consists of 10 invited talks and seminars.
Resumo:
The article presents and discusses estimates of social and economic indicators for Italy’s regions in benchmark years roughly from Unification to the present day: life expectancy, education, GDP per capita at purchasing power parity, and the new Human Development Index (HDI). A broad interpretative hypothesis, based on the distinction between passive and active modernization, is proposed to account for the evolution of regional imbalances over the long-run. In the lack of active modernization, Southern Italy converged thanks to passive modernization, i.e., State intervention: however, this was more effective in life expectancy, less successful in education, expensive and as a whole ineffective in GDP. As a consequence, convergence in the HDI occurred from the late XIX century to the 1970s, but came to a sudden halt in the last decades of the XX century.
Resumo:
The optimization of the pilot overhead in single-user wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-used block fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with the normalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
In the context of fading channels it is well established that, with a constrained transmit power, the bit rates achievable by signals that are not peaky vanish as the bandwidth grows without bound. Stepping back from the limit, we characterize the highest bit rate achievable by such non-peaky signals and the approximate bandwidth where that apex occurs. As it turns out, the gap between the highest rate achievable without peakedness and the infinite-bandwidth capacity (with unconstrained peakedness) is small for virtually all settings of interest to wireless communications. Thus, although strictly achieving capacity in wideband fading channels does require signal peakedness, bit rates not far from capacity can be achieved with conventional signaling formats that do not exhibit the serious practical drawbacks associated with peakedness. In addition, we show that the asymptotic decay of bit rate in the absence of peakedness usually takes hold at bandwidths so large that wideband fading models are called into question. Rather, ultrawideband models ought to be used.
Resumo:
A contemporary perspective on the tradeoff between transmit antenna diversity andspatial multiplexing is provided. It is argued that, in the context of most modern wirelesssystems and for the operating points of interest, transmission techniques that utilizeall available spatial degrees of freedom for multiplexing outperform techniques that explicitlysacrifice spatial multiplexing for diversity. In the context of such systems, therefore,there essentially is no decision to be made between transmit antenna diversity and spatialmultiplexing in MIMO communication. Reaching this conclusion, however, requires thatthe channel and some key system features be adequately modeled and that suitable performancemetrics be adopted; failure to do so may bring about starkly different conclusions. Asa specific example, this contrast is illustrated using the 3GPP Long-Term Evolution systemdesign.
Resumo:
The analysis of the multiantenna capacity in the high-SNR regime has hitherto focused on the high-SNR slope (or maximum multiplexing gain), which quantifies the multiplicative increase as function of the number of antennas. This traditional characterization is unable to assess the impact of prominent channel features since, for a majority of channels, the slope equals the minimum of the number of transmit and receive antennas. Furthermore, a characterization based solely on the slope captures only the scaling but it has no notion of the power required for a certain capacity. This paper advocates a more refined characterization whereby, as function of SNRjdB, the high-SNR capacity is expanded as an affine function where the impact of channel features such as antenna correlation, unfaded components, etc, resides in the zero-order term or power offset. The power offset, for which we find insightful closed-form expressions, is shown to play a chief role for SNR levels of practical interest.
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.
Resumo:
Two important challenges that teachers are currently facing are the sharing and the collaborative authoring of their learning design solutions, such as didactical units and learning materials. On the one hand, there are tools that can be used for the creation of design solutions and only some of them facilitate the co-edition. However, they do not incorporate mechanisms that support the sharing of the designs between teachers. On the other hand, there are tools that serve as repositories of educational resources but they do not enable the authoring of the designs. In this paper we present LdShake, a web tool whose novelty is focused on the combined support for the social sharing and co-edition of learning design solutions within communities of teachers. Teachers can create and share learning designs with other teachers using different access rights so that they can read, comment or co-edit the designs. Therefore, each design solution is associated to a group of teachers able to work on its definition, and another group that can only see the design. The tool is generic in that it allows the creation of designs based on any pedagogical approach. However, it can be particularized in instances providing pre-formatted designs structured according to a specific didactic method (such as Problem-Based Learning, PBL). A particularized LdShake instance has been used in the context of Human Biology studies where teams of teachers are required to work together in the design of PBL solutions. A controlled user study, that compares the use of a generic LdShake and a Moodle system, configured to enable the creation and sharing of designs, has been also carried out. The combined results of the real and controlled studies show that the social structure, and the commenting, co-edition and publishing features of LdShake provide a useful, effective and usable approach for facilitating teachers' teamwork.
Resumo:
In the last few years, there has been a growing focus on faster computational methods to support clinicians in planning stenting procedures. This study investigates the possibility of introducing computational approximations in modelling stent deployment in aneurysmatic cerebral vessels to achieve simulations compatible with the constraints of real clinical workflows. The release of a self-expandable stent in a simplified aneurysmatic vessel was modelled in four different initial positions. Six progressively simplified modelling approaches (based on Finite Element method and Fast Virtual Stenting – FVS) have been used. Comparing accuracy of the results, the final configuration of the stent is more affected by neglecting mechanical properties of materials (FVS) than by adopting 1D instead of 3D stent models. Nevertheless, the differencesshowed are acceptable compared to those achieved by considering different stent initial positions. Regarding computationalcosts, simulations involving 1D stent features are the only ones feasible in clinical context.
Resumo:
In a distributed key distribution scheme, a set of servers helps a set of users in a group to securely obtain a common key. Security means that an adversary who corrupts some servers and some users has no information about the key of a noncorrupted group. In this work, we formalize the security analysis of one such scheme which was not considered in the original proposal. We prove the scheme is secure in the random oracle model, assuming that the Decisional Diffie-Hellman (DDH) problem is hard to solve. We also detail a possible modification of that scheme and the one in which allows us to prove the security of the schemes without assuming that a specific hash function behaves as a random oracle. As usual, this improvement in the security of the schemes is at the cost of an efficiency loss.