38 resultados para Learning from one Example


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic Term Recognition (ATR) is a fundamental processing step preceding more complex tasks such as semantic search and ontology learning. From a large number of methodologies available in the literature only a few are able to handle both single and multi-word terms. In this paper we present a comparison of five such algorithms and propose a combined approach using a voting mechanism. We evaluated the six approaches using two different corpora and show how the voting algorithm performs best on one corpus (a collection of texts from Wikipedia) and less well using the Genia corpus (a standard life science corpus). This indicates that choice and design of corpus has a major impact on the evaluation of term recognition algorithms. Our experiments also showed that single-word terms can be equally important and occupy a fairly large proportion in certain domains. As a result, algorithms that ignore single-word terms may cause problems to tasks built on top of ATR. Effective ATR systems also need to take into account both the unstructured text and the structured aspects and this means information extraction techniques need to be integrated into the term recognition process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article aims to gain a greater understanding of relevant and successful methods of stimulating an ICT culture and skills development in rural areas. The paper distils good practice activities, utilizing criteria derived from a review of the rural dimensions of ICT learning, from a range of relevant initiatives and programmes. These good practice activities cover: community resource centres providing opportunities for ‘tasting’ ICTs; video games and Internet Cafe´s as tools removing ‘entry barriers’; emphasis on ‘user management’ as a means of creating ownership; service delivery beyond fixed locations; use of ICT capacities in the delivery of general services; and selected use of financial support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - This article examines the internationalisation of Tesco and extracts the salient lessons learned from this process. Design/methodology/ approach - This research draws on a dataset of 62 in-depth interviews with key executives, sell- and buy-side analysts and corporate advisers at the leading investment banks in the City of London to detail the experiences of Tesco's European expansion. Findings - The case study of Tesco illuminates a number of different dimensions of the company's international experience. It offers some new insights into learning in international distribution environments such as the idea that learning is facilitated by uncertainty or "shocks" in the international retail marketplace; the size of the domestic market may inhibit change and so disable international learning; and learning is not necessarily facilitated by step-by-step incremental approaches to expansion. Research limitations/implications - The paper explores learning from a rather broad perspective, although it is hoped that these parameters can be used to raise a new set of more detailed priorities for future research on international retail learning. It is also recognised that the data gathered for this case study focus on Tesco's European operations. Practical implications - This paper raises a number of interesting issues such as whether the extremities of the business may be a more appropriate place for management to experiment and test new retail innovations, and the extent to which retailers take self-reflection seriously. Originality/value - The paper applies a new theoretical learning perspective to capture the variety of experiences during the internationalisation process, thus addressing a major gap in our understanding of the whole internationalisation process. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaccines are the greatest single instrument of prophylaxis against infectious diseases, with immeasurable benefits to human wellbeing. The accurate and reliable prediction of peptide-MHC binding is fundamental to the robust identification of T-cell epitopes and thus the successful design of peptide- and protein-based vaccines. The prediction of MHC class II peptide binding has hitherto proved recalcitrant and refractory. Here we illustrate the utility of existing computational tools for in silico prediction of peptides binding to class II MHCs. Most of the methods, tested in the present study, detect more than the half of the true binders in the top 5% of all possible nonamers generated from one protein. This number increases in the top 10% and 15% and then does not change significantly. For the top 15% the identified binders approach 86%. In terms of lab work this means 85% less expenditure on materials, labour and time. We show that while existing caveats are well founded, nonetheless use of computational models of class II binding can still offer viable help to the work of the immunologist and vaccinologist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 1999, the European Antimicrobial Resistance Surveillance System (EARSS) has monitored the rise in infection due to a number of organisms, including meticillin-resistant Staphylococcus aureus (MRSA). The EARSS reported that MRSA infections within intensive care units account for 25-50% of infections in many central and southern European countries, these included France, Spain, Great Britain, Malta, Greece and Italy. Each country has defined epidemic MRSA (EMRSA) strains; however, the method of spread of these strains from one country to another is unknown. In this current study, DNA profiles of 473 isolates of MRSA collected from the UK and Malta were determined by PFGE. Analysis of the data showed that two countries separated by a large geographical distance had a similar DNA profile pattern. Additionally it was demonstrated that strains of EMRSA normally found in the UK were also found in the Maltese cohort (EMRSA 15 and 16). A distinct DNA profile was found in the Maltese cohort, which may be a local EMRSA, and accounted for 14.4% of all Maltese isolates. The appearance of the same MRSA and EMRSA profiles in two separate countries suggests that MRSA can be transferred out of their country of origin and potentially establish in a new locality or country.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A homologous series of ultra-violet stabilisers containing 2-hydroxybenzophenone (HBP) moiety as a uv absorbing chromophore with varying alkyl chain lengths and sizes were prepared by known chemical synthesis. The strong absorbance of the HBP chromophore was utilized to evaluate the concentration of these stabilisers in low density polyethylene films and concentration of these stabilisers in low density polyethylene films and in relevant solvents by ultra-violet/visible spectroscopy. Intrinsic diffusion coefficients, equilibrium solubilities, volatilities from LDPE films and volatility of pure stabilisers were studied over a temperature range of 5-100oC. The effects of structure, molecular weight and temperature on the above parameters were investigated and the results were analysed on the basis of theoretical models published in the literature. It has been found that an increase in alkyl chain lengths does not change the diffusion coefficients to a significant level, while attachment of polar or branched alkyl groups change their value considerably. An Arrhenius type of relationship for the temperature dependence of diffusion coefficients seems to be valid only for a narrow temperature range, and therefore extrapolation of data from one temperature to another leads to a considerable error. The evidence showed that increase in additive solubility in the polymer is favoured by lower heat of fusions and melting points of additives. This implies the validity of simple regular solution theory to provide an adequate basis for understanding the solubility of additives in polymers The volubility of stabilisers from low density polyethylene films showed that of an additive from a polymer can be expressed in terms of a first-order kinetic equation. In addition the rate of loss of stabilisers was discussed in relation to its diffusion, solubility and volatility and found that all these factors may contribute to the additive loss, although one may be a rate determining factor. Stabiliser migration from LDPE into various solvents and food simulants was studied at temperatures 5, 23, 40 and 70oC; from the plots of rate of migration versus square root time, characteristic diffusion coefficients were obtained by using the solution of Fick's diffusion equations. It was shown that the rate of migration depends primarily on partition coefficients between solvent and the polymer of the additive and also on the swelling action of the contracting media. Characteristic diffusion coefficients were found to approach to intrinsic values in non swelling solvents, whereas in the case of highly swollen polymer samples, the former may be orders of magnitude greater than the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is organised into three parts. In Part 1 relevant literature is reviewed and three critical components in the development of a cognitive approach to instruction are identified. These three components are considered to be the structure of the subject-matter, the learner's cognitive structures, and the learner's cognitive strategies which act as control and transfer devices between the instructional materials and the learner's cognitive structures. Six experiments are described in Part 2 which is divided into two methodologically distinct units. The three experiments of Unit 1 examined how learning from materials constructed from concept name by concept attribute matrices is influenced by learner or experimenter controlled sequence and organisation. The results suggested that the relationships between input organisation, output organisation and recall are complex and highlighted the importance of investigating organisational strategies at both acquisition and recall. The role of subjects previously acquired knowledge and skills in relation to the instructional material was considered to be an important factor. The three experiments of Unit 2 utilised a "diagramming relationships methodology" which was devised as one means of investigating the processes by which new information is assimilated into an individual's cognitive structure. The methodology was found to be useful in identifying cognitive strategies related to successful task performance. The results suggested that errors could be minimised and comprehension improved on the diagramming relationships task by instructing subjects in ways which induced successful processing operations. Part 3 of this thesis highlights salient issues raised by the experimental work within the framework outlined in Part 1 and discusses potential implications for future theoretical developments and research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores how the world-wide-web can be used to support English language teachers doing further studies at a distance. The future of education worldwide is moving towards a requirement that we, as teacher educators, use the latest web technology not as a gambit, but as a viable tool to improve learning. By examining the literature on knowledge, teacher education and web training, a model of teacher knowledge development, along with statements of advice for web developers based upon the model are developed. Next, the applicability and viability of both the model and statements of advice are examined by developing a teacher support site (bttp://www. philseflsupport. com) according to these principles. The data collected from one focus group of users from sixteen different countries, all studying on the same distance Masters programme, is then analysed in depth. The outcomes from the research are threefold: A functioning website that is averaging around 15, 000 hits a month provides a professional contribution. An expanded model of teacher knowledge development that is based upon five theoretical principles that reflect the ever-expanding cyclical nature of teacher learning provides an academic contribution. A series of six statements of advice for developers of teacher support sites. These statements are grounded in the theoretical principles behind the model of teacher knowledge development and incorporate nine keys to effective web facilitation. Taken together, they provide a forward-looking contribution to the praxis of web supported teacher education, and thus to the potential dissemination of the research presented here. The research has succeeded in reducing the proliferation of terminology in teacher knowledge into a succinct model of teacher knowledge development. The model may now be used to further our understanding of how teachers learn and develop as other research builds upon the individual study here. NB: Appendix 4 is only available only available for consultation at Aston University Library with prior arrangement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The theatre director (metteur en scene in French) is a relatively new figure in theatre practice. It was not until the I820s that the term 'mise en scene' gained currency. The term 'director' was not in general use until the I880s. The emergence and the role of the director has been considered from a variety of perspectives, either through the history of theatre (Allevy, Jomaron, Sarrazac, Viala, Biet and Triau); the history of directing (Chinoy and Cole, Boll, Veinstein, Roubine); semiotic approaches to directing (Whitmore, Miller, Pavis); the semiotics of performance (De Marinis); generic approaches to the mise en scene (Thomasseau, Banu); post-dramatic approaches to theatre (Lehmann); approaches to performance process and the specifics of rehearsal methodology (Bradby and Williams, Giannachi and Luckhurst, Picon-Vallin, Styan). What the scholarly literature has not done so far is to map the parameters necessarily involved in the directing process, and to incorporate an analysis of the emergence of the theatre director during the modem period and consider its impact on contemporary performance practice. Directing relates primarily to the making of the performance guided by a director, a single figure charged with the authority to make binding artistic decisions. Each director may have her/his own personal approaches to the process of preparation prior to a show. This is exemplified, for example, by the variety of terms now used to describe the role and function of directing, from producer, to facilitator or outside eye. However, it is essential at the outset to make two observations, each of which contributes to a justification for a generic analysis (as opposed to a genetic approach). Firstly, a director does not work alone, and cooperation with others is involved at all stages of the process. Secondly, beyond individual variation, the role of the director remains twofold. The first is to guide the actors (meneur de jeu, directeur d'acteurs, coach); the second is to make a visual representation in the performance space (set designer, stage designer, costume designer, lighting designer, scenographe). The increasing place of scenography has brought contemporary theatre directors such as Wilson, Castellucci, Fabre to produce performances where the performance space becomes a semiotic dimension that displaces the primacy of the text. The play is not, therefore, the sole artistic vehicle for directing. This definition of directing obviously calls for a definition of what the making of the performance might be. The thesis defines the making of the performance as the activity of bringing a social event, by at least one performer, providing visual and/or textual meaning in a performance space. This definition enables us to evaluate four consistent parameters throughout theatre history: first, the social aspect associated to the performance event; second, the devising process which may be based on visual and/or textual elements; third, the presence of at least one performer in the show; fourth, the performance space (which is not simply related to the theatre stage). Although the thesis focuses primarily on theatre practice, such definition blurs the boundaries between theatre and other collaborative artistic disciplines (cinema, opera, music and dance). These parameters illustrate the possibility to undertake a generic analysis of directing, and resonate with the historical, political and artistic dimensions considered. Such a generic perspective on the role of the director addresses three significant questions: an historical question: how/why has the director emerged?; a sociopolitical question: how/why was the director a catalyst for the politicisation of theatre, and subsequently contributed to the rise of State-funded theatre policy?; and an artistic one: how/why the director has changed theatre practice and theory in the twentieth-century? Directing for the theatre as an artistic activity is a historically situated phenomenon. It would seem only natural from a contemporary perspective to associate the activity of directing to the function of the director. This is relativised, however, by the question of how the performance was produced before the modern period. The thesis demonstrates that the rise of the director is a progressive and historical phenomenon (Dort) rather than a mere invention (Viala, Sarrazac). A chronological analysis of the making of the performance throughout theatre history is the most useful way to open the study. In order to understand the emergence of the director, the research methodology assesses the interconnection of the four parameters above throughout four main periods of theatre history: the beginning of the Renaissance (meneur de jeu), the classical age (actor-manager and stage designer-manager), the modern period (director) and the contemporary period (director-facilitator, performer). This allows us properly to appraise the progressive emergence of the director, as well as to make an analysis of her/his modern and contemporary role. The first chapter argues that the physical separation between the performance space and its audience, which appeared in the early fifteenth-century, has been a crucial feature in the scenographic, aesthetic, political and social organisation of the performance. At the end of the Middle Ages, French farces which raised socio-political issues (see Bakhtin) made a clear division on a single outdoor stage (treteau) between the actors and the spectators, while religious plays (drame fiturgique, mystere) were mostly performed on various outdoor and opened multispaces. As long as the performance was liturgical or religious, and therefore confined within an acceptable framework, it was allowed. At the time, the French ecclesiastical and civil authorities tried, on several occasions, to prohibit staged performances. As a result, practitioners developed non-official indoor spaces, the Theatre de fa Trinite (1398) being the first French indoor theatre recognized by scholars. This self-exclusion from the open public space involved breaking the accepted rules by practitioners (e.g. Les Confreres de fa Passion), in terms of themes but also through individual input into a secular performance rather than the repetition of commonly known religious canvases. These developments heralded the authorised theatres that began to emerge from the mid-sixteenth century, which in some cases were subsidised in their construction. The construction of authorised indoor theatres associated with the development of printing led to a considerable increase in the production of dramatic texts for the stage. Profoundly affecting the reception of the dramatic text by the audience, the distance between the stage and the auditorium accompanied the changing relationship between practitioners and spectators. This distance gave rise to a major development of the role of the actor and of the stage designer. The second chapter looks at the significance of both the actor and set designer in the devising process of the performance from the sixteenth-century to the end of the nineteenth-century. The actor underwent an important shift in function in this period from the delivery of an unwritten text that is learned in the medieval oral tradition to a structured improvisation produced by the commedia dell 'arte. In this new form of theatre, a chef de troupe or an experienced actor shaped the story, but the text existed only through the improvisation of the actors. The preparation of those performances was, moreover, centred on acting technique and the individual skills of the actor. From this point, there is clear evidence that acting began to be the subject of a number of studies in the mid-sixteenth-century, and more significantly in the seventeenth-century, in Italy and France. This is revealed through the implementation of a system of notes written by the playwright to the actors (stage directions) in a range of plays (Gerard de Vivier, Comedie de la Fidelite Nuptiale, 1577). The thesis also focuses on Leoni de' Sommi (Quatro dialoghi, 1556 or 1565) who wrote about actors' techniques and introduced the meneur de jeu in Italy. The actor-manager (meneur de jeu), a professional actor, who scholars have compared to the director (see Strihan), trained the actors. Nothing, however, indicates that the actor-manager was directing the visual representation of the text in the performance space. From the end of the sixteenth-century, the dramatic text began to dominate the process of the performance and led to an expansion of acting techniques, such as the declamation. Stage designers carne from outside the theatre tradition and played a decisive role in the staging of religious celebrations (e.g. Actes des Apotres, 1536). In the sixteenth-century, both the proscenium arch and the borders, incorporated in the architecture of the new indoor theatres (theatre a l'italienne), contributed to create all kinds of illusions on the stage, principally the revival of perspective. This chapter shows ongoing audience demands for more elaborate visual effects on the stage. This led, throughout the classical age, and even more so during the eighteenth-century, to grant the stage design practitioner a major role in the making of the performance (see Ciceri). The second chapter demonstrates that the guidance of the actors and the scenographic conception, which are the artistic components of the role of the director, appear to have developed independently from one another until the nineteenth-century. The third chapter investigates the emergence of the director per se. The causes for this have been considered by a number of scholars, who have mainly identified two: the influence of Naturalism (illustrated by the Meiningen Company, Antoine, and Stanislavski) and the invention of electric lighting. The influence of the Naturalist movement on the emergence of the modem director in the late nineteenth-century is often considered as a radical factor in the history of theatre practice. Naturalism undoubtedly contributed to changes in staging, costume and lighting design, and to a more rigorous commitment to the harmonisation and visualisation of the overall production of the play. Although the art of theatre was dependent on the dramatic text, scholars (Osborne) demonstrate that the Naturalist directors did not strictly follow the playwright's indications written in the play in the late nineteenth-century. On the other hand, the main characteristic of directing in Naturalism at that time depended on a comprehensive understanding of the scenography, which had to respond to the requirements of verisimilitude. Electric lighting contributed to this by allowing for the construction of a visual narrative on stage. However, it was a master technician, rather than an emergent director, who was responsible for key operational decisions over how to use this emerging technology in venues such as the new Bayreuth theatre in 1876. Electric lighting reflects a normal technological evolution and cannot be considered as one of the main causes of the emergence of the director. Two further causes of the emergence of the director, not considered in previous studies, are the invention of cinema and the Symbolist movement (Lugne-Poe, Meyerhold). Cinema had an important technological influence on the practitioners of the Naturalist movement. In order to achieve a photographic truth on the stage (tableau, image), Naturalist directors strove to decorate the stage with the detailed elements that would be expected to be found if the situation were happening in reality. Film production had an influence on the work of actors (Walter). The filmmaker took over a primary role in the making of the film, as the source of the script, the filming process and the editing of the film. This role influenced the conception that theatre directors had of their own work. It is this concept of the director which influenced the development of the theatre director. As for the Symbolist movement, the director's approach was to dematerialise the text of the playwright, trying to expose the spirit, movement, colour and rhythm of the text. Therefore, the Symbolists disengaged themselves from the material aspect of the production, and contributed to give greater artistic autonomy to the role of the director. Although the emergence of the director finds its roots amongst the Naturalist practitioners (through a rigorous attempt to provide a strict visual interpretation of the text on stage), the Symbolist director heralded the modem perspective of the making of performance. The emergence of the director significantly changed theatre practice and theory. For instance, the rehearsal period became a clear work in progress, a platform for both developing practitioners' techniques and staging the show. This chapter explores and contrasts several practitioners' methods based on the two aspects proposed for the definition of the director (guidance of the actors and materialisation of a visual space). The fourth chapter argues that the role of the director became stronger, more prominent, and more hierarchical, through a more political and didactic approach to theatre as exemplified by the cases of France and Germany at the end of the nineteenth-century and through the First World War. This didactic perspective to theatre defines the notion of political theatre. Political theatre is often approached by the literature (Esslin, Willett) through a Marxist interpretation of the great German directors' productions (Reinhardt, Piscator, Brecht). These directors certainly had a great influence on many directors after the Second World War, such as Jean Vilar, Judith Molina, Jean-Louis Barrault, Roger Planchon, Augusto Boal, and others. This chapter demonstrates, moreover, that the director was confirmed through both ontological and educational approaches to the process of making the performance, and consequently became a central and paternal figure in the organisational and structural processes practiced within her/his theatre company. In this way, the stance taken by the director influenced the State authorities in establishing theatrical policy. This is an entirely novel scholarly contribution to the study of the director. The German and French States were not indifferent to the development of political theatre. A network of public theatres was thus developed in the inter-war period, and more significantly after the Second World War. The fifth chapter shows how State theatre policies establish its sources in the development of political theatre, and more specifically in the German theatre trade union movement (Volksbiihne) and the great directors at the end of the nineteenth-century. French political theatre was more influenced by playwrights and actors (Romain Rolland, Louise Michel, Louis Lumet, Emile Berny). French theatre policy was based primarily on theatre directors who decentralised their activities in France during both the inter-war period and the German occupation. After the Second World War, the government established, through directors, a strong network of public theatres. Directors became both the artistic director and the executive director of those institutionalised theatres. The institution was, however, seriously shaken by the social and political upheaval of 1968. It is the link between the State and the institution in which established directors were entangled that was challenged by the young emerging directors who rejected institutionalised responsibility in favour of the autonomy of the artist in the 1960s. This process is elucidated in chapter five. The final chapter defines the contemporary role of the director in contrasting thework of a number of significant young theatre practitioners in the 1960s such as Peter Brook, Ariane Mnouchkine, The Living Theater, Jerzy Grotowski, Augusto Boal, Eugenio Barba, all of whom decided early on to detach their companies from any form of public funding. This chapter also demonstrates how they promoted new forms of performance such as the performance of the self. First, these practitioners explored new performance spaces outside the traditional theatre building. Producing performances in a non-dedicated theatre place (warehouse, street, etc.) was a more frequent practice in the 1960s than before. However, the recent development of cybertheatre questions both the separation of the audience and the practitioners and the place of the director's role since the 1990s. Secondly, the role of the director has been multifaceted since the 1960s. On the one hand, those directors, despite all their different working methods, explored western and non-western acting techniques based on both personal input and collective creation. They challenged theatrical conventions of both the character and the process of making the performance. On the other hand, recent observations and studies distinguish the two main functions of the director, the acting coach and the scenographe, both having found new developments in cinema, television, and in various others events. Thirdly, the contemporary director challenges the performance of the text. In this sense, Antonin Artaud was a visionary. His theatre illustrates the need for the consideration of the totality of the text, as well as that of theatrical production. By contrasting the theories of Artaud, based on a non-dramatic form of theatre, with one of his plays (Le Jet de Sang), this chapter demonstrates how Artaud examined the process of making the performance as a performance. Live art and autobiographical performance, both taken as directing the se(f, reinforce this suggestion. Finally, since the 1990s, autobiographical performance or the performance of the self is a growing practical and theoretical perspective in both performance studies and psychology-related studies. This relates to the premise that each individual is making a representation (through memory, interpretation, etc.) of her/his own life (performativity). This last section explores the links between the place of the director in contemporary theatre and performers in autobiographical practices. The role of the traditional actor is challenged through non-identification of the character in the play, while performers (such as Chris Burden, Ron Athey, Orlan, Franko B, Sterlac) have, likewise, explored their own story/life as a performance. The thesis demonstrates the validity of the four parameters (performer, performance space, devising process, social event) defining a generic approach to the director. A generic perspective on the role of the director would encompass: a historical dimension relative to the reasons for and stages of the 'emergence' of the director; a socio-political analysis concerning the relationship between the director, her/his institutionalisation, and the political realm; and the relationship between performance theory, practice and the contemporary role of the director. Such a generic approach is a new departure in theatre research and might resonate in the study of other collaborative artistic practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We explore the effects of over-specificity in learning algorithms by investigating the behavior of a student, suited to learn optimally from a teacher B, learning from a teacher B' ? B. We only considered the supervised, on-line learning scenario with teachers selected from a particular family. We found that, in the general case, the application of the optimal algorithm to the wrong teacher produces a residual generalization error, even if the right teacher is harder. By imposing mild conditions to the learning algorithm form, we obtained an approximation for the residual generalization error. Simulations carried out in finite networks validate the estimate found.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent National Student Surveys revealed that many U.K. university students are dissatisfied with the timeliness and usefulness of the feedback received from their tutors. Ensuring timeliness in marking often results in a reduction in the quality of feedback. In Computer Science where learning relies on practising and learning from mistakes, feedback that pin-points errors and explains means of improvement is important to achieve a good student learning experience. Though suitable use of Information and Communication Technology should alleviate this problem, existing Virtual Learning Environments and e-Assessment applications such as Blackboard/WebCT, BOSS, MarkTool and GradeMark are inadequate to support a coursework assessment process that promotes timeliness and usefulness of feedback while maintaining consistency in marking involving multiple tutors. We have developed a novel Internet application, called eCAF, for facilitating an efficient and transparent coursework assessment and feedback process. The eCAF system supports detailed marking scheme editing and enables tutors to use such schemes to pin-point errors in students' work so as to provide helpful feedback efficiently. Tutors can also highlight areas in a submitted work and associate helpful feedback that clearly links to the identified mistakes and the respective marking criteria. In light of the results obtained from a recent trial of eCAF, we discuss how the key features of eCAF may facilitate an effective and efficient coursework assessment and feedback process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Recent critiques of incident reporting suggest that its role in managing safety has been over emphasized. The objective of this study was to examine the perceived effectiveness of incident reporting in improving safety in mental health and acute hospital settings by asking staff about their perceptions and experiences. DESIGN: /st>Qualitative research design using documentary analysis and semi-structured interviews. SETTING: /st>Two large teaching hospitals in London; one providing acute and the other mental healthcare. PARTICIPANTS: /st>Sixty-two healthcare practitioners with experience of reporting and analysing incidents. RESULTS: /st>Incident reporting was perceived as having a positive effect on safety, not only by leading to changes in care processes but also by changing staff attitudes and knowledge. Staff discussed examples of both instrumental and conceptual uses of the knowledge generated by incident reports. There are difficulties in using incident reports to improve safety in healthcare at all stages of the incident reporting process. Differences in the risks encountered and the organizational systems developed in the two hospitals to review reported incidents could be linked to the differences we found in attitudes to incident reporting between the two hospitals. CONCLUSION: /st>Incident reporting can be a powerful tool for developing and maintaining an awareness of risks in healthcare practice. Using incident reports to improve care is challenging and the study highlighted the complexities involved and the difficulties faced by staff in learning from incident data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The semantic web vision is one in which rich, ontology-based semantic markup will become widely available. The availability of semantic markup on the web opens the way to novel, sophisticated forms of question answering. AquaLog is a portable question-answering system which takes queries expressed in natural language and an ontology as input, and returns answers drawn from one or more knowledge bases (KBs). We say that AquaLog is portable because the configuration time required to customize the system for a particular ontology is negligible. AquaLog presents an elegant solution in which different strategies are combined together in a novel way. It makes use of the GATE NLP platform, string metric algorithms, WordNet and a novel ontology-based relation similarity service to make sense of user queries with respect to the target KB. Moreover it also includes a learning component, which ensures that the performance of the system improves over the time, in response to the particular community jargon used by end users.