833 resultados para Learning from one Example


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Frage des Neubeginns pädagogischer Wissenschaft in Deutschland wird hier im Blick auf die gesellschaftliche Funktion von Wissenschaft gestellt und am Beispiel einer Teildisziplin, der Berufs- und Wirtschaftspädagogik (BWP), vorläufig beantwortet. Basis dafür sind die Berufsbiographien sowie die Theorie- und Wissensproduktion samt der Diskursform von 23 Hochschullehrern der BWP, dem vermutlich vollständigen Personenkorpus dieser Disziplin und der ersten Generation ihrer Vertreter in Westdeutschland nach 1945. Danach zeichnet sich die BWP durch personale Kontinuität, theoretische Homogenität und diskursive Selbstreferentialität aus und zeigt sich dieser Konstanz nach als ein geschlossener Wissenschaftsprozeß seit 1930 bis 1960. Der war freilich politisch offen; gesellschaftlich gesehen war und verhielt sich die Disziplin funktional. (DIPF/Orig.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examined how international food price shocks have impacted local ination processes in Brazil, Chile, Colombia, Mexico, and Peru in the past decade -- Using impulse-response analysis coming from cointegrated VARs, we wind that international food ination shocks take from one to six quarters to pass through to domestic head-line ination, depending on the country -- In addition, by calculating the elasticity of local prices to an international food price shock, we found that this pass-through is not complete -- We also take a closer look at how this type of shock affects local food and core prices separately, and asses the possibility second round effects over core ination stemming from the shock -- We wind that a transmission to headline prices does occur, and that part of the transmission is associated with rising core prices both directly and through possible second round effects, which implies a role for monetary policy when such a shock takes place -- This is especially relevant given that international food prices have recently been on an upward trend after falling considerably during the Great Recession

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – Graffiti, both ancient and contemporary, could be argued to be significant and therefore worthy of protection. Attaching value is, however, subjective with no specific method being solely utilised for evaluating these items. The purpose of this paper to help those who are attempting to evaluate the merit of graffiti to do so, by determining “cultural significance”, which is a widely adopted concept for attaching value to the historic built environment. The current Scottish system utilised to assess “cultural significance” is the Scottish Historic Environment Policy (SHEP) which shares many common features with other determinants of cultural significance in different countries. The SHEP document, as with other systems, could however be criticised for being insufficiently sensitive to enable the evaluation of historic graffiti due, in part, to the subjective nature of determination of aesthetic value. Design/methodology/approach – A review of literature is followed by consideration of case studies taken from a variety of historical and geographical contexts. The majority of examples of graffiti included in this paper have been selected for their relative high profile, previous academic study, and breadth of geographic spread. This selection will hopefully enable a relatively comprehensive, rational assessment to be undertaken. That being said, one example has been integrated to reflect commonly occurring graffiti that is typical to all of the built environment. Findings – The determination of aesthetic value is particularly problematic for the evaluator and the use of additional art‐based mechanisms such as “significant form”, “self expression” and “meaning” may aid this process. Regrettably, these determinants are also in themselves subjective, enhancing complexity of evaluation. Almost all graffiti could be said to have artistic merit, using the aforementioned determinants. However, whether it is “good” art is an all together different question. The evaluation of “good” art and graffiti would have traditionally been evaluated by experts. Today, determination of graffiti should be evaluated and value attached by broader society, community groups, and experts alike. Originality/value – This research will assist those responsible for historic building conservation with the evaluation of whether graffiti is worthy of conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this master’s thesis, I examine the development of writer-characters and metafiction from John Irving’s The World According to Garp to Last Night in Twisted River and how this development relates to the development of late twentieth century postmodern literary theory to twenty-first century post-postmodern literary theory. The purpose of my study is to determine how the prominently postmodern feature metafiction, created through the writer-character’s stories-within-stories, has changed in form and function in the two novels published thirty years apart from one another, and what possible features this indicates for future post-postmodern theory. I establish my theoretical framework on the development of metafiction largely on late twentieth-century models of author and authorship as discussed by Roland Barthes, Wayne Booth and Michel Foucault. I base my close analysis of metafiction mostly on Linda Hutcheon’s model of overt and covert metafiction. At the end of my study, I examine Irving’s later novel through Suzanne Rohr’s models of reality constitution and fictional reality. The analysis of the two novels focuses on excerpts that feature the writer-characters, their stories-within-stories and the novels’ other characters and the narrators’ evaluations of these two. I draw examples from both novels, but I illustrate my choice of focus on the novels at the beginning of each section. Through this, I establish a method of analysis that best illustrates the development as a continuum from pre-existing postmodern models and theories to the formation of new post-postmodern theory. Based on my findings, the thesis argues that twenty-first century literary theory has moved away from postmodern overt deconstruction of the narrative and its meaning. New post-postmodern literary theory reacquires the previously deconstructed boundaries that define reality and truth and re-establishes them as having intrinsic value that cannot be disputed. In establishing fictional reality as self-governing and non-intrudable, post-postmodern theory takes a stance against postmodern nihilism, which indicates the re-founded, non-questionable value of the text’s reality. To continue mapping other possible features of future post-postmodern theory, I recommend further analysis solely on John Irving’s novels’ published in the twenty-first century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supernova (SN) is an explosion of a star at the end of its lifetime. SNe are classified to two types, namely type I and II through the optical spectra. They have been categorised based on their explosion mechanism, to core collapse supernovae (CCSNe) and thermonuclear supernovae. The CCSNe group which includes types IIP, IIn, IIL, IIb, Ib, and Ic are produced when a massive star with initial mass more than 8 M⊙ explodes due to a collapse of its iron core. On the other hand, thermonuclear SNe originate from white dwarfs (WDs) made of carbon and oxygen, in a binary system. Infrared astronomy covers observations of astronomical objects in infrared radiation. The infrared sky is not completely dark and it is variable. Observations of SNe in the infrared give different information than optical observations. Data reduction is required to correct raw data from for example unusable pixels and sky background. In this project, the NOTCam package in the IRAF was used for the data reduction. For measuring magnitudes of SNe, the aperture photometry method with the Gaia program was used. In this Master’s thesis, near-infrared (NIR) observations of three supernovae of type IIn (namely LSQ13zm, SN 2009ip and SN2011jb), one type IIb (SN2012ey), in addition to one type Ic (SN2012ej) and type IIP (SN 2013gd) are studied with emphasis on luminosity and colour evolution. All observations were done with the Nordic Optical Telescope (NOT). Here, we used the classification by Mattila & Meikle (2001) [76], where the SNe are differentiated by the infrared light curves into two groups, namely ’ordinary’ and ’slowly declining’. The light curves and colour evolution of these supernovae were obtained in J, H and Ks bands. In this study, our data, combined with other observations, provide evidence to categorize LSQ13zm, SN 2012ej and SN 2012ey as being part of the ordinary type. We found interesting NIR behaviour of SN 2011jb, which lead it to be classified as a slowly declining type.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este projeto tem como objetivo apresentar uma maneira diferente de abordar os conceitos de funções quadráticas, propondo exercícios de motivação e de fixação de conteúdos, com questões contextualizadas e voltadas para o dia-a-dia do educando. Pretende-se auxiliar os docentes e tornar, assim, a aprendizagem de seus discentes mais prazerosa, desmistificando a Matemática, oportunizando aos alunos a utilização dos conceitos matemáticos em seu cotidiano. Utilizando as ferramentas tecnológicas já disponíveis na maioria das escolas de Educação Básica, apresentarse-ão atividades usando pelo menos um software livre para inspirar os professores na elaboração de suas aulas e consequentemente ajudar os discentes na construção do seu conhecimento.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Bayesian optimisation algorithm for a nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. When a human scheduler works, he normally builds a schedule systematically following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not yet completed, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this paper, we design a more human-like scheduling algorithm, by using a Bayesian optimisation algorithm to implement explicit learning from past solutions. A nurse scheduling problem from a UK hospital is used for testing. Unlike our previous work that used Genetic Algorithms to implement implicit learning [1], the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The Bayesian optimisation algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, new rule strings have been obtained. Sets of rule strings are generated in this way, some of which will replace previous strings based on fitness. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. For clarity, consider the following toy example of scheduling five nurses with two rules (1: random allocation, 2: allocate nurse to low-cost shifts). In the beginning of the search, the probabilities of choosing rule 1 or 2 for each nurse is equal, i.e. 50%. After a few iterations, due to the selection pressure and reinforcement learning, we experience two solution pathways: Because pure low-cost or random allocation produces low quality solutions, either rule 1 is used for the first 2-3 nurses and rule 2 on remainder or vice versa. In essence, Bayesian network learns 'use rule 2 after 2-3x using rule 1' or vice versa. It should be noted that for our and most other scheduling problems, the structure of the network model is known and all variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus, learning can amount to 'counting' in the case of multinomial distributions. For our problem, we use our rules: Random, Cheapest Cost, Best Cover and Balance of Cost and Cover. In more detail, the steps of our Bayesian optimisation algorithm for nurse scheduling are: 1. Set t = 0, and generate an initial population P(0) at random; 2. Use roulette-wheel selection to choose a set of promising rule strings S(t) from P(t); 3. Compute conditional probabilities of each node according to this set of promising solutions; 4. Assign each nurse using roulette-wheel selection based on the rules' conditional probabilities. A set of new rule strings O(t) will be generated in this way; 5. Create a new population P(t+1) by replacing some rule strings from P(t) with O(t), and set t = t+1; 6. If the termination conditions are not met (we use 2000 generations), go to step 2. Computational results from 52 real data instances demonstrate the success of this approach. They also suggest that the learning mechanism in the proposed approach might be suitable for other scheduling problems. Another direction for further research is to see if there is a good constructing sequence for individual data instances, given a fixed nurse scheduling order. If so, the good patterns could be recognized and then extracted as new domain knowledge. Thus, by using this extracted knowledge, we can assign specific rules to the corresponding nurses beforehand, and only schedule the remaining nurses with all available rules, making it possible to reduce the solution space. Acknowledgements The work was funded by the UK Government's major funding agency, Engineering and Physical Sciences Research Council (EPSRC), under grand GR/R92899/01. References [1] Aickelin U, "An Indirect Genetic Algorithm for Set Covering Problems", Journal of the Operational Research Society, 53(10): 1118-1126,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo de este trabajo consiste en oponer dos modos de fundamentar los sistemas políticos: el primero se basa en apelar a la historia y a la identidad nacional; mientras que el segundo pone el acento en los mecanismos institucionales. Vamos a tratar estas dos maneras de entender la política a partir del ejemplo de la Holanda de los siglos XVI y XVII, con la utilización que en esta época se hizo del mito Bátavo y del mito de Venecia. Mientras que el mito bátavo es sólo histórico, el mito de Venecia tiene un componente histórico y un aspecto político y constitucional, que representa Espinosa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we propose to infer pixel-level labelling in video by utilising only object category information, exploiting the intrinsic structure of video data. Our motivation is the observation that image-level labels are much more easily to be acquired than pixel-level labels, and it is natural to find a link between the image level recognition and pixel level classification in video data, which would transfer learned recognition models from one domain to the other one. To this end, this thesis proposes two domain adaptation approaches to adapt the deep convolutional neural network (CNN) image recognition model trained from labelled image data to the target domain exploiting both semantic evidence learned from CNN, and the intrinsic structures of unlabelled video data. Our proposed approaches explicitly model and compensate for the domain adaptation from the source domain to the target domain which in turn underpins a robust semantic object segmentation method for natural videos. We demonstrate the superior performance of our methods by presenting extensive evaluations on challenging datasets comparing with the state-of-the-art methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Bayesian optimisation algorithm for a nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. When a human scheduler works, he normally builds a schedule systematically following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not yet completed, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this paper, we design a more human-like scheduling algorithm, by using a Bayesian optimisation algorithm to implement explicit learning from past solutions. A nurse scheduling problem from a UK hospital is used for testing. Unlike our previous work that used Genetic Algorithms to implement implicit learning [1], the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The Bayesian optimisation algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, new rule strings have been obtained. Sets of rule strings are generated in this way, some of which will replace previous strings based on fitness. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. For clarity, consider the following toy example of scheduling five nurses with two rules (1: random allocation, 2: allocate nurse to low-cost shifts). In the beginning of the search, the probabilities of choosing rule 1 or 2 for each nurse is equal, i.e. 50%. After a few iterations, due to the selection pressure and reinforcement learning, we experience two solution pathways: Because pure low-cost or random allocation produces low quality solutions, either rule 1 is used for the first 2-3 nurses and rule 2 on remainder or vice versa. In essence, Bayesian network learns 'use rule 2 after 2-3x using rule 1' or vice versa. It should be noted that for our and most other scheduling problems, the structure of the network model is known and all variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus, learning can amount to 'counting' in the case of multinomial distributions. For our problem, we use our rules: Random, Cheapest Cost, Best Cover and Balance of Cost and Cover. In more detail, the steps of our Bayesian optimisation algorithm for nurse scheduling are: 1. Set t = 0, and generate an initial population P(0) at random; 2. Use roulette-wheel selection to choose a set of promising rule strings S(t) from P(t); 3. Compute conditional probabilities of each node according to this set of promising solutions; 4. Assign each nurse using roulette-wheel selection based on the rules' conditional probabilities. A set of new rule strings O(t) will be generated in this way; 5. Create a new population P(t+1) by replacing some rule strings from P(t) with O(t), and set t = t+1; 6. If the termination conditions are not met (we use 2000 generations), go to step 2. Computational results from 52 real data instances demonstrate the success of this approach. They also suggest that the learning mechanism in the proposed approach might be suitable for other scheduling problems. Another direction for further research is to see if there is a good constructing sequence for individual data instances, given a fixed nurse scheduling order. If so, the good patterns could be recognized and then extracted as new domain knowledge. Thus, by using this extracted knowledge, we can assign specific rules to the corresponding nurses beforehand, and only schedule the remaining nurses with all available rules, making it possible to reduce the solution space. Acknowledgements The work was funded by the UK Government's major funding agency, Engineering and Physical Sciences Research Council (EPSRC), under grand GR/R92899/01. References [1] Aickelin U, "An Indirect Genetic Algorithm for Set Covering Problems", Journal of the Operational Research Society, 53(10): 1118-1126,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research arose from the notorious need to promote oral production in the adult learners of the English Extension courses at Universidad del Valle in 2014. This qualitative research was carried out in a 60 hour course divided along 15 sessions on Saturdays, and with an adult population between the ages of 22 and 65 years old. Its main objective was to describe the impact of games aimed at promoting oral production in English with a group of adult learners. Data were collected from one demographic survey, video-recordings of classroom events during the implementation of games, students? surveys after each game and a teacher?s journal. The analysis of data showed that games did have an impact in students? performance which was related to a positive atmosphere in the classroom. Students showed progress in terms of fluency, interaction and even pronunciation; however they still showed difficulties with accuracy in their spontaneous utterances. These learners? achievements seemed to have a relation with the class atmosphere during games where students showed high level of involvement, confidence, mutual support and enjoyment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystems can provide many services. Wetlands, for example, can help mitigate water pollution from point sources as well as non-point sources, serve as habitat for wildlife, sequester carbon and serve as a place for recreation. Studies have found that these services can have substantial value to society. The sale of ecosystem credits has been found to be a possible way to finance construction investments in wetlands and easements to farmers to take their land out of production. At the same time, selling one ecosystem service credit may not always be enough to justify the investment. Traditionally market participants have only been allowed to sell a single credit from one piece of land, but recently there have been discussions about the possibility of selling more than one credit from a piece of land because it potentially could lead to more efficient ecosystem service provision. Selling multiple credits is sometimes referred to as credit stacking. This paper is an empirical study of the potential for credit stacking applied to the services provided by wetlands in the Upper Mississippi River Basin, specifically nitrogen, phosphorus and wildlife credits. In the setting of our study where costs are discrete rather than continuous we found that wetlands are a cost-effective way to reduce the nitrogen loads from wastewater treatment plants and that stacking nitrogen, phosphorus and wildlife credits may improve social welfare while leading to a higher level of ecosystem services. However, for credit stacking to be welfare improving we found that there needs to be a substantial demand for the credit that covers the majority of the investment in wetlands, while the credit aggregator has a choice between what ecosystem projects to undertake. If the credit that covers the majority of investment is sold first and is the sole basis of the investment decision and the objective is to improve welfare, a sequential implementation of ecosystem credits is not recommended; it would not lead to an increase in the total amount of ecosystem services provided though it would increase profit for the credit producer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ocean bottom pressure records from eight stations of the Cascadia array are used to investigate the properties of short surface gravity waves with frequencies ranging from 0.2 to 5 Hz. It is found that the pressure spectrum at all sites is a well-defined function of the wind speed U10 and frequency f, with only a minor shift of a few dB from one site to another that can be attributed to variations in bottom properties. This observation can be combined with the theoretical prediction that the ocean bottom pressure spectrum is proportional to the surface gravity wave spectrum E(f) squared, times the overlap integral I(f) which is given by the directional wave spectrum at each frequency. This combination, using E(f) estimated from modeled spectra or parametric spectra, yields an overlap integral I(f) that is a function of the local wave age inline image. This function is maximum for f∕fPM = 8 and decreases by 10 dB for f∕fPM = 2 and f∕fPM = 30. This shape of I(f) can be interpreted as a maximum width of the directional wave spectrum at f∕fPM = 8, possibly equivalent to an isotropic directional spectrum, and a narrower directional distribution toward both the dominant low frequencies and the higher capillary-gravity wave frequencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, the use of heavy metals and chemical products industry expanded. The presence of significant amounts of, pollutants in industrial waste water can lead to serious risks to the environment and human health have heavy metals like chromium is one example of the future of salmon knock pond environment. Chromium is an essential element in the diet, but high doses of this element is very dangerous. Hence the use of chemical methods as a tool for the removal of metals from waste water pond be used. The aim of this study was to investigate the mineral kaolin adsorbents for the removal of chromium is water. Thus, the effect of different concentrations of absorbent micro amounts of chromium absorption and variable temperature, pH and electrolytes were studied. During the investigation of spectroscopic instrument (Varian) UV-VIS are used. Comparison of the absorption mechanism of chromium adsorption by the adsorbent with nano-absorbent kaolin kaolin was investigated. According to the studies done in the same conditions of temperature, pH and shaking rate of chromium absorption by nano kaolin kaolin is much more attractive. Therefore, its use as an adsorbent abundant, cheap, accessible, efficient and effective is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The TOPEX/POSEIDON mission offers the first opportunity to observe rain cells over the ocean by a dual-frequency radar altimeter (TOPEX) and simultaneously observe their natural radiative properties by a three-frequency radiometer (TOPEX microwave radiometer (TMR)). This work is a feasibility study aimed at understanding the capability and potential of the active/passive TOPEX/TMR system for oceanic rainfall detection. On the basis of past experiences in rain flagging, a joint TOPEX/TMR rain probability index is proposed. This index integrates several advantages of the two sensors and provides a more reliable rain estimate than the radiometer alone. One year's TOPEX/TMR TMR data are used to test the performance of the index. The resulting rain frequency statistics show quantitative agreement with those obtained from the Comprehensive Ocean-Atmosphere Data Set (COADS) in the Intertropical Convergence Zone (ITCZ), while qualitative agreement is found for other regions of the world ocean. A recent finding that the latitudinal frequency of precipitation over the Southern Ocean increases steadily toward the Antarctic continent is confirmed by our result. Annual and seasonal precipitation maps are derived from the index. Notable features revealed include an overall similarity in rainfall pattern from the Pacific, the Atlantic, and the Indian Oceans and a general phase reversal between the two hemispheres, as well as a number of regional anomalies in terms of rain intensity. Comparisons with simultaneous Global Precipitation Climatology Project (GPCP) multisatellite precipitation rate and COADS rain climatology suggest that systematic differences also exist. One example is that the maximum rainfall in the ITCZ of the Indian Ocean appears to be more intensive and concentrated in our result compared to that of the GPCP. Another example is that the annual precipitation produced by TOPEX/TMR is constantly higher than those from GPCP and COADS in the extratropical regions of the northern hemisphere, especially in the northwest Pacific Ocean. Analyses of the seasonal variations of prominent rainy and dry zones in the tropics and subtropics show various behaviors such as systematic migration, expansion and contraction, merging and breakup, and pure intensity variations, The seasonality of regional features is largely influenced by local atmospheric events such as monsoon, storm, or snow activities. The results of this study suggest that TOPEX and its follow-on may serve as a complementary sensor to the special sensor microwave/imager in observing global oceanic precipitation.