833 resultados para Learning from one Example
Resumo:
During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.
Resumo:
The working paper depicts two innovative examples from Japan of the direct supply of food, which involves the development of closer producer-consumer relations, as well as closer producer-producer networks. Choku-bai-jo and Teikei networks are considered as examples of practices implicated in alternative food networks (AFNs). One example has become a quasi-public endeavour and is seen by the Japanese state as a legitimate part of rural development and is promoted in support of small producers. The other is borne from consumer concern over food quality and, despite its long-lived status, this arrangement remains marginal and with little institutional or governmental support. A model which blends the organization and aims of both examples holds potential for a more sustainable eco-economic future.
Resumo:
At its inception, the paradigm of system dynamics was deliberately made distinct from that of OR. Yet developments in soft OR now have much in common with current system dynamics modeling practice. This article briefly traces the parallel development of system dynamics and soft OR, and argues that a dialogue between the two would be mutually rewarding. to support this claim, examples of soft OR tools are described along with some of the field’s philosophical grounding and current issues. Potential benefits resulting from a dialogue are explored, with particular emphasis on the methodological framework of system dynamics and the need for a complementarist approach. The article closes with some suggestions on how to begin learning from.
Resumo:
he perspective European Supergrid would consist of an integrated power system network, where electricity demands from one country could be met by generation from another country. This paper makes use of a bi-linear fixed-effects model to analyse the determinants for trading electricity across borders among 34 countries connected by the European Supergrid. The key question that this paper aims to address is the extent to which the privatisation of European electricity markets has brought about higher cross-border trade of electricity. The analysis makes use of distance, price ratios, gate closure times, size of peaks and aggregate demand as standard determinants. Controlling for other standard determinants, it is concluded that privatisation in most cases led to higher power exchange and that the benefits are more significant where privatisation measures have been in place for a longer period.
Resumo:
A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.
Resumo:
Housing Associations (HAs) contribute circa 20% of the UK’s housing supply. HAs are however under increasing pressure as a result of funding cuts and rent reductions. Due to the increased pressure, a number of processes are currently being reviewed by HAs, especially how they manage and learn from defects. Learning from defects is considered a useful approach to achieving defect reduction within the UK housebuilding industry. This paper contributes to our understanding of how HAs learn from defects by undertaking an initial round table discussion with key HA stakeholders as part of an ongoing collaborative research project with the National House Building Council (NHBC) to better understand how house builders and HAs learn from defects to reduce their prevalence. The initial discussion shows that defect information runs through a number of groups, both internal and external of a HA during both the defects management process and organizational learning (OL) process. Furthermore, HAs are reliant on capturing and recording defect data as the foundation for the OL process. During the OL process defect data analysis is the primary enabler to recognizing a need for a change to organizational routines. When a need for change has been recognized, new options are typically pursued to design out defects via updates to a HAs Employer’s Requirements. Proposed solutions are selected by a review board and committed to organizational routine. After implementing a change, both structured and unstructured feedback is sought to establish the change’s success. The findings from the HA discussion demonstrates that OL can achieve defect reduction within the house building sector in the UK. The paper concludes by outlining a potential ‘learning from defects model’ for the housebuilding industry as well as describing future work.
Resumo:
Rapid growth in the production of new homes in the UK is putting build quality under pressure as evidenced by an increase in the number of defects. Housing associations (HAs) contribute approximately 20% of the UK’s new housing supply. HAs are currently experiencing central government funding cuts and rental revenue reductions. As part of HAs’ quest to ramp up supply despite tight budget conditions, they are reviewing how they learn from defects. Learning from defects is argued as a means of reducing the persistent defect problem within the UK housebuilding industry, yet how HAs learn from defects is under-researched. The aim of this research is to better understand how HAs, in practice, learn from past defects to reduce the prevalence of defects in future new homes. The theoretical lens for this research is organizational learning. The results drawn from 12 HA case studies indicate that effective organizational learning has the potential to reduce defects within the housing sector. The results further identify that HAs are restricting their learning to focus primarily on reducing defects through product and system adaptations. Focusing on product and system adaptations alone suppresses HAs’ abilities to reduce defects in the future.
Resumo:
We study the relationship between the sentiment levels of Twitter users and the evolving network structure that the users created by @-mentioning each other. We use a large dataset of tweets to which we apply three sentiment scoring algorithms, including the open source SentiStrength program. Specifically we make three contributions. Firstly we find that people who have potentially the largest communication reach (according to a dynamic centrality measure) use sentiment differently than the average user: for example they use positive sentiment more often and negative sentiment less often. Secondly we find that when we follow structurally stable Twitter communities over a period of months, their sentiment levels are also stable, and sudden changes in community sentiment from one day to the next can in most cases be traced to external events affecting the community. Thirdly, based on our findings, we create and calibrate a simple agent-based model that is capable of reproducing measures of emotive response comparable to those obtained from our empirical dataset.
Resumo:
The incidence of Listeria monocytogenes in three cheese manufacturing plants from the northeastern region of Sao Paulo, Brazil, was evaluated from October 2008 to September 2009. L. monocytogenes was found in samples from two plants, at percentages of 13.3% (n = 128) and 9.6% (n = 114). Samples of raw and pasteurized milk, water, and Minas Frescal cheese were negative for L. monocyto genes, although the pathogen was isolated from the surface of Prato cheese and in brine from one of the plants evaluated. L. monocytogenes was also isolated from different sites of the facilities, mainly in non food contact surfaces such as drains, floors, and platforms. Serotype 4b was the most predominant in the plants studied. The results of this study indicate the need for control strategies to prevent the dispersion of L. monocytogenes in the environment of cheese manufacturing plants.
Resumo:
Morphological and molecular studies were carried out on Palisada papillosa and P. perforata from the Canary Islands (type locality of P. perforata), Mexico and Brazil. The two species have been distinguished by features of their external morphology such as size and degree of compactness of the thalli, presence or absence of arcuate branches, branching pattern and basal system. A detailed morphological comparison between these taxa showed that none of the vegetative anatomical or reproductive characters was sufficient to separate these species. The presence or absence of cortical cells in a palisade-like arrangement, also previously used to. distinguish these species, is not applicable. The species present all characters typical of the genus, and both share production of the first pericentral cell underneath the basal cell of the trichoblast, production of two fertile pericentral cells (the second and the third additional, the first remaining sterile), spermatangial branches produced from one of two laterals on the suprabasal cell of trichoblasts, and the procarpbearing segment with four pericentral cells. Details of the procarp are described for the species for the first time. The phylogenetic position of these species was inferred by analysis of the chloroplast-encoded rbcL gene sequences from 39 taxa, using one other Rhodomelacean taxon and two Ceramiaceae as outgroups. Relationships within the clade formed by P. papillosa and P. perforata have not been resolved due to the low level of genetic variation in their rbcL sequences (0-0.4%). Considering this and the morphological similarities, we conclude that P. papillosa is a taxonomic synonym of P. perforata. The phylogenetic analyses also supported the nomenclatural transfer of two species of Chondrophycus to Palisada, namely, P. patentiramea (Montagne) Cassano, Senties, Gil-Rodriguez & M.T. Fujii comb. nov. and P. thuyoides (Kutzing) Cassano, Senties, Gil-Rodriguez & M.T. Fujii comb. nov.
Resumo:
Sociable robots are embodied agents that are part of a heterogeneous society of robots and humans. They Should be able to recognize human beings and each other, and to engage in social, interactions. The use of a robotic architecture may strongly reduce the time and effort required to construct a sociable robot. Such architecture must have structures and mechanisms to allow social interaction. behavior control and learning from environment. Learning processes described oil Science of Behavior Analysis may lead to the development of promising methods and Structures for constructing robots able to behave socially and learn through interactions from the environment by a process of contingency learning. In this paper, we present a robotic architecture inspired from Behavior Analysis. Methods and structures of the proposed architecture, including a hybrid knowledge representation. are presented and discussed. The architecture has been evaluated in the context of a nontrivial real problem: the learning of the shared attention, employing an interactive robotic head. The learning capabilities of this architecture have been analyzed by observing the robot interacting with the human and the environment. The obtained results show that the robotic architecture is able to produce appropriate behavior and to learn from social interaction. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Learning from anywhere anytime is a contemporary phenomenon in the field of education that is thought to be flexible, time and cost saving. The phenomenon is evident in the way computer technology mediates knowledge processes among learners. Computer technology is however, in some instances, faulted. There are studies that highlight drawbacks of computer technology use in learning. In this study we aimed at conducting a SWOT analysis on ubiquitous computing and computer-mediated social interaction and their affect on education. Students and teachers were interviewed on the mentioned concepts using focus group interviews. Our contribution in this study is, identifying what teachers and students perceive to be the strength, weaknesses, opportunities and threats of ubiquitous computing and computer-mediated social interaction in education. We also relate the findings with literature and present a common understanding on the SWOT of these concepts. Results show positive perceptions. Respondents revealed that ubiquitous computing and computer-mediated social interaction are important in their education due to advantages such as flexibility, efficiency in terms of cost and time, ability to acquire computer skills. Nevertheless disadvantages where also mentioned for example health effects, privacy and security issues, noise in the learning environment, to mention but a few. This paper gives suggestions on how to overcome threats mentioned.
Resumo:
A current topic in Swedish schools is the use of computer games and gaming. One reason is because computers are becoming more and more integrated into the schools, and the technology plays a large role in the everyday lives of the pupils. Since teachers should integrate pupils’ interests in the formal teaching, it is of interest to know what attitudes teachers have towards gaming. Therefore the aim of this empirical study is to gain an insight into the attitudes Swedish primary teachers have towards online and offline computer games in the EFL classroom. An additional aim is to investigate to what extent teachers use games. Five interviews were conducted with teachers in different Swedish schools in a small to medium-sized municipality. After the interviews were transcribed, the results were analyzed and discussed in relation to relevant research and the sociocultural theory. The results show that teachers are positive towards games and gaming, mostly because gaming often contains interaction with others and learning from peers is a main component in sociocultural theory. However, only one out of the five participants had at some point used games. The conclusion is that teachers are unsure about how to use games in their teaching and that training and courses in this area would be valuable. More research is needed within this area, and it would be of value to investigate what suggested courses would contain and also to investigate exactly how games can be used in teaching.
Resumo:
The Sandy River in central Maine Is flanked along much of its length by low terraces. Approximately 100 kg of sediment from one terrace in Starks, Somerset County, Maine was wet-sieved in the field. Over 1100 subfossil Coleoptera were recovered representing 53 individual species of a total of 99 taxa. Wood associated with the fauna is 2000 +/-80 14C Yr in age (1-16,038). The fauna is dominated by species characteristic of habitats apparent in modern central Maine. The subfossil assemblage is indicative of a wide vartety of environments including open ground (e.g., Harpalus pensylvanicus), dense forest (e.g., pterostichus honestus), aquatic environments (e.g., Gyrinus, Helophorus), riparian environments with sand and gravel substrates (e.g., Bembidion inaequale, Schizogenius lineolatus), and moist, organic-rich terrestrial environments (e.g., Micropeplus sculptus). The ecological requirements for each taxon permit an environmental reconstruction suggesting an area vegetationally, climatically, and ecologically similar to that of the Sandy River today. The lowest terraces apparently represent the modern-day floodplain of the Sandy River. An average sedimentation rate of l.00 to 1.04 mm per year has been inferred based on radiocarbon dates here and elsewhere on the Sandy River. The Coleopteran fauna suggests that sand and gravel were distinctly abundant, and that the aggradation of point bars, as seen today, contributed to the flood history. Lateral bank erosion of the modern Sandy River accelerated after the State of Maine mandated cessation of bar removal in 1975: flood severity has dramatically increased since that time. Implications suggest that mining of the bars may be necessary to minimize future flooding problems.
Resumo:
Sistemas de previsão de cheias podem ser adequadamente utilizados quando o alcance é suficiente, em comparação com o tempo necessário para ações preventivas ou corretivas. Além disso, são fundamentalmente importantes a confiabilidade e a precisão das previsões. Previsões de níveis de inundação são sempre aproximações, e intervalos de confiança não são sempre aplicáveis, especialmente com graus de incerteza altos, o que produz intervalos de confiança muito grandes. Estes intervalos são problemáticos, em presença de níveis fluviais muito altos ou muito baixos. Neste estudo, previsões de níveis de cheia são efetuadas, tanto na forma numérica tradicional quanto na forma de categorias, para as quais utiliza-se um sistema especialista baseado em regras e inferências difusas. Metodologias e procedimentos computacionais para aprendizado, simulação e consulta são idealizados, e então desenvolvidos sob forma de um aplicativo (SELF – Sistema Especialista com uso de Lógica “Fuzzy”), com objetivo de pesquisa e operação. As comparações, com base nos aspectos de utilização para a previsão, de sistemas especialistas difusos e modelos empíricos lineares, revelam forte analogia, apesar das diferenças teóricas fundamentais existentes. As metodologias são aplicadas para previsão na bacia do rio Camaquã (15543 km2), para alcances entre 10 e 48 horas. Dificuldades práticas à aplicação são identificadas, resultando em soluções as quais constituem-se em avanços do conhecimento e da técnica. Previsões, tanto na forma numérica quanto categorizada são executadas com sucesso, com uso dos novos recursos. As avaliações e comparações das previsões são feitas utilizandose um novo grupo de estatísticas, derivadas das freqüências simultâneas de ocorrência de valores observados e preditos na mesma categoria, durante a simulação. Os efeitos da variação da densidade da rede são analisados, verificando-se que sistemas de previsão pluvio-hidrométrica em tempo atual são possíveis, mesmo com pequeno número de postos de aquisição de dados de chuva, para previsões sob forma de categorias difusas.