950 resultados para Meaning transfer model
Resumo:
Cloud imagery is not currently used in numerical weather prediction (NWP) to extract the type of dynamical information that experienced forecasters have extracted subjectively for many years. For example, rapidly developing mid-latitude cyclones have characteristic signatures in the cloud imagery that are most fully appreciated from a sequence of images rather than from a single image. The Met Office is currently developing a technique to extract dynamical development information from satellite imagery using their full incremental 4D-Var (four-dimensional variational data assimilation) system. We investigate a simplified form of this technique in a fully nonlinear framework. We convert information on the vertical wind field, w(z), and profiles of temperature, T(z, t), and total water content, qt (z, t), as functions of height, z, and time, t, to a single brightness temperature by defining a 2D (vertical and time) variational assimilation testbed. The profiles of w, T and qt are updated using a simple vertical advection scheme. We define a basic cloud scheme to obtain the fractional cloud amount and, when combined with the temperature field, we convert this information into a brightness temperature, having developed a simple radiative transfer scheme. With the exception of some matrix inversion routines, all our code is developed from scratch. Throughout the development process we test all aspects of our 2D assimilation system, and then run identical twin experiments to try and recover information on the vertical velocity, from a sequence of observations of brightness temperature. This thesis contains a comprehensive description of our nonlinear models and assimilation system, and the first experimental results.
Resumo:
Successful innovation diffusion process may well take the form of knowledge transfer process. Therefore, the primary objectives of this paper include: first, to evaluate the interrelations between transfer of knowledge and diffusion of innovation; and second to develop a model to establish a connection between the two. This has been achieved using a four-step approach. The first step of the approach is to assess and discuss the theories relating to knowledge transfer (KT) and innovation diffusion (ID). The second step focuses on developing basic models for KT and ID, based on the key theories surrounding these areas. A considerable amount of literature has been written on the association between knowledge management and innovation, the respective fields of KT and ID. The next step, therefore, explores the relationship between innovation and knowledge management in order to identify the connections between the latter, i.e. KT and ID. Finally, step four proposes and develops an integrated model for KT and ID. As the developed model suggests the sub-processes of knowledge transfer can be connected to the innovation diffusion process in several instances as discussed and illustrated in the paper.
Resumo:
Government and institutionally-driven ‘good practice transfer’ initiatives are consistently presented as a means to enhance construction firm and industry performance. Two implicit tenets of these initiatives appear to be: knowledge embedded in good practice will transfer automatically; and, the potential of implementing good practice will be capitalised regardless of the context where it is to be used. The validity of these tenets is increasingly being questioned and, concurrently, more nuanced knowledge production understandings are being developed which recognise and incorporate context-specificity. This research contributes to this growing, more critical agenda by examining the actual benefits accrued from good practice transfer from the perspective of a small specialist trade contracting firm. A concept model for successful good practice transfer is developed from a single longitudinal case study within a small heating and plumbing firm. The concept model consists of five key variables: environment, strategy, people, technology, and organisation of work. The key findings challenge the implicit assumptions prevailing in the existing literature and support a contingency approach that argues successful good practice transfer is not just adopting and mechanistically inserting into the firm, but requires addressing ‘behavioural’ aspects. For successful good practice transfer, small specialist trade contracting firms need to develop and operationalise organisation slack, mechanisms for scanning external stimuli and absorbing knowledge. They also need to formulate and communicate client-driven external strategies; to motive and educate people at all levels; to possess internal or accessible complementary skills and knowledge; to have ‘soft focus’ immediate/mid-term benefits at a project level; and, to embed good practice in current work practices.
Resumo:
Common approaches to the simulation of borehole heat exchangers (BHEs) assume heat transfer in circulating fluid and grout to be in a quasi-steady state and ignore fluctuations in fluid temperature due to transport of the fluid around the loop. However, in domestic ground source heat pump (GSHP) systems, the heat pump and circulating pumps switch on and off during a given hour; therefore, the effect of the thermal mass of the circulating fluid and the dynamics of fluid transport through the loop has important implications for system design. This may also be important in commercial systems that are used intermittently. This article presents transient simulation of a domestic GSHP system with a single BHE using a dynamic three-dimensional (3D) numerical BHE model. The results show that delayed response associated with the transit of fluid along the pipe loop is of some significance in moderating swings in temperature during heat pump operation. In addition, when 3D effects are considered, a lower heat transfer rate is predicted during steady operations. These effects could be important when considering heat exchanger design and system control. The results will be used to develop refined two-dimensional models.
Resumo:
With the rapid growth of information and technology, knowledge is a valuable asset in organisation which has become significant as a strategic resource. Many studies have focused on managing knowledge in organisations. In particular, knowledge transfer has become a significant issue concerned with the movement of knowledge across organisational boundaries. It enables the exploitation and application of existing knowledge for other organisations, reducing the time of creating knowledge, and minimising the cost of organisational learning. One way to capture knowledge in a transferrable form is through practice. In this paper, we discuss how organisations can transfer knowledge through practice effectively and propose a model for a semiotic approach to practice-oriented knowledge transfer. In this model, practice is treated as a sign that represents knowledge, and its localisation is analysed as a semiotic process.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk
Resumo:
Knowledge is a valuable asset in organisations that has become significant as a strategic resource in the information age. Many studies have focused on managing knowledge in organisations. In particular, knowledge transfer has become a significant issue concerned with the movement of knowledge across organisational boundaries. One way to capture knowledge in a transferrable form is through practice. In this paper, we discuss how organisations can transfer knowledge through practice effectively and propose a model for a semiotic approach to practice-oriented knowledge transfer. In this model, practice is treated as a sign that represents knowledge, and its localisation is analysed as a semiotic process.
Resumo:
This work proposes a unified neurofuzzy modelling scheme. To begin with, the initial fuzzy base construction method is based on fuzzy clustering utilising a Gaussian mixture model (GMM) combined with the analysis of covariance (ANOVA) decomposition in order to obtain more compact univariate and bivariate membership functions over the subspaces of the input features. The mean and covariance of the Gaussian membership functions are found by the expectation maximisation (EM) algorithm with the merit of revealing the underlying density distribution of system inputs. The resultant set of membership functions forms the basis of the generalised fuzzy model (GFM) inference engine. The model structure and parameters of this neurofuzzy model are identified via the supervised subspace orthogonal least square (OLS) learning. Finally, instead of providing deterministic class label as model output by convention, a logistic regression model is applied to present the classifier’s output, in which the sigmoid type of logistic transfer function scales the outputs of the neurofuzzy model to the class probability. Experimental validation results are presented to demonstrate the effectiveness of the proposed neurofuzzy modelling scheme.
Resumo:
One central question in the formal linguistic study of adult multilingual morphosyntax (i.e., L3/Ln acquisition) involves determining the role(s) the L1 and/or the L2 play(s) at the L3 initial state (e.g., Bardel & Falk, Second Language Research 23: 459–484, 2007; Falk & Bardel, Second Language Research: forthcoming; Flynn et al., The International Journal of Multilingualism 8: 3–16, 2004; Rothman, Second Language Research: forthcoming; Rothman & Cabrelli, On the initial state of L3 (Ln) acquisition: Selective or absolute transfer?: 2007; Rothman & Cabrelli Amaro, Second Language Research 26: 219–289, 2010). The present article adds to this general program, testing Rothman's (Second Language Research: forthcoming) model for L3 initial state transfer, which when relevant in light of specific language pairings, maintains that typological proximity between the languages is the most deterministic variable determining the selection of syntactic transfer. Herein, I present empirical evidence from the later part of the beginning stages of L3 Brazilian Portuguese (BP) by native speakers of English and Spanish, who have attained an advanced level of proficiency in either English or Spanish as an L2. Examining the related domains of syntactic word order and relative clause attachment preference in L3 BP, the data clearly indicate that Spanish is transferred for both experimental groups irrespective of whether it was the L1 or L2. These results are expected by Rothman's (Second Language Research: forthcoming) model, but not necessarily predicted by other current hypotheses of multilingual syntactic transfer; the implications of this are discussed.
Resumo:
This study investigates transfer at the third-language (L3) initial state, testing between the following possibilities: (1) the first language (L1) transfer hypothesis (an L1 effect for all adult acquisition), (2) the second language (L2) transfer hypothesis, where the L2 blocks L1 transfer (often referred to in the recent literature as the ‘L2 status factor’; Williams and Hammarberg, 1998), and (3) the Cumulative Enhancement Model (Flynn et al., 2004), which proposes selective transfer from all previous linguistic knowledge. We provide data from successful English-speaking learners of L2 Spanish at the initial state of acquiring L3 French and L3 Italian relating to properties of the Null-Subject Parameter (e.g. Chomsky, 1981; Rizzi, 1982). We compare these groups to each other, as well as to groups of English learners of L2 French and L2 Italian at the initial state, and conclude that the data are consistent with the predictions of the ‘L2 status factor’. However, we discuss an alternative possible interpretation based on (psycho)typologically-motivated transfer (borrowing from Kellerman, 1983), providing a methodology for future research in this domain to meaningfully tease apart the ‘L2 status factor’ from this alternative account.
Resumo:
We present new radiative transfer simulations to support determination of sea surface temperature (SST) from Along Track Scanning Radiometer (ATSR) imagery. The simulations are to be used within the ATSR Reprocessing for Climate project. The simulations are based on the “Reference Forward Model” line-by-line model linked with a sea surface emissivity model that accounts for wind speed and temperature, and with a discrete ordinates scattering model (DISORT). Input to the forward model is a revised atmospheric profile dataset, based on full resolution ERA-40, with a wider range of high-latitude profiles to address known retrieval biases in those regions. Analysis of the radiative impacts of atmospheric trace gases shows that geographical and temporal variation of N2O, CH4, HNO3, and CFC-11 and CFC-12 have effects of order 0.05, 0.2, 0.1 K on the 3.7, 11, 12 μm channels respectively. In addition several trace gases, neglected in previous studies, are included using fixed profiles contributing ~ 0.04 K to top-of-atmosphere BTs. Comparison against observations for ATSR2 and AATSR indicates that forward model biases have been reduced from 0.2 to 0.5 K for previous simulations to ~ 0.1 K.
Resumo:
Models for water transfer in the crop-soil system are key components of agro-hydrological models for irrigation, fertilizer and pesticide practices. Many of the hydrological models for water transfer in the crop-soil system are either too approximate due to oversimplified algorithms or employ complex numerical schemes. In this paper we developed a simple and sufficiently accurate algorithm which can be easily adopted in agro-hydrological models for the simulation of water dynamics. We used a dual crop coefficient approach proposed by the FAO for estimating potential evaporation and transpiration, and a dynamic model for calculating relative root length distribution on a daily basis. In a small time step of 0.001 d, we implemented algorithms separately for actual evaporation, root water uptake and soil water content redistribution by decoupling these processes. The Richards equation describing soil water movement was solved using an integration strategy over the soil layers instead of complex numerical schemes. This drastically simplified the procedures of modeling soil water and led to much shorter computer codes. The validity of the proposed model was tested against data from field experiments on two contrasting soils cropped with wheat. Good agreement was achieved between measurement and simulation of soil water content in various depths collected at intervals during crop growth. This indicates that the model is satisfactory in simulating water transfer in the crop-soil system, and therefore can reliably be adopted in agro-hydrological models. Finally we demonstrated how the developed model could be used to study the effect of changes in the environment such as lowering the groundwater table caused by the construction of a motorway on crop transpiration. (c) 2009 Elsevier B.V. All rights reserved.
Resumo:
Photoelectron spectroscopy and scanning tunneling microscopy have been used to investigate how the oxidation state of Ce in CeO2-x(111) ultrathin films is influenced by the presence of Pd nanoparticles. Pd induces an increase in the concentration of Ce3+ cations, which is interpreted as charge transfer from Pd to CeO2-x(111) on the basis of DFT+U calculations. Charge transfer from Pd to Ce4+ is found to be energetically favorable even for individual Pd adatoms. These results have implications for our understanding of the redox behavior of ceria-based model catalyst systems.
Resumo:
Nocturnal cooling of air within a forest canopy and the resulting temperature profile may drive local thermally driven motions, such as drainage flows, which are believed to impact measurements of ecosystem–atmosphere exchange. To model such flows, it is necessary to accurately predict the rate of cooling. Cooling occurs primarily due to radiative heat loss. However, much of the radiative loss occurs at the surface of canopy elements (leaves, branches, and boles of trees), while radiative divergence in the canopy air space is small due to high transmissivity of air. Furthermore, sensible heat exchange between the canopy elements and the air space is slow relative to radiative fluxes. Therefore, canopy elements initially cool much more quickly than the canopy air space after the switch from radiative gain during the day to radiative loss during the night. Thus in modeling air cooling within a canopy, it is not appropriate to neglect the storage change of heat in the canopy elements or even to assume equal rates of cooling of the canopy air and canopy elements. Here a simple parameterization of radiatively driven cooling of air within the canopy is presented, which accounts implicitly for radiative cooling of the canopy volume, heat storage in the canopy elements, and heat transfer between the canopy elements and the air. Simulations using this parameterization are compared to temperature data from the Morgan–Monroe State Forest (IN, USA) FLUXNET site. While the model does not perfectly reproduce the measured rates of cooling, particularly near the top of the canopy, the simulated cooling rates are of the correct order of magnitude.
Resumo:
Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.