855 resultados para Technology in motion pictures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several mechanisms for self-enhancing feedback instabilities in marine ecosystems are identified and briefly elaborated. It appears that adverse phases of operation may be abruptly triggered by explosive breakouts in abundance of one or more previously suppressed populations. Moreover, an evident capacity of marine organisms to accomplish extensive geographic habitat expansions may expand and perpetuate a breakout event. This set of conceptual elements provides a framework for interpretation of a sequence of events that has occurred in the Northern Benguela Current Large Marine Ecosystem (off south-western Africa). This history can illustrate how multiple feedback loops might interact with one another in unanticipated and quite malignant ways, leading not only to collapse of customary resource stocks but also to degradation of the ecosystem to such an extent that disruption of customary goods and services may go beyond fisheries alone to adversely affect other major global ecosystem concerns (e.g. proliferations of jellyfish and other slimy, stingy, toxic and/or noxious organisms, perhaps even climate change itself, etc.). The wisdom of management interventions designed to interrupt an adverse mode of feedback operation is pondered. Research pathways are proposed that may lead to improved insights needed: (i) to avoid potential 'triggers' that might set adverse phases of feedback loop operation into motion; and (ii) to diagnose and properly evaluate plausible actions to reverse adverse phases of feedback operation that might already have been set in motion. These pathways include the drawing of inferences from available 'quasi-experiments' produced either by short-term climatic variation or inadvertently in the course of biased exploitation practices, and inter-regional applications of the comparative method of science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon possesses unique electrical and structural properties that make it an ideal material for use in fuel cell construction. In alkaline, phosphoric acid and proton-exchange membrane fuel cells (PEMFCs), carbon is used in fabricating the bipolar plate and the gas-diffusion layer. It can also act as a support for the active metal in the catalyst layer. Various forms of carbon - from graphite and carbon blacks to composite materials - have been chosen for fuel-cell components. The development of carbon nanotubes and the emergence of nanotechnology in recent years has therefore opened up new avenues of matenials development for the low-temperature fuel cells, particularly the hydrogen PEMFC and the direct methanol PEMFC. Carbon nanotubes and aerogels are also being investigated for use as catalyst support, and this could lead to the production of more stable, high activity catalysts, with low platinum loadings (< 0.1 Mg cm(-2)) and therefore low cost. Carbon can also be used as a fuel in high-temperature fuel cells based on solid oxide, alkaline or molten carbonate technology. In the direct carbon fuel cell (DCFC), the energy of combustion of carbon is converted to electrical power with a thermodynamic efficiency close to 100%. The DCFC could therefore help to extend the use of fossil fuels for power generation as society moves towards a more sustainable energy future. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the early stages of a three year study that is investigating the impact of a technology-enriched teacher education program on beginning teachers' integration of computers, graphics calculators, and the internet into secondary school mathematics classrooms. Whereas much of the existing research on the role of technology in mathematics learning has been concerned with effects on curriculum content or student learning, less attention has been given to the relationship between technology use and issues of pedagogy, in particular the impact on teachers' professional learning in the context of specific classroom and school environments. Our research applies sociocultural theories of learning to consider how beginning teachers are initiated into a collaborative professional community featuring both web-based and face to face interaction, and how participation in such a community shapes their pedagogical beliefs and practices. The aim of this paper is to analyse processes through which the emerging community was established and sustained during the first year of the study. We examine features of this community in terms of identity formation, shifts in values and beliefs, and interaction patterns revealed in bulletin board discussion between students and lecturers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the early 21st century, we need to prepare university students to navigate local and global cultures effectively and sensitively. These future professionals must develop comprehensive intercultural communication skills and understanding. Yet university assessment in Australia is often based on a western template of knowledge, which automatically places International, Indigenous, as well as certain groups of local students at a study disadvantage. It also ensures that Australian students from dominant groups are not given the opportunity to develop these vital intercultural skills. This paper explores the issues embedded in themes 1 and 4 of this conference and provides details of an innovative website developed at Queensland University of Technology in Brisbane, Australia, which encourages academic staff to investigate the hidden assumptions that can underpin their assessment practices. The website also suggests strategies academics can use to ensure that their assessment becomes more socially and culturally responsive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Corpus Linguistics is a young discipline. The earliest work was done in the 1960s, but corpora only began to be widely used by lexicographers and linguists in the late 1980s, by language teachers in the late 1990s, and by language students only very recently. This course in corpus linguistics was held at the Departamento de Linguistica Aplicada, E.T.S.I. de Minas, Universidad Politecnica de Madrid from June 15-19 1998. About 45 teachers registered for the course. 30% had PhDs in linguistics, 20% in literature, and the rest were doctorandi or qualified English teachers. The course was designed to introduce the use of corpora and other computational resources in teaching and research, with special reference to scientific and technological discourse in English. Each participant had a computer networked with the lecturer’s machine, whose display could be projected onto a large screen. Application programs were loaded onto the central server, and telnet and a web browser were available. COBUILD gave us permission to access the 323 million word Bank of English corpus, Mike Scott allowed us to use his Wordsmith Tools software, and Tim Johns gave us a copy of his MicroConcord program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The point of departure for this study was a recognition of the differences in suppliers' and acquirers' judgements of the value of technology when transferred between the two, and the significant impacts of technology valuation on the establishment of technology partnerships and effectiveness of technology collaborations. The perceptions, transfer strategies and objectives, perceived benefits and assessed technology contributions as well as associated costs and risks of both suppliers and acquirers were seen to be the core to these differences. This study hypothesised that the capability embodied in technology to yield future returns makes technology valuation distinct from the process of valuing manufacturing products. The study hence has gone beyond the dimensions of cost calculation and price determination that have been discussed in the existing literature, by taking a broader view of how to achieve and share future added value from transferred technology. The core of technology valuation was argued as the evaluation of the 'quality' of the capability (technology) in generating future value and the effectiveness of the transfer arrangement for best use of such a capability. A dynamic approach comprising future value generation and realisation within the context of specific forms of collaboration was therefore adopted. The research investigations focused on the UK and China machine tool industries, where there are many technology transfer activities and the value issue has already been recognised in practice. Data were gathered from three groups: machine tool manufacturing technology suppliers in the UK and acquirers in China, and machine tool users in China. Data collecting methods included questionnaire surveys and case studies within all the three groups. The study has focused on identifying and examining the major factors affecting value as well as their interactive effects on technology valuation from both the supplier's and acquirer's point of view. The survey results showed the perceptions and the assessments of the owner's value and transfer value from the supplier's and acquirer's point of view respectively. Benefits, costs and risks related to the technology transfer were the major factors affecting the value of technology. The impacts of transfer payment on the value of technology by the sharing of financial benefits, costs and risks between partners were assessed. The close relationship between technology valuation and transfer arrangements was established by which technical requirements and strategic implications were considered. The case studies reflected the research propositions and revealed that benefits, costs and risks in the financial, technical and strategic dimensions interacted in the process of technology valuation within the context of technology collaboration. Further to the assessment of factors affecting value, a technology valuation framework was developed which suggests that technology attributes for the enhancement of contributory factors and their contributions to the realisation of transfer objectives need to be measured and compared with the associated costs and risks. The study concluded that technology valuation is a dynamic process including the generation and sharing of future value and the interactions between financial, technical and strategic achievements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing countries are subject to the same global pressures as their developed counterparts but have additional domestic challenges that may place them at a significant, and perhaps insurmountable, disadvantage. However, technology still offers them the opportunity to participate in the international economy. Difficult conditions in their countries do not absolve managers from formulating and implementing technology policies that can make their firms globally competitive. At a macro-economic level, a number of broad developmental issues impact on the use of technology in developing countries. The subject of this paper is to examine the challenge for South African firms in their efforts to master technology, despite internal and external difficulties. Owners of technology need to consider the local context when supplying their technology to developing markets. The paper aims to investigate the views of technology recipients by examining the perceptions of South African managers regarding technology integration in a manufacturing environment. A number of technology suppliers were also interviewed in order to obtain their opinions on the issues raised by the technology acquirers. The importance of different factors in integrating technology is studied in relation to managers ’ abilities to control these variables. An importance-control grid framework is used to identify critical parameters and to assess how they can be managed in a complex environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in technology coupled with increasing labour costs have caused service firms to explore self-service delivery options. Although some studies have focused on self-service and use of technology in service delivery, few have explored the role of service quality in consumer evaluation of technology-based self-service options. By integrating and extending the self-service quality framework the service evaluation model and the Technology Acceptance Model the authors address this emerging issue by empirically testing a comprehensive model that captures the antecedents and consequences of perceived service quality to predict continued customer interaction in the technology-based self-service context of Internet banking. Important service evaluation constructs like perceived risk, perceived value and perceived satisfaction are modelled in this framework. The results show that perceived control has the strongest influence on service quality evaluations. Perceived speed of delivery, reliability and enjoyment also have a significant impact on service quality perceptions. The study also found that even though perceived service quality, perceived risk and satisfaction are important predictors of continued interaction, perceived customer value plays a pivotal role in influencing continued interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper expands research into self-service technology in the service encounter. Self-service technology is where customers deliver service themselves using some form of a technological interface. There is still a great deal unknown about self-service technology, in particular its impact on consumer satisfaction and consumer commitment. With that in mind, this empirical study explores the relative impact of self-service technology on consumer satisfaction and on a multidimensional measure of consumer commitment containing affective commitment, temporal commitment and instrumental commitment. The results reveal that in a hotel context personal service still remains very important for assessments of satisfaction, and affective and temporal commitment. What is particularly interesting is that self-service technology, while impacting these constructs, also impacts instrumental commitment. This suggests that positive evaluations of self-service technology may tie consumers into relationships with hotels. A discussion and implications for managers are provided on these and other results, and the paper is concluded with further potential research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - To consider the role of technology in knowledge management in organizations, both actual and desired. Design/methodology/approach - Facilitated, computer-supported group workshops were conducted with 78 people from ten different organizations. The objective of each workshop was to review the current state of knowledge management in that organization and develop an action plan for the future. Findings - Only three organizations had adopted a strongly technology-based "solution" to knowledge management problems, and these followed three substantially different routes. There was a clear emphasis on the use of general information technology tools to support knowledge management activities, rather than the use of tools specific to knowledge management. Research limitations/implications - Further research is needed to help organizations make best use of generally available software such as intranets and e-mail for knowledge management. Many issues, especially human, relate to the implementation of any technology. Participation was restricted to organizations that wished to produce an action plan for knowledge management. The findings may therefore represent only "average" organizations, not the very best practice. Practical implications - Each organization must resolve four tensions: Between the quantity and quality of information/knowledge, between centralized and decentralized organization, between head office and organizational knowledge, and between "push" and "pull" processes. Originality/value - Although it is the group rather than an individual that determines what counts as knowledge, hardly any previous studies of knowledge management have collected data in a group context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data envelopment analysis defines the relative efficiency of a decision making unit (DMU) as the ratio of the sum of its weighted outputs to the sum of its weighted inputs allowing the DMUs to freely allocate weights to their inputs/outputs. However, this measure may not reflect a DMU's true efficiency as some inputs/outputs may not contribute reasonably to the efficiency measure. Traditionally, to overcome this problem weights restrictions have been imposed. This paper offers a new approach to this problem where DMUs operate a constant returns to scale technology in a single input multi-output context. The approach is based on introducing unobserved DMUs, created by adjusting the output levels of certain observed relatively efficient DMUs, reflecting a combination of technical information of feasible production levels and the DM's value judgments. Its main advantage is that the information conveyed by the DM is local, with reference to a specific observed DMU. The approach is illustrated on a real life application. © 2003 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blurred edges appear sharper in motion than when they are stationary. We proposed a model of this motion sharpening that invokes a local, nonlinear contrast transducer function (Hammett et al, 1998 Vision Research 38 2099-2108). Response saturation in the transducer compresses or 'clips' the input spatial waveform, rendering the edges as sharper. To explain the increasing distortion of drifting edges at higher speeds, the degree of nonlinearity must increase with speed or temporal frequency. A dynamic contrast gain control before the transducer can account for both the speed dependence and approximate contrast invariance of motion sharpening (Hammett et al, 2003 Vision Research, in press). We show here that this model also predicts perceived sharpening of briefly flashed and flickering edges, and we show that the model can account fairly well for experimental data from all three modes of presentation (motion, flash, and flicker). At moderate durations and lower temporal frequencies the gain control attenuates the input signal, thus protecting it from later compression by the transducer. The gain control is somewhat sluggish, and so it suffers both a slow onset, and loss of power at high temporal frequencies. Consequently, brief presentations and high temporal frequencies of drift and flicker are less protected from distortion, and show greater perceptual sharpening.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blurred edges appear sharper in motion than when they are stationary. We (Vision Research 38 (1998) 2108) have previously shown how such distortions in perceived edge blur may be accounted for by a model which assumes that luminance contrast is encoded by a local contrast transducer whose response becomes progressively more compressive as speed increases. If the form of the transducer is fixed (independent of contrast) for a given speed, then a strong prediction of the model is that motion sharpening should increase with increasing contrast. We measured the sharpening of periodic patterns over a large range of contrasts, blur widths and speeds. The results indicate that whilst sharpening increases with speed it is practically invariant with contrast. The contrast invariance of motion sharpening is not explained by an early, static compressive non-linearity alone. However, several alternative explanations are also inconsistent with these results. We show that if a dynamic contrast gain control precedes the static non-linear transducer then motion sharpening, its speed dependence, and its invariance with contrast, can be predicted with reasonable accuracy. © 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blurred edges appear sharper in motion than when they are stationary. We have previously shown how such distortions in perceived edge blur may be explained by a model which assumes that luminance contrast is encoded by a local contrast transducer whose response becomes progressively more compressive as speed increases. To test this model further, we measured the sharpening of drifting, periodic patterns over a large range of contrasts, blur widths, and speeds Human Vision. The results indicate that, while sharpening increased with speed, it was practically invariant with contrast. This contrast invariance cannot be explained by a fixed compressive nonlinearity since that predicts almost no sharpening at low contrasts.We show by computational modelling of spatiotemporal responses that, if a dynamic contrast gain control precedes the static nonlinear transducer, then motion sharpening, its speed dependence, and its invariance with contrast can be predicted with reasonable accuracy.