974 resultados para Interactive technology
Resumo:
Higher education institutions, has an active role in the development of a sustainable future and for this reason, it is essential that they became environmentally sustainable institutions, applying methods such as the Ecological Footprint analysis. This study intent is to strengthen the potential of the ecological footprint as an indicator of the sustainability of students of Lisbon School of Health Technology, and identify the relationship between the ecological footprint and the different socio-demographic variables.
Resumo:
According to the Intergovernmental Panel on Climate Change, the average temperature of the Earth's surface has risen about 1º C in the last 100 years and will increase, depending on the scenario emissions of Greenhouse Gases. The rising temperatures could trigger environmental effects like rising sea levels, floods, droughts, heat waves, hurricanes. With growing concerns about different environmental issues and the need to address climate change, institutions of higher education should create knowledge and integrate sustainability into teaching programs and research programs, as well as promoting environmental issues for society. The aim of this study is to determine the carbon footprint of the academic community of Lisbon School of Health Technology (ESTeSL) in 2013, identifying possible links between the Carbon Footprint and the different socio-demographic variables.
Resumo:
This paper studies the effects of the diffusion of a General Purpose Technology (GPT) that spreads first within the developed North country of its origin, and then to a developing South country. In the developed general equilibrium growth model, each final good can be produced by one of two technologies. Each technology is characterized by a specific labor complemented by a specific set of intermediate goods, which are enhanced periodically by Schumpeterian R&D activities. When quality reaches a threshold level, a GPT arises in one of the technologies and spreads first to the other technology within the North. Then, it propagates to the South, following a similar sequence. Since diffusion is not even, neither intra- nor inter-country, the GPT produces successive changes in the direction of technological knowledge and in inter- and intra-country wage inequality. Through this mechanism the different observed paths of wage inequality can be accommodated.
Resumo:
With the advent of Web 2.0, new kinds of tools became available, which are not seen as novel anymore but are widely used. For instance, according to Eurostat data, in 2010 32% of individuals aged 16 to 74 used the Internet to post messages to social media sites or instant messaging tools, ranging from 17% in Romania to 46% in Sweden (Eurostat, 2012). Web 2.0 applications have been used in technology-enhanced learning environments. Learning 2.0 is a concept that has been used to describe the use of social media for learning. Many Learning 2.0 initiatives have been launched by educational and training institutions in Europe. Web 2.0 applications have also been used for informal learning. Web 2.0 tools can be used in classrooms, virtual or not, not only to engage students but also to support collaborative activities. Many of these tools allow users to use tags to organize resources and facilitate their retrieval at a later date or time. The aim of this chapter is to describe how tagging has been used in systems that support formal or informal learning and to summarize the functionalities that are common to these systems. In addition, common and unusual tagging applications that have been used in some Learning Objects Repositories are analysed.
Resumo:
Virtual Reality (VR) has grown to become state-of-theart technology in many business- and consumer oriented E-Commerce applications. One of the major design challenges of VR environments is the placement of the rendering process. The rendering process converts the abstract description of a scene as contained in an object database to an image. This process is usually done at the client side like in VRML [1] a technology that requires the client’s computational power for smooth rendering. The vision of VR is also strongly connected to the issue of Quality of Service (QoS) as the perceived realism is subject to an interactive frame rate ranging from 10 to 30 frames-per-second (fps), real-time feedback mechanisms and realistic image quality. These requirements overwhelm traditional home computers or even high sophisticated graphical workstations over their limits. Our work therefore introduces an approach for a distributed rendering architecture that gracefully balances the workload between the client and a clusterbased server. We believe that a distributed rendering approach as described in this paper has three major benefits: It reduces the clients workload, it decreases the network traffic and it allows to re-use already rendered scenes.
Resumo:
When exploring a virtual environment, realism depends mainly on two factors: realistic images and real-time feedback (motions, behaviour etc.). In this context, photo realism and physical validity of computer generated images required by emerging applications, such as advanced e-commerce, still impose major challenges in the area of rendering research whereas the complexity of lighting phenomena further requires powerful and predictable computing if time constraints must be attained. In this technical report we address the state-of-the-art on rendering, trying to put the focus on approaches, techniques and technologies that might enable real-time interactive web-based clientserver rendering systems. The focus is on the end-systems and not the networking technologies used to interconnect client(s) and server(s).
Resumo:
Tese de doutoramento em Engenharia do Ambiente, especialidade em Sistemas Sociais
Resumo:
In this paper, we intend to present some research carried out in a state Primary school, which is very well-equipped with ICT resources, including interactive whiteboards. The interactive whiteboard was used in the context of a Unit of Work for English learning, based on a traditional oral story, ‘Jack and the Beanstalk’. It was also used for reinforcing other topics like, ‘At the beach’, ‘In the city’, ‘Jobs’, etc. An analysis of the use of the digital board, which includes observation records as well as questionnaires for teachers and pupils, was carried out.
Resumo:
Operational Modal Analysis is currently applied in structural dynamic monitoring studies using conventional wired based sensors and data acquisition platforms. This approach, however, becomes inadequate in cases where the tests are performed in ancient structures with esthetic concerns or in others, where the use of wires greatly impacts the monitoring system cost and creates difficulties in the maintenance and deployment of data acquisition platforms. In these cases, the use of sensor platforms based on wireless and MEMS would clearly benefit these applications. This work presents a first attempt to apply this wireless technology to the structural monitoring of historical masonry constructions in the context of operational modal analysis. Commercial WSN platforms were used to study one laboratory specimen and one of the structural elements of a XV century building in Portugal. Results showed that in comparison to the conventional wired sensors, wireless platforms have poor performance in respect to the acceleration time series recorded and the detection of modal shapes. However, for frequency detection issues, reliable results were obtained, especially when random excitation was used as noise source.
Resumo:
Electricity is regarded as one of the indispensable means to growth of any country’s economy. This source of power is the heartbeat of everything from the huge metropolitans, industries, worldwide computer networks and our global communication systems down to our homes. Electrical energy is the lifeline for any economic and societal development of a region or country. It is central to develop countries for maintaining acquired life styles and essential to developing countries for industrialisation and escaping poverty.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Mestrado Integrado em Engenharia Química e Bioquímica
Resumo:
In this paper we demonstrate an add/drop filter based on SiC technology. Tailoring of the channel bandwidth and wavelength is experimentally demonstrated. The concept is extended to implement a 1 by 4 wavelength division multiplexer with channel separation in the visible range. The device consists of a p-i'(a-SiC:H)-n/p-i(a-Si: H)-n heterostructure. Several monochromatic pulsed lights, separately or in a polychromatic mixture illuminated the device. Independent tuning of each channel is performed by steady state violet bias superimposed either from the front and back sides. Results show that, front background enhances the light-to-dark sensitivity of the long and medium wavelength channels and quench strongly the others. Back violet background has the opposite behaviour. This nonlinearity provides the possibility for selective removal or addition of wavelengths. An optoelectronic model is presented and explains the light filtering properties of the add/drop filter, under different optical bias conditions.
Resumo:
The work agenda includes the production of a report on different doctoral programmes on “Technology Assessment” in Europe, the US and Japan, in order to analyse collaborative post-graduation activities. Finally, the proposals on collaborative post-graduation programme between FCTUNL and ITAS-FZK will be organised by an ongoing discussion process with colleagues from ITAS.
Resumo:
In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.