7 resultados para Software tools

em Repositório Institucional da Universidade de Aveiro - Portugal


Relevância:

70.00% 70.00%

Publicador:

Resumo:

A utilização das TIC ocupam um lugar cada vez mais importante nas nossas escolas, marcado sobretudo pela evolução das tecnologias e pela utilização em contexto educativo de muitas ferramentas da Web 2.0. Esse facto é muito notório na disciplina de Educação Visual e Tecnológica, de carácter eminentemente prático, onde é permitido explorar várias ferramentas digitais para abordagem de conteúdos da disciplina e para a criação de produtos gráficos e plásticos. Com o aparecimento da Web 2.0 e a disponibilização de milhares de novas ferramentas digitais aos utilizadores da Internet, estimula-se um interesse cada vez maior na adoção de metodologias e estratégias com recurso a estes media e que suportem uma aprendizagem mais eficaz e motivadora para os alunos, articulando-se os suportes tradicionais de EVT com os novos media digitais. Neste contexto, o presente estudo é o resultado duma investigação-ação realizada no âmbito do Programa Doutoral em Multimédia em Educação da Universidade de Aveiro onde se implementou a integração de ferramentas da Web, Web 2.0 e Software Livre em contexto educativo na disciplina de EVT, na qual poderiam ser utilizadas tanto as técnicas tradicionais de realização mais usuais na disciplina como a integração e articulação com as ferramentas digitais, suportadas por software livre (e outros de utilização gratuita), a Web e a Web 2.0 para suporte ao ensino e aprendizagem dos diversos conteúdos e áreas de exploração da disciplina. Este estudo, desenhado em três ciclos, envolveu num primeiro momento a constituição de uma comunidade de prática de professores alargada, sendo criadas seis turmas de formação que reuniram um total de 112 professores que pretendiam integrar as ferramentas digitais em EVT. Para além da pesquisa, análise, seleção e catalogação destas 430 ferramentas digitais recenseadas, produziram-se 371 manuais de apoio à utilização das mesmas, sendo estes recursos disponibilizados no espaço do EVTdigital. Num segundo ciclo, decorrente da avaliação realizada, foi criada a distribuição EVTux para simplificar o acesso e utilização das ferramentas digitais em contexto de EVT. Finalmente, o terceiro ciclo, decorre da eliminação da disciplina de EVT do currículo do 2º ciclo do ensino básico e a sua substituição por duas novas disciplinas, tendo-se realizada a respetiva análise de conteúdo das metas curriculares e produzido a aplicação As ferramentas digitais do Mundo Visual, concebida para contextualizar e indexar as ferramentas digitais selecionadas para a nova disciplina de Educação Visual.Os resultados deste estudo apontam claramente para a possibilidade de integrar na disciplina de Educação Visual e Tecnológica (ou no presente momento, em Educação Visual) ferramentas digitais para abordagem aos conteúdos e áreas de exploração, bem como a possibilidade de se constituírem facilmente comunidades de prática (como foi o caso) que possam colaborar na catalogação destas ferramentas no contexto específico da disciplina e para a necessidade sentida pelos professores em obter informação e formação que os possa atualizar quanto à integração das TIC no currículo. Apresentam-se, ainda, as limitações deste estudo que passaram sobretudo pelo impacto negativo que a eliminação da disciplina provocou na motivação dos docentes e a sua consequente participação no decorrer de algumas fases do trabalho, e ainda da dificuldade de gestão de uma equipa de professores colaboradores tão numerosa e diversificada. Nesse sentido, são também apresentadas sugestões para estudos futuros.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the last decade, mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. In particular, a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation (4G). 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigms). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications (i.e. YouTube and Skype) to be available in the near future. Therefore, 4G wireless communications system will be of paramount importance on the development of the information society in the near future. As 4G wireless services will continue to increase, this will put more and more pressure on the spectrum availability. There is a worldwide recognition that methods of spectrum managements have reached their limit and are no longer optimal, therefore new paradigms must be sought. Studies show that most of the assigned spectrum is under-utilized, thus the problem in most cases is inefficient spectrum management rather spectrum shortage. There are currently trends towards a more liberalized approach of spectrum management, which are tightly linked to what is commonly termed as Cognitive Radio (CR). Furthermore, conventional deployment of 4G wireless systems (one BS in cell and mobile deploy around it) are known to have problems in providing fairness (users closer to the BS are more benefited relatively to the cell edge users) and in covering some zones affected by shadowing, therefore the use of relays has been proposed as a solution. To evaluate and analyse the performances of 4G wireless systems software tools are normally used. Software tools have become more and more mature in recent years and their need to provide a high level evaluation of proposed algorithms and protocols is now more important. The system level simulation (SLS) tools provide a fundamental and flexible way to test all the envisioned algorithms and protocols under realistic conditions, without the need to deal with the problems of live networks or reduced scope prototypes. Furthermore, the tools allow network designers a rapid collection of a wide range of performance metrics that are useful for the analysis and optimization of different algorithms. This dissertation proposes the design and implementation of conventional system level simulator (SLS), which afterwards enhances for the 4G wireless technologies namely cognitive Radios (IEEE802.22) and Relays (IEEE802.16j). SLS is then used for the analysis of proposed algorithms and protocols.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Na medida em que os produtos e os processos de criação são cada vez mais mediados digitalmente, existe uma reflexão recente acerca da relação entre as imagens e as ferramentas usadas para a sua produção. A relação natural e estreita entre a dimensão conceptual e a dimensão física abre a discussão ao nível da semântica e dos processos da projetação e manipulação das imagens, nas quais estão naturalmente incluídas as ferramentas CAD. Tendo o desenho um papel inequívoco e fundamental no exercício da projetação e da modelação 3D é pertinente perceber a relação e a articulação entre estas duas ferramentas. Reconhecendo o desenho como uma ferramenta de domínio físico capaz de expressar o pensamento que opera a transformação de concepções abstratas em concepções concretas, reconhecê-lo refletido na dimensão virtual através de um software CAD 3D não é trivial, já que este, na generalidade, é processado através de um pensamento cujo contexto é distante da materialidade. Metodologicamente, abordaremos esta questão procurando a verificação da hipótese através de uma proposta de exercício prático que procura avaliar o efeito que as imagens analógicas poderão ter sobre o reconhecimento e operatividade da ferramenta Blender num enquadramento académico. Pretende-se, pois, perceber como o desenho analógico pode integrar o processo de modelação 3D e qual a relação que mantém com quem elas opera. A articulação do desenho com as ferramentas de produção de design, especificamente CAD 3D, permitirá compreender na especialidade a articulação entre ferramentas de diferentes naturezas tanto no processo da projetação quanto na criação de artefactos visuais. Assim como poderá lançar a discussão acerca das estratégias pedagógicas de ensino do desenho e do 3D num curso de Design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of the world population has led to an increase of settlements often located in areas prone to natural disasters, including earthquakes. Consequently, despite the important advances in the field of natural catastrophes modelling and risk mitigation actions, the overall human losses have continued to increase and unprecedented economic losses have been registered. In the research work presented herein, various areas of earthquake engineering and seismology are thoroughly investigated, and a case study application for mainland Portugal is performed. Seismic risk assessment is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible numerical tools and software. In the present work, an open-source platform for seismic hazard and risk assessment is developed. This software is capable of computing the distribution of losses or damage for an earthquake scenario (deterministic event-based) or earthquake losses due to all the possible seismic events that might occur within a region for a given interval of time (probabilistic event-based). This effort has been developed following an open and transparent philosophy and therefore, it is available to any individual or institution. The estimation of the seismic risk depends mainly on three components: seismic hazard, exposure and vulnerability. The latter component assumes special importance, as by intervening with appropriate retrofitting solutions, it may be possible to decrease directly the seismic risk. The employment of analytical methodologies is fundamental in the assessment of structural vulnerability, particularly in regions where post-earthquake building damage might not be available. Several common methodologies are investigated, and conclusions are yielded regarding the method that can provide an optimal balance between accuracy and computational effort. In addition, a simplified approach based on the displacement-based earthquake loss assessment (DBELA) is proposed, which allows for the rapid estimation of fragility curves, considering a wide spectrum of uncertainties. A novel vulnerability model for the reinforced concrete building stock in Portugal is proposed in this work, using statistical information collected from hundreds of real buildings. An analytical approach based on nonlinear time history analysis is adopted and the impact of a set of key parameters investigated, including the damage state criteria and the chosen intensity measure type. A comprehensive review of previous studies that contributed to the understanding of the seismic hazard and risk for Portugal is presented. An existing seismic source model was employed with recently proposed attenuation models to calculate probabilistic seismic hazard throughout the territory. The latter results are combined with information from the 2011 Building Census and the aforementioned vulnerability model to estimate economic loss maps for a return period of 475 years. These losses are disaggregated across the different building typologies and conclusions are yielded regarding the type of construction more vulnerable to seismic activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work deals with the development of robust numerical tools for Isogeometric Analysis suitable for problems of solid mechanics in the nonlinear regime. To that end, a new solid-shell element, based on the Assumed Natural Strain method, is proposed for the analysis of thin shell-like structures. The formulation is extensively validated using a set of well-known benchmark problems available in the literature, in both linear and nonlinear (geometric and material) regimes. It is also proposed an alternative formulation which is focused on the alleviation of the volumetric locking pathology in linear elastic problems. In addition, an introductory study in the field of contact mechanics, in the context of Isogeometric Analysis, is also presented, with special focus on the implementation of a the Point-to-Segment algorithm. All the methodologies presented in the current work were implemented in a in-house code, together with several pre- and post-processing tools. In addition, user subroutines for the commercial software Abaqus were also implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When developing software for autonomous mobile robots, one has to inevitably tackle some kind of perception. Moreover, when dealing with agents that possess some level of reasoning for executing their actions, there is the need to model the environment and the robot internal state in a way that it represents the scenario in which the robot operates. Inserted in the ATRI group, part of the IEETA research unit at Aveiro University, this work uses two of the projects of the group as test bed, particularly in the scenario of robotic soccer with real robots. With the main objective of developing algorithms for sensor and information fusion that could be used e ectively on these teams, several state of the art approaches were studied, implemented and adapted to each of the robot types. Within the MSL RoboCup team CAMBADA, the main focus was the perception of ball and obstacles, with the creation of models capable of providing extended information so that the reasoning of the robot can be ever more e ective. To achieve it, several methodologies were analyzed, implemented, compared and improved. Concerning the ball, an analysis of ltering methodologies for stabilization of its position and estimation of its velocity was performed. Also, with the goal keeper in mind, work has been done to provide it with information of aerial balls. As for obstacles, a new de nition of the way they are perceived by the vision and the type of information provided was created, as well as a methodology for identifying which of the obstacles are team mates. Also, a tracking algorithm was developed, which ultimately assigned each of the obstacles a unique identi er. Associated with the improvement of the obstacles perception, a new algorithm of estimating reactive obstacle avoidance was created. In the context of the SPL RoboCup team Portuguese Team, besides the inevitable adaptation of many of the algorithms already developed for sensor and information fusion and considering that it was recently created, the objective was to create a sustainable software architecture that could be the base for future modular development. The software architecture created is based on a series of di erent processes and the means of communication among them. All processes were created or adapted for the new architecture and a base set of roles and behaviors was de ned during this work to achieve a base functional framework. In terms of perception, the main focus was to de ne a projection model and camera pose extraction that could provide information in metric coordinates. The second main objective was to adapt the CAMBADA localization algorithm to work on the NAO robots, considering all the limitations it presents when comparing to the MSL team, especially in terms of computational resources. A set of support tools were developed or improved in order to support the test and development in both teams. In general, the work developed during this thesis improved the performance of the teams during play and also the e ectiveness of the developers team when in development and test phases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main motivation for the work presented here began with previously conducted experiments with a programming concept at the time named "Macro". These experiments led to the conviction that it would be possible to build a system of engine control from scratch, which could eliminate many of the current problems of engine management systems in a direct and intrinsic way. It was also hoped that it would minimize the full range of software and hardware needed to make a final and fully functional system. Initially, this paper proposes to make a comprehensive survey of the state of the art in the specific area of software and corresponding hardware of automotive tools and automotive ECUs. Problems arising from such software will be identified, and it will be clear that practically all of these problems stem directly or indirectly from the fact that we continue to make comprehensive use of extremely long and complex "tool chains". Similarly, in the hardware, it will be argued that the problems stem from the extreme complexity and inter-dependency inside processor architectures. The conclusions are presented through an extensive list of "pitfalls" which will be thoroughly enumerated, identified and characterized. Solutions will also be proposed for the various current issues and for the implementation of these same solutions. All this final work will be part of a "proof-of-concept" system called "ECU2010". The central element of this system is the before mentioned "Macro" concept, which is an graphical block representing one of many operations required in a automotive system having arithmetic, logic, filtering, integration, multiplexing functions among others. The end result of the proposed work is a single tool, fully integrated, enabling the development and management of the entire system in one simple visual interface. Part of the presented result relies on a hardware platform fully adapted to the software, as well as enabling high flexibility and scalability in addition to using exactly the same technology for ECU, data logger and peripherals alike. Current systems rely on a mostly evolutionary path, only allowing online calibration of parameters, but never the online alteration of their own automotive functionality algorithms. By contrast, the system developed and described in this thesis had the advantage of following a "clean-slate" approach, whereby everything could be rethought globally. In the end, out of all the system characteristics, "LIVE-Prototyping" is the most relevant feature, allowing the adjustment of automotive algorithms (eg. Injection, ignition, lambda control, etc.) 100% online, keeping the engine constantly working, without ever having to stop or reboot to make such changes. This consequently eliminates any "turnaround delay" typically present in current automotive systems, thereby enhancing the efficiency and handling of such systems.