36 resultados para Uniform Eberlein Compacta
Resumo:
Mestrado em Engenharia Química - Ramo Optimização Energética na Indústria Química
Resumo:
As the variety of mobile devices connected to the Internet growts there is a correponding increase in the need to deliver content tailored to their heterogeneous characteristics. At the same time, we watch to the increase of e-learning in universities through the adoption of electronic platforms and standards. Not surprisingly, the concept of mLearning (Mobile Learning) appeared in recent years decreasing the limitation of learning location with the mobility of general portable devices. However, this large number and variety of Web-enabled devices poses several challenges for Web content creators who want to automatic get the delivery context and adapt the content to the client mobile devices. In this paper we analyze several approaches to defining delivery context and present an architecture for deliver uniform mLearning content to mobile devices denominated eduMCA - Educational Mobile Content Adaptation. With the eduMCA system the Web authors will not need to create specialized pages for each kind of device, since the content is automatically transformed to adapt to any mobile device capabilities from WAP to XHTML MP-compliant devices.
Resumo:
The main purpose of this work was the development of procedures for the simulation of atmospheric ows over complex terrain, using OpenFOAM. For this aim, tools and procedures were developed apart from this code for the preprocessing and data extraction, which were thereafter applied in the simulation of a real case. For the generation of the computational domain, a systematic method able to translate the terrain elevation model to a native OpenFOAM format (blockMeshDict) was developed. The outcome was a structured mesh, in which the user has the ability to de ne the number of control volumes and its dimensions. With this procedure, the di culties of case set up and the high computation computational e ort reported in literature associated to the use of snappyHexMesh, the OpenFOAM resource explored until then for the accomplishment of this task, were considered to be overwhelmed. Developed procedures for the generation of boundary conditions allowed for the automatic creation of idealized inlet vertical pro les, de nition of wall functions boundary conditions and the calculation of internal eld rst guesses for the iterative solution process, having as input experimental data supplied by the user. The applicability of the generated boundary conditions was limited to the simulation of turbulent, steady-state, incompressible and neutrally strati ed atmospheric ows, always recurring to RaNS (Reynolds-averaged Navier-Stokes) models. For the modelling of terrain roughness, the developed procedure allowed to the user the de nition of idealized conditions, like an uniform aerodynamic roughness length or making its value variable as a function of topography characteristic values, or the using of real site data, and it was complemented by the development of techniques for the visual inspection of generated roughness maps. The absence and the non inclusion of a forest canopy model limited the applicability of this procedure to low aerodynamic roughness lengths. The developed tools and procedures were then applied in the simulation of a neutrally strati ed atmospheric ow over the Askervein hill. In the performed simulations was evaluated the solution sensibility to di erent convection schemes, mesh dimensions, ground roughness and formulations of the k - ε and k - ω models. When compared to experimental data, calculated values showed a good agreement of speed-up in hill top and lee side, with a relative error of less than 10% at a height of 10 m above ground level. Turbulent kinetic energy was considered to be well simulated in the hill windward and hill top, and grossly predicted in the lee side, where a zone of ow separation was also identi ed. Despite the need of more work to evaluate the importance of the downstream recirculation zone in the quality of gathered results, the agreement between the calculated and experimental values and the OpenFOAM sensibility to the tested parameters were considered to be generally in line with the simulations presented in the reviewed bibliographic sources.
Resumo:
In recent years, mobile learning has emerged as an educational approach to decrease the limitation of learning location and adapt the teaching-learning process to all type of students. However, the large number and variety of Web-enabled devices poses challenges for Web content creators who want to automatic get the delivery context and adapt the content to mobile devices. In this paper we study several approaches to adapt the learning content to mobile phones. We present an architecture for deliver uniform m-Learning content to students in a higher School. The system development is organized in two phases: firstly enabling the educational content to mobile devices and then adapting it to all the heterogeneous mobile platforms. With this approach, Web authors will not need to create specialized pages for each kind of device, since the content is automatically transformed to adapt to any mobile device capabilities from WAP to XHTML MP-compliant devices.
Resumo:
Recent studies of mobile Web trends show a continuous explosion of mobile-friendly content. However, the increasing number and heterogeneity of mobile devices poses several challenges for Web programmers who want to automatically get the delivery context and adapt the content to mobile devices. In this process, the devices detection phase assumes an important role where an inaccurate detection could result in a poor mobile experience for the enduser. In this paper we compare the most promising approaches for mobile device detection. Based on this study, we present an architecture for a system to detect and deliver uniform m-Learning content to students in a Higher School. We focus mainly on the devices capabilities repository manageable and accessible through an API. We detail the structure of the capabilities XML Schema that formalizes the data within the devices capabilities XML repository and the REST Web Service API for selecting the correspondent devices capabilities data according to a specific request. Finally, we validate our approach by presenting the access and usage statistics of the mobile web interface of the proposed system such as hits and new visitors, mobile platforms, average time on site and rejection rate.
Resumo:
Recent studies of mobile Web trends show the continued explosion of mobile-friend content. However, the wide number and heterogeneity of mobile devices poses several challenges for Web programmers, who want automatic delivery of context and adaptation of the content to mobile devices. Hence, the device detection phase assumes an important role in this process. In this chapter, the authors compare the most used approaches for mobile device detection. Based on this study, they present an architecture for detecting and delivering uniform m-Learning content to students in a Higher School. The authors focus mainly on the XML device capabilities repository and on the REST API Web Service for dealing with device data. In the former, the authors detail the respective capabilities schema and present a new caching approach. In the latter, they present an extension of the current API for dealing with it. Finally, the authors validate their approach by presenting the overall data and statistics collected through the Google Analytics service, in order to better understand the adherence to the mobile Web interface, its evolution over time, and the main weaknesses.
Resumo:
Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.
Resumo:
Earthquakes are associated with negative events, such as large number of casualties, destruction of buildings and infrastructures, or emergence of tsunamis. In this paper, we apply the Multidimensional Scaling (MDS) analysis to earthquake data. MDS is a set of techniques that produce spatial or geometric representations of complex objects, such that, objects perceived to be similar/distinct in some sense are placed nearby/distant on the MDS maps. The interpretation of the charts is based on the resulting clusters since MDS produces a different locus for each similarity measure. In this study, over three million seismic occurrences, covering the period from January 1, 1904 up to March 14, 2012 are analyzed. The events, characterized by their magnitude and spatiotemporal distributions, are divided into groups, either according to the Flinn–Engdahl seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Space-time and Space-frequency correlation indices are proposed to quantify the similarities among events. MDS has the advantage of avoiding sensitivity to the non-uniform spatial distribution of seismic data, resulting from poorly instrumented areas, and is well suited for accessing dynamics of complex systems. MDS maps are proven as an intuitive and useful visual representation of the complex relationships that are present among seismic events, which may not be perceived on traditional geographic maps. Therefore, MDS constitutes a valid alternative to classic visualization tools, for understanding the global behavior of earthquakes.
Resumo:
In recent years, mobile learning has emerged as an educational approach to decrease the limitation of learning location and adapt the teaching-learning process to all type of students. However, the large number and variety of Web-enabled devices poses challenges for Web content creators who want to automatic get the delivery context and adapt the content to mobile devices. This paper studies several approaches to adapt the learning content to mobile phones. It presents an architecture for deliver uniform m-Learning content to students in a higher School. The system development is organized in two phases: firstly enabling the educational content to mobile devices and then adapting it to all the heterogeneous mobile platforms. With this approach, Web authors will not need to create specialized pages for each kind of device, since the content is automatically transformed to adapt to any mobile device capabilities from WAP to XHTML MP-compliant devices.
Resumo:
Cosmic microwave background (CMB) radiation is the imprint from an early stage of the Universe and investigation of its properties is crucial for understanding the fundamental laws governing the structure and evolution of the Universe. Measurements of the CMB anisotropies are decisive to cosmology, since any cosmological model must explain it. The brightness, strongest at the microwave frequencies, is almost uniform in all directions, but tiny variations reveal a spatial pattern of small anisotropies. Active research is being developed seeking better interpretations of the phenomenon. This paper analyses the recent data in the perspective of fractional calculus. By taking advantage of the inherent memory of fractional operators some hidden properties are captured and described.
Resumo:
Mestrado em Engenharia Química - Ramo Optimização Energética na Indústria Química
Resumo:
A presente dissertação centrou-se no estudo técnico-económico de dois cenários futuros para a continuação de fornecimento de energia térmica a um complexo de piscinas existente na região do vale do Tâmega. Neste momento a central de cogeração existente excedeu a sua licença de utilização e necessita de ser substituída. Os dois cenários em estudo são a compra de uma nova caldeira, a gás natural, para suprir as necessidades térmicas da caldeira existente a fuelóleo, ou o uso de um sistema de cogeração compacto que poderá estar disponível numa empresa do grupo. No primeiro cenário o investimento envolvido é cerca de 456 640 € sem proveitos de outra ordem para além dos requisitos térmicos, mas no segundo cenário os resultados são bem diferentes, mesmo que tenha de ser realizado o investimento de 1 000 000 € na instalação. Para este cenário foi efetuado um levantamento da legislação nacional no que toca à cogeração, recolheram-se dados do edifício como: horas de funcionamento, número de utentes, consumos de energia elétrica, térmica, água, temperatura da água das piscinas, temperatura do ar da nave, assim como as principais características da instalação de cogeração compacta. Com esta informação realizou-se o balanço de massa e energia e criou-se um modelo da nova instalação em software de modelação processual (Aspen Plus® da AspenTech). Os rendimentos térmico e elétrico obtidos da nova central de cogeração compacta foram, respetivamente, de 38,1% e 39,8%, com uma percentagem de perdas de 12,5% o que determinou um rendimento global de 78%. A avaliação da poupança de energia primária para esta instalação de cogeração compacta foi de 19,6 % o que permitiu concluir que é de elevada eficiência. O modelo criado permitiu compreender as necessidades energéticas, determinar alguns custos associados ao processo e simular o funcionamento da unidade com diferentes temperaturas de ar ambiente (cenários de verão e inverno com temperaturas médias de 20ºC e 5ºC). Os resultados revelaram uma diminuição de 1,14 €/h no custo da electricidade e um aumento do consumo de gás natural de 62,47 €/h durante o período mais frio no inverno devido ao aumento das perdas provocadas pela diminuição da temperatura exterior. Com esta nova unidade de cogeração compacta a poupança total anual pode ser, em média, de 267 780 € admitindo um valor para a manutenção de 97 698 €/ano. Se assim for, o projeto apresenta um retorno do investimento ao fim de 5 anos, com um VAL de 1 030 430 € e uma taxa interna de rentabilidade (TIR) de 14% (positiva, se se considerar a taxa de atualização do investimento de 3% para 15 anos de vida). Apesar do custo inicial ser elevado, os parâmetros económicos mostram que o projeto tem viabilidade económica e dará lucro durante cerca de 9 anos.
Resumo:
A crescente evolução na tecnologia das juntas coladas conferiu um potencial atractivo às ligações adesivas, com aplicações nas mais variadas indústrias. Isto deve-se não só aos aspetos económicos, tais como a melhoria da cadência de produção mas também à resistência mecânica que estas proporcionam. A possibilidade de ligar facilmente materiais distintos, a distribuição mais uniforme das tensões, a melhor resistência à fadiga e a elevada capacidade de amortecimento de vibrações estão entre as principais vantagens da utilização deste tipo de ligação. Estas propriedades transformam as juntas coladas numas das preferidas no momento de seleção de meios de união. O trabalho desenvolvido nesta dissertação enquadra-se no âmbito das ligações adesivas e tem como principais objetivos a produção de uma ferramenta para a produção de provetes de adesivo, assim como a determinação das propriedades mecânicas à tração dos mesmos para testar o desempenho do molde fabricado. Para tal, utilizou-se um adesivo frágil (Araldite® AV 138), um dúctil (Araldite® 2015) e um muito dúctil (SikaForce® 7888). Paralelamente é selecionado o método mais adequado na obtenção destes provetes, designadamente escolhendo entre a moldação em molde aberto e a injeção em molde fechado. Com vista à obtenção dos provetes, foi projetado e construído um molde em aço. Recorrendo à máquina de tração Shimadzu AG – X 100, realizaram-se os respetivos ensaios de tração, para a determinação de todas as propriedades mecânicas dos adesivos. Para efeitos de comparação de resultados foram utilizados dois tipos de extensómetros, um mecânico e um ótico. Os resultados experimentais permitiram observar que a presença de vazios afetou especialmente a deformação de rotura e a tensão de rotura. Detetaram-se pequenas discordâncias, comparativamente com os estudos publicados, de algumas características mecânicas obtidas dos diversos adesivos utilizados. Constatou-se também um ligeiro desfasamento entre os valores adquiridos com os dois tipos de extensómetros utilizados.
Resumo:
Robotica 2012: 12th International Conference on Autonomous Robot Systems and Competitions April 11, 2012, Guimarães, Portugal
Resumo:
The underground scenarios are one of the most challenging environments for accurate and precise 3d mapping where hostile conditions like absence of Global Positioning Systems, extreme lighting variations and geometrically smooth surfaces may be expected. So far, the state-of-the-art methods in underground modelling remain restricted to environments in which pronounced geometric features are abundant. This limitation is a consequence of the scan matching algorithms used to solve the localization and registration problems. This paper contributes to the expansion of the modelling capabilities to structures characterized by uniform geometry and smooth surfaces, as is the case of road and train tunnels. To achieve that, we combine some state of the art techniques from mobile robotics, and propose a method for 6DOF platform positioning in such scenarios, that is latter used for the environment modelling. A visual monocular Simultaneous Localization and Mapping (MonoSLAM) approach based on the Extended Kalman Filter (EKF), complemented by the introduction of inertial measurements in the prediction step, allows our system to localize himself over long distances, using exclusively sensors carried on board a mobile platform. By feeding the Extended Kalman Filter with inertial data we were able to overcome the major problem related with MonoSLAM implementations, known as scale factor ambiguity. Despite extreme lighting variations, reliable visual features were extracted through the SIFT algorithm, and inserted directly in the EKF mechanism according to the Inverse Depth Parametrization. Through the 1-Point RANSAC (Random Sample Consensus) wrong frame-to-frame feature matches were rejected. The developed method was tested based on a dataset acquired inside a road tunnel and the navigation results compared with a ground truth obtained by post-processing a high grade Inertial Navigation System and L1/L2 RTK-GPS measurements acquired outside the tunnel. Results from the localization strategy are presented and analyzed.