130 resultados para Arquitetura acústica - Simulação por computador
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
BRITTO, Ricardo S.; MEDEIROS, Adelardo A. D.; ALSINA, Pablo J. Uma arquitetura distribuída de hardware e software para controle de um robô móvel autônomo. In: SIMPÓSIO BRASILEIRO DE AUTOMAÇÃO INTELIGENTE,8., 2007, Florianópolis. Anais... Florianópolis: SBAI, 2007.
Resumo:
This Masters Degree dissertation seeks to make a comparative study of internal air temperature data, simulated through the thermal computer application DesignBuilder 1.2, and data registered in loco through HOBO® Temp Data Logger, in a Social Housing Prototype (HIS), located at the Central Campus of the Federal University of Rio Grande do Norte UFRN. The prototype was designed and built seeking strategies of thermal comfort recommended for the local climate where the study was carried out, and built with panels of cellular concrete by Construtora DoisA, a collaborator of research project REPESC Rede de Pesquisa em Eficiência Energética de Sistemas Construtivos (Research Network on Energy Efficiency of Construction Systems), an integral part of Habitare program. The methodology employed carefully examined the problem, reviewed the bibliography, analyzing the major aspects related to computer simulations for thermal performance of buildings, such as climate characterization of the region under study and users thermal comfort demands. The DesignBuilder 1.2 computer application was used as a simulation tool, and theoretical alterations were carried out in the prototype, then they were compared with the parameters of thermal comfort adopted, based on the area s current technical literature. Analyses of the comparative studies were performed through graphical outputs for a better understanding of air temperature amplitudes and thermal comfort conditions. The data used for the characterization of external air temperature were obtained from the Test Reference Year (TRY), defined for the study area (Natal-RN). Thus the author also performed comparative studies for TRY data registered in the years 2006, 2007 and 2008, at weather station Davis Precision Station, located at the Instituto Nacional de Pesquisas Espaciais INPE-CRN (National Institute of Space Research), in a neighboring area of UFRN s Central Campus. The conclusions observed from the comparative studies performed among computer simulations, and the local records obtained from the studied prototype, point out that the simulations performed in naturally ventilated buildings is quite a complex task, due to the applications limitations, mainly owed to the complexity of air flow phenomena, the influence of comfort conditions in the surrounding areas and climate records. Lastly, regarding the use of the application DesignBuilder 1.2 in the present study, one may conclude that it is a good tool for computer simulations. However, it needs some adjustments to improve reliability in its use. There is a need for continued research, considering the dedication of users to the prototype, as well as the thermal charges of the equipment, in order to check sensitivity
Resumo:
Natural air ventilation is the most import passive strategy to provide thermal comfort in hot and humid climates and a significant low energy strategy. However, the natural ventilated building requires more attention with the architectural design than a conventional building with air conditioning systems, and the results are less reliable. Therefore, this thesis focuses on softwares and methods to predict the natural ventilation performance from the point of view of the architect, with limited resource and knowledge of fluid mechanics. A typical prefabricated building was modelled due to its simplified geometry, low cost and occurrence at the local campus. Firstly, the study emphasized the use of computational fluid dynamics (CFD) software, to simulate the air flow outside and inside the building. A series of approaches were developed to make the simulations possible, compromising the results fidelity. Secondly, the results of CFD simulations were used as the input of an energy tool, to simulate the thermal performance under different rates of air renew. Thirdly, the results of temperature were assessed in terms of thermal comfort. Complementary simulations were carried out to detail the analyses. The results show the potentialities of these tools. However the discussions concerning the simplifications of the approaches, the limitations of the tools and the level of knowledge of the average architect are the major contribution of this study
Resumo:
In this work, we propose a solution to solve the scalability problem found in collaborative, virtual and mixed reality environments of large scale, that use the hierarchical client-server model. Basically, we use a hierarchy of servers. When the capacity of a server is reached, a new server is created as a sun of the first one, and the system load is distributed between them (father and sun). We propose efficient tools and techniques for solving problems inherent to client-server model, as the definition of clusters of users, distribution and redistribution of users through the servers, and some mixing and filtering operations, that are necessary to reduce flow between servers. The new model was tested, in simulation, emulation and in interactive applications that were implemented. The results of these experimentations show enhancements in the traditional, previous models indicating the usability of the proposed in problems of all-to-all communications. This is the case of interactive games and other applications devoted to Internet (including multi-user environments) and interactive applications of the Brazilian Digital Television System, to be developed by the research group. Keywords: large scale virtual environments, interactive digital tv, distributed
Resumo:
In this work, we present a hardware-software architecture for controlling the autonomous mobile robot Kapeck. The hardware of the robot is composed of a set of sensors and actuators organized in a CAN bus. Two embedded computers and eigth microcontroller based boards are used in the system. One of the computers hosts the vision system, due to the significant processing needs of this kind of system. The other computer is used to coordinate and access the CAN bus and to accomplish the other activities of the robot. The microcontroller-based boards are used with the sensors and actuators. The robot has this distributed configuration in order to exhibit a good real-time behavior, where the response time and the temporal predictability of the system is important. We adopted the hybrid deliberative-reactive paradigm in the proposed architecture to conciliate the reactive behavior of the sensors-actuators net and the deliberative activities required to accomplish more complex tasks
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
This work proposes hardware architecture, VHDL described, developed to embedded Artificial Neural Network (ANN), Multilayer Perceptron (MLP). The present work idealizes that, in this architecture, ANN applications could easily embed several different topologies of MLP network industrial field. The MLP topology in which the architecture can be configured is defined by a simple and specifically data input (instructions) that determines the layers and Perceptron quantity of the network. In order to set several MLP topologies, many components (datapath) and a controller were developed to execute these instructions. Thus, an user defines a group of previously known instructions which determine ANN characteristics. The system will guarantee the MLP execution through the neural processors (Perceptrons), the components of datapath and the controller that were developed. In other way, the biases and the weights must be static, the ANN that will be embedded must had been trained previously, in off-line way. The knowledge of system internal characteristics and the VHDL language by the user are not needed. The reconfigurable FPGA device was used to implement, simulate and test all the system, allowing application in several real daily problems
Resumo:
Currently there is still a high demand for quality control in manufacturing processes of mechanical parts. This keeps alive the need for the inspection activity of final products ranging from dimensional analysis to chemical composition of products. Usually this task may be done through various nondestructive and destructive methods that ensure the integrity of the parts. The result generated by these modern inspection tools ends up not being able to geometrically define the real damage and, therefore, cannot be properly displayed on a computing environment screen. Virtual 3D visualization may help identify damage that would hardly be detected by any other methods. One may find some commercial softwares that seek to address the stages of a design and simulation of mechanical parts in order to predict possible damages trying to diminish potential undesirable events. However, the challenge of developing softwares capable of integrating the various design activities, product inspection, results of non-destructive testing as well as the simulation of damage still needs the attention of researchers. This was the motivation to conduct a methodological study for implementation of a versatile CAD/CAE computer kernel capable of helping programmers in developing softwares applied to the activities of design and simulation of mechanics parts under stress. In this research it is presented interesting results obtained from the use of the developed kernel showing that it was successfully applied to case studies of design including parts presenting specific geometries, namely: mechanical prostheses, heat exchangers and piping of oil and gas. Finally, the conclusions regarding the experience of merging CAD and CAE theories to develop the kernel, so as to result in a tool adaptable to various applications of the metalworking industry are presented
Resumo:
There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant
Resumo:
This work consists of the integrated design process analyses with thermal energetic simulation during the early design stages, based on six practical cases. It aims to schematize the integration process, identifying the thermal energetic analyses contributions at each design phase and identifying the highest impact parameters on building performance. The simulations were run in the DesignBuilder energy tool, which has the same EnergyPlus engine, validated. This tool was chosen due to the flexible and user friendly graphic interface for modeling and output assessment, including the parametric simulation to compare design alternatives. The six case studies energy tools are three architectural and three retrofit projects, and the author the simulations as a consultant or as a designer. The case studies were selected based on the commitment of the designers in order to achieve performance goals, and their availability to share the process since the early pre-design analyses, allowing schematizing the whole process, and supporting the design decisions with quantifications, including energy targets. The thermoenergetic performance analyses integration is feasible since the early stages, except when only a short time is available to run the simulations. The simulation contributions are more important during the sketch and detail phases. The predesign phase can be assisted by means of reliable bioclimatic guidelines. It was verified that every case study had two dominant design variables on the general performance. These variables differ according the building characteristics and always coincide with the local bioclimatic strategies. The adaptation of alternatives to the design increases as earlier it occurs. The use of simulation is very useful: to prove and convince the architects; to quantify the cost benefits and payback period to the retrofit designer; and to the simulator confirm the desirable result and report the performance to the client
Resumo:
The city of Natal has a significant daylight availability, although it use isn’t systematically explored in schools architecture. In this context, this research aims to determine procedures for the analysis of the daylight performance in school design in Natal-RN. The method of analysis is divided in Visible Sky Factor (VSF), simulating and analyzing the results. The annual variation of the daylight behavior requires the adoption of dynamic simulation as data procedure. The classrooms were modelled in SketchUp, simulated in Daysim program and the results were assessed by means of spreadsheets in Microsoft Excel. The classrooms dimensions are 7.20mx 7.20m, with windows-to-wall-ratio (WWR) of 20%, 40% and 50%, and with different shading devices, such as standard horizontal overhang, sloped overhang, standard horizontal overhang with side view protection, standard horizontal overhang with a dropped edge, standard horizontal overhang with three horizontal louvers, double standard horizontal overhang, double standard horizontal overhang with three horizontal louvers, plus the use of shelf light in half the models with WWR of 40% and 50%. The data was organized in spreadsheets, with two intervals of UDI: between 300lux and 2000 lux and between 300lux and 3000lux. The simulation was performed with the weather file of 2009 to the city of NatalRN. The graphical outputs are illuminance curves, isolines of UDI among 300lux and 2000 lux and tables with index of occurrences of glare and to an UDI among 300lux 3000lux. The best UDI300-2000lux performance was evidenced to: Phase 1 (models with WWR of 20%), Phase 2 (models with WWR of 40% and 50% with light shelf). The best UDI300-3000lux performance was evidenced to: Phase 1 (models with WWR of 20% and 40% with light shelf) and Phase 2 (models with WWR of 40% and 50% with light shelf). The outputs prove that the daylight quality mainly depends on the shading system efficacy to avoid the glare occurrence, which determines the daylight discomfort. The bioclimatic recommendations of big openings with partial shading (with an opening with direct sunlight) resulted in illuminances level higher than the acceptable upper threshold. The improvement of the shading system percentage (from 73% to 91%) in medium-size of openings (WWR 40% and 50%) reduced or eliminate the glare occurrence without compromising the daylight zone depth (7.20m). The passive zone was determined for classrooms with satisfactory daylight performance, it was calculated the daylight zone depth rule-of-thumb with the ratio between daylight zone depth and the height of the window for different size of openings. The ratio ranged from 1.54 to 2.57 for WWR of 20%, 40% and 50% respectively. There was a reduction or elimination of glare in the passive area with light shelf, or with awning window shading.
Resumo:
The Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease characterized by progressive muscle weakness that leads the patient to death, usually due to respiratory complications. Thus, as the disease progresses the patient will require noninvasive ventilation (NIV) and constant monitoring. This paper presents a distributed architecture for homecare monitoring of nocturnal NIV in patients with ALS. The implementation of this architecture used single board computers and mobile devices placed in patient’s homes, to display alert messages for caregivers and a web server for remote monitoring by the healthcare staff. The architecture used a software based on fuzzy logic and computer vision to capture data from a mechanical ventilator screen and generate alert messages with instructions for caregivers. The monitoring was performed on 29 patients for 7 con-tinuous hours daily during 5 days generating a total of 126000 samples for each variable monitored at a sampling rate of one sample per second. The system was evaluated regarding the rate of hits for character recognition and its correction through an algorithm for the detection and correction of errors. Furthermore, a healthcare team evaluated regarding the time intervals at which the alert messages were generated and the correctness of such messages. Thus, the system showed an average hit rate of 98.72%, and in the worst case 98.39%. As for the message to be generated, the system also agreed 100% to the overall assessment, and there was disagreement in only 2 cases with one of the physician evaluators.
Resumo:
The Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease characterized by progressive muscle weakness that leads the patient to death, usually due to respiratory complications. Thus, as the disease progresses the patient will require noninvasive ventilation (NIV) and constant monitoring. This paper presents a distributed architecture for homecare monitoring of nocturnal NIV in patients with ALS. The implementation of this architecture used single board computers and mobile devices placed in patient’s homes, to display alert messages for caregivers and a web server for remote monitoring by the healthcare staff. The architecture used a software based on fuzzy logic and computer vision to capture data from a mechanical ventilator screen and generate alert messages with instructions for caregivers. The monitoring was performed on 29 patients for 7 con-tinuous hours daily during 5 days generating a total of 126000 samples for each variable monitored at a sampling rate of one sample per second. The system was evaluated regarding the rate of hits for character recognition and its correction through an algorithm for the detection and correction of errors. Furthermore, a healthcare team evaluated regarding the time intervals at which the alert messages were generated and the correctness of such messages. Thus, the system showed an average hit rate of 98.72%, and in the worst case 98.39%. As for the message to be generated, the system also agreed 100% to the overall assessment, and there was disagreement in only 2 cases with one of the physician evaluators.