130 resultados para Arquitetura acústica - Simulação por computador
Resumo:
The present study aims to investigate the conceptions of teachers and management team of the Colégio Nossa Senhora das Neves - Natal / RN about curriculum, school architecture and possible relationships established between these components. To develop the study, we rely on the theoretical contributions of Viñao Frago (2001), Escolano (2001); Benconstta (2005), among others, about the architecture school, and with regard to curriculum, ancoramo us in theoretical reflections Silva (2000, 2006, 2008). We assume that the school place is a social construct and as such, reflects the interests of certain groups, to organize, establish ways to condition their unctions and uses. In this space, people's lives is planned, both those who work there, as those who study there. Thus, the architecture school promotes, through representations, signs, symbols and shapes, certain charges that impact the ways of being and acting subjects by establishing appropriation and expropriation of rights and legitimate forms of inclusion and exclusion. Thus, it is an expression of power. A power that is expressed in the form of lead the way people should behave in a certain space. Clarity on these aspects of the architecture school is important, since in the same way that the opinion of several experts is important to discuss the adequacy of school architecture (environmentalists, architects, engineers, planners), the / the teacher / and the / as managers / must also meet the educational nature of the architecture school, so as to present its share of contribution in order to make the post-school conducive to learning multiple. From this perspective, we analyze the concepts of four teachers and eight individuals who are part of the management team of the CNSN, whose views were seized through participant observation, semi-structured interviews and documentary analysis. The construction of the data indicated levels of conceptual curriculum varied, ranging from those rooted in traditional theories of curriculum as those regarding the curriculum tied to discursive and contextual aspects. The conceptions of architecture school, predominantly focused on the aspects of the architecture school materials and most established subject, differently, relations between curriculum and school architecture
Resumo:
The industrial automation is directly linked to the development of information tecnology. Better hardware solutions, as well as improvements in software development methodologies make possible the rapid growth of the productive process control. In this thesis, we propose an architecture that will allow the joining of two technologies in hardware (industrial network) and software field (multiagent systems). The objective of this proposal is to join those technologies in a multiagent architecture to allow control strategies implementations in to field devices. With this, we intend develop an agents architecture to detect and solve problems which may occur in the industrial network environment. Our work ally machine learning with industrial context, become proposed multiagent architecture adaptable to unfamiliar or unexpected production environment. We used neural networks and presented an allocation strategies of these networks in industrial network field devices. With this we intend to improve decision support at plant level and allow operations human intervention independent
Resumo:
The area of the hospital automation has been the subject a lot of research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others); communication (tracking patients, staff and materials), development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials); and aid to medical diagnosis (according to each speciality). This thesis presents an architecture for a patient monitoring and alert systems. This architecture is based on intelligent systems techniques and is applied in hospital automation, specifically in the Intensive Care Unit (ICU) for the patient monitoring in hospital environment. The main goal of this architecture is to transform the multiparameter monitor data into useful information, through the knowledge of specialists and normal parameters of vital signs based on fuzzy logic that allows to extract information about the clinical condition of ICU patients and give a pre-diagnosis. Finally, alerts are dispatched to medical professionals in case any abnormality is found during monitoring. After the validation of the architecture, the fuzzy logic inferences were applied to the trainning and validation of an Artificial Neural Network for classification of the cases that were validated a priori with the fuzzy system
Resumo:
There are some approaches that take advantage of unused computational resources in the Internet nodes - users´ machines. In the last years , the peer-to-peer networks (P2P) have gaining a momentum mainly due to its support for scalability and fault tolerance. However, current P2P architectures present some problems such as nodes overhead due to messages routing, a great amount of nodes reconfigurations when the network topology changes, routing traffic inside a specific network even when the traffic is not directed to a machine of this network, and the lack of a proximity relationship among the P2P nodes and the proximity of these nodes in the IP network. Although some architectures use the information about the nodes distance in the IP network, they use methods that require dynamic information. In this work we propose a P2P architecture to fix the problems afore mentioned. It is composed of three parts. The first part consists of a basic P2P architecture, called SGrid, which maintains a relationship of nodes in the P2P network with their position in the IP network. Its assigns adjacent key regions to nodes of a same organization. The second part is a protocol called NATal (Routing and NAT application layer) that extends the basic architecture in order to remove from the nodes the responsibility of routing messages. The third part consists of a special kind of node, called LSP (Lightware Super-Peer), which is responsible for maintaining the P2P routing table. In addition, this work also presents a simulator that validates the architecture and a module of the Natal protocol to be used in Linux routers
Resumo:
Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system
Resumo:
Simulations based on cognitively rich agents can become a very intensive computing task, especially when the simulated environment represents a complex system. This situation becomes worse when time constraints are present. This kind of simulations would benefit from a mechanism that improves the way agents perceive and react to changes in these types of environments. In other worlds, an approach to improve the efficiency (performance and accuracy) in the decision process of autonomous agents in a simulation would be useful. In complex environments, and full of variables, it is possible that not every information available to the agent is necessary for its decision-making process, depending indeed, on the task being performed. Then, the agent would need to filter the coming perceptions in the same as we do with our attentions focus. By using a focus of attention, only the information that really matters to the agent running context are perceived (cognitively processed), which can improve the decision making process. The architecture proposed herein presents a structure for cognitive agents divided into two parts: 1) the main part contains the reasoning / planning process, knowledge and affective state of the agent, and 2) a set of behaviors that are triggered by planning in order to achieve the agent s goals. Each of these behaviors has a runtime dynamically adjustable focus of attention, adjusted according to the variation of the agent s affective state. The focus of each behavior is divided into a qualitative focus, which is responsible for the quality of the perceived data, and a quantitative focus, which is responsible for the quantity of the perceived data. Thus, the behavior will be able to filter the information sent by the agent sensors, and build a list of perceived elements containing only the information necessary to the agent, according to the context of the behavior that is currently running. Based on the human attention focus, the agent is also dotted of a affective state. The agent s affective state is based on theories of human emotion, mood and personality. This model serves as a basis for the mechanism of continuous adjustment of the agent s attention focus, both the qualitative and the quantative focus. With this mechanism, the agent can adjust its focus of attention during the execution of the behavior, in order to become more efficient in the face of environmental changes. The proposed architecture can be used in a very flexibly way. The focus of attention can work in a fixed way (neither the qualitative focus nor the quantitaive focus one changes), as well as using different combinations for the qualitative and quantitative foci variation. The architecture was built on a platform for BDI agents, but its design allows it to be used in any other type of agents, since the implementation is made only in the perception level layer of the agent. In order to evaluate the contribution proposed in this work, an extensive series of experiments were conducted on an agent-based simulation over a fire-growing scenario. In the simulations, the agents using the architecture proposed in this work are compared with similar agents (with the same reasoning model), but able to process all the information sent by the environment. Intuitively, it is expected that the omniscient agent would be more efficient, since they can handle all the possible option before taking a decision. However, the experiments showed that attention-focus based agents can be as efficient as the omniscient ones, with the advantage of being able to solve the same problems in a significantly reduced time. Thus, the experiments indicate the efficiency of the proposed architecture
Resumo:
In this work, we propose the Interperception paradigm, a new approach that includes a set of rules and a software architecture for merge users from different interfaces in the same virtual environment. The system detects the user resources and provide transformations on the data in order to allow its visualization in 3D, 2D and textual (1D) interfaces. This allows any user to connect, access information, and exchange information with other users in a feasible way, without needs of changing hardware or software. As results are presented two virtual environments builded acording this paradigm
Resumo:
The need to implement a software architecture that promotes the development of a SCADA supervisory system for monitoring industrial processes simulated with the flexibility of adding intelligent modules and devices such as CLP, according to the specifications of the problem, it was the motivation for this work. In the present study, we developed an intelligent supervisory system on a simulation of a distillation column modeled with Unisim. Furthermore, OLE Automation was used as communication between the supervisory and simulation software, which, with the use of the database, promoted an architecture both scalable and easy to maintain. Moreover, intelligent modules have been developed for preprocessing, data characteristics extraction, and variables inference. These modules were fundamentally based on the Encog software
Resumo:
The microstrip antennas are in constant evidence in current researches due to several advantages that it presents. Fractal geometry coupled with good performance and convenience of the planar structures are an excellent combination for design and analysis of structures with ever smaller features and multi-resonant and broadband. This geometry has been applied in such patch microstrip antennas to reduce its size and highlight its multi-band behavior. Compared with the conventional microstrip antennas, the quasifractal patch antennas have lower frequencies of resonance, enabling the manufacture of more compact antennas. The aim of this work is the design of quasi-fractal patch antennas through the use of Koch and Minkowski fractal curves applied to radiating and nonradiating antenna s edges of conventional rectangular patch fed by microstrip inset-fed line, initially designed for the frequency of 2.45 GHz. The inset-fed technique is investigated for the impedance matching of fractal antennas, which are fed through lines of microstrip. The efficiency of this technique is investigated experimentally and compared with simulations carried out by commercial software Ansoft Designer used for precise analysis of the electromagnetic behavior of antennas by the method of moments and the neural model proposed. In this dissertation a study of literature on theory of microstrip antennas is done, the same study is performed on the fractal geometry, giving more emphasis to its various forms, techniques for generation of fractals and its applicability. This work also presents a study on artificial neural networks, showing the types/architecture of networks used and their characteristics as well as the training algorithms that were used for their implementation. The equations of settings of the parameters for networks used in this study were derived from the gradient method. It will also be carried out research with emphasis on miniaturization of the proposed new structures, showing how an antenna designed with contours fractals is capable of a miniaturized antenna conventional rectangular patch. The study also consists of a modeling through artificial neural networks of the various parameters of the electromagnetic near-fractal antennas. The presented results demonstrate the excellent capacity of modeling techniques for neural microstrip antennas and all algorithms used in this work in achieving the proposed models were implemented in commercial software simulation of Matlab 7. In order to validate the results, several prototypes of antennas were built, measured on a vector network analyzer and simulated in software for comparison
Resumo:
The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals
Resumo:
In this work, we propose a new approach to Interactive Digital Television (IDTV), aimed to explore the concepts of immersivity. Several architectures have been proposed to IDTV, but they did not explore coherently questions related to immersion. The goal of this thesis consists in defining formally what is immersion and interactivity for digital TV and how they may be used to improve user experience in this new televisive model. The approach raises questions such as the appropriate choice of equipment to assist in the sense of immersion, which forms of interaction between users can be exploited in the interaction-immersion context, if the environment where an immersive and interactive application is used can influence the user experience, and which new forms of interactivity between users, and interactivity among users and interactive applications can be explored with the use of immersion. As one of the goals of this proposal, we point out new solutions to these issues that require further studies. We intend to formalize the concepts that embrace interactivity in the brazilian system of digital TV. In an initial study, this definition is organized into categories or levels of interactivity. From this point are made analisis and specifications to achieve immersion using DTV. We pretend to make some case studies of immersive interactive applications for digital television in order to validate the proposed architecture. We also approach the use of remote devices anda proposal of middleware architecture that allows its use in conjunction with immersive interactive applications
Resumo:
The objective of the present work is develop a model to simulate electrical energy networks in transient and stead states, using the software ATP (Alternative Transient Program), able to be a way to join two distinct themes, present in classical methodology planning networks: short circuit analysis and load flow theory. Beyond that, using a tool for relay simulation, this paper intend to use the new developed model to investigate the influence of transient phenomenon in operation of protection relays, and calibrate the enterprise's protections relays. For testing the model, some relays, actually, installed at COSERN were used
Resumo:
This work presents a description of models development at DigSILENT PowerFactoryT M program for the transient stability study in power systems with wind turbine. The main goal is to make available means to use a dynamic simulation program in power systems, widely published, and utilize it as a tool that helps in programs results evaluations used for this intent. The process of simulations and analyses results starts after the models setting description phase. The results obtained by the DigSILENT PowerFactoryT M and ATP, program chosen to the validation also international recognized, are compared during this phase. The main tools and guide lines of PowerFactoryT M program use are presented here, directing these elements to the solution of the approached problem. For the simulation it is used a real system which it will be connected a wind farm. Two different technologies of wind turbines were implemented: doublyfed induction generator with frequency converter, connecting the rotor to the stator and to the grid, and synchronous wind generator with frequency converter, interconnecting the generator to the grid. Besides presenting the basic conceptions of dynamic simulation, it is described the implemented control strategies and models of turbine and converters. The stability of the wind turbine interconnected to grid is analyzed in many operational conditions, resultant of diverse kinds of disturbances
Resumo:
Foundation Fieldbus Industrial networks are the high standard technology which allows users to create complex control logic and totally decentralized. Although being so advanced, they still have some limitations imposed by their own technology. Attempting to solve one of these limitations, this paper describes how to design a Fuzzy controller in a Foundation Fieldbus network using their basic elements of programming, the functional blocks, so that the network remains fully independent of other devices other than the same instruments that constitute it. Moreover, in this work was developed a tool that aids this process of building the Fuzzy controller, setting the internal parameters of functional blocks and informing how many and which blocks should be used for a given structure. The biggest challenge in creating this controller is exactly the choice of blocks and how to arrange them in order to effectuate the same functions of a Fuzzy controller implemented in other kind of environment. The methodology adopted was to divide each one of the phases of a traditional Fuzzy controller and then create simple structures with the functional blocks to implement them. At the end of the work, the developed controller is compared with a Fuzzy controller implemented in a mathematical program that it has a proper tool for the development and implementation of Fuzzy controllers, obtaining comparatives graphics of performance between both
Resumo:
This work treats of an implementation OFDMA baseband processor in hardware for LTE Downlink. The LTE or Long Term Evolution consist the last stage of development of the technology called 3G (Mobile System Third Generation) which offers an increasing in data rate and more efficiency and flexibility in transmission with application of advanced antennas and multiple carriers techniques. This technology applies in your physical layer the OFDMA technical (Orthogonal Frequency Division Multiple Access) for generation of signals and mapping of physical resources in downlink and has as base theoretical to OFDM multiple carriers technique (Orthogonal Frequency Division Multiplexing). With recent completion of LTE specifications, different hardware solutions have been developed, mainly, to the level symbol processing where the implementation of OFDMA processor in base band is commonly considered, because it is also considered a basic architecture of others important applications. For implementation of processor, the reconfigurable hardware offered by devices as FPGA are considered which shares not only to meet the high requirements of flexibility and adaptability of LTE as well as offers possibility of an implementation quick and efficient. The implementation of processor in reconfigurable hardware meets the specifications of LTE physical layer as well as have the flexibility necessary for to meet others standards and application which use OFDMA processor as basic architecture for your systems. The results obtained through of simulation and verification functional system approval the functionality and flexibility of processor implemented