37 resultados para Sistemas automáticos de ambientes
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
The purpose of this research is to analyze different daylighting systems in schools in the city of Natal/RN. Although with the abundantly daylight available locally, there are a scarce and diffuse architectural recommendations relating sky conditions, dimensions of daylight systems, shading, fraction of sky visibility, required illuminance, glare, period of occupation and depth of the lit area. This research explores different selected apertures systems to explore the potential of natural light for each system. The method has divided into three phases: The first phase is the modeling which involves the construction of three-dimensional model of a classroom in Sketchup software 2014, which is featured in follow recommendations presented in the literature to obtain a good quality of environmental comfort in school settings. The second phase is the dynamic performance computer simulation of the light through the Daysim software. The input data are the climate file of 2009 the city of Natal / RN, the classroom volumetry in 3ds format with the assignment of optical properties of each surface, the sensor mapping file and the user load file . The results produced in the simulation are organized in a spreadsheet prepared by Carvalho (2014) to determine the occurrence of useful daylight illuminance (UDI) in the range of 300 to 3000lux and build graphics illuminance curves and contours of UDI to identify the uniformity of distribution light, the need of the minimum level of illuminance and the occurrence of glare.
Resumo:
Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.
Resumo:
Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.
Resumo:
In academia, it is common to create didactic processors, facing practical disciplines in the area of Hardware Computer and can be used as subjects in software platforms, operating systems and compilers. Often, these processors are described without ISA standard, which requires the creation of compilers and other basic software to provide the hardware / software interface and hinder their integration with other processors and devices. Using reconfigurable devices described in a HDL language allows the creation or modification of any microarchitecture component, leading to alteration of the functional units of data path processor as well as the state machine that implements the control unit even as new needs arise. In particular, processors RISP enable modification of machine instructions, allowing entering or modifying instructions, and may even adapt to a new architecture. This work, as the object of study addressing educational soft-core processors described in VHDL, from a proposed methodology and its application on two processors with different complexity levels, shows that it s possible to tailor processors for a standard ISA without causing an increase in the level hardware complexity, ie without significant increase in chip area, while its level of performance in the application execution remains unchanged or is enhanced. The implementations also allow us to say that besides being possible to replace the architecture of a processor without changing its organization, RISP processor can switch between different instruction sets, which can be expanded to toggle between different ISAs, allowing a single processor become adaptive hybrid architecture, which can be used in embedded systems and heterogeneous multiprocessor environments
Resumo:
SANTOS, Raimunda Fernanda dos; SILVA, Eliane Ferreira da. A importância da Arquitetura da Informação no planejamento de ambientes digitais inclusivos.In: SEMINÁRIO DE PESQUISA DO CENTRO DE CIÊNCIAS SOCIAIS APLICADAS,17.,2012,Natal/RN. Anais... Natal/RN: Centro de Ciências Sociais Aplicadas, 2012. Trabalho oral.
Resumo:
Este trabalho tem dois objetivos: avaliar a usabilidade de três interfaces de ambientes virtuais de educação à distância através de duas técnicas avaliativas e identificar os fatores influenciadores da percepção de usabilidade dos ambientes avaliados. Os sistemas de educação à distância escolhidos foram o AulaNet, o E-Proinfo e o Teleduc, por serem desenvolvidos no Brasil e terem distribuição gratuita. A avaliação da usabilidade foi realizada através de duas técnicas documentadas na literatura. A primeira técnica de avaliação, do tipo preditiva ou diagnóstica, foi realizada pelo autor e um concluinte do curso de Sistemas de Informação do Centro Federal de Educação Tecnológica do estado do Piauí (CEFET-PI), mediante a observação de um checklist denominado Ergolist. A segunda avaliação, do tipo prospectivo, foi efetivada com o usuário sendo o próprio avaliador das interfaces, através de um questionário. A amostra foi composta de 15 professores e 15 alunos do CEFET-PI. Os resultados colhidos foram analisados a partir da estatística descritiva e testes de chi-quadrado. Os resultados mostraram que os ambientes apresentarem problemas de adaptabilidade, pois não possuem flexibilidade e nem levam em consideração a experiência do usuário. Na análise inferencial, foi constatado que o tempo de uso da Internet não afetou significativamente sua avaliação da usabilidade dos três ambientes, assim como na maior parte das variáveis de usabilidade não foram influenciadas pelo tipo de usuário , sexo e escolaridade . Por outro lado, em vários dos critérios ergonômicos avaliados, as variáveis de sistema tipo de ambiente e experiência com computador e a variável demográfica faixa etária afetaram a percepção de usabilidade dos ambientes virtuais de educação à distância
Resumo:
The objective of this thesis is proposes a method for a mobile robot to build a hybrid map of an indoor, semi-structured environment. The topological part of this map deals with spatial relationships among rooms and corridors. It is a topology-based map, where the edges of the graph are rooms or corridors, and each link between two distinct edges represents a door. The metric part of the map consists in a set of parameters. These parameters describe a geometric figure which adapts to the free space of the local environment. This figure is calculated by a set of points which sample the boundaries of the local free space. These points are obtained with range sensors and with knowledge about the robot s pose. A method based on generalized Hough transform is applied to this set of points in order to obtain the geomtric figure. The building of the hybrid map is an incremental procedure. It is accomplished while the robot explores the environment. Each room is associated with a metric local map and, consequently, with an edge of the topo-logical map. During the mapping procedure, the robot may use recent metric information of the environment to improve its global or relative pose
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
In this work, we propose a solution to solve the scalability problem found in collaborative, virtual and mixed reality environments of large scale, that use the hierarchical client-server model. Basically, we use a hierarchy of servers. When the capacity of a server is reached, a new server is created as a sun of the first one, and the system load is distributed between them (father and sun). We propose efficient tools and techniques for solving problems inherent to client-server model, as the definition of clusters of users, distribution and redistribution of users through the servers, and some mixing and filtering operations, that are necessary to reduce flow between servers. The new model was tested, in simulation, emulation and in interactive applications that were implemented. The results of these experimentations show enhancements in the traditional, previous models indicating the usability of the proposed in problems of all-to-all communications. This is the case of interactive games and other applications devoted to Internet (including multi-user environments) and interactive applications of the Brazilian Digital Television System, to be developed by the research group. Keywords: large scale virtual environments, interactive digital tv, distributed
Resumo:
The goal of this work is to propose a SLAM (Simultaneous Localization and Mapping) solution based on Extended Kalman Filter (EKF) in order to make possible a robot navigates along the environment using information from odometry and pre-existing lines on the floor. Initially, a segmentation step is necessary to classify parts of the image in floor or non floor . Then the image processing identifies floor lines and the parameters of these lines are mapped to world using a homography matrix. Finally, the identified lines are used in SLAM as landmarks in order to build a feature map. In parallel, using the corrected robot pose, the uncertainty about the pose and also the part non floor of the image, it is possible to build an occupancy grid map and generate a metric map with the obstacle s description. A greater autonomy for the robot is attained by using the two types of obtained map (the metric map and the features map). Thus, it is possible to run path planning tasks in parallel with localization and mapping. Practical results are presented to validate the proposal
Resumo:
The advent of the Internet stimulated the appearance of several services. An example is the communication ones present in the users day-by-day. Services as chat and e-mail reach an increasing number of users. This fact is turning the Net a powerful communication medium. The following work explores the use of communication conventional services into the Net infrastructure. We introduce the concept of communication social protocols applied to a shared virtual environment. We argue that communication tools have to be adapted to the Internet potentialities. To do that, we approach some theories of the Communication area and its applicability in a virtual environment context. We define multi-agent architecture to support the offer of these services, as well as, a software and hardware platform to support the accomplishment of experiments using Mixed Reality. Finally, we present the obtained results, experiments and products
Resumo:
In this work, we propose the Interperception paradigm, a new approach that includes a set of rules and a software architecture for merge users from different interfaces in the same virtual environment. The system detects the user resources and provide transformations on the data in order to allow its visualization in 3D, 2D and textual (1D) interfaces. This allows any user to connect, access information, and exchange information with other users in a feasible way, without needs of changing hardware or software. As results are presented two virtual environments builded acording this paradigm
Resumo:
The lava Platform is increasing1y being adopted in the development of distributed sys¬tems with higb user demando This kind of application is more complex because it needs beyond attending the functional requirements, to fulfil1 the pre-established performance parameters. This work makes a study on the Java Vutual Machine (JVM), approaching its intemal aspects and exploring the garbage collection strategies existing in the literature and used by the NM. It also presents a set of tools that helps in the job of optimizing applications and others that help in the monitoring of applications in the production envi¬ronment. Doe to the great amount of technologies that aim to solve problems which are common to the application layer, it becomes difficult to choose the one with best time response and less memory usage. This work presents a brief introduction to each one of tbe possible technologies and realize comparative tests through a statistical analysis of the response time and garbage collection activity random variables. The obtained results supply engineers and managers with a subside to decide which technologies to use in large applications through the knowledge of how they behave in their environments and the amount of resources that they consume. The relation between the productivity of the technology and its performance is also considered ao important factor in this choice
Resumo:
The monitoring of patients performed in hospitals is usually done either in a manual or semiautomated way, where the members of the healthcare team must constantly visit the patients to ascertain the health condition in which they are. The adoption of this procedure, however, compromises the quality of the monitoring conducted since the shortage of physical and human resources in hospitals tends to overwhelm members of the healthcare team, preventing them from moving to patients with adequate frequency. Given this, many existing works in the literature specify alternatives aimed at improving this monitoring through the use of wireless networks. In these works, the network is only intended for data traffic generated by medical sensors and there is no possibility of it being allocated for the transmission of data from applications present in existing user stations in the hospital. However, in the case of hospital automation environments, this aspect is a negative point, considering that the data generated in such applications can be directly related to the patient monitoring conducted. Thus, this thesis defines Wi-Bio as a communication protocol aimed at the establishment of IEEE 802.11 networks for patient monitoring, capable of enabling the harmonious coexistence among the traffic generated by medical sensors and user stations. The formal specification and verification of Wi-Bio were made through the design and analysis of Petri net models. Its validation was performed through simulations with the Network Simulator 2 (NS2) tool. The simulations of NS2 were designed to portray a real patient monitoring environment corresponding to a floor of the nursing wards sector of the University Hospital Onofre Lopes (HUOL), located at Natal, Rio Grande do Norte. Moreover, in order to verify the feasibility of Wi-Bio in terms of wireless networks standards prevailing in the market, the testing scenario was also simulated under a perspective in which the network elements used the HCCA access mechanism described in the IEEE 802.11e amendment. The results confirmed the validity of the designed Petri nets and showed that Wi-Bio, in addition to presenting a superior performance compared to HCCA on most items analyzed, was also able to promote efficient integration between the data generated by medical sensors and user applications on the same wireless network