794 resultados para BIM, Building Information Modeling, Cloud Computing, CAD, FM, GIS
Resumo:
dIn this work, a perceptron neural-network technique is applied to estimate hourly values of the diffuse solar-radiation at the surface in São Paulo City, Brazil, using as input the global solar-radiation and other meteorological parameters measured from 1998 to 2001. The neural-network verification was performed using the hourly measurements of diffuse solar-radiation obtained during the year 2002. The neural network was developed based on both feature determination and pattern selection techniques. It was found that the inclusion of the atmospheric long-wave radiation as input improves the neural-network performance. on the other hand traditional meteorological parameters, like air temperature and atmospheric pressure, are not as important as long-wave radiation which acts as a surrogate for cloud-cover information on the regional scale. An objective evaluation has shown that the diffuse solar-radiation is better reproduced by neural network synthetic series than by a correlation model. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Searching for an understanding of how the brain supports conscious processes, cognitive scientists have proposed two main classes of theory: Global Workspace and Information Integration theories. These theories seem to be complementary, but both still lack grounding in terms of brain mechanisms responsible for the production of coherent and unitary conscious states. Here we propose following James Robertson's "Astrocentric Hypothesis" - that conscious processing is based on analog computing in astrocytes. The "hardware" for these computations is calcium waves mediated by adenosine triphosphate signaling. Besides presenting our version of this hypothesis, we also review recent findings on astrocyte morphology that lend support to their functioning as Local Hubs (composed of protoplasmic astrocytes) that integrate synaptic activity, and as a Master Hub (composed, in the human brain, by a combination of interlaminar, fibrous, polarized and varicose projection astrocytes) that integrates whole-brain activity.
Astrocytes and human cognition: Modeling information integration and modulation of neuronal activity
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The design and implementation of an ERP system involves capturing the information necessary for implementing the system's structure and behavior that support enterprise management. This process should start on the enterprise modeling level and finish at the coding level, going down through different abstraction layers. For the case of Free/Open Source ERP, the lack of proper modeling methods and tools jeopardizes the advantages of source code availability. Moreover, the distributed, decentralized decision-making, and source-code driven development culture of open source communities, generally doesn't rely on methods for modeling the higher abstraction levels necessary for an ERP solution. The aim of this paper is to present a model driven development process for the open source ERP ERP5. The proposed process covers the different abstraction levels involved, taking into account well established standards and common practices, as well as new approaches, by supplying Enterprise, Requirements, Analysis, Design, and Implementation workflows. Copyright 2008 ACM.
Resumo:
Includes bibliography
Resumo:
The advance in the graphic computer's techniques and computer's capacity of processing made possible applications like the human anatomic structures modeling, in order to investigate diseases, surgical planning or even provide images for training of Computer Aided Diagnosis (CAD). On this context, this work exhibits an anatomical model of cardiac structures represented in a tridimensional environment. The model was represented with geometrical elements and has anatomical details, as the different tunics that compose the cardiac wall and measures that preserves the characteristics found on real structures. The validation of the anatomical model was made through quantitative comparations with real structures measures, available on specialized literature. The results obtained, evaluated by two specialists, are compatible with real anatomies, respecting the anatomical particularities. This degree of representation will allow the verification of the influence of radiological parameters, morphometric peculiarities and stage of the cardiac diseases on the quality of the images, as well as on the performance of the CAD. © 2010 IEEE.
Resumo:
The technologies are rapidly developing, but some of them present in the computers, as for instance their processing capacity, are reaching their physical limits. It is up to quantum computation offer solutions to these limitations and issues that may arise. In the field of information security, encryption is of paramount importance, being then the development of quantum methods instead of the classics, given the computational power offered by quantum computing. In the quantum world, the physical states are interrelated, thus occurring phenomenon called entanglement. This study presents both a theoretical essay on the merits of quantum mechanics, computing, information, cryptography and quantum entropy, and some simulations, implementing in C language the effects of entropy of entanglement of photons in a data transmission, using Von Neumann entropy and Tsallis entropy.
Resumo:
Modeling is a step to perform a finite element analysis. Different methods of model construction are reported in literature, as the Bio-CAD modeling. The purpose of this study was to perform a model evaluation and application using two methods of Bio-CAD modeling from human edentulous hemi-mandible on the finite element analysis. From CT scans of dried human skull was reconstructed a stereolithographic model. Two methods of modeling were performed: STL conversion approach (Model 1) associated to STL simplification and reverse engineering approach (Model 2). For finite element analysis was used the action of lateral pterygoid muscle as loading condition to assess total displacement (D), equivalent von-Mises stress (VM) and maximum principal stress (MP). Two models presented differences on the geometry regarding surface number (1834 (model 1); 282 (model 2)). Were observed differences in finite element mesh regarding element number (30428 nodes/16683 elements (model 1); 15801 nodes/8410 elements (model 2). D, VM and MP stress areas presented similar distribution in two models. The values were different regarding maximum and minimum values of D (ranging 0-0.511 mm (model 1) and 0-0.544 mm (model 2), VM stress (6.36E-04-11.4 MPa (model 1) and 2.15E-04-14.7 MPa (model 2) and MP stress (-1.43-9.14 MPa (model 1) and -1.2-11.6 MPa (model 2). From two methods of Bio-CAD modeling, the reverse engineering presented better anatomical representation compared to the STL conversion approach. The models presented differences in the finite element mesh, total displacement and stress distribution.
Resumo:
The Frequency Modulated - Atomic Force Microscope (FM-AFM) is apowerful tool to perform surface investigation with true atomic resolution. The controlsystem of the FM-AFM must keep constant both the frequency and amplitude ofoscillation of the microcantilever during the scanning process of the sample. However,tip and sample interaction forces cause modulations in the microcantilever motion.A Phase-Locked Loop (PLL) is used as a demodulator and to generate feedback signalto the FM-AFM control system. The PLL performance is vital to the FM-AFMperformace since the image information is in the modulated microcantilever motion.Nevertheless, little attention is drawn to PLL performance in the FM-AFM literature.Here, the FM-AFM control system is simulated, comparing the performancefor di erent PLL designs.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
Atmospheric aerosol particles serving as cloud condensation nuclei (CCN) are key elements of the hydrological cycle and climate. Knowledge of the spatial and temporal distribution of CCN in the atmosphere is essential to understand and describe the effects of aerosols in meteorological models. In this study, CCN properties were measured in polluted and pristine air of different continental regions, and the results were parameterized for efficient prediction of CCN concentrations.The continuous-flow CCN counter used for size-resolved measurements of CCN efficiency spectra (activation curves) was calibrated with ammonium sulfate and sodium chloride aerosols for a wide range of water vapor supersaturations (S=0.068% to 1.27%). A comprehensive uncertainty analysis showed that the instrument calibration depends strongly on the applied particle generation techniques, Köhler model calculations, and water activity parameterizations (relative deviations in S up to 25%). Laboratory experiments and a comparison with other CCN instruments confirmed the high accuracy and precision of the calibration and measurement procedures developed and applied in this study.The mean CCN number concentrations (NCCN,S) observed in polluted mega-city air and biomass burning smoke (Beijing and Pearl River Delta, China) ranged from 1000 cm−3 at S=0.068% to 16 000 cm−3 at S=1.27%, which is about two orders of magnitude higher than in pristine air at remote continental sites (Swiss Alps, Amazonian rainforest). Effective average hygroscopicity parameters, κ, describing the influence of chemical composition on the CCN activity of aerosol particles were derived from the measurement data. They varied in the range of 0.3±0.2, were size-dependent, and could be parameterized as a function of organic and inorganic aerosol mass fraction. At low S (≤0.27%), substantial portions of externally mixed CCN-inactive particles with much lower hygroscopicity were observed in polluted air (fresh soot particles with κ≈0.01). Thus, the aerosol particle mixing state needs to be known for highly accurate predictions of NCCN,S. Nevertheless, the observed CCN number concentrations could be efficiently approximated using measured aerosol particle number size distributions and a simple κ-Köhler model with a single proxy for the effective average particle hygroscopicity. The relative deviations between observations and model predictions were on average less than 20% when a constant average value of κ=0.3 was used in conjunction with variable size distribution data. With a constant average size distribution, however, the deviations increased up to 100% and more. The measurement and model results demonstrate that the aerosol particle number and size are the major predictors for the variability of the CCN concentration in continental boundary layer air, followed by particle composition and hygroscopicity as relatively minor modulators. Depending on the required and applicable level of detail, the measurement results and parameterizations presented in this study can be directly implemented in detailed process models as well as in large-scale atmospheric and climate models for efficient description of the CCN activity of atmospheric aerosols.
Resumo:
In the last years the attentions on the energy efficiency on historical buildings grows, as different research project took place across Europe. The attention on combining, the need of the preservation of the buildings, their value and their characteristic, with the need of the reduction of energy consumption and the improvements of indoor comfort condition, stimulate the discussion of two points of view that are usually in contradiction, buildings engineer and Conservation Institution. The results are surprising because a common field is growing while remains the need of balancing the respective exigencies. From these experience results clear that many questions should be answered also from the building physicist regarding the correct assessment: on the energy consumption of this class of buildings, on the effectiveness of the measures that could be adopted, and much more. This thesis gives a contribution to answer to these questions developing a procedure to analyse the historic building. The procedure gives a guideline of the energy audit for the historical building considering the experimental activities to dial with the uncertainty of the estimation of the energy balance. It offers a procedure to simulate the energy balance of building with a validated dynamic model considering also a calibration procedure to increase the accuracy of the model. An approach of design of energy efficiency measures through an optimization that consider different aspect is also presented. All the process is applied to a real case study to give to the reader a practical understanding.