954 resultados para Building methods
Resumo:
The movement of graphics and audio programming towards three dimensions is to better simulate the way we experience our world. In this project I looked to use methods for coming closer to such simulation via realistic graphics and sound combined with a natural interface. I did most of my work on a Dell OptiPlex with an 800 MHz Pentium III processor and an NVIDlA GeForce 256 AGP Plus graphics accelerator -high end products in the consumer market as of April 2000. For graphics, I used OpenGL [1], an open·source, multi-platform set of graphics libraries that is relatively easy to use, coded in C . The basic engine I first put together was a system to place objects in a scene and to navigate around the scene in real time. Once I accomplished this, I was able to investigate specific techniques for making parts of a scene more appealing.
Resumo:
In this work, the plate bending formulation of the boundary element method (BEM) based on the Reissner's hypothesis is extended to the analysis of zoned plates in order to model a building floor structure. In the proposed formulation each sub-region defines a beam or a slab and depending on the way the sub-regions are represented, one can have two different types of analysis. In the simple bending problem all sub-regions are defined by their middle surface. on the other hand, for the coupled stretching-bending problem all sub-regions are referred to a chosen reference surface, therefore eccentricity effects are taken into account. Equilibrium and compatibility conditions are automatically imposed by the integral equations, which treat this composed structure as a single body. The bending and stretching values defined on the interfaces are approximated along the beam width, reducing therefore the number of degrees of freedom. Then, in the proposed model the set of equations is written in terms of the problem values on the beam axis and on the external boundary without beams. Finally some numerical examples are presented to show the accuracy of the proposed model.
Resumo:
This paper presents the overall methodology that has been used to encode both the Brazilian Portuguese WordNet (WordNet.Br) standard language-independent conceptual-semantic relations (hyponymy, co-hyponymy, meronymy, cause, and entailment) and the so-called cross-lingual conceptual-semantic relations between different wordnets. Accordingly, after contextualizing the project and outlining the current lexical database structure and statistics, it describes the WordNet.Br editing GUI that was designed to aid the linguist in carrying out the tasks of building synsets, selecting sample sentences from corpora, writing synset concept glosses, and encoding both language-independent conceptual-semantic relations and cross-lingual conceptual-semantic relations between WordNet.Br and Princeton WordNet © Springer-Verlag Berlin Heidelberg 2006.
Resumo:
As proeminentes edificações da cidade de Belém foram revestidas durante o século 19 com azulejos produzidos em Portugal e Alemanha que já apresentam distintos graus de degradação. O Palacete Pinho é uma das mais importantes destas edificações e foi selecionado para se investigar a ação do clima tropical amazônico sobre a degradação destes azulejos. Para atingir estes objetivos mapearam-se os azulejos desta edificação visando identificar as modificações de origem orgânica e inorgânica e coletas de amostras para análises. Os minerais foram determinados por DRX, a composição química por métodos clássicos úmidos e MEV/SED e os micro-organismos por microscopia. Os resultados obtidos mostram que os azulejos Portugueses e Alemães são distintos entre si. Enquanto o biscoito é composto de SiO2 e Al2O3, CaO foi encontrado apenas nos Portugueses. Os baixos conteúdos de Na2O e K2O indicam adição de materiais para redução da temperatura de fusão. SiO2 e PbO compõem o vidrado, já CoO e FeO foram adicionados como pigmentos. O biscoito dos azulejos Alemães é constituído de quartzo, mullita e cristobalita, ao contrário do Português com quartzo, gehlenita, diopsídio, calcita e feldspatos. Os vidrados são amorfos ao DRX. As diferenças químicas e mineralógicas entre os azulejos Portugueses e Alemães indicam que foram produzidos por matéria prima distinta, bem como processo termal. As alterações relacionadas com o intemperismo são as finas camadas de detritos (nos Alemães), manchas de oxidação, manchas escuras, descolamento do azulejo (no Português); perda de vidrado e biscoito tornando-se pulverulento como consequência do estabelecimento de Cyanophyta e bacillariophyta (Português). As distintas feições de degradação dos azulejos refletem as suas diferenças mineralógicas e químicas expostas ao clima tropical Amazônico.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
During autumn 2003, several thousand European starlings (Sturnus vulgaris) began roosting on exposed I-beams in a newly constructed, decorative glass canopy that covered the passenger pick-up area at the terminal building for Cleveland Hopkins International Airport, Ohio. The use of lethal control or conventional dispersal techniques, such as pyrotechnics and fire hoses, were not feasible in the airport terminal area. The design and aesthetics of the structure precluded the use of netting and other exclusion materials. In January 2004, an attempt was made to disperse the birds using recorded predator and distress calls broadcast from speakers installed in the structure. This technique failed to disperse the birds. In February 2004, we developed a technique using compressed air to physically and audibly harass the birds. We used a trailer-mounted commercial air compressor producing 185 cubic feet per minute of air at 100 pounds per square inch pressure and a 20-foot long, 1-inch diameter PVC pipe attached to the outlet hose. One person slowly (< 5 mph) drove a pick-up truck through the airport terminal at dusk while the second person sat on a bench in the truck bed and directed the compressed air from the pipe into the canopy to harass starlings attempting to enter the roost site. After 5 consecutive nights of compressed-air harassment, virtually no starlings attempted to roost in the canopy. Once familiar with the physical effects of the compressed air, the birds dispersed at the sound of the air. Only occasional harassment at dusk was needed through the remainder of the winter to keep the canopy free of starlings. Similar harassment with the compressor was conducted successfully in autumn 2004 with the addition of a modified leaf blower, wooden clappers, and laser. In conclusion, we found compressed air to be a safe, unobtrusive, and effective method for dispersing starlings from an urban roost site. This technique would likely be applicable for other urban-roosting species such as crows, house sparrows, and blackbirds.
Resumo:
Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.
Resumo:
Abstract Background Using univariate and multivariate variance components linkage analysis methods, we studied possible genotype × age interaction in cardiovascular phenotypes related to the aging process from the Framingham Heart Study. Results We found evidence for genotype × age interaction for fasting glucose and systolic blood pressure. Conclusions There is polygenic genotype × age interaction for fasting glucose and systolic blood pressure and quantitative trait locus × age interaction for a linkage signal for systolic blood pressure phenotypes located on chromosome 17 at 67 cM.
Resumo:
OBJECTIVE: To evaluate tools for the fusion of images generated by tomography and structural and functional magnetic resonance imaging. METHODS: Magnetic resonance and functional magnetic resonance imaging were performed while a volunteer who had previously undergone cranial tomography performed motor and somatosensory tasks in a 3-Tesla scanner. Image data were analyzed with different programs, and the results were compared. RESULTS: We constructed a flow chart of computational processes that allowed measurement of the spatial congruence between the methods. There was no single computational tool that contained the entire set of functions necessary to achieve the goal. CONCLUSION: The fusion of the images from the three methods proved to be feasible with the use of four free-access software programs (OsiriX, Register, MRIcro and FSL). Our results may serve as a basis for building software that will be useful as a virtual tool prior to neurosurgery.
Resumo:
Machine learning comprises a series of techniques for automatic extraction of meaningful information from large collections of noisy data. In many real world applications, data is naturally represented in structured form. Since traditional methods in machine learning deal with vectorial information, they require an a priori form of preprocessing. Among all the learning techniques for dealing with structured data, kernel methods are recognized to have a strong theoretical background and to be effective approaches. They do not require an explicit vectorial representation of the data in terms of features, but rely on a measure of similarity between any pair of objects of a domain, the kernel function. Designing fast and good kernel functions is a challenging problem. In the case of tree structured data two issues become relevant: kernel for trees should not be sparse and should be fast to compute. The sparsity problem arises when, given a dataset and a kernel function, most structures of the dataset are completely dissimilar to one another. In those cases the classifier has too few information for making correct predictions on unseen data. In fact, it tends to produce a discriminating function behaving as the nearest neighbour rule. Sparsity is likely to arise for some standard tree kernel functions, such as the subtree and subset tree kernel, when they are applied to datasets with node labels belonging to a large domain. A second drawback of using tree kernels is the time complexity required both in learning and classification phases. Such a complexity can sometimes prevents the kernel application in scenarios involving large amount of data. This thesis proposes three contributions for resolving the above issues of kernel for trees. A first contribution aims at creating kernel functions which adapt to the statistical properties of the dataset, thus reducing its sparsity with respect to traditional tree kernel functions. Specifically, we propose to encode the input trees by an algorithm able to project the data onto a lower dimensional space with the property that similar structures are mapped similarly. By building kernel functions on the lower dimensional representation, we are able to perform inexact matchings between different inputs in the original space. A second contribution is the proposal of a novel kernel function based on the convolution kernel framework. Convolution kernel measures the similarity of two objects in terms of the similarities of their subparts. Most convolution kernels are based on counting the number of shared substructures, partially discarding information about their position in the original structure. The kernel function we propose is, instead, especially focused on this aspect. A third contribution is devoted at reducing the computational burden related to the calculation of a kernel function between a tree and a forest of trees, which is a typical operation in the classification phase and, for some algorithms, also in the learning phase. We propose a general methodology applicable to convolution kernels. Moreover, we show an instantiation of our technique when kernels such as the subtree and subset tree kernels are employed. In those cases, Direct Acyclic Graphs can be used to compactly represent shared substructures in different trees, thus reducing the computational burden and storage requirements.
Resumo:
This work of thesis involves various aspects of crystal engineering. Chapter 1 focuses on crystals containing crown ether complexes. Aspects such as the possibility of preparing these materials by non-solution methods, i.e. by direct reaction of the solid components, thermal behavior and also isomorphism and interconversion between hydrates are taken into account. In chapter 2 a study is presented aimed to understanding the relationship between hydrogen bonding capability and shape of the building blocks chosen to construct crystals. The focus is on the control exerted by shape on the organization of sandwich cations such as cobalticinium, decamethylcobalticinium and bisbenzenchromium(I) and on the aggregation of monoanions all containing carboxylic and carboxylate groups, into 0-D, 1-D, 2-D and 3-D networks. Reactions conducted in multi-component molecular assemblies or co-crystals have been recognized as a way to control reactivity in the solid state. The [2+2] photodimerization of olefins is a successful demonstration of how templated solid state synthesis can efficiently synthesize unique materials with remarkable stereoselectivity and under environment-friendly conditions. A demonstration of this synthetic strategy is given in chapter 3. The combination of various types of intermolecular linkages, leading to formation of high order aggregation and crystalline materials or to a random aggregation resulting in an amorphous precipitate, may not go to completeness. In such rare cases an aggregation process intermediate between crystalline and amorphous materials is observed, resulting in the formation of a gel, i.e. a viscoelastic solid-like or liquid-like material. In chapter 4 design of new Low Molecular Weight Gelators is presented. Aspects such as the relationships between molecular structure, crystal packing and gelation properties and the application of this kind of gels as a medium for crystal growth of organic molecules, such as APIs, are also discussed.
Resumo:
Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).
Resumo:
The present work is aimed to the study and the analysis of the defects detected in the civil structure and that are object of civil litigation in order to create an instruments capable of helping the different actor involved in the building process. It is divided in three main sections. The first part is focused on the collection of the data related to the civil proceeding of the 2012 and the development of in depth analysis of the main aspects regarding the defects on existing buildings. The research center “Osservatorio Claudio Ceccoli” developed a system for the collection of the information coming from the civil proceedings of the Court of Bologna. Statistical analysis are been performed and the results are been shown and discussed in the first chapters.The second part analyzes the main issues emerged during the study of the real cases, related to the activities of the technical consultant. The idea is to create documents, called “focus”, addressed to clarify and codify specific problems in order to develop guidelines that help the technician editing of the technical advice.The third part is centered on the estimation of the methods used for the collection of data. The first results show that these are not efficient. The critical analysis of the database, the result and the experience and throughout, allowed the implementation of the collection system for the data.
Resumo:
Resilience research has been applied to socioeconomic as well as for agroecological studies in the last 20 years. It provides a conceptual and methodological approach for a better understanding of interrelations between the performance of ecological and social systems. In the research area Alto Beni, Bolivia, the production of cocoa (Theobroma cacao L.), is one of the main sources of income. Farmers in the region have formed producers’ associations to enhance organic cocoa cultivation and obtain fair prices since the 1980s. In cooperation with the long-term system comparisons by the Research Institute of Organic Agriculture (FiBL) in Alto Beni, aspects of the field trial are applied for the use in on-farm research: a comparison of soil fertility, biomass and crop diversity is combined with qualitative interviews and participatory observation methods. Fieldwork is carried out together with Bolivian students through the Swiss KFPE-programme Echanges Universitaires. For the system comparisons, four different land-use types were classified according to their ecological complexity during a preliminary study in 2009: successional agroforestry systems, simple agroforestry systems (both organically managed and certified), traditional systems and conventional monocultures. The study focuses on interrelations between different ways of cocoa cultivation, livelihoods and the related socio-cultural rationales behind them. In particular this second aspect is innovative as it allows to broaden the biophysical perspective to a more comprehensive evaluation with socio-ecological aspects thereby increasing the relevance of the agronomic field studies for development policy and practice. Moreover, such a socio-ecological baseline allows to assess the potential of organic agriculture regarding resilience-building face to socio-environmental stress factors. Among others, the results of the pre-study illustrate local farmers’ perceptions of climate change and the consequences for the different crop-systems: all interviewees mentioned rising temperatures and/or an extended dry season as negative impacts more with regard to their own working conditions than to their crops. This was the case in particular for conventional monocultures and in plots where slash-and-burn cultivation was practised whereas for organic agroforestry systems the advantage of working in the shade was stressed indicating that their relevance rises in the context of climate change.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.