819 resultados para Graphical representations
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
O imageamento da porosidade é uma representação gráfica da distribuição lateral da porosidade da rocha, estimada a partir de dados de perfis geofísicos de poço. Apresenta-se aqui uma metodologia para produzir esta imagem geológica, totalmente independente da intervenção do intérprete, através de um algoritmo, dito, interpretativo baseado em dois tipos de redes neurais artificiais. A primeira parte do algoritmo baseia-se em uma rede neural com camada competitiva e é construído para realizar uma interpretação automática do clássico gráfico o Pb - ΦN, produzindo um zoneamento do perfil e a estimativa da porosidade. A segunda parte baseia-se em uma rede neural com função de base radial, projetado para realizar uma integração espacial dos dados, a qual pode ser dividida em duas etapas. A primeira etapa refere-se à correlação de perfis de poço e a segunda à produção de uma estimativa da distribuição lateral da porosidade. Esta metodologia ajudará o intérprete na definição do modelo geológico do reservatório e, talvez o mais importante, o ajudará a desenvolver de um modo mais eficiente as estratégias para o desenvolvimento dos campos de óleo e gás. Os resultados ou as imagens da porosidade são bastante similares às seções geológicas convencionais, especialmente em um ambiente deposicional simples dominado por clásticos, onde um mapa de cores, escalonado em unidades de porosidade aparente para as argilas e efetiva para os arenitos, mostra a variação da porosidade e a disposição geométrica das camadas geológicas ao longo da seção. Esta metodologia é aplicada em dados reais da Formação Lagunillas, na Bacia do Lago Maracaibo, Venezuela.
Resumo:
Learning to read and write at an early stage is the process of transferring the sound form of the spoken language for the graphical form of writing, a process, a time that in our system of alphabetical called writing, the letters are graphical representations in the level of phoneme. So that this representation occurs, it is necessary that the individual already can of some form perceive and manipulate the different sonorous segments of the word. This capacity of perception directed to the segments of the word calls Phonological Awareness. Thus, it was established had for objective to verify the pertaining to school performance of 1ª to 4ª series with and without of learning in Tests of Phonological Awareness. Fourth children with age average of 9 years and 3 months without learning disabilities had been submitted to the Protocol of Phonological Awareness (CIELO, 2002) using of this instrument had participated of this study 80 pertaining to school of both only the phonological tasks. The data received from quantiqualitative approach whose results were extracted inferences. The statistically significant results occurred in the tasks of Realism Face Detection, Syllables, Detecting Phonemes, Phonemic Synthesis and Reversal Phonemic. Based on the results we observed that children without learning difficulties performed better on all tasks mentioned above
Resumo:
Abstract Background In honeybees, differential feeding of female larvae promotes the occurrence of two different phenotypes, a queen and a worker, from identical genotypes, through incremental alterations, which affect general growth, and character state alterations that result in the presence or absence of specific structures. Although previous studies revealed a link between incremental alterations and differential expression of physiometabolic genes, the molecular changes accompanying character state alterations remain unknown. Results By using cDNA microarray analyses of >6,000 Apis mellifera ESTs, we found 240 differentially expressed genes (DEGs) between developing queens and workers. Many genes recorded as up-regulated in prospective workers appear to be unique to A. mellifera, suggesting that the workers' developmental pathway involves the participation of novel genes. Workers up-regulate more developmental genes than queens, whereas queens up-regulate a greater proportion of physiometabolic genes, including genes coding for metabolic enzymes and genes whose products are known to regulate the rate of mass-transforming processes and the general growth of the organism (e.g., tor). Many DEGs are likely to be involved in processes favoring the development of caste-biased structures, like brain, legs and ovaries, as well as genes that code for cytoskeleton constituents. Treatment of developing worker larvae with juvenile hormone (JH) revealed 52 JH responsive genes, specifically during the critical period of caste development. Using Gibbs sampling and Expectation Maximization algorithms, we discovered eight overrepresented cis-elements from four gene groups. Graph theory and complex networks concepts were adopted to attain powerful graphical representations of the interrelation between cis-elements and genes and objectively quantify the degree of relationship between these entities. Conclusion We suggest that clusters of functionally related DEGs are co-regulated during caste development in honeybees. This network of interactions is activated by nutrition-driven stimuli in early larval stages. Our data are consistent with the hypothesis that JH is a key component of the developmental determination of queen-like characters. Finally, we propose a conceptual model of caste differentiation in A. mellifera based on gene-regulatory networks.
Resumo:
Há anos vem-se questionando o papel da representação gráfica na arquitetura, diante das novas tecnologias gráficas computacionais e novos processos projetivos delas decorrentes. Entretanto, o desenho à mão livre ainda se mostra atuante no processo projetivo, marcado por um olhar atento, uma percepção individual e um tempo de execução que permite imersão, entrega e reflexão. Este artigo apresenta uma análise a partir de um olhar mais atento ao desenho de projetos do arquiteto Paulo Mendes da Rocha, como contribuição para a discussão sobre o papel do desenho analógico no processo projetivo. Foram selecionados alguns projetos do arquiteto: Projeto para o Concurso de remodelação do centro urbano de Santiago (Chile); Estádio Serra Dourada; Edifício Jaraguá, Caetano de Campos; Ginásio do Clube Atlético Paulistano (1958); Residência no Butantã (1964); residência Fernando Millan (1970) e Museu Brasileiro da Escultura (1986). Este artigo apresenta estudos de leitura sobre os desenhos originais do arquiteto com o objetivo de detectar as intenções projetuais, conceitos e características do projeto. Foram feitas marcações gráficas sobre os desenhos originais, evidenciando uma leitura particular do pesquisador que permitiu uma melhor compreensão dos projetos.
Resumo:
Este artigo versa sobre uma pesquisa integrante do Núcleo de Apoio à Pesquisa em Estudos de Linguagem em Arquitetura e Cidade (N.ELAC), que desenvolve pesquisas em Linguagem e Representação. Entre as diversas formas de representação em arquitetura, a presente pesquisa traz o modelo tridimensional físico como ferramenta que proporciona maior facilidade de leitura do projeto, sendo mais concreta que os desenhos técnicos. Objetiva-se, assim, destacar a importância do modelo físico como meio de aproximação da população ao patrimônio arquitetônico. Como estudo de caso, foi escolhido o Edifício E1, obra de Ernest Mange e Hélio Duarte. Localizado no campus da USP em São Carlos, é considerado patrimônio da cidade, entretanto, encontra-se praticamente enclausurado no interior do campus, dificultando maior contato da comunidade com o edifício. O projeto do edifício utilizou apenas o desenho como representação, não incluindo nenhum tipo de modelo tridimensional (físico ou digital). A partir de um levantamento das representações gráficas utilizadas pelos projetistas, foi possível fazer uma comparação entre o nível de compreensão do projeto apenas com as peças gráficas dos arquitetos e a partir do modelo físico, produzido pela pesquisadora. Foi realizado um pré-teste em escola pública municipal, despertando o interesse desses alunos pelo edifício em questão.
Resumo:
O presente artigo vincula-se às pesquisas do Núcleo de Apoio à Pesquisa em Estudos de Linguagem em Arquitetura e Cidade (N.ELAC), que atua na área de Linguagem e Representação. Diante das diversas formas de representação em arquitetura (desenho, maquete, modelos digitais), nesta pesquisa o modelo tridimensional físico é trazido como ferramenta que proporciona maior facilidade de leitura do projeto e tratado como meio de aproximação da comunidade ao patrimônio arquitetônico, envolvendo, sobretudo, a arquitetura moderna paulista. Como estudo de caso, escolheu-se o Edifício E1, obra de Ernest Mange e Hélio Duarte. Localizado no campus da USP em São Carlos, é considerado patrimônio da cidade, entretanto, encontra-se praticamente enclausurado no interior do campus, dificultando maior contato da comunidade com o edifício. Durante sua execução, foi utilizado apenas o desenho como ferramenta de representação de projeto, não incluindo nenhum tipo de modelo tridimensional (físico ou digital). A partir do levantamento das representações gráficas utilizadas, foi possível fazer uma comparação entre o nível de compreensão do projeto apenas com as peças gráficas dos arquitetos e a partir do modelo físico, produzido pela pesquisadora. Realizou-se um pré-teste em escola pública municipal, que indicou um aumento no interesse desses alunos pelo edifício em questão.
Resumo:
Background: The recent development of semi-automated techniques for staining and analyzing flow cytometry samples has presented new challenges. Quality control and quality assessment are critical when developing new high throughput technologies and their associated information services. Our experience suggests that significant bottlenecks remain in the development of high throughput flow cytometry methods for data analysis and display. Especially, data quality control and quality assessment are crucial steps in processing and analyzing high throughput flow cytometry data. Methods: We propose a variety of graphical exploratory data analytic tools for exploring ungated flow cytometry data. We have implemented a number of specialized functions and methods in the Bioconductor package rflowcyt. We demonstrate the use of these approaches by investigating two independent sets of high throughput flow cytometry data. Results: We found that graphical representations can reveal substantial non-biological differences in samples. Empirical Cumulative Distribution Function and summary scatterplots were especially useful in the rapid identification of problems not identified by manual review. Conclusions: Graphical exploratory data analytic tools are quick and useful means of assessing data quality. We propose that the described visualizations should be used as quality assessment tools and where possible, be used for quality control.
Resumo:
Das intelligente Tutorensystem LARGO für die Rechtswissenschaften soll Jurastudenten helfen, Argumentationsstrategien zu lernen. Im verwendeten Ansatz werden Gerichtsprotokolle als Lernmaterialien verwendet: Studenten annotieren diese und erstellen graphische Repräsentationen des Argumentationsverlaufs. Das System kann dabei zur Reflexion über die von Anwälten vorgebrachten Argumente anregen und Lernende auf mögliche Schwächen in ihrer Analyse des Disputs hinweisen. Zur Erkennung von Schwächen verwendet das System Graphgrammatiken und kollaborative Filtermechanismen. Dieser Artikel stellt dar, wie in LARGO auf Basis der Bestimmung eines „Benutzungskontextes“ die Rückmeldungen im System benutzungsadaptiv gestaltet werden. Weiterhin diskutieren wir auf Basis der Ergebnisse einer kontrollierten Studie mit dem System, welche mit Jurastudierenden an der University of Pittsburgh stattfand, in wie weit der automatisch bestimmte Benutzungskontext zur Vorhersage von Lernerfolgen bei Studenten verwendbar ist.
Resumo:
Visualization of program executions has been used in applications which include education and debugging. However, traditional visualization techniques often fall short of expectations or are altogether inadequate for new programming paradigms, such as Constraint Logic Programming (CLP), whose declarative and operational semantics differ in some crucial ways from those of other paradigms. In particular, traditional ideas regarding the behavior of data often cannot be lifted in a straightforward way to (C)LP from other families of programming languages. In this chapter we discuss techniques for visualizing data evolution in CLP. We briefly review some previously proposed visualization paradigms, and also propose a number of (to our knowledge) novel ones. The graphical representations have been chosen based on the perceived needs of a programmer trying to analyze the behavior and characteristics of an execution. In particular, we concéntrate on the representation of the run-time valúes of the variables, and the constraints among them. Given our interest in visualizing large executions, we also pay attention to abstraction techniques, i.e., techniques which are intended to help in reducing the complexity of the visual information.
Resumo:
Visualization of program executions has been found useful in applications which include education and debugging. However, traditional visualization techniques often fall short of expectations or are altogether inadequate for new programming paradigms, such as Constraint Logic Programming (CLP), whose declarative and operational semantics differ in some crucial ways from those of other paradigms. In particular, traditional ideas regarding flow control and the behavior of data often cannot be lifted in a straightforward way to (C)LP from other families of programming languages. In this paper we discuss techniques for visualizing program execution and data evolution in CLP. We briefly review some previously proposed visualization paradigms, and also propose a number of (to our knowledge) novel ones. The graphical representations have been chosen based on the perceived needs of a programmer trying to analyze the behavior and characteristics of an execution. In particular, we concéntrate on the representation of the program execution behavior (control), the runtime valúes of the variables, and the runtime constraints. Given our interest in visualizing large executions, we also pay attention to abstraction techniques, Le., techniques which are intended to help in reducing the complexity of the visual information.
Resumo:
Visualisation of program executions has been used in applications which include education and debugging. However, traditional visualisation techniques often fall short of expectations or are altogether inadequate for new programming paradigms, such as Constraint Logic Programming (CLP), whose declarative and operational semantics differ in some crucial ways from those of other paradigms. In particular, traditional ideas regarding the behaviour of data often cannot be lifted in a straightforward way to (C)LP from other families of programming languages. In this chapter we discuss techniques for visualising data evolution in CLP. We briefly review some previously proposed visualisation paradigms, and also propose a number of (to our knowledge) novel ones. The graphical representations have been chosen based on the perceived needs of a programmer trying to analyse the behaviour and characteristics of an execution. In particular, we concentrate on the representation of the run-time values of the variables, and the constraints among them. Given our interest in visualising large executions, we also pay attention to abstraction techniques, i.e., techniques which are intended to help in reducing the complexity of the visual information.
Resumo:
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth?s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Resumo:
En numerosas ocasiones a lo largo de la historia la imaginación de los creadores ha ido por delante de las posibilidades técnicas de cada momento. Así, muchas de estas nuevas ideas han requerido largos periodos de tiempo para materializarse como realidad construida, hasta que el desarrollo tecnológico e industrial hubo alcanzado un grado de madurez suficiente. En el campo de la arquitectura, estas limitaciones técnicas se han ido acotando paulatinamente hasta desembocar en la situación actual en la que cualquier planteamiento formal puede ser representado gráficamente y analizado desde un punto de vista estructural, superádose de este modo la barrera existente históricamente en el tratamiento de las formas. A lo largo del presente tesis doctoral se analiza cómo la formulación del Método de los Elementos Finitos en la década de los cincuenta y las curvas de Bézier en la década de los sesenta del siglo pasado y la posterior generalización de los ordenadores personales y de los programas informáticos asociados (C.A.D. y F.E.M. principalmente) en los estudios de arquitectura e ingeniería a partir de la década de los noventa, posibilitó el desarrollo de cualquier propuesta arquitectónica, por compleja que ésta fuese, provocando una verdadera revolución a nivel formal en el mundo de la arquitectura, especialmente en el campo de la edificación singular o icónica. Se estudia este proceso a través de ocho edificios; cuatro anteriores y otros tantos posteriores a la desaparición de la barrera anteriormente referida, establecida de forma simbólica en la década de los años ochenta del siglo XX: Frontón de Recoletos en Madrid, Edificio Seagram en Nueva York, Habitat ’67 en Montreal, Ópera de Sídney, museo Guggenheim de Bilbao, ampliación del Victoria & Albert Museum en Londres, tanatorio “Meiso no Mori” en Gifu y nueva sede de la CCTV en Pekín. De entre ellos, la Ópera de Sídney, obra del arquitecto danés Jørn Utzon, condensa gran parte de los aspectos relevantes investigados en relación a la influencia que los métodos de representación y análisis estructural ejercen en la concepción y construcción de las obras de arquitectura. Por este motivo y por considerarse un hito de la arquitectura a nivel global se toma como caso de estudio. La idea general del edificio, que data de 1956, se enmarca en una época inmediatamente anterior a la del desarrollo científico y tecnológico anteriormente referido. Esta ausencia de herramientas de diseño disponibles acordes a la complejidad formal de la propuesta planteada condicionó enormente la marcha del proyecto, dilatándose dramáticamente en el tiempo y disparándose su coste hasta el punto de que el propio arquitecto danés fue separado de las obras antes de su conclusión. Además, la solución estructural finalmente construida de las cubiertas dista mucho de la prevista por Utzon inicialmente. Donde él había imaginado unas finas láminas de hormigón flotando sobre el paisaje se materializó una estructura más pesada, formada por costillas pretensadas de hormigón con unas secciones notablemente mayores. La forma también debió ser modificada de modo ostensible respecto a la propuesta inicial. Si este edificio se pretendiese construir en la actualidad, con toda seguridad el curso de los acontecimientos se desarrollaría por senderos muy diferentes. Ante este supuesto, se plantean las siguientes cuestiones: ¿sería posible realizar un análisis estructural de la cubierta laminar planteada por Utzon inicialmente en el concurso con las herramientas disponibles en la actualidad?; ¿sería dicha propuesta viable estructuralmente?. A lo largo de las siguientes páginas se pretende dar respuesta a estas cuestiones, poniendo de relieve el impacto que los ordenadores personales y los programas informáticos asociados han tenido en la manera de concebir y construir edificios. También se han analizado variantes a la solución laminar planteada en la fase de concurso, a través de las cuales, tratando en la medida de lo posible de ajustarse a las sugerencias que Ove Arup y su equipo realizaron a Jørn Utzon a lo largo del dilatado proceso de proyecto, mejorar el comportamiento general de la estructura del edificio. Por último, se ha pretendido partir de cero y plantear, desde una perspectiva contemporánea, posibles enfoques metodológicos aplicables a la búsqueda de soluciones estructurales compatibles con la forma propuesta originalmente por Utzon para las cubiertas de la Ópera de Sídney y que nunca llegó a ser construida (ni analizada), considerando para ello los medios tecnológicos, científicos e industriales disponibles en la actualidad. Abstract On numerous occasions throughout history the imagination of creators has gone well beyond of the technical possibilities of their time. Many new ideas have required a long period to materialize, until the technological and industrial development had time to catch up. In the architecture field, these technical limitations have gradually tightened leading to the current situation in which any formal approach can be represented and analyzed from a structural point of view, thus concluding that the structural analysis and the graphical representation’s barrier in the development of architectural projects has dissappeared. Throughout the following pages it is examined how the development of the Finite Element Method in the fifties and the Bezier curves in the sixties of the last century and the subsequent spread of personal computers and specialized software in the architectural and engineering offices from the nineties, enabled the development of any architectural proposal independently of its complexity. This has caused a revolution at a formal level in architecture, especially in the field of iconic building. This process is analyzed through eight buildings, four of them before and another four after the disappearance of the above mentioned barrier, roughly established in the eighties of the last century: Fronton Recoletos in Madrid, Seagram Building in New York Habitat '67 in Montreal, Sydney Opera House, Guggenheim Museum Bilbao, Victoria & Albert Museum extension in London, Crematorium “Meiso no Mori” in Gifu and the new CCTV headquarters in Beijing. Among them, the Sydney Opera House, designed by Danish architect Jørn Utzon, condenses many of the main aspects previously investigated regarding the impact of representation methods and structural analysis on the design and construction of architectural projects. For this reason and also because it is considered a global architecture milestone, it is selected as a case study. The building’s general idea, which dates from 1956, is framed at a time immediately preceding the above mentioned scientific and technological development. This lack of available design tools in accordance with the proposal’s formal complexity conditioned enormously the project’s progress, leading to a dramatic delay and multiplying the final budget disproportionately to the point that the Danish architect himself was separated from the works before completion. Furthermore, the built structure differs dramatically from the architect’s initial vision. Where Utzon saw a thin concrete shell floating over the landscape a heavier structure was built, consisting of prestressed concrete ribs with a significantly greater size. The geometry also had to be modified. If this building were to built today, the course of events surely would walk very different paths. Given this assumption, a number of questions could then be formulated: Would it be possible to perform a structural analysis of Utzon’s initially proposed competition-free-ways roof’s geometry with the tools available nowadays?; Would this proposal be structurally feasable?. Throughout the following pages it is intended to clarify this issues, highlighting personal computers and associated software’s impact in building design and construction procedures, especially in the field of iconic building. Variants have also been analyzed for the laminar solution proposed in the competition phase, through which, trying as far as possible to comply with the suggestions that Ove Arup and his team did to Jørn Utzon along the lengthy process project, improving the overall performance of the building structure. Finally, we have started from scratch and analyzed, from a contemporary perspective, possible structural solutions compatible with Utzon’s Opera House’s original geometry and vision –proposal that was never built (nor even analyzed)-, taking into consideration the technological, scientific and industrial means currently available.