960 resultados para Compositional kriging
Resumo:
The OMEX core CD110 W90, retrieved from the Douro Mud Patch (DMP) off the River Douro in the north of Portugal, records the period since the beginning of Little Ice Age (LIA). The core chronology is based upon the data attributes for Pb-210, Cs-137 and a C-14 dating from a level near the core base. Geochemical, granulometric, microfaunal (benthic foraminifera) and compositional data suggest the occurrence of precipitation changes which may have been, at least partially, influenced by the North Atlantic Oscillation (NAO), that contributes to the regulation of the ocean-atmosphere dynamics in the North Atlantic. Southwesterly Atlantic storm track is associated with the negative phases of the NAO, when the Azores High is anomalously weak, higher oceanographic hydrodynamism, downwelling events and increased rainfall generally occurs. Prevalence of these characteristics during the LIA left a record that corresponds to phases of major floods. During these phases the DMP received a higher contribution of relatively coarse-grained terrigenous sediments, enriched in quartz particles, which diluted the contribution of other minerals, as indicated by reduced concentrations of several lithogenic chemical elements such as: Al, As, Ba, Ce, Co, Cu, Fe, K, La, Li, Mg, Mn, Mo, Na, Ni, P, Rb, Sc, Sn, Th, V and Y. The presence of biogenic carbonate particles also underwent dilution, as revealed by the smaller abundance of foraminifera and correlative lower concentrations of Ca and Sr. During this period, the DMP also received an increased contribution of organic matter, indicated by higher values of lignin remains and a benthic foraminifera high productivity index, or BFHP, which gave rise to early diagenetic changes with pyrite formation. Since the beginning of the 20th century this contribution diminished, probably due to several drier periods and the impact of human activities in the river basins, e.g. construction of dams, or, on the littoral areas, construction of hard-engineering structures and sand extraction activities. During the first half of the 20th century mainly positive phases of the NAO prevailed, caused by the above normal strengthening of the subtropical high pressure centre of the Azores and the deepening of the low pressure centre in Iceland. These phases may have contributed to the reduction in the supply of both terrigenous sediments and organic matter from shallow water to the DMP. During the positive phases of the NAO, sedimentation became finer. The development of mining and industrial activities during the 20th century is marked, in this core, by higher concentrations of Pb. Furthermore, the erosion of heaps resulting from wolfram exploitation leaves its signature as a peak of W concentrations recorded in the sediments of the DMP deposited between the 1960s and the 1990s. Wolfram exploitation was an important activity in the middle part of the 20th century, particularly during the period of the Second World War. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Abstract Background Transcript enumeration methods such as SAGE, MPSS, and sequencing-by-synthesis EST "digital northern", are important high-throughput techniques for digital gene expression measurement. As other counting or voting processes, these measurements constitute compositional data exhibiting properties particular to the simplex space where the summation of the components is constrained. These properties are not present on regular Euclidean spaces, on which hybridization-based microarray data is often modeled. Therefore, pattern recognition methods commonly used for microarray data analysis may be non-informative for the data generated by transcript enumeration techniques since they ignore certain fundamental properties of this space. Results Here we present a software tool, Simcluster, designed to perform clustering analysis for data on the simplex space. We present Simcluster as a stand-alone command-line C package and as a user-friendly on-line tool. Both versions are available at: http://xerad.systemsbiology.net/simcluster. Conclusion Simcluster is designed in accordance with a well-established mathematical framework for compositional data analysis, which provides principled procedures for dealing with the simplex space, and is thus applicable in a number of contexts, including enumeration-based gene expression data.
Resumo:
Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.
Resumo:
Abstract Background All organisms living under aerobic atmosphere have powerful mechanisms that confer their macromolecules protection against oxygen reactive species. Microorganisms have developed biomolecule-protecting systems in response to starvation and/or oxidative stress, such as DNA biocrystallization with Dps (DNA-binding protein from starved cells). Dps is a protein that is produced in large amounts when the bacterial cell faces harm, which results in DNA protection. In this work, we evaluated the glycosylation in the Dps extracted from Salmonella enterica serovar Typhimurium. This Dps was purified from the crude extract as an 18-kDa protein, by means of affinity chromatography on an immobilized jacalin column. Results The N-terminal sequencing of the jacalin-bound protein revealed 100% identity with the Dps of S. enterica serovar Typhimurium. Methyl-alpha-galactopyranoside inhibited the binding of Dps to jacalin in an enzyme-linked lectin assay, suggesting that the carbohydrate recognition domain (CRD) of jacalin is involved in the interaction with Dps. Furthermore, monosaccharide compositional analysis showed that Dps contained mannose, glucose, and an unknown sugar residue. Finally, jacalin-binding Dps was detected in larger amounts during the bacterial earlier growth periods, whereas high detection of total Dps was verified throughout the bacterial growth period. Conclusion Taken together, these results indicate that Dps undergoes post-translational modifications in the pre- and early stationary phases of bacterial growth. There is also evidence that a small mannose-containing oligosaccharide is linked to this bacterial protein.
Resumo:
Abstract Background In recent years, the growing demand for biofuels has encouraged the search for different sources of underutilized lignocellulosic feedstocks that are available in sufficient abundance to be used for sustainable biofuel production. Much attention has been focused on biomass from grass. However, large amounts of timber residues such as eucalyptus bark are available and represent a potential source for conversion to bioethanol. In the present paper, we investigate the effects of a delignification process with increasing sodium hydroxide concentrations, preceded or not by diluted acid, on the bark of two eucalyptus clones: Eucalyptus grandis (EG) and the hybrid, E. grandis x urophylla (HGU). The enzymatic digestibility and total cellulose conversion were measured, along with the effect on the composition of the solid and the liquor fractions. Barks were also assessed using Fourier-transform infrared spectroscopy (FTIR), solid-state nuclear magnetic resonance (NMR), X-Ray diffraction, and scanning electron microscopy (SEM). Results Compositional analysis revealed an increase in the cellulose content, reaching around 81% and 76% of glucose for HGU and EG, respectively, using a two-step treatment with HCl 1%, followed by 4% NaOH. Lignin removal was 84% (HGU) and 79% (EG), while the hemicellulose removal was 95% and 97% for HGU and EG, respectively. However, when we applied a one-step treatment, with 4% NaOH, higher hydrolysis efficiencies were found after 48 h for both clones, reaching almost 100% for HGU and 80% for EG, in spite of the lower lignin and hemicellulose removal. Total cellulose conversion increased from 5% and 7% to around 65% for HGU and 59% for EG. NMR and FTIR provided important insight into the lignin and hemicellulose removal and SEM studies shed light on the cell-wall unstructuring after pretreatment and lignin migration and precipitation on the fibers surface, which explain the different hydrolysis rates found for the clones. Conclusion Our results show that the single step alkaline pretreatment improves the enzymatic digestibility of Eucalyptus bark. Furthermore, the chemical and physical methods combined in this study provide a better comprehension of the pretreatment effects on cell-wall and the factors that influence enzymatic digestibility of this forest residue.
Resumo:
O presente artigo busca explicitar o conceito de ironia na Teoria do romance. A explicitação do conceito de ironia se desdobrará num desenvolvimento duplo: como exigência normativo-composicional e como radicalização subjetiva que excede a normatividade. No primeiro sentido, a ironia configura subjetivamente uma totalidade na obra épica, partindo da sua fragmentação objetiva nas relações sociais modernas. Nessa acepção, a ironia se apresenta como uma manobra subjetiva a serviço da normatividade épica do romance, pois sua finalidade é harmonizar o ideal subjetivo com a objetividade histórica burguesa. Seu paradigma é representado, neste artigo, por Goethe. O outro sentido pelo qual a ironia romântica aparece é demarcado pela forma extremada da subjetividade. Esta, reconhecendo uma impossibilidade de realização de seu ideal harmônico na modernidade, porque o mundo moderno se lhe apresenta como uma efetividade oposta aos anseios subjetivos, refugia-se na própria interioridade e se distancia do mundo presente, buscando refúgio em tempos e lugares mais propícios à realização poética. Novalis é o modelo dessa ironia radicalizada. Essa forma irônica, ao contrário da "cadência irônica" de Goethe, aniquila a forma romance, uma vez que o aspecto subjetivo da pura reflexão, a lírica, se sobrepõe à objetividade histórica presente que o romance também necessariamente deve encerrar.
Resumo:
There is special interest in the incorporation of metallic nanoparticles in a surrounding dielectric matrix for obtaining composites with desirable characteristics such as for surface plasmon resonance, which can be used in photonics and sensing, and controlled surface electrical conductivity. We investigated nanocomposites produced through metallic ion implantation in insulating substrate, where the implanted metal self-assembles into nanoparticles. During the implantation, the excess of metal atom concentration above the solubility limit leads to nucleation and growth of metal nanoparticles, driven by the temperature and temperature gradients within the implanted sample including the beam-induced thermal characteristics. The nanoparticles nucleate near the maximum of the implantation depth profile (projected range), that can be estimated by computer simulation using the TRIDYN. This is a Monte Carlo simulation program based on the TRIM (Transport and Range of Ions in Matter) code that takes into account compositional changes in the substrate due to two factors: previously implanted dopant atoms, and sputtering of the substrate surface. Our study suggests that the nanoparticles form a bidimentional array buried few nanometers below the substrate surface. More specifically we have studied Au/PMMA (polymethylmethacrylate), Pt/PMMA, Ti/alumina and Au/alumina systems. Transmission electron microscopy of the implanted samples showed the metallic nanoparticles formed in the insulating matrix. The nanocomposites were characterized by measuring the resistivity of the composite layer as function of the dose implanted. These experimental results were compared with a model based on percolation theory, in which electron transport through the composite is explained by conduction through a random resistor network formed by the metallic nanoparticles. Excellent agreement was found between the experimental results and the predictions of the theory. It was possible to conclude, in all cases, that the conductivity process is due only to percolation (when the conducting elements are in geometric contact) and that the contribution from tunneling conduction is negligible.
Resumo:
Size effects on phase stability and phase transitions in technologically relevant materials have received growing attention. Several works reported that metastable phases can be retained at room temperature in nanomaterials, these phases generally corresponding to the high-temperature polymorph of the same material in bulk state. Additionally, size-dependent shifts in solubility limits and/or in the transition temperatures for on heating or on cooling cycles have been observed. ZrO2-Sc2O3 (zirconia-scandia) solid solutions are known to exhibit very high oxygen ion conductivity provided their structure is composed of cubic and/or pseudocubic tetragonal phases. Unfortunately, for solid zirconia-scandia polycrystalline samples with typical micrometrical average crystal sizes, the high-conductivity cubic phase is only stable above 600°C. Depending on composition, three low-conductivity rhombo-hedral phases (β, γ and δ) are stable below 600°C down to room temperature, within the compositional range of interest for SOFCs. In previous investigations, we showed that the rhombohedral phases can be avoided in nanopowders with average crystallite size lower than 35 nm.
Resumo:
Sinking particles through the pelagic ocean have been traditionally considered the most important vehicle by which the biological pump sequesters carbon in the ocean interior. Nevertheless, regional scale variability in particle flux is a major outstanding issue in oceanography. 5 Here, we have studied the regional and temporal variability of total particulate organic matter fluxes, as well as chloropigment and total hydrolyzed amino acid (THAA) compositions and fluxes in the Canary Current region, between 20–30 N, during two contrasting periods: August 2006, characterized by warm and stratified waters, but also intense winds which enhanced eddy development south of the Canary Islands, 10 and February 2007, characterized by colder waters, less stratification and higher productivity. We found that the eddy-field generated south of the Canary Islands enhanced by >2 times particulate organic carbon (POC) export with respect to stations (FF; farfield) outside the eddy-field influence. We also observed flux increases of one order of magnitude in chloropigment and 70% in THAA in the eddy-field relative to FF stations. 15 Principal Components Analysis (PCA) was performed to assess changes in particulate organic matter composition between stations. At eddy-field stations, higher chlorophyll enrichment reflected “fresher” material, while at FF stations a higher proportion of pheophytin indicated greater degradation due to microbes and microzooplankton. PCA also suggests that phytoplankton community structure, particularly the dominance of 20 diatoms versus carbonate-rich plankton, is the major factor influencing the POC export within the eddy field. In February, POC export fluxes were the highest ever reported for this area, reaching values of 15 mmolCm−2 d−1 at 200m depth. Compositional changes in pigments and THAA indicate that the source of sinking particles varies zonally and meridionally and suggest that sinking particles were more degraded at 25 near-coastal stations relative to open ocean stations.
Resumo:
[EN] Sinking particles through the pelagic ocean have been traditionally considered the most important vehicle by which the biological pump sequesters carbon in the ocean interior. Nevertheless, regional scale variability in particle flux is a major outstanding issue in oceanography. Here, we have studied the regional and temporal variability of total particulate organic matter fluxes, as well as chloropigment and total hydrolyzed amino acid (THAA) compositions and fluxes in the Canary Current region, between 20?30_ N, during two contrasting periods: August 2006, characterized by warm and stratified waters, but also intense winds which enhanced eddy development south of the Canary Islands, and February 2007, characterized by colder waters, less stratification and higher productivity. We found that the eddyfield generated south of the Canary Islands enhanced by >2 times particulate organic carbon (POC) export with respect to stations (FF; far-field) outside the eddy-field influence. We also observed flux increases of one order of magnitude in chloropigment and 2 times in THAA in the eddy-field relative to FF stations. Principal Components Analysis (PCA) was performed to assess changes in particulate organic matter composition between stations. At eddy-field stations, higher chlorophyll enrichment reflected ?fresher? material, while at FF stations a higher proportion of pheophytin indicated greater degradation due to microbes and microzooplankton. PCA also suggests that phytoplankton community structure, particularly the dominance of diatoms versus carbonate-rich plankton, is the major factor influencing the POC export within the eddy field. In February, POC export POC export within the eddy field. In February, POC export fluxes were the highest ever reported for this area, reaching values of _15 mmolCm?2 d?1 at 200m depth. Compositional changes in pigments and THAA indicate that the source of sinking particles varies zonally and meridionally and suggest that sinking particles were more degraded at near-coastal stations relative to open ocean stations.
Resumo:
Máster Oficial en Gestión Costera
Resumo:
The continuous increase of genome sequencing projects produced a huge amount of data in the last 10 years: currently more than 600 prokaryotic and 80 eukaryotic genomes are fully sequenced and publically available. However the sole sequencing process of a genome is able to determine just raw nucleotide sequences. This is only the first step of the genome annotation process that will deal with the issue of assigning biological information to each sequence. The annotation process is done at each different level of the biological information processing mechanism, from DNA to protein, and cannot be accomplished only by in vitro analysis procedures resulting extremely expensive and time consuming when applied at a this large scale level. Thus, in silico methods need to be used to accomplish the task. The aim of this work was the implementation of predictive computational methods to allow a fast, reliable, and automated annotation of genomes and proteins starting from aminoacidic sequences. The first part of the work was focused on the implementation of a new machine learning based method for the prediction of the subcellular localization of soluble eukaryotic proteins. The method is called BaCelLo, and was developed in 2006. The main peculiarity of the method is to be independent from biases present in the training dataset, which causes the over‐prediction of the most represented examples in all the other available predictors developed so far. This important result was achieved by a modification, made by myself, to the standard Support Vector Machine (SVM) algorithm with the creation of the so called Balanced SVM. BaCelLo is able to predict the most important subcellular localizations in eukaryotic cells and three, kingdom‐specific, predictors were implemented. In two extensive comparisons, carried out in 2006 and 2008, BaCelLo reported to outperform all the currently available state‐of‐the‐art methods for this prediction task. BaCelLo was subsequently used to completely annotate 5 eukaryotic genomes, by integrating it in a pipeline of predictors developed at the Bologna Biocomputing group by Dr. Pier Luigi Martelli and Dr. Piero Fariselli. An online database, called eSLDB, was developed by integrating, for each aminoacidic sequence extracted from the genome, the predicted subcellular localization merged with experimental and similarity‐based annotations. In the second part of the work a new, machine learning based, method was implemented for the prediction of GPI‐anchored proteins. Basically the method is able to efficiently predict from the raw aminoacidic sequence both the presence of the GPI‐anchor (by means of an SVM), and the position in the sequence of the post‐translational modification event, the so called ω‐site (by means of an Hidden Markov Model (HMM)). The method is called GPIPE and reported to greatly enhance the prediction performances of GPI‐anchored proteins over all the previously developed methods. GPIPE was able to predict up to 88% of the experimentally annotated GPI‐anchored proteins by maintaining a rate of false positive prediction as low as 0.1%. GPIPE was used to completely annotate 81 eukaryotic genomes, and more than 15000 putative GPI‐anchored proteins were predicted, 561 of which are found in H. sapiens. In average 1% of a proteome is predicted as GPI‐anchored. A statistical analysis was performed onto the composition of the regions surrounding the ω‐site that allowed the definition of specific aminoacidic abundances in the different considered regions. Furthermore the hypothesis that compositional biases are present among the four major eukaryotic kingdoms, proposed in literature, was tested and rejected. All the developed predictors and databases are freely available at: BaCelLo http://gpcr.biocomp.unibo.it/bacello eSLDB http://gpcr.biocomp.unibo.it/esldb GPIPE http://gpcr.biocomp.unibo.it/gpipe
Resumo:
The application of Concurrency Theory to Systems Biology is in its earliest stage of progress. The metaphor of cells as computing systems by Regev and Shapiro opened the employment of concurrent languages for the modelling of biological systems. Their peculiar characteristics led to the design of many bio-inspired formalisms which achieve higher faithfulness and specificity. In this thesis we present pi@, an extremely simple and conservative extension of the pi-calculus representing a keystone in this respect, thanks to its expressiveness capabilities. The pi@ calculus is obtained by the addition of polyadic synchronisation and priority to the pi-calculus, in order to achieve compartment semantics and atomicity of complex operations respectively. In its direct application to biological modelling, the stochastic variant of the calculus, Spi@, is shown able to model consistently several phenomena such as formation of molecular complexes, hierarchical subdivision of the system into compartments, inter-compartment reactions, dynamic reorganisation of compartment structure consistent with volume variation. The pivotal role of pi@ is evidenced by its capability of encoding in a compositional way several bio-inspired formalisms, so that it represents the optimal core of a framework for the analysis and implementation of bio-inspired languages. In this respect, the encodings of BioAmbients, Brane Calculi and a variant of P Systems in pi@ are formalised. The conciseness of their translation in pi@ allows their indirect comparison by means of their encodings. Furthermore it provides a ready-to-run implementation of minimal effort whose correctness is granted by the correctness of the respective encoding functions. Further important results of general validity are stated on the expressive power of priority. Several impossibility results are described, which clearly state the superior expressiveness of prioritised languages and the problems arising in the attempt of providing their parallel implementation. To this aim, a new setting in distributed computing (the last man standing problem) is singled out and exploited to prove the impossibility of providing a purely parallel implementation of priority by means of point-to-point or broadcast communication.
Resumo:
In the whole of Europe the most important composer of concertos for two violins is indubitably Vivaldi (1678-1741), who produced almost thirty works of this type during almost the full length of his creative career. The dissertation examines this particular side of Vivaldi’s activity, starting with an examination of the concerto in Rome, Bologna, and Venice at the turn of the seventeenth and eighteenth centuries. The aspects investigated include the ‘conceptual’ origins of the double concerto for two violins in Vivaldi, the nature, distribution and interrelationship of their sources (particular attention being given to compositional revisions in the autograph manuscripts) and an analysis of the works themselves that takes in form, tonal structure, technical-instrumental character and performance practice. The concertos that have come down in particularly problematic non-autograph sources are discussed in detail and presented in critical editions. A reconstruction is offered of the two works (RV 520 and 526) that have survived only in incomplete form, lacking the part of the first soloist. The concertos for two violins composed in Germany by Telemann and J. S. Bach, the contemporaries of Vivaldi who paid greatest attention to the double concerto genre, are then described and analysed. The thesis ends with a complete list of modern editions of Vivaldi’s concertos for two violins and a select discography.
Resumo:
Purpose of this research is to deepen the study on the section in architecture. The survey aims as important elements in the project Teatro Domestico by Aldo Rossi built for the XVII Triennale di Milano in 1986 and, through the implementation on several topics of architecture, verify the timeliness and fertility in the new compositional exercises. Through the study of certain areas of the Rossi’s theory we tried to find a common thread for the reading of the theater project. The theater is the place of the ephemeral and the artificial, which is why his destiny is the end and the fatal loss. The design and construction of theater setting has always had a double meaning between the value of civil architecture and testing of new technologies available. Rossi's experience in this area are clear examples of the inseparable relationship between the representation of architecture as art and design of architecture as a model of reality. In the Teatro Domestico, the distinction between representation and the real world is constantly canceled and returned through the reversal of the meaning and through the skip of scale. At present, studies conducted on the work of Rossi concern the report that the architectural composition is the theory of form, focusing compositional development of a manufacturing process between the typological analysis and form invention. The research, through the analysis of some projects few designs, will try to analyze this issue through the rules of composition both graphical and concrete construction, hoping to decipher the mechanism underlying the invention. The almost total lack of published material on the project Teatro Domestico and the opportunity to visit the archives that preserve the drawings, has allowed the author of this study to deepen the internal issues in the project, thus placing this search as a first step toward possible further analysis on the works of Rossi linked to performance world. The final aim is therefore to produce material that can best describe the work of Rossi. Through the reading of the material published by the same author and the vision of unpublished material preserved in the archives, it was possible to develop new material and increasing knowledge about the work, otherwise difficult to analyze. The research is divided into two groups. The first, taking into account the close relationship most frequently mentioned by Rossi himself between archeology and architectural composition, stresses the importance of tipo such as urban composition reading system as well as open tool of invention. Resuming Ezio Bonfanti’s essay on the work of the architect we wanted to investigate how the paratactic method is applied to the early work conceived and, subsequently as the process reaches a complexity accentuated, while keeping stable the basic terms. Following a brief introduction related to the concept of the section and the different interpretations that over time the term had, we tried to identify with this facility a methodology for reading Rossi’s projects. The result is a constant typological interpretation of the term, not only related to the composition in plant but also through the elevation plans. The section is therefore intended as the overturning of such elevation is marked on the same plane of the terms used, there is a different approach, but a similarity of characters. The identification of architectural phonemes allows comparison with other arts. The research goes in the direction of language trying to identify the relationship between representation and construction, between the ephemeral and the real world. In this sense it will highlight the similarities between the graphic material produced by Ross and some important examples of contemporary author. The comparison between the composition system with the surrealist world of painting and literature will facilitate the understanding and identification of possible rules applied by Rossi. The second part of the research is characterized by a focus on the intent of the project chosen. Teatro Domestico embodies a number of elements that seem to conclude (assuming an end point but also to start) a curriculum author. With it, the experiments carried out on the theater started with the project for the Teatrino Scientifico (1978) through the project for the Teatro del Mondo (1979), into a Laic Tabernacle representative collective and private memory of the city. Starting from a reading of the draft, through the collection of published material, we’ve made an analysis on the explicit themes of the work, finding the conceptual references. Following the taking view of the original materials not published kept at Aldo Rossi's Archive Collection of the Canadian Center for Architecture in Montréal, will be implemented through the existing techniques for digital representation, a virtual reconstruction of the project, adding little to the material, a new element for future studies. The reconstruction is part of a larger research studies where the current technologies of composition and representation in architecture stand side by side with research on the method of composition of this architect. The results achieved are in addition to experiences in the past dealt with the reconstruction of some of the lost works of Aldo Rossi. A partial objective is to reactivate a discourse around this work is considered non-principal, among others born in the prolific activities. Reassessment of development projects which would bring the level of ephemeral works most frequented by giving them the value earned. In conclusion, the research aims to open a new field of interest on the part not only as a technical instrument of representation of an idea but as an actual mechanism through which composition is formed and the idea is developed.