914 resultados para DFT piperidine morpholine computational study diastereoselection chemodivergent synthesis
Resumo:
A computational study of line-focus generation was done using a self-written ray-tracing code and compared to experimental data. Two line-focusing geometries were compared, i.e., either exploiting the sagittal astigmatism of a tilted spherical mirror or using the spherical aberration of an off-axis- illuminated spherical mirror. Line focusing by means of astigmatism or spherical aberration showed identical results as expected for the equivalence of the two frames of reference. The variation of the incidence angle on the target affects the line-focus length, which affects the amplification length such that as long as the irradiance is above the amplification threshold, it is advantageous to have a longer line focus. The amplification threshold is physically dependent on operating parameters and plasma-column conditions and in the present study addresses four possible cases.
Resumo:
The abundance of alpha-fetoprotein (AFP), a natural protein produced by the fetal yolk sac during pregnancy, correlates with lower incidence of estrogen receptor positive (ER+) breast cancer. The pharmacophore region of AFP has been narrowed down to a four amino acid (AA) region in the third domain of the 591 AA peptide. Our computational study focuses on a 4-mer segment consisting of the amino acids threonine-proline-valine-asparagine (TPVN). We have run replica exchange molecular dynamics (REMD) simulations and used 120 configurational snapshots from the total trajectory as starting configurations for quantum chemical calculations. We optimized structures using semiempirical (PM3, PM6, PM6-D2, PM6-H2, PM6-DH+, PM6-DH2) and density functional methods (TPSS, PBE0, M06-2X). By comparing the accuracy of these methods against RI-MP2 benchmarks, we devised a protocol for calculating the lowest energy conformers of these peptides accurately and efficiently. This protocol screens out high-energy conformers using lower levels of theory and outlines a general method for predicting small peptide structures.
Resumo:
Bidirectional promoters regulate adjacent genes organized in a divergent fashion (head to head orientation). Several Reports pertaining to bidirectional promoters on a genomic scale exists in mammals. This work provides the essential background on theoretical and experimental work to carry out a genomic scale analysis of bidirectional promoters in plants. A computational study was performed to identify putative bidirectional promoters and the over-represented cis-regulatory motifs from three sequenced plant genomes: rice (Oryza sativa), Arabidopsis thaliana, and Populus trichocarpa using the Plant Cis-acting Regulatory DNA Elements (PLACE) and PLANT CARE databases. Over-represented motifs along with their possible function were described with the help of a few conserved representative putative bidirectional promoters from the three model plants. By doing so a foundation was laid for the experimental evaluation of bidirectional promoters in plants. A novel Agrobacterium tumefaciens mediated transient expression assay (AmTEA) was developed for young plants of different cereal species and the model dicot Arabidopsis thaliana. AmTEA was evaluated using five promoters (six constructs) and two reporter genes, gus and egfp. Efficacy and stability of AmTEA was compared with stable transgenics using the Arabidopsis DEAD-box RNA helicase family gene promoter. AmTEA was primarily developed to overcome the many problems associated with the development of transgenics and expression studies in plants. Finally a possible mechanism for the bidirectional activity of bidirectional promoters was highlighted. Deletion analysis using promoter-reporter gene constructs identified three rice promoters to be bidirectional. Regulatory elements located in the 5’- untranslated regions (UTR) of one of the genes of the divergent gene pair were found to be responsible for their bidirectional ctivity
Resumo:
The physics of the operation of singe-electron tunneling devices (SEDs) and singe-electron tunneling transistors (SETs), especially of those with multiple nanometer-sized islands, has remained poorly understood in spite of some intensive experimental and theoretical research. This computational study examines the current-voltage (IV) characteristics of multi-island single-electron devices using a newly developed multi-island transport simulator (MITS) that is based on semi-classical tunneling theory and kinetic Monte Carlo simulation. The dependence of device characteristics on physical device parameters is explored, and the physical mechanisms that lead to the Coulomb blockade (CB) and Coulomb staircase (CS) characteristics are proposed. Simulations using MITS demonstrate that the overall IV characteristics in a device with a random distribution of islands are a result of a complex interplay among those factors that affect the tunneling rates that are fixed a priori (e.g. island sizes, island separations, temperature, gate bias, etc.), and the evolving charge state of the system, which changes as the source-drain bias (VSD) is changed. With increasing VSD, a multi-island device has to overcome multiple discrete energy barriers (up-steps) before it reaches the threshold voltage (Vth). Beyond Vth, current flow is rate-limited by slow junctions, which leads to the CS structures in the IV characteristic. Each step in the CS is characterized by a unique distribution of island charges with an associated distribution of tunneling probabilities. MITS simulation studies done on one-dimensional (1D) disordered chains show that longer chains are better suited for switching applications as Vth increases with increasing chain length. They are also able to retain CS structures at higher temperatures better than shorter chains. In sufficiently disordered 2D systems, we demonstrate that there may exist a dominant conducting path (DCP) for conduction, which makes the 2D device behave as a quasi-1D device. The existence of a DCP is sensitive to the device structure, but is robust with respect to changes in temperature, gate bias, and VSD. A side gate in 1D and 2D systems can effectively control Vth. We argue that devices with smaller island sizes and narrower junctions may be better suited for practical applications, especially at room temperature.
Resumo:
Prevention and treatment of osteoporosis rely on understanding of the micromechanical behaviour of bone and its influence on fracture toughness and cell-mediated adaptation processes. Postyield properties may be assessed by nonlinear finite element simulations of nanoindentation using elastoplastic and damage models. This computational study aims at determining the influence of yield surface shape and damage on the depth-dependent response of bone to nanoindentation using spherical and conical tips. Yield surface shape and damage were shown to have a major impact on the indentation curves. Their influence on indentation modulus, hardness, their ratio as well as the elastic-to-total work ratio is well described by multilinear regressions for both tip shapes. For conical tips, indentation depth was not statistically significant (p<0.0001). For spherical tips, damage was not a significant parameter (p<0.0001). The gained knowledge can be used for developing an inverse method for identification of postelastic properties of bone from nanoindentation.
Resumo:
Refractive losses in laser-produced plasmas used as gain media are caused by electron density gradients, and limit the energy transport range. The pump pulse is thus deflected from the high-gain region and the short wavelength laser signal also steers away, causing loss of collimation. A Hohlraum used as a target makes the plasma homogeneous and can mitigate refractive losses by means of wave-guiding. A computational study combining a hydrodynamics code and an atomic physics code is presented, which includes a ray-tracing modeling based on the eikonal theory of the trajectory equation. This study presents gain calculations based on population inversion produced by free-electron collisions exciting bound electrons into metastable levels in the 3d94d1(J = 0) → 3d94p1(J = 1) transition of Ni-like Sn. Further, the Hohlraum suggests a dramatic enhancement of the conversion efficiency of collisionally excited x-ray lasing for Ni-like Sn.
Resumo:
Aging societies suffer from an increasing incidence of bone fractures. Bone strength depends on the amount of mineral measured by clinical densitometry, but also on the micromechanical properties of the bone hierarchical organization. A good understanding has been reached for elastic properties on several length scales, but up to now there is a lack of reliable postyield data on the lower length scales. In order to be able to describe the behavior of bone at the microscale, an anisotropic elastic-viscoplastic damage model was developed using an eccentric generalized Hill criterion and nonlinear isotropic hardening. The model was implemented as a user subroutine in Abaqus and verified using single element tests. A FE simulation of microindentation in lamellar bone was finally performed show-ing that the new constitutive model can capture the main characteristics of the indentation response of bone. As the generalized Hill criterion is limited to elliptical and cylindrical yield surfaces and the correct shape for bone is not known, a new yield surface was developed that takes any convex quadratic shape. The main advantage is that in the case of material identification the shape of the yield surface does not have to be anticipated but a minimization results in the optimal shape among all convex quadrics. The generality of the formulation was demonstrated by showing its degeneration to classical yield surfaces. Also, existing yield criteria for bone at multiple length scales were converted to the quadric formulation. Then, a computational study to determine the influence of yield surface shape and damage on the in-dentation response of bone using spherical and conical tips was performed. The constitutive model was adapted to the quadric criterion and yield surface shape and critical damage were varied. They were shown to have a major impact on the indentation curves. Their influence on indentation modulus, hardness, their ratio as well as the elastic to total work ratio were found to be very well described by multilinear regressions for both tip shapes. For conical tips, indentation depth was not a significant fac-tor, while for spherical tips damage was insignificant. All inverse methods based on microindentation suffer from a lack of uniqueness of the found material properties in the case of nonlinear material behavior. Therefore, monotonic and cyclic micropillar com-pression tests in a scanning electron microscope allowing a straightforward interpretation comple-mented by microindentation and macroscopic uniaxial compression tests were performed on dry ovine bone to identify modulus, yield stress, plastic deformation, damage accumulation and failure mecha-nisms. While the elastic properties were highly consistent, the postyield deformation and failure mech-anisms differed between the two length scales. A majority of the micropillars showed a ductile behavior with strain hardening until failure by localization in a slip plane, while the macroscopic samples failed in a quasi-brittle fashion with microcracks coalescing into macroscopic failure surfaces. In agreement with a proposed rheological model, these experiments illustrate a transition from a ductile mechanical behavior of bone at the microscale to a quasi-brittle response driven by the growth of preexisting cracks along interfaces or in the vicinity of pores at the macroscale. Subsequently, a study was undertaken to quantify the topological variability of indentations in bone and examine its relationship with mechanical properties. Indentations were performed in dry human and ovine bone in axial and transverse directions and their topography measured by AFM. Statistical shape modeling of the residual imprint allowed to define a mean shape and describe the variability with 21 principal components related to imprint depth, surface curvature and roughness. The indentation profile of bone was highly consistent and free of any pile up. A few of the topological parameters, in particular depth, showed significant correlations to variations in mechanical properties, but the cor-relations were not very strong or consistent. We could thus verify that bone is rather homogeneous in its micromechanical properties and that indentation results are not strongly influenced by small de-viations from the ideal case. As the uniaxial properties measured by micropillar compression are in conflict with the current literature on bone indentation, another dissipative mechanism has to be present. The elastic-viscoplastic damage model was therefore extended to viscoelasticity. The viscoelastic properties were identified from macroscopic experiments, while the quasistatic postelastic properties were extracted from micropillar data. It was found that viscoelasticity governed by macroscale properties has very little influence on the indentation curve and results in a clear underestimation of the creep deformation. Adding viscoplasticity leads to increased creep, but hardness is still highly overestimated. It was possible to obtain a reasonable fit with experimental indentation curves for both Berkovich and spherical indenta-tion when abandoning the assumption of shear strength being governed by an isotropy condition. These results remain to be verified by independent tests probing the micromechanical strength prop-erties in tension and shear. In conclusion, in this thesis several tools were developed to describe the complex behavior of bone on the microscale and experiments were performed to identify its material properties. Micropillar com-pression highlighted a size effect in bone due to the presence of preexisting cracks and pores or inter-faces like cement lines. It was possible to get a reasonable fit between experimental indentation curves using different tips and simulations using the constitutive model and uniaxial properties measured by micropillar compression. Additional experimental work is necessary to identify the exact nature of the size effect and the mechanical role of interfaces in bone. Deciphering the micromechanical behavior of lamellar bone and its evolution with age, disease and treatment and its failure mechanisms on several length scales will help preventing fractures in the elderly in the future.
Resumo:
As the major anionic phospholipids predominantly found in the mitochondrial inner membrane of eukaryotic cells, cardiolipin (CL) and its precursor phosphatidylglycerol (PG) are of great importance in many critical mitochondrial processes. Pgs1Δ cells of Saccharomyces cerevisiae lacking both PG and CL display severe mitochondrial defects. Translation of several proteins including products of four mitochondrial DNA (mtDNA) encoded genes (COX1, COX2, COX3, and COB ) and one nuclear-encoded gene (COX4) is inhibited. The molecular basis of this phenotype was analyzed using a combined biochemical, molecular and genetic approach. ^ Using a mitochondrial targeted green fluorescence protein (mtGFP) fused to the COX4 promoter and its 5′ and 3′ untranslated regions (UTRs), lack of mtGFP expression independent of carbon source and strain background was confirmed to be at the translational level. The translational defect was not due to deficiency of mitochondrial respiratory function but rather caused directly by the lack of PG/CL in the mitochondrial membrane. Re-introduction of a functional PGS1 gene restored PG synthesis and expression of the above mtGFP. Deletional analysis of the 5′ UTR of COX4 mRNA revealed the presence of a 50 nt sequence as a cis-acting element inhibiting COX4 translation. Using similar constructs with HIS3 and lacZ as reporter genes, extragenic spontaneous mutations that allowed expression of His3p and β-galactosidase were isolated, which appeared to be recessive and derived from loss-of-function mutations as determined by mating analysis. Using a tetracycline repressible plasmid-borne PGS1 expression system and an in vivo mitochondrial protein translation method, the translation of mtDNA encoded COX1 and COX3 mRNAs was shown to be significantly inhibited in parallel with reduced levels of PG/CL content. Therefore, the cytoplasmic translation machinery appears to be able to sense the level of PG/CL in mitochondria and regulate COX4 translation coordinately with the mtDNA encoded subunits. ^ The essential requirement of PG and CL in mitochondrial function was further demonstrated in the study of CL synthesis by factors affecting mitochondrial biogenesis such as carbon source, growth phase or mitochondrial mutations at the level of transcription. We have also demonstrated that CL synthesis is dependent on the level of PG and INO2/INO4 regulatory genes. ^
Resumo:
This study is a synthesis of paleomagnetic and mineral magnetic results for Sites 819 through 823 of Ocean Drilling Program (ODP) Leg 133, which lie on a transect from the outer edge of the Great Barrier Reef (GBR) down the continental slope to the bottom of the Queensland Trough. Because of viscous remagnetization and pervasive overprinting, few reversal boundaries can be identified in these extremely high-resolution Quaternary sequences. Some of the magnetic instability, and the differences in the quality of the paleomagnetic signal among sites, can be explained in terms of the dissolution of primary iron oxides in the high near-surface geochemical gradients. Well-defined changes in magnetic properties, notably susceptibility, reflect responses to glacio-eustatic sea-level fluctuations and changes in slope sedimentation processes resulting from formation of the GBR. Susceptibility can be used to correlate between adjacent holes at a given site to an accuracy of about 20 cm. Among-site correlation of susceptibility is also possible for certain parts of the sequences and permits (tentative) extension of the reversal chronology. The reversal boundaries that can be identified are generally compatible with the calcareous nannofossil biostratigraphy and demonstrate a high level of biostratigraphic consistency among sites. A revised chronology based on an optimum match with the susceptibility stratigraphy is presented. Throughout most of the sequences there is a strong inverse correlation both between magnetic susceptibility and calcium carbonate content, and between susceptibility and d18O. In the upper, post-GBR, sections a more complicated type of magnetic response occurs during glacial maxima and subsequent transgressions, resulting in a positive correlation between susceptibility and d18O. Prior to and during formation of the outer-reef barrier, the sediments have relatively uniform magnetic properties showing multidomain behavior and displaying cyclic variations in susceptibility related to sea-level change. The susceptibility oscillations are controlled more by carbonate dilution than by variation in terrigenous influx. Establishment of the outer reef between 1.01 and 0.76 Ma restricted the supply of sediment to the slope, causing a four-fold reduction in sedimentation rates and a transition from prograding to aggrading seismic geometries (see other chapters in this volume). The Brunhes/Matuyama boundary and the end of the transition period mark a change to lower and more subdued susceptibility oscillations with higher carbonate contents. The major change in magnetic properties comes at about 0.4 Ma in the aggrading sequence, which contains prominent sharp susceptibility peaks associated with glacial cycles, with distinctive single-domain magnetite and mixed single-domain/superparamagnetic characteristics. Bacterial magnetite has been found in the sediments, particularly where there are high susceptibility peaks, but its importance has not yet been assessed. A possible explanation for the characteristic pattern of magnetic properties in the post-GBR glacial cycles can be found in terms of fluvio-deltaic processes and inter-reefal lagoonal reservoirs that develop when the shelf becomes exposed at low sea-level.
Resumo:
The computational study commented by Touchette opens the door to a desirable generalization of standard large deviation theory for special, though ubiquitous, correlations. We focus on three interrelated aspects: (i) numerical results strongly suggest that the standard exponential probability law is asymptotically replaced by a power-law dominant term; (ii) a subdominant term appears to reinforce the thermodynamically extensive entropic nature of q-generalized rate function; (iii) the correlations we discussed, correspond to Q -Gaussian distributions, differing from Lévy?s, except in the case of Cauchy?Lorentz distributions. Touchette has agreeably discussed point (i), but, unfortunately, points (ii) and (iii) escaped to his analysis. Claiming the absence of connection with q-exponentials is unjustified.
Resumo:
La presente investigación se llevó a cabo en la Universidad Politécnica de Madrid (España) conjuntamente con la Universidad Nacional Experimental del Táchira (Venezuela). El estudio consistió en diseñar una cavidad interna dentro del perfil aerodinámico 2415-3s, el cual fue desarrollado en la Universidad Técnica Checa (Praga, República Checa). Se realizó un estudio computacional, mediante la técnica del CFD, de diferentes modelos de cavidades internas en este perfil, para seleccionar el diseño más adecuado, fabricando un prototipo en 3D; logrando de esta manera validar la simulación computacional con los datos experimentales obtenidos con los ensayos en el túnel de viento AF6109 de la Universidad Nacional Experimental del Táchira. También se aplicaron técnicas de visualización en el túnel de viento, como líneas de corriente de humo y películas de aceite sobre el perfil aerodinámico. Dicho procedimiento permitió corroborar la validación de la simulación computacional. El perfil aerodinámico seleccionado se denominó 2415-3s-TC, cuya característica principal consiste en tres canales independientes entre sí, alojados dentro de la cavidad interna, permitiendo que el flujo de aire forzado a través de la cavidad, cambiara de dirección, para desembocar lo más tangencialmente, así como, lo más perpendicularmente posible al escalón del perfil aerodinámico 2415-3s. Esta configuración de diseñó permitió elevar el coeficiente de sustentación para ángulos de ataque mayores a 8º, así como para ángulos cercanos al ángulo crítico. ABSTRACT This research was conducted at the Polytechnic University of Madrid (Spain) together with the National Experimental University of Táchira (Venezuela). The study was to design an internal cavity within the airfoil 2415-3s, which was developed in the Czech Technical University (Prague, Czech Republic). A computational study was performed using CFD technique, different models of internal cavities in the profile to select the most appropriate design, manufacturing a prototype 3D; thus achieving validate the computer simulation with experimental data obtained from the tests in the wind tunnel AF6109 of the National Experimental University of Táchira. Visualization techniques were also applied in the wind tunnel, as streamlines smoke and oil films on the airfoil. This procedure corroborated validation of computational simulation. The airfoil selected denominated 2415-3s-TC, whose main characteristic consists of three independent channels each other, housed within the inner cavity, allowing the forced air flow through the cavity, change direction, to lead as more tangentially and, as perpendicular as possible to the step 2415-3s aerofoil. This configuration designed allowed increasing the lift coefficient for higher angles of attack to 8º, and for angles near the critical angle.
Resumo:
Los procesos de diseño y construcción en Arquitectura han mostrado un desarrollo de optimización históricamente muy deficiente cuando se compara con las restantes actividades típicamente industriales. La aspiración constante a una industrialización efectiva, tanto en aras de alcanzar mayores cotas de calidad así como de ahorro de recursos, recibe hoy una oportunidad inmejorable desde el ámbito informático: el Building Information Modelling o BIM. Lo que en un inicio puede parecer meramente un determinado tipo de programa informático, en realidad supone un concepto de “proceso” que subvierte muchas rutinas hoy habituales en el desarrollo de proyectos y construcciones arquitectónicas. La inclusión y desarrollo de datos ligados al proyecto, desde su inicio hasta el fin de su ciclo de vida, conlleva la oportunidad de crear una realidad virtual dinámica y actualizable, que por añadidura posibilita su ensayo y optimización en todos sus aspectos: antes y durante su ejecución, así como vida útil. A ello se suma la oportunidad de transmitir eficientemente los datos completos de proyecto, sin apenas pérdidas o reelaboración, a la cadena de fabricación, lo que facilita el paso a una industrialización verdaderamente significativa en edificación. Ante una llamada mundial a la optimización de recursos y el interés indudable de aumentar beneficios económicos por medio de la reducción del factor de incertidumbre de los procesos, BIM supone un opción de mejora indudable, y así ha sido reconocido a través de la inminente implantación obligatoria por parte de los gobiernos (p. ej. Gran Bretaña en 2016 y España en 2018). La modificación de procesos y roles profesionales que conlleva la incorporación de BIM resulta muy significativa y marcará el ejercicio profesional de los futuros graduados en las disciplinas de Arquitectura, Ingeniería y Construcción (AEC por sus siglas en inglés). La universidad debe responder ágilmente a estas nuevas necesidades incorporando esta metodología en la enseñanza reglada y aportando una visión sinérgica que permita extraer los beneficios formativos subyacentes en el propio marco BIM. En este sentido BIM, al aglutinar el conjunto de datos sobre un único modelo virtual, ofrece un potencial singularmente interesante. La realidad tridimensional del modelo, desarrollada y actualizada continuamente, ofrece al estudiante una gestión radicalmente distinta de la representación gráfica, en la que las vistas parciales de secciones y plantas, tan complejas de asimilar en los inicios de la formación universitaria, resultan en una mera petición a posteriori, para ser extraída según necesidad del modelo virtual. El diseño se realiza siempre sobre el propio modelo único, independientemente de la vista de trabajo elegida en cada momento, permaneciendo los datos y sus relaciones constructivas siempre actualizados y plenamente coherentes. Esta descripción condensada de características de BIM preconfiguran gran parte de las beneficios formativos que ofrecen los procesos BIM, en especial, en referencia al desarrollo del diseño integrado y la gestión de la información (incluyendo TIC). Destacan a su vez las facilidades en comprensión visual de elementos arquitectónicos, sistemas técnicos, sus relaciones intrínsecas así como procesos constructivos. A ello se une el desarrollo experimental que la plataforma BIM ofrece a través de sus software colaborativos: la simulación del comportamiento estructural, energético, económico, entre otros muchos, del modelo virtual en base a los datos inherentes del proyecto. En la presente tesis se describe un estudio de conjunto para explicitar tanto las cualidades como posibles reservas en el uso de procesos BIM, en el marco de una disciplina concreta: la docencia de la Arquitectura. Para ello se ha realizado una revisión bibliográfica general sobre BIM y específica sobre docencia en Arquitectura, así como analizado las experiencias de distintos grupos de interés en el marco concreto de la enseñanza de la en Arquitectura en la Universidad Europea de Madrid. El análisis de beneficios o reservas respecto al uso de BIM se ha enfocado a través de la encuesta a estudiantes y la entrevista a profesionales AEC relacionados o no con BIM. Las conclusiones del estudio permiten sintetizar una implantación de metodología BIM que para mayor claridad y facilidad de comunicación y manejo, se ha volcado en un Marco de Implantación eminentemente gráfico. En él se orienta sobre las acciones docentes para el desarrollo de competencias concretas, valiéndose de la flexibilidad conceptual de los Planes de Estudio en el contexto del Espacio Europeo de Educación Superior (Declaración de Bolonia) para incorporar con naturalidad la nueva herramienta docente al servicio de los objetivos formativo legalmente establecidos. El enfoque global del Marco de Implementación propuesto facilita la planificación de acciones formativas con perspectiva de conjunto: combinar los formatos puntuales o vehiculares BIM, establecer sinergias transversales y armonizar recursos, de modo que la metodología pueda beneficiar tanto la asimilación de conocimientos y habilidades establecidas para el título, como el propio flujo de aprendizaje o learn flow BIM. Del mismo modo reserva, incluso visualmente, aquellas áreas de conocimiento en las que, al menos en la planificación actual, la inclusión de procesos BIM no se considera ventajosa respecto a otras metodologías, o incluso inadecuadas para los objetivos docentes establecidos. Y es esta última categorización la que caracteriza el conjunto de conclusiones de esta investigación, centrada en: 1. la incuestionable necesidad de formar en conceptos y procesos BIM desde etapas muy iniciales de la formación universitaria en Arquitectura, 2. los beneficios formativos adicionales que aporta BIM en el desarrollo de competencias muy diversas contempladas en el currículum académico y 3. la especificidad del rol profesional del arquitecto que exigirá una implantación cuidadosa y ponderada de BIM que respete las metodologías de desarrollo creativo tradicionalmente efectivas, y aporte valor en una reorientación simbiótica con el diseño paramétrico y fabricación digital que permita un diseño finalmente generativo. ABSTRACT The traditional architectural design and construction procedures have proven to be deficient where process optimization is concerned, particularly when compared to other common industrial activities. The ever‐growing strife to achieve effective industrialization, both in favor of reaching greater quality levels as well as sustainable management of resources, has a better chance today than ever through a mean out of the realm of information technology, the Building Information Modelling o BIM. What may initially seem to be merely another computer program, in reality turns out to be a “process” concept that subverts many of today’s routines in architectural design and construction. Including and working with project data from the very beginning to the end of its full life cycle allows for creating a dynamic and updatable virtual reality, enabling data testing and optimizing throughout: before and during execution, all the way to the end of its lifespan. In addition, there is an opportunity to transmit complete project data efficiently, with hardly any loss or redeveloping of the manufacture chain required, which facilitates attaining a truly significant industrialization within the construction industry. In the presence of a world‐wide call for optimizing resources, along with an undeniable interest in increasing economic benefits through reducing uncertainty factors in its processes, BIM undoubtedly offers a chance for improvement as acknowledged by its imminent and mandatory implementation on the part of governments (for example United Kingdom in 2016 and Spain in 2018). The changes involved in professional roles and procedures upon incorporating BIM are highly significant and will set the course for future graduates of Architecture, Engineering and Construction disciplines (AEC) within their professions. Higher Education must respond to such needs with swiftness by incorporating this methodology into their educational standards and providing a synergetic vision that focuses on the underlying educational benefits inherent in the BIM framework. In this respect, BIM, in gathering data set under one single virtual model, offers a uniquely interesting potential. The three‐dimensional reality of the model, under continuous development and updating, provides students with a radically different graphic environment, in which partial views of elevation, section or plan that tend characteristically to be difficult to assimilate at the beginning of their studies, become mere post hoc requests to be ordered when needed directly out the virtual model. The design is always carried out on the sole model itself, independently of the working view chosen at any particular moment, with all data and data relations within construction permanently updated and fully coherent. This condensed description of the features of BIM begin to shape an important part of the educational benefits posed by BIM processes, particularly in reference to integrated design development and information management (including ITC). At the same time, it highlights the ease with which visual understanding is achieved regarding architectural elements, technology systems, their intrinsic relationships, and construction processes. In addition to this, there is the experimental development the BIM platform grants through its collaborative software: simulation of structural, energetic, and economic behavior, among others, of the virtual model according to the data inherent to the project. This doctoral dissertation presents a broad study including a wide array of research methods and issues in order to specify both the virtues and possible reservations in the use of BIM processes within the framework of a specific discipline: teaching Architecture. To do so, a literature review on BIM has been carried out, specifically concerning teaching in the discipline of Architecture, as well as an analysis of the experience of different groups of interest delimited to Universidad Europea de Madrid. The analysis of the benefits and/or limitations of using BIM has been approached through student surveys and interviews with professionals from the AEC sector, associated or not, with BIM. Various diverse educational experiences are described and academic management for experimental implementation has been analyzed. The conclusions of this study offer a synthesis for a Framework of Implementation of BIM methodology, which in order to reach greater clarity, communication ease and user‐friendliness, have been posed in an eminently graphic manner. The proposed framework proffers guidance on teaching methods conducive to the development of specific skills, taking advantage of the conceptual flexibility of the European Higher Education Area guidelines based on competencies, which naturally facilitate for the incorporation of this new teaching tool to achieve the educational objectives established by law. The global approach of the Implementation Framework put forth in this study facilitates the planning of educational actions within a common perspective: combining exceptional or vehicular BIM formats, establishing cross‐disciplinary synergies, and sharing resources, so as to purport a methodology that contributes to the assimilation of knowledge and pre‐defined competencies within the degree program, and to the flow of learning itself. At the same time, it reserves, even visually, those areas of knowledge in which the use of BIM processes is not considered necessarily an advantage over other methodologies, or even inadequate for the learning outcomes established, at least where current planning is concerned. It is this last category which characterizes the research conclusions as a whole, centering on: 1. The unquestionable need for teaching BIM concepts and processes in Architecture very early on, in the initial stages of higher education; 2. The additional educational benefits that BIM offers in a varied array of competency development within the academic curriculum; and 3. The specific nature of the professional role of the Architect, which demands a careful and balanced implementation of BIM that respects the traditional teaching methodologies that have proven effective and creative, and adds value by a symbiotic reorientation merged with parametric design and digital manufacturing so to enable for a finally generative design.
Análisis de las herramientas ORCC y Vivado HLS para la Síntesis de Modelos de Flujo de Datos RVC-CAL
Resumo:
En este Proyecto Fin de Grado se ha realizado un estudio de cómo generar, a partir de modelos de flujo de datos en RVC-CAL (Reconfigurable Video Coding – CAL Actor Language), modelos VHDL (Versatile Hardware Description Language) mediante Vivado HLS (Vivado High Level Synthesis), incluida en las herramientas disponibles en Vivado de Xilinx. Una vez conseguido el modelo VHDL resultante, la intención es que mediante las herramientas de Xilinx se programe en una FPGA (Field Programmable Gate Array) o el dispositivo Zynq también desarrollado por Xilinx. RVC-CAL es un lenguaje de flujo de datos que describe la funcionalidad de bloques funcionales, denominados actores. Las funcionalidades que desarrolla un actor se definen como acciones, las cuales pueden ser diferentes en un mismo actor. Los actores pueden comunicarse entre sí y formar una red de actores o network. Con Vivado HLS podemos obtener un diseño VHDL a partir de un modelo en lenguaje C. Por lo que la generación de modelos en VHDL a partir de otros en RVC-CAL, requiere una fase previa en la que los modelos en RVC-CAL serán compilados para conseguir su equivalente en lenguaje C. El compilador ORCC (Open RVC-CAL Compiler) es la herramienta que nos permite lograr diseños en lenguaje C partiendo de modelos en RVC-CAL. ORCC no crea directamente el código ejecutable, sino que genera un código fuente disponible para ser compilado por otra herramienta, en el caso de este proyecto, el compilador GCC (Gnu C Compiler) de Linux. En resumen en este proyecto nos encontramos con tres puntos de estudio bien diferenciados, los cuales son: 1. Partimos de modelos de flujo de datos en RVC-CAL, los cuales son compilados por ORCC para alcanzar su traducción en lenguaje C. 2. Una vez conseguidos los diseños equivalentes en lenguaje C, son sintetizados en Vivado HLS para conseguir los modelos en VHDL. 3. Los modelos VHDL resultantes serian manipulados por las herramientas de Xilinx para producir el bitstream que sea programado en una FPGA o en el dispositivo Zynq. En el estudio del segundo punto, nos encontramos con una serie de elementos conflictivos que afectan a la síntesis en Vivado HLS de los diseños en lenguaje C generados por ORCC. Estos elementos están relacionados con la manera que se encuentra estructurada la especificación en C generada por ORCC y que Vivado HLS no puede soportar en determinados momentos de la síntesis. De esta manera se ha propuesto una transformación “manual” de los diseños generados por ORCC que afecto lo menos posible a los modelos originales para poder realizar la síntesis con Vivado HLS y crear el fichero VHDL correcto. De esta forma este documento se estructura siguiendo el modelo de un trabajo de investigación. En primer lugar, se exponen las motivaciones y objetivos que apoyan y se esperan lograr en este trabajo. Seguidamente, se pone de manifiesto un análisis del estado del arte de los elementos necesarios para el desarrollo del mismo, proporcionando los conceptos básicos para la correcta comprensión y estudio del documento. Se realiza una descripción de los lenguajes RVC-CAL y VHDL, además de una introducción de las herramientas ORCC y Vivado, analizando las bondades y características principales de ambas. Una vez conocido el comportamiento de ambas herramientas, se describen las soluciones desarrolladas en nuestro estudio de la síntesis de modelos en RVC-CAL, poniéndose de manifiesto los puntos conflictivos anteriormente señalados que Vivado HLS no puede soportar en la síntesis de los diseños en lenguaje C generados por el compilador ORCC. A continuación se presentan las soluciones propuestas a estos errores acontecidos durante la síntesis, con las cuales se pretende alcanzar una especificación en C más óptima para una correcta síntesis en Vivado HLS y alcanzar de esta forma los modelos VHDL adecuados. Por último, como resultado final de este trabajo se extraen un conjunto de conclusiones sobre todos los análisis y desarrollos acontecidos en el mismo. Al mismo tiempo se proponen una serie de líneas futuras de trabajo con las que se podría continuar el estudio y completar la investigación desarrollada en este documento. ABSTRACT. In this Project it has made a study of how to generate, from data flow models in RVC-CAL (Reconfigurable Video Coding - Actor CAL Language), VHDL models (Versatile Hardware Description Language) by Vivado HLS (Vivado High Level Synthesis), included in the tools available in Vivado of Xilinx. Once achieved the resulting VHDL model, the intention is that by the Xilinx tools programmed in FPGA or Zynq device also developed by Xilinx. RVC-CAL is a dataflow language that describes the functionality of functional blocks, called actors. The functionalities developed by an actor are defined as actions, which may be different in the same actor. Actors can communicate with each other and form a network of actors. With Vivado HLS we can get a VHDL design from a model in C. So the generation of models in VHDL from others in RVC-CAL requires a preliminary phase in which the models RVC-CAL will be compiled to get its equivalent in C. The compiler ORCC (Open RVC-CAL Compiler) is the tool that allows us to achieve designs in C language models based on RVC-CAL. ORCC not directly create the executable code but generates an available source code to be compiled by another tool, in the case of this project, the GCC compiler (GNU C Compiler) of Linux. In short, in this project we find three well-defined points of study, which are: 1. We start from data flow models in RVC-CAL, which are compiled by ORCC to achieve its translation in C. 2. Once you realize the equivalent designs in C, they are synthesized in Vivado HLS for VHDL models. 3. The resulting models VHDL would be manipulated by Xilinx tools to produce the bitstream that is programmed into an FPGA or Zynq device. In the study of the second point, we find a number of conflicting elements that affect the synthesis Vivado HLS designs in C generated by ORCC. These elements are related to the way it is structured specification in C generated ORCC and Vivado HLS cannot hold at certain times of the synthesis. Thus it has proposed a "manual" transformation of designs generated by ORCC that affected as little as possible to the original in order to perform the synthesis Vivado HLS and create the correct file VHDL models. Thus this document is structured along the lines of a research. First, the motivations and objectives that support and hope to reach in this work are presented. Then it shows an analysis the state of the art of the elements necessary for its development, providing the basics for a correct understanding and study of the document. A description of the RVC-CAL and VHDL languages is made, in addition an introduction of the ORCC and Vivado tools, analyzing the advantages and main features of both. Once you know the behavior of both tools, the solutions developed in our study of the synthesis of RVC-CAL models, introducing the conflicting points mentioned above are described that Vivado HLS cannot stand in the synthesis of design in C language generated by ORCC compiler. Below the proposed solutions to these errors occurred during synthesis, with which it is intended to achieve optimum C specification for proper synthesis Vivado HLS and thus create the appropriate VHDL models are presented. Finally, as the end result of this work a set of conclusions on all analyzes and developments occurred in the same are removed. At the same time a series of future lines of work which could continue to study and complete the research developed in this document are proposed.
Resumo:
Protein folding is a grand challenge of the postgenomic era. In this paper, 58 folding events sampled during 47 molecular dynamics trajectories for a total simulation time of more than 4 μs provide an atomic detail picture of the folding of a 20-residue synthetic peptide with a stable three-stranded antiparallel β-sheet fold. The simulations successfully reproduce the NMR solution conformation, irrespective of the starting structure. The sampling of the conformational space is sufficient to determine the free energy surface and localize the minima and transition states. The statistically predominant folding pathway involves the formation of contacts between strands 2 and 3, starting with the side chains close to the turn, followed by association of the N-terminal strand onto the preformed 2–3 β-hairpin. The folding mechanism presented here, formation of a β-hairpin followed by consolidation, is in agreement with a computational study of the free energy surface of another synthetic three-stranded antiparallel β-sheet by Bursulaya and Brooks [(1999) J. Am. Chem. Soc. 121, 9947–9951]. Hence, it might hold in general for antiparallel β-sheets with short turns.
Resumo:
A tuberculose (TB) é uma doença infectocontagiosa, causada por micobactérias do complexo Mycobacterium, principalmente, o M. tuberculosis. Praticamente extinta em países desenvolvidos, antigamente denominados Países de Primeiro Mundo, a tuberculose voltou a ter foco mundial dada a sua crescente taxa de incidência e mortalidade. Segundo a Organização Mundial de Saúde, a TB, hoje, figura como principal causa de morte por doenças infectocontagiosas em todo mundo, com a incidência de 8,6 milhões de novos casos ao ano e cerca de 1,5 milhões de mortes. O principal desafio no tratamento da tuberculose é a multirresistência de M. tuberculosis frente aos fármacos disponíveis. Sendo assim, a busca de novos fármacos antituberculose e o estudo de novos alvos são necessários para superar essa situação. Frente à necessidade de exploração de novos alvos e ante a indicação da maltosiltransferase (GlgE) como novo alvo potencialmente promissor contra M. tuberculosis, este projeto pretendeu viabilizar a síntese de análogos da glicose (análoga do substrato natural da GlgE, a maltose 1-fosfato) por meio de rotas sintéticas que fazem uso do micro-ondas. Essas rotas sintéticas seguem os princípios da click chemistry, que são reações químicas modulares, cujas condições reacionais são simples e resultam em produtos de fácil purificação. O presente trabalho também visou à comparação entre o método convencional de síntese de triazóis e aquele que utiliza o micro-ondas, no que se refere aos os tempos de reação, às condições reacionais e aos rendimentos com derivados sintetizados no Laboratório de Planejamento e Síntese de Quimioterápicos Potencialmente Ativos em Doenças Negligenciadas (LAPEN). Entretanto, não obteve-se sucesso na etapa final da rota sintética, a glicosilação. Nos demais métodos sintéticos o micro-ondas mostrou-se uma valiosa ferramenta para obtenção dos compostos triazólicos.