978 resultados para Optimizing Compilation
Resumo:
An unusual feature of the mammalian genome is the number of genes exhibiting monoallelic expression. Recently random monoallelic expression of autosomal genes has been reported for olfactory and Ly-49 NK receptor genes, as well as for Il-2, Il-4 and Pax5. RNA fluorescence in situ hybridization (FISH) has been exploited to monitor allelic expression by visualizing the number of sites of transcription in individual nuclei. However, the sensitivity of this technique is difficult to determine for a given gene. We show that by combining DNA and RNA FISH it is possible to control for the hybridization efficiency and the accessibility and visibility of fluorescent probes within the nucleus.
Resumo:
Linker length and composition were varied in libraries of single-chain Arc repressor, resulting in proteins with effective concentrations ranging over six orders of magnitude (10 μM–10 M). Linkers of 11 residues or more were required for biological activity. Equilibrium stability varied substantially with linker length, reaching a maximum for glycine-rich linkers containing 19 residues. The effects of linker length on equilibrium stability arise from significant and sometimes opposing changes in folding and unfolding kinetics. By fixing the linker length at 19 residues and varying the ratio of Ala/Gly or Ser/Gly in a 16-residue-randomized region, the effects of linker flexibility were examined. In these libraries, composition rather than sequence appears to determine stability. Maximum stability in the Ala/Gly library was observed for a protein containing 11 alanines and five glycines in the randomized region of the linker. In the Ser/Gly library, the most stable protein had seven serines and nine glycines in this region. Analysis of folding and unfolding rates suggests that alanine acts largely by accelerating folding, whereas serine acts predominantly to slow unfolding. These results demonstrate an important role for linker design in determining the stability and folding kinetics of single-chain proteins and suggest strategies for optimizing these parameters.
Resumo:
Materials with high electrical conductivity and optical transparency are needed for future flat panel display, solar energy, and other opto-electronic technologies. InxCd1-xO films having a simple cubic microstructure have been grown on amorphous glass substrates by a straightforward chemical vapor deposition process. The x = 0.05 film conductivity of 17,000 S/cm, carrier mobility of 70 cm2/Vs, and visible region optical transparency window considerably exceed the corresponding parameters for commercial indium-tin oxide. Ab initio electronic structure calculations reveal small conduction electron effective masses, a dramatic shift of the CdO band gap with doping, and a conduction band hybridization gap caused by extensive Cd 5s + In 5s mixing.
Resumo:
Devido às tendências de crescimento da quantidade de dados processados e a crescente necessidade por computação de alto desempenho, mudanças significativas estão acontecendo no projeto de arquiteturas de computadores. Com isso, tem-se migrado do paradigma sequencial para o paralelo, com centenas ou milhares de núcleos de processamento em um mesmo chip. Dentro desse contexto, o gerenciamento de energia torna-se cada vez mais importante, principalmente em sistemas embarcados, que geralmente são alimentados por baterias. De acordo com a Lei de Moore, o desempenho de um processador dobra a cada 18 meses, porém a capacidade das baterias dobra somente a cada 10 anos. Esta situação provoca uma enorme lacuna, que pode ser amenizada com a utilização de arquiteturas multi-cores heterogêneas. Um desafio fundamental que permanece em aberto para estas arquiteturas é realizar a integração entre desenvolvimento de código embarcado, escalonamento e hardware para gerenciamento de energia. O objetivo geral deste trabalho de doutorado é investigar técnicas para otimização da relação desempenho/consumo de energia em arquiteturas multi-cores heterogêneas single-ISA implementadas em FPGA. Nesse sentido, buscou-se por soluções que obtivessem o melhor desempenho possível a um consumo de energia ótimo. Isto foi feito por meio da combinação de mineração de dados para a análise de softwares baseados em threads aliadas às técnicas tradicionais para gerenciamento de energia, como way-shutdown dinâmico, e uma nova política de escalonamento heterogeneity-aware. Como principais contribuições pode-se citar a combinação de técnicas de gerenciamento de energia em diversos níveis como o nível do hardware, do escalonamento e da compilação; e uma política de escalonamento integrada com uma arquitetura multi-core heterogênea em relação ao tamanho da memória cache L1.
Resumo:
Extensive experimental and computational studies have been carried out on the enantioselective titanium(IV)-catalyzed cyanobenzoylation of aldehydes using 1:n Binolam:Ti(OiPr)4 mixtures as precatalysts, with the purpose of identifying the key mechanistic aspects governing enantioselectivity. HCN and isopropyl benzoate were detected in the reacting mixtures. This, as well as the reaction’s response to the presence of an exogenous base, and the failure to react in the presence of Binol:Ti(OiPr)4 mixtures, led us to propose not a direct cyanobenzoylation but an indirect process involving enantioselective hydrocyanation followed by O-benzoylation. Computational work provided positive evidence for the intervention of both indirect and direct cyanobenzoylation routes, the former being faster. However, the standard Curtin–Hammett-based optimization search ended with unsatisfactory results. Experimental and computational DFT studies (B3LYP/6-31G*) led us to conclude that: (1) the overall cyanobenzoylation of aldehydes catalyzed by 1:n Binolam:Ti(OiPr)4 mixtures involves an enantioselective hydrocyanation followed by an stereochemically inert O-benzoylation; (2) the initial complexes prevailing in a 1:1 Binolam:Ti(OiPr)4 mixture are the solvated mononuclear monomer 5·2(iPrOH) and solvated dinuclear dimer 9·2(iPrOH), whereas 9·2(iPrOH) is the major component in a 1:2 or higher 1:n mixture; (3) since the slowest step is that of benzoylation of ligated iPrOH which yields the actual catalysts 5–9, the catalytic system fits into a non-Curtin–Hammett framework, the final products deriving from a kinetic quench of the competing routes; and (4) accordingly, catalysis by 1:1 Binolam:Ti(OiPr)4 mixtures should involve cyanobenzoylations promoted by mononuclear 5, contaminated with those promoted by some dinuclear open dimer 9, whereas cyanobenzoylations catalyzed by a 1:2 and higher 1:n mixtures should be the result of catalysis promoted by the large amounts of dinuclear open dimer 9.
Resumo:
Array measurements have become a valuable tool for site response characterization in a non-invasive way. The array design, i.e. size, geometry and number of stations, has a great influence in the quality of the obtained results. From the previous parameters, the number of available stations uses to be the main limitation for the field experiments, because of the economical and logistical constraints that it involves. Sometimes, from the initially planned array layout, carefully designed before the fieldwork campaign, one or more stations do not work properly, modifying the prearranged geometry. Whereas other times, there is not possible to set up the desired array layout, because of the lack of stations. Therefore, for a planned array layout, the number of operative stations and their arrangement in the array become a crucial point in the acquisition stage and subsequently in the dispersion curve estimation. In this paper we carry out an experimental work to analyze which is the minimum number of stations that would provide reliable dispersion curves for three prearranged array configurations (triangular, circular with central station and polygonal geometries). For the optimization study, we analyze together the theoretical array responses and the experimental dispersion curves obtained through the f-k method. In the case of the f-k method, we compare the dispersion curves obtained for the original or prearranged arrays with the ones obtained for the modified arrays, i.e. the dispersion curves obtained when a certain number of stations n is removed, each time, from the original layout of X geophones. The comparison is evaluated by means of a misfit function, which helps us to determine how constrained are the studied geometries by stations removing and which station or combination of stations affect more to the array capability when they are not available. All this information might be crucial to improve future array designs, determining when it is possible to optimize the number of arranged stations without losing the reliability of the obtained results.
Resumo:
A key target to reduce current hydrocarbon emissions from vehicular exhaust is to improve their abatement under cold-start conditions. Herein, we demonstrate the potential of factorial analysis to design a highly efficient catalytic trap. The impact of the synthesis conditions on the preparation of copper-loaded ZSM-5 is clearly revealed by XRD, N2 sorption, FTIR, NH3-TPD, SEM and TEM. A high concentration of copper nitrate precursor in the synthesis improves the removal of hydrocarbons, providing both strong adsorption sites for hydrocarbon retention at low temperature and copper oxide nanoparticles for full hydrocarbon catalytic combustion at high temperature. The use of copper acetate precursor leads to a more homogeneous dispersion of copper oxide nanoparticles also providing enough catalytic sites for the total oxidation of hydrocarbons released from the adsorption sites, although lower copper loadings are achieved. Thus, synthesis conditions leading to high copper loadings jointly with highly dispersed copper oxide nanoparticles would result in an exceptional catalytic trap able to reach superior hydrocarbon abatement under highly demanding operational conditions.