877 resultados para TOP-DOWN


Relevância:

60.00% 60.00%

Publicador:

Resumo:

SILVA, Alexandre Reche e. Rudimentos de uma inspeção topográfica aplicados à Passacaglia para orquestra, opus 1, de Anton Webern. Ictus - Periódico do PPGMUS/UFBA, Salvador, v. 7, p.189-208, 2010

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background. Tremendous advances in biomaterials science and nanotechnologies, together with thorough research on stem cells, have recently promoted an intriguing development of regenerative medicine/tissue engineering. The nanotechnology represents a wide interdisciplinary field that implies the manipulation of different materials at nanometer level to achieve the creation of constructs that mimic the nanoscale-based architecture of native tissues. Aim. The purpose of this article is to highlight the significant new knowledges regarding this matter. Emerging acquisitions. To widen the range of scaffold materials resort has been carried out to either recombinant DNA technology-generated materials, such as a collagen-like protein, or the incorporation of bioactive molecules, such as RDG (arginine-glycine-aspartic acid), into synthetic products. Both the bottom-up and the top-down fabrication approaches may be properly used to respectively obtain sopramolecular architectures or, instead, micro-/nanostructures to incorporate them within a preexisting complex scaffold construct. Computer-aided design/manufacturing (CAD/CAM) scaffold technique allows to achieve patient-tailored organs. Stem cells, because of their peculiar properties - ability to proliferate, self-renew and specific cell-lineage differentiate under appropriate conditions - represent an attractive source for intriguing tissue engineering/regenerative medicine applications. Future research activities. New developments in the realization of different organs tissue engineering will depend on further progress of both the science of nanoscale-based materials and the knowledge of stem cell biology. Moreover the in vivo tissue engineering appears to be the logical step of the current research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The commercial fisheries of Lake Victoria are presently dominated by three species: the stocked Lates niloticus and Oreochromis niloticus, and the endemic cyprinid Rastrineobola argentea. The three comprise at least 90% of the commercial catch while the rest of the endemic species mostly occur as by-catch (incidental catch) except in localised areas. Apart from being a major source of food, the three species especially the Nile perch represent the usually recognized main forms of predation, As they exert a "top-down" effect on production, they are important in the trophic dynamics of the Lake Victoria ecosystem. However, another form of predation which is usually unrecognized in the lake productivity mechanisms is one due to fishing mortality. Fishermen essentially behave as predatory elements in the ecosystem. This is manifested in ways that paral1el the effect of fish as predators e.g. some fishermen are habitat restricted and specialised in catching particular species or sizes, others are opportunistic and switch to whatever species (prey) are available which may depend on season, etc. There are also indirect factors that influence fishing mortality as a form of predation e.g. availability on the market of different gears, thefts of nets and of fish from nets, civil strife, market demand etc. The fatter are essentially socioeconomic factors. Application of the principles of fisheries management requires "a data base from which effective options can be generated. It is considered that one of the fundamental requirements for such a data base is information on the spatial distribution of the species fishery. This can be combined with information on landings which can eventually be incorporated into a programme of stock monitoring. The aim of this paper is to highlight information on the Tilapia fishery that may benefit fisheries management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ecological network analysis was applied in the Seine estuary ecosystem, northern France, integrating ecological data from the years 1996 to 2002. The Ecopath with Ecosim (EwE) approach was used to model the trophic flows in 6 spatial compartments leading to 6 distinct EwE models: the navigation channel and the two channel flanks in the estuary proper, and 3 marine habitats in the eastern Seine Bay. Each model included 12 consumer groups, 2 primary producers, and one detritus group. Ecological network analysis was performed, including a set of indices, keystoneness, and trophic spectrum analysis to describe the contribution of the 6 habitats to the Seine estuary ecosystem functioning. Results showed that the two habitats with a functioning most related to a stressed state were the northern and central navigation channels, where building works and constant maritime traffic are considered major anthropogenic stressors. The strong top-down control highlighted in the other 4 habitats was not present in the central channel, showing instead (i) a change in keystone roles in the ecosystem towards sediment-based, lower trophic levels, and (ii) a higher system omnivory. The southern channel evidenced the highest system activity (total system throughput), the higher trophic specialisation (low system omnivory), and the lowest indication of stress (low cycling and relative redundancy). Marine habitats showed higher fish biomass proportions and higher transfer efficiencies per trophic levels than the estuarine habitats, with a transition area between the two that presented intermediate ecosystem structure. The modelling of separate habitats permitted disclosing each one's response to the different pressures, based on their a priori knowledge. Network indices, although non-monotonously, responded to these differences and seem a promising operational tool to define the ecological status of transitional water ecosystems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O papel exercido pelas bactérias é reconhecido como fundamental no metabolismo de qualquer sistema aquático, não só pela mineralização da matéria orgânica, como também pela transferência de matéria e energia para níveis tróficos superiores (“microbial loop”). Para a realização deste estudo foram escolhidos quatro lagos com diferentes estados tróficos no Campus Carreiros da Universidade Federal do Rio Grande – FURG - RS. O Lago Biguás e o da Base possuem características de ambientes eutrófico - hipereutrófico, enquanto que, o Lago Polegar é caracterizado como um ambiente oligo-mesotrófico e o Lago Negro é considerado um ambiente distrófico. Em um estudo anterior em nove lagos rasos nesta mesma região, incluindo os quatro analisados no presente trabalho, Souza (2007) sugeriu que as bactérias livres atuariam como mineralizadoras e o seu crescimento seria limitado pela disponibilidade de fosfato (controle “bottom-up”), enquanto que as bactérias aderidas participariam da decomposição dos agregados orgânicos. Também foi sugerido que as bactérias aderidas seriam controladas principalmente pela predação por flagelados e ciliados (controle “top-down”), provavelmente devido ao seu maior biovolume. Porém, estas informações foram obtidas a partir de relações estatísticas de dados coletados em uma única amostragem. Assim, neste estudo a comunidade bacteriana (abundância e biomassa) e outros parâmetros físicos, químicos e biológicos dos quatro lagos rasos sub-tropicais foram estudados em amostragens quinzenais no decorrer de um ano entre junho de 2008 e maio de 2009. Nossos resultados indicam que a disponibilidade de carbono orgânico dissolvido produzido pelo fitoplâncton parece ser um dos principais fatores controladores da dinâmica de bactérias nestes lagos. Entretanto, a predação no Lago Negro parece ter sido de maior magnitude no controle das bactérias neste ambiente, uma vez que não houve um incremento na abundância bacteriana deste lago proporcional ao incremento da clorofila a. A presença de um maior número de nano - e microflagelados neste lago dá suporte a esta hipótese. Para testar esta hipótese, foi realizado um experimento utilizando-se a Técnica da Diluição em conjunto com a técnica a de FISH (Hibridização in situ Fluorescente) para identificar as taxas de produção e consumo não só dos diferentes morfotipos, mas também dos diferentes grupos filogenéticos (Archaea, Eubacteria, Alfa- Beta- e Gama-Proteobacteria e Cytophaga-Flavobacter) de uma amostra de água do Lago Negro. Os resultados deste experimento indicaram que as bactérias estão, de fato, sendo consumidas por vi protozoários na mesma proporção que estão sendo produzidas. Além disso, no Lago Negro a predação parece estar vinculada ao tamanho/biovolume celular, sendo os morfotipos de tamanho reduzido mais resistentes a predação e, por isso, mais abundantes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis is actually the composition of two separate studies aimed at further understanding the role of incomplete combustion products on atmospheric chemistry. The first explores the sensitivity of black carbon (BC) forcing to aerosol vertical location since BC has an increased forcing per unit mass when it is located above reflective clouds. We used a column radiative transfer model to produce globally-averaged values of normalized direct radiative forcing (NDRF) for BC over and under different types of clouds. We developed a simple column-weighting scheme based on the mass fractions of BC that are over and under clouds in measured vertical profiles. The resulting NDRF is in good agreement with global 3-D model estimates, supporting the column-weighted model as a tool for exploring uncertainties due to diversity in vertical distribution. BC above low clouds accounts for about 20% of the global burden but 50% of the forcing. We estimate maximum-minimum spread in NDRF due to modeled profiles as about 40% and uncertainty as about 25%. Models overestimate BC in the upper troposphere compared with measurements; modeled NDRF might need to be reduced by about 15%. Redistributing BC within the lowest 4 km of the atmosphere affects modeled NDRF by only about 5% and cannot account for very high forcing estimates. The second study estimated global year 2000 carbon monoxide (CO) emissions using a traditional bottom-up inventory. We applied literature-derived emission factors to a variety of fuel and technology combinations. Combining these with regional fuel use and production data we produced CO emissions estimates that were separable by sector, fuel type, technology, and region. We estimated year 2000 stationary source emissions of 685.9 Tg/yr and 885 Tg/yr if we included adopted mobile sources from EDGAR v3.2FT2000. Open/biomass burning contributed most significantly to global CO burden, while the residential sector, primarily in Asia and Africa, were the largest contributors with respect to contained combustion sources. Industry production in Asia, including brick, cement, iron and steel-making, also contributed significantly to CO emissions. Our estimates of biofuel emissions are lower than most previously published bottom-up estimates while our other fuel emissions are generally in good agreement. Our values are also universally lower than recently estimated CO emissions from models using top-down methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Top-down (grazing) and bottom-up (nutrient, light) controls are important in freshwater ecosystems regulation. Relative importance of these factors could change in space and time, but in tropical lakes bottom-up regulation has to been appointed as more influent. Present study aimed to test the hypothesis that phytoplankton growths rate in Armando Ribeiro reservoir, a huge eutrophic reservoir in semi-arid region of Rio Grande do Norte state, is more limited by nutrient available then zooplankton grazing pressure. Bioassay was conduced monthly from September (2008) to August (2009) manipulating two levels of nutrients (with/without addition) and two level of grazers (with/without removal). Experimental design was factorial 2X2 with four treatments (X5), (i) control with water and zooplankton from natural spot ( C ), (ii) with nutrient addition ( +NP ), (iii) with zooplankton remove ( -Z ) and (iv) with zooplankton remove and nutrient addition ( -Z+NP ). For bioassay confection transparent plastic bottles (500ml) was incubate for 4 or 5 days in two different depths, Secchi`s depth (high luminosity) and 3 times Secchi`s depth (low luminosity). Water samples were collected from each bottle in begins and after incubates period for chlorophyll a concentration analysis and zoopalnktonic organisms density. Phytoplankton growths rates were calculated. Bifactorial ANOVA was performance to test if had a significant effect (p<0,005) of nutrient addition and grazers remove as well a significant interaction between factors on phytoplankton growths rates. Effect magnitude was calculated the relative importance of each process. Results show that phytoplankton growth was in generally stimulated by nutrient addition, as while zooplankton remove rarely stimulated phytoplankton growth. Some significant interactions happening between nutrient additions and grazers remove on phytoplankton growth. In conclusion this study suggests that in studied reservoir phytoplankton growth is more controlled by ascendent factors than descendent

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A encontrabilidade da informação é um elemento que se situa entre as funcionalidades de um ambiente informacional analógico, digital ou híbrido e as características dos sujeitos informacionais. Deriva da mediação infocomunicacional, visto que está relacionada aos processos que compõem o fluxo infocomunicacional, desde a produção até a apropriação da informação. Considerando que os profissionais da informação, de informática e os próprios usuários de ambientes informacionais são mediadores, percebe-se a influência de suas ações mediadoras na encontrabilidade da informação. Com o objetivo de compreender como a mediação infocomunicacional praticada por esses mediadores pode influenciar a encontrabilidade da informação em ambientes informacionais, foram realizadas pesquisas bibliográfica, descritiva e documental, com abordagem qualitativa, o que viabilizou uma discussão entre os conceitos estudados, tornando possível realizar a análise do processo de autoarquivamento no Repositório Institucional da Universidade Federal do Rio Grande do Norte. A partir da técnica de observação foram mapeadas as ações praticadas pelos diferentes mediadores no referido ambiente, tendo como base as dimensões top-down e bottom-up do Modelo de Encontrabilidade da Informação (MEI). A partir do mapeamento, foram identificadas as ações infocomunicacionais e tecnológicas realizadas pelos diferentes mediadores no referido Repositório, constatando a hipótese de que suas ações interferem significativamente na encontrabilidade da informação.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A escola portuguesa do século XXI tem sido alvo de reformas sucessivas procedentes de imposições económicas, políticas e sociais, sem o devido acompanhamento de inovação bottom up. A mudança decorrente tem-se processado sobretudo formalmente (inovação top down), menosprezando, ainda em larga escala, as pessoas e o seu capital (intelectual, emocional, social e psicológico), num momento em que a globalização impulsiona a transição do paradigma funcional para o paradigma das competências. Sendo os indivíduos parte integrante das soluções e êxitos organizacionais, bem como cada vez mais manifesta a preocupação dos líderes com a eficácia, eficiência e qualidade do serviço prestado ao nível das escolas de interesse público, direcionamos a análise para a gestão dos recursos humanos, mais particularmente ao nível da dualidade: instabilidade na carreira docente/estabilidade do diretor de turma. O interesse supremo da temática reside nas potencialidades deste cargo de gestão intermédia na concretização da missão da escola, a partir do papel que o seu detentor assume na teia de relações em que se insere por imposição das suas funções, e que pode fomentar quando dotado de determinadas características. Este estudo de caso, de natureza essencialmente quantitativa, traduz a análise crítica acerca do impacto da continuidade do desempenho do cargo ao longo do 2.º ciclo do ensino básico no sucesso educativo dos alunos, partindo da análise dos resultados de medidas adotadas no ano escolar 2010/2011. Para o efeito recorreu-se à análise do aproveitamento, da atuação do conselho de turma e da comunicação escola-família em seis turmas do sexto ano de escolaridade, três com o mesmo diretor de turma, de uma escola pública do concelho de Câmara de Lobos. Concluímos que a manutenção do diretor de turma nas turmas visadas teve repercussões na vertente relacional, principalmente, não tendo sido confirmada a conexão entre a continuidade do profissional que assume o cargo ao longo do ciclo e os resultados académicos dos discentes, pese embora a opinião contrária de 87,9% dos docentes inquiridos, e de a maioria quer dos estudantes, quer dos seus encarregados de educação, ser favorável à sua prossecução.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Numéro spécial: Translational Nanomedicine

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents the modeling and FPGA implementation of digital TIADC mismatches compensation systems. The development of the whole work follows a top-down methodology. Following this methodology was developed a two channel TIADC behavior modeling and their respective offset, gain and clock skew mismatches on Simulink. In addition was developed digital mismatch compensation system behavior modeling. For clock skew mismatch compensation fractional delay filters were used, more specifically, the efficient Farrow struct. The definition of wich filter design methodology would be used, and wich Farrow structure, required the study of various design methods presented in literature. The digital compensation systems models were converted to VHDL, for FPGA implementation and validation. These system validation was carried out using the test methodology FPGA In Loop . The results obtained with TIADC mismatch compensators show the high performance gain provided by these structures. Beyond this result, these work illustrates the potential of design, implementation and FPGA test methodologies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Existen métodos clásicos de diseño de bases de datos relacionales -descomposición y síntesis- El objetivo de esas aproximaciones clásicas es alcanzar el más alto nivel de normalización. El método de diseño por descomposición es “top-down” que comienza con una relación existente, investiga su forma normal y la descompone vía proyecciones hasta que el esquema relacional adquiera el grado de normalización deseado.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The work outlined in this dissertation will allow biochemists and cellular biologists to characterize polyubiquitin chains involved in their cellular environment by following a facile mass spectrometric based workflow. The characterization of polyubiquitin chains has been of interest since their discovery in 1984. The profound effects of ubiquitination on the movement and processing of cellular proteins depend exclusively on the structures of mono and polyubiquitin modifications anchored or unanchored on the protein within the cellular environment. However, structure-function studies have been hindered by the difficulty in identifying complex chain structures due to limited instrument capabilities of the past. Genetic mutations or reiterative immunoprecipitations have been used previously to characterize the polyubiquitin chains, but their tedium makes it difficult to study a broad ubiquitinome. Top-down and middle-out mass spectral based proteomic studies have been reported for polyubiquitin and have had success in characterizing parts of the chain, but no method to date has been successful at differentiating all theoretical ubiquitin chain isomers (ubiquitin chain lengths from dimer to tetramer alone have 1340 possible isomers). The workflow presented here can identify chain length, topology and linkages present using a chromatographic-time-scale compatible, LC-MS/MS based workflow. To accomplish this feat, the strategy had to exploit the most recent advances in top-down mass spectrometry. This included the most advanced electron transfer dissociation (ETD) activation and sensitivity for large masses from the orbitrap Fusion Lumos. The spectral interpretation had to be done manually with the aid of a graphical interface to assign mass shifts because of a lack of software capable to interpret fragmentation across isopeptide linkages. However, the method outlined can be applied to any mass spectral based system granted it results in extensive fragmentation across the polyubiquitin chain; making this method adaptable to future advances in the field.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La vision joue un rôle très important dans la prévention du danger. La douleur a aussi pour fonction de prévenir les lésions corporelles. Nous avons donc testé l’hypothèse qu’une hypersensibilité à la douleur découlerait de la cécité en guise de compensation sensorielle. En effet, une littérature exhaustive indique qu’une plasticité intermodale s’opère chez les non-voyants, ce qui module à la hausse la sensibilité de leurs sens résiduels. De plus, plusieurs études montrent que la douleur peut être modulée par la vision et par une privation visuelle temporaire. Dans une première étude, nous avons mesuré les seuils de détection thermique et les seuils de douleur chez des aveugles de naissance et des voyants à l’aide d’une thermode qui permet de chauffer ou de refroidir la peau. Les participants ont aussi eu à quantifier la douleur perçue en réponse à des stimuli laser CO2 et à répondre à des questionnaires mesurant leur attitude face à des situations douloureuses de la vie quotidienne. Les résultats obtenus montrent que les aveugles congénitaux ont des seuils de douleur plus bas et des rapports de douleur plus élevés que leurs congénères voyants. De plus, les résultats psychométriques indiquent que les non-voyants sont plus attentifs à la douleur. Dans une deuxième étude, nous avons mesuré l’impact de l'expérience visuelle sur la perception de la douleur en répliquant la première étude dans un échantillon d’aveugles tardifs. Les résultats montrent que ces derniers sont en tous points similaires aux voyants quant à leur sensibilité à la douleur. Dans une troisième étude, nous avons testé les capacités de discrimination de température des aveugles congénitaux, car la détection de changements rapides de température est cruciale pour éviter les brûlures. Il s’est avéré que les aveugles de naissance ont une discrimination de température plus fine et qu’ils sont plus sensibles à la sommation spatiale de la chaleur. Dans une quatrième étude, nous avons examiné la contribution des fibres A∂ et C au traitement nociceptif des non-voyants, car ces récepteurs signalent la première et la deuxième douleur, respectivement. Nous avons observé que les aveugles congénitaux détectent plus facilement et répondent plus rapidement aux sensations générées par l’activation des fibres C. Dans une cinquième et dernière étude, nous avons sondé les changements potentiels qu’entrainerait la perte de vision dans la modulation descendante des intrants nociceptifs en mesurant les effets de l’appréhension d’un stimulus nocif sur la perception de la douleur. Les résultats montrent que, contrairement aux voyants, les aveugles congénitaux voient leur douleur exacerbée par l’incertitude face au danger, suggérant ainsi que la modulation centrale de la douleur est facilitée chez ces derniers. En gros, ces travaux indiquent que l’absence d’expérience visuelle, plutôt que la cécité, entraine une hausse de la sensibilité nociceptive, ce qui apporte une autre dimension au modèle d’intégration multi-sensorielle de la vision et de la douleur.