897 resultados para multimedia contents
Resumo:
We studied the effects of different protocols of post-disuse rehabilitation on angiogenesis and myosin heavy chain (MHC) content in rat hindlimb muscles after caudal suspension. Thirty female Wistar rats were divided into five groups: (1) Control I, (2) Control II, (3) Suspended, (4) Suspended trained on declined treadmill, and (5) Suspended trained on flat treadmill. Fragments of the soleus and tibialis anterior (TA) muscles were frozen and processed by electrophoresis and immunohistochemistry (CD31 antibody). Hindlimb suspension caused reduction of capillary/fiber (C/F) ratios and contents of MHC type I (MHCI) in the soleus in parallel to increased capillary density. Flat treadmill protocols increased the content of the MHCI isoform. The C/F ratio was increased by concentric training after hypokinesis, but was not modified by eccentric training, which caused a greater reduction of capillary density compared to the other protocols. In the TA muscle, hindlimb suspension caused a non-significant increase in capillary density and C/F ratio with limited changes in MHC. The present data demonstrate that the different training protocols adopted and the functional performance of the muscles analyzed caused specific changes in capillarization and in the content of the various MHC types. (C) 2010 Published by Elsevier GmbH.
Resumo:
We compared the lignin contents of tropical forages by different analytical methods and evaluated their correlations with parameters related to the degradation of neutral detergent fiber (NDF). The lignin content was evaluated by five methods: cellulose solubilization in sulfuric acid [Lignin (sa)], oxidation with potassium permanganate [Lignin (pm)], the Klason lignin method (KL), solubilization in acetyl bromide from acid detergent fiber (ABLadf) and solubilization in acetyl bromide from the cell wall (ABLcw). Samples from ten grasses and ten legumes were used. The lignin content values obtained by gravimetric methods were also corrected for protein contamination, and the corrected values were referred to as Lignin (sa)p, Lignin (pm)p and KLp. The indigestible fraction of NDF (iNDF), the discrete lag (LAG) and the fractional rate of degradation (kd) of NDF were estimated using an in vitro assay. Correcting for protein resulted in reductions (P < 0.05) in the lignin contents as measured by the Lignin (sa), Lignin (pm) and, especially, the KL methods. There was an interaction (P < 0.05) of analytical method and forage group for lignin content. In general, LKp method provided the higher (P < 0.05) lignin contents. The estimates of lignin content obtained by the Lignin (sa)p, Lignin (pm)p and LKp methods were associated (P > 0.05) with all of the NDF degradation parameters. However, the strongest correlation coefficients for all methods evaluated were obtained with Lignin (pm)p and KLp. The lignin content estimated by the ABLcw method did not correlate (P > 0.05) with any parameters of NDF degradation. There was a correlation (P < 0.05) between the lignin content estimated by the ABLadf method and iNDF content. Nonetheless, this correlation was weaker than those found with gravimetric methods. From these results, we concluded that the gravimetric methods produce residues that are contaminated by nitrogenous compounds. Adjustment for these contaminants is suggested, particularly for the KL method, to express lignin content with greater accuracy. The relationships between lignin content measurements and NDF degradation parameters can be better determined using KLp and Lignin (pm)p methods. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Different monomer structures lead to different physical and mechanical properties for both the monomers and the polymers. The objective of this study was to determine the influence of the bisphenylglycidyl dimethacrylate (BisGMA) concentration (33, 50 or 66 mol%) and the co-monomer content [triethylene glycol dimethacrylate (TEGDMA), ethoxylated bisphenol-A dimethacrylate (BisEMA), or both in equal parts] on viscosity (eta), degree of conversion (DC), and flexural strength (FS). eta was measured using a viscometer, DC was obtained by Fourier transfer Raman (FT-Raman) spectroscopy, and FS was determined by three-point bending. At 50 and 66% BisGMA, increases in eta were observed following the partial and total substitution of TEGDMA by BisEMA. For 33% BisGMA, eta increased significantly only when no TEGDMA was present. The DC was influenced by BisGMA content and co-monomer type. Mixtures containing 66% BisGMA showed a lower DC compared with mixtures containing other concentrations of BisGMA. The BisEMA mixtures had a lower DC compared with the TEGDMA mixtures. The FS was influenced by co-monomer content only. BisEMA mixtures presented a statistically lower FS, followed by TEGDMA + BisEMA mixtures, and then by TEGDMA mixtures. Partial or total replacement of TEGDMA by BisEMA increased eta, which was associated with the observed decreases in DC and FS. Although the BisGMA content influenced the DC, it did not affect the FS results.
Resumo:
Trichogramma australicum larvae develop most rapidly in younger eggs of its host, the pest lepidopteran Helicoverpa armigera . To establish how the developmental stage of the host affects the diet of T. australicum , larvae were fixed in situ in eggs of H. armigera of different ages and the structure of the egg contents and parasitoid gut contents examined histologically. Larvae feeding on newly laid host eggs contain primarily yolk particles in their gut, while larvae feeding on older hosts contain necrotic cells and yolk particles. The gut of T. australicum larvae does not contain organised tissue remnants, indicating that larvae feed primarily by sucking food into their pharynx and feed best on a mixture of particulate semisolids in a liquid matrix. Secretory structures of T. australicum larvae that could be involved in modifying the host environment were examined. The hindgut is modified to form an anal vesicle with a number of attributes suggesting that it may be a specialised secretory structure. The paired salivary glands open to the exterior via a common duct.
Resumo:
Passive avoidance learning is with advantage studied in day-old chicks trained to distinguish between beads of two different colors, of which one at training was associated with aversive taste. During the first 30-min post-training, two periods of glutamate release occur in the forebrain. One period is immediately after the aversive experience, when glutamate release is confined to the left hemisphere. A second release, 30 min later, may be bilateral, perhaps with preponderance of the right hemisphere. The present study showed increased pool sizes of glutamate and glutamine, specifically in the left hemisphere, at the time when the first glutamate release occurs, indicating de novo synthesis of glutamate/glutamine from glucose or glycogen, which are the only possible substrates. Behavioral evidence that memory is extinguished by intracranial administration at this time of iodoacetate, an inhibitor of glycolysis and glycogenolysis, and that the extinction of memory is counteracted by injection of glutamine, supports this concept. A decrease in forebrain glycogen of similar magnitude and coinciding with the increase in glutamate and glutamine suggests that glycogen rather than glucose is the main source of newly synthesized glutamate/glutamine. The second activation of glutamatergic activity 30 min after training, when memory is consolidated into stable, long-term memory, is associated with a bilateral increase in pool size of glutamate/glutamine. No glycogenolysis was observed at this time, but again there is a temporal correlation with sensitivity to inhibition by iodoacetate and rescue by glutamine, indicating the importance of de novo synthesis of glutamate/glutamine from glucose or glycogen. (C) 2003 Elsevier B.V All rights reserved.
Resumo:
Websites are, nowadays, the face of institutions, but they are often neglected, especially when it comes to contents. In the present paper, we put forth an investigation work whose final goal is the development of a model for the measurement of data quality in institutional websites for health units. To that end, we have carried out a bibliographic review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we are currently carrying out a Delphi Method process, presently in its second stage, with the purpose of reaching an adequate set of attributes for the measurement of content quality.
Resumo:
Hoje em dia, há cada vez mais informação audiovisual e as transmissões ou ficheiros multimédia podem ser partilhadas com facilidade e eficiência. No entanto, a adulteração de conteúdos vídeo, como informação financeira, notícias ou sessões de videoconferência utilizadas num tribunal, pode ter graves consequências devido à importância desse tipo de informação. Surge então, a necessidade de assegurar a autenticidade e a integridade da informação audiovisual. Nesta dissertação é proposto um sistema de autenticação de vídeo H.264/Advanced Video Coding (AVC), denominado Autenticação de Fluxos utilizando Projecções Aleatórias (AFPA), cujos procedimentos de autenticação, são realizados ao nível de cada imagem do vídeo. Este esquema permite um tipo de autenticação mais flexível, pois permite definir um limite máximo de modificações entre duas imagens. Para efectuar autenticação é utilizada uma nova técnica de autenticação de imagens, que combina a utilização de projecções aleatórias com um mecanismo de correcção de erros nos dados. Assim é possível autenticar cada imagem do vídeo, com um conjunto reduzido de bits de paridade da respectiva projecção aleatória. Como a informação de vídeo é tipicamente, transportada por protocolos não fiáveis pode sofrer perdas de pacotes. De forma a reduzir o efeito das perdas de pacotes, na qualidade do vídeo e na taxa de autenticação, é utilizada Unequal Error Protection (UEP). Para validação e comparação dos resultados implementou-se um sistema clássico que autentica fluxos de vídeo de forma típica, ou seja, recorrendo a assinaturas digitais e códigos de hash. Ambos os esquemas foram avaliados, relativamente ao overhead introduzido e da taxa de autenticação. Os resultados mostram que o sistema AFPA, utilizando um vídeo com qualidade elevada, reduz o overhead de autenticação em quatro vezes relativamente ao esquema que utiliza assinaturas digitais e códigos de hash.
Resumo:
A Realidade Aumentada veio alterar a percepção que o ser humano tem do mundo real. A expansão da nossa realidade à Realidade Virtual possibilita a criação de novas experiencias, cuja aplicabilidade é já tida como natural em diversas situações. No entanto, potenciar este tipo de interacção pode ser um processo complexo, quer por limitações tecnológicas, quer pela gestão dos recursos envolvidos. O desenvolvimento de projectos com realidade aumentada para fins comerciais passa assim muitas vezes pela optimização dos recursos utilizados tendo em consideração as limitações das tecnologias envolventes (sistemas de detecção de movimento e voz, detecção de padrões, GPS, análise de imagens, sensores biométricos, etc.). Com a vulgarização e aceitação das técnicas de Realidade Aumentada em muitas áreas (medicina, educação, lazer, etc.), torna-se também necessário que estas técnicas sejam transversais aos dispositivos que utilizamos diariamente (computadores, tablets, telemóveis etc.). Um dominador comum entre estes dispositivos é a internet uma vez que as aplicações online conseguem abarcar um maior número de pessoas. O objectivo deste projecto era o de criar uma aplicação web com técnicas de Realidade Aumentada e cujos conteúdos fossem geridos pelos utilizadores. O processo de investigação e desenvolvimento deste trabalho passou assim por uma fase fundamental de prototipagem para seleccionar as tecnologias que melhor se enquadravam no tipo de arquitectura pretendida para a aplicação e nas ferramentas de desenvolvimento utilizadas pela empresa onde o projecto foi desenvolvido. A aplicação final é composta por um FrontOffice, responsável por mostrar e interpretar as aplicações criadas e possibilitar a integração com outras aplicações, e um BackOffice que possibilita aos utilizadores, sem conhecimentos de programação, criar novas aplicações de realidade aumentada e gerir os conteúdos multimédia utilizados. A aplicação desenvolvida pode servir de base para outras aplicações e ser reutilizável noutros âmbitos, sempre com o objectivo de reduzir custos de desenvolvimento e de gestão de conteúdos, proporcionando assim a implementação de uma Framework que permite a gestão de conteúdos em diferentes áreas (medicina, educação, lazer, etc.), onde os utilizadores podem criar as suas próprias aplicações, jogos e ferramentas de trabalho. No decorrer do projecto, a aplicação foi validada por especialistas garantindo o cumprimento dos objectivos propostos.
Resumo:
Actualmente, os smartphones e outros dispositivos móveis têm vindo a ser dotados com cada vez maior poder computacional, sendo capazes de executar um vasto conjunto de aplicações desde simples programas de para tirar notas até sofisticados programas de navegação. Porém, mesmo com a evolução do seu hardware, os actuais dispositivos móveis ainda não possuem as mesmas capacidades que os computadores de mesa ou portáteis. Uma possível solução para este problema é distribuir a aplicação, executando partes dela no dispositivo local e o resto em outros dispositivos ligados à rede. Adicionalmente, alguns tipos de aplicações como aplicações multimédia, jogos electrónicos ou aplicações de ambiente imersivos possuem requisitos em termos de Qualidade de Serviço, particularmente de tempo real. Ao longo desta tese é proposto um sistema de execução de código remota para sistemas distribuídos com restrições de tempo-real. A arquitectura proposta adapta-se a sistemas que necessitem de executar periodicamente e em paralelo mesmo conjunto de funções com garantias de tempo real, mesmo desconhecendo os tempos de execução das referidas funções. A plataforma proposta foi desenvolvida para sistemas móveis capazes de executar o Sistema Operativo Android.
Resumo:
In Invisible Cities (1972), Italo Calvino contrasts a rigid outline structure with a flexible textual content. The tension comprised by the numerical structure proposed in the table of contents stands out against the set of polissemic texts which make up the subject matter of the book. The opposition between form and content point to a fruitful dichotomy in the conception of the novel linked to the theories of the open and closed work. This essay will investigate the structural construction of Invisible Cities by looking at its table of contents, seeking to discuss some models of formalistic representation proposed by the criticism and the specific contribution they may, or may not, provide. The objective is to analyse the pertinence of such theories in the light of historical and cultural approaches. Aiming to uncover possible meanings which arise from the debate, this essay will question to what extent structural complexities can be considered literary if they are not ultimately related to the culture in which a text is found.
Resumo:
A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition
Flavoured versus natural waters: macromineral (Ca, Mg, K, Na) and micromineral (Fe, Cu, Zn) contents
Resumo:
Macro (Ca, Mg, K, Na) and micromineral (Fe, Zn, Cu) composition of 39 waters was analysed. Determinations were made by atomic flame spectrophotometry for macrominerals and electrothermic atomisation in graphite furnace for microminerals. Mineral contents of still or sparkling natural waters (without flavours) changed from brand to brand. Mann–Whitney test was used to search for significant differences between flavoured and natural waters. For that, the concentration of each mineral was compared to the presence of flavours, preservatives, acidifying agents, fruit juice and/or sweeteners, according to the labelled composition. The statistical study demonstrated that flavoured waters generally have increased contents of K, Na, Fe and Cu. The added preservatives also led to significant differences in the mineral composition. Acidifying agents and fruit juice can also be correlated to the increase of Mg, K, Na, Fe and Cu. Sweeteners do not provide any significant difference in Ca, Mg, Fe and Zn contents.
Resumo:
This work reports a relatively rapid procedure for the forecasting of the remediation time (RT) of sandy soils contaminated with cyclohexane using vapour extraction. The RT estimated through the mathematical fitting of experimental results was compared with that of real soils. The main objectives were: (i) to predict the RT of soils with natural organic matter (NOM) and water contents different from those used in experiments; and (ii) to analyse the time and efficiency of remediation, and the distribution of contaminants into the soil matrix after the remediation process, according to the soil contents of: (ii1) NOM; and (ii2) water. For sandy soils with negligible clay contents, artificially contaminated with cyclohexane before vapour extraction, it was concluded that: (i) if the NOM and water contents belonged to the range of the prepared soils, the RT of real soils could be predicted with relative differences not higher than 12%; (ii1) the increase of NOM content from 0% to 7.5% increased the RT (1.8–13 h) and decreased the remediation efficiency (RE) (99–90%) and (ii2) the increase of soil water content from 0% to 6% increased the RT (1.8–4.9 h) and decreased the RE (99–97%). NOM increases the monolayer capacity leading to a higher sorption into the solid phase. Increasing of soil water content reduces the mass transfer coefficient between phases. Concluding, NOM and water contents influence negatively the remediation process, turning it less efficient and more time consuming, and consequently more expensive.
Resumo:
The objectives of this work were: (1) to identify an isotherm model to relate the contaminant contents in the gas phase with those in the solid and non-aqueous liquid phases; (2) to develop a methodology for the estimation of the contaminant distribution in the different phases of the soil; and (3) to evaluate the influence of soil water content on the contaminant distribution in soil. For sandy soils with negligible contents of clay and natural organic matter, contaminated with benzene, toluene, ethylbenzene, xylene, trichloroethylene (TCE), and perchloroethylene (PCE), it was concluded that: (1) Freundlich’s model showed to be adequate to relate the contaminant contents in the gas phase with those in the solid and non-aqueous liquid phases; (2) the distribution of the contaminants in the different phases present in the soil could be estimated with differences lower than 10% for 83% of the cases; and (3) an increase of the soil water content led to a decrease of the amount of contaminant in the solid and non-aqueous liquid phases, increasing the amount in the other phases.