851 resultados para Relative and point positioning
Resumo:
O objetivo deste trabalho é testar a teoria da Paridade de Poder de Compra em sua versão absoluta e relativa para o Brasil no período de 1995 a 2010, utilizando procedimentos da econometria visando estabelecer através de testes de hipóteses a validação ou rejeição da teoria da Paridade do Poder de Compra. Para a verificação serão utilizados os testes de Dickey-Fuller (DF), Dickey-Fuller Ampliado (ADF) e testes de Cointegração de Engle e Granger e Joahansen. Adotamos para o estudo os países EUA e Brasil, tendo em vista o fluxo de comércio entre estes países e sua importância na economia mundial. Através dos índices de preço IPA e PPI analisarse- á a validação da teoria da Paridade de Poder de Compra em sua versão relativa e absoluta, chegando-se a conclusão de aceitação de sua versão relativa e rejeição de sua versão absoluta.
Resumo:
Como estudar uma cultura ou uma comunidade perdida nos tempos bíblicos? Esta é um questão motriz para o autor. Foi dessa maneira que surgiu o seu interesse em discutir a possibilidade do uso do mito cosmogônico para o entendimento da comunidade dos cativos judaítas em Babilônia. É uma iniciativa, que precisava ser trilhada pelos pesquisadores que se dispusessem ao estudo das culturas do mundo bíblico. Assim se elegeu o tema Mito Cosmogônico no Primeiro Testamento como instrumento de aprofundamento da pesquisa bíblica. O mito é uma escolha mais ou menos óbvia, pela sua capacidade de funcionar como paradigma, pragmática e traditiva contra-hegemônica dentro de um contexto social interétnico. Estas eram ponderações vindas de matrizes como a do fenomenólogo Mircea Eliade, do Antropólogo Roger Bastide e do teólogo e fenomenólogo José Severino Croatto. É por isto que um paralelo é traçado entre o mito de Marduk e o texto de Isaías 51, 9-11, que fala de Javé como sendo criador do mundo e que luta contra as forças do caos. Isto é feito, com vistas à percepção da profecia do Isaías do exílio, como parentesco e sua justaposição com a mitologia babilônica, e ambos se aproximam bastante de forma sintagmática e histórico-social. Coube ainda saber se a profecia do Dêutero-Isaías atuava da mesma maneira que o poema Enuma elish funcionava para os babilônicos. Ou seja, fazia-se surgir modelos sociais às comunidades de escravos dentro do Império Neobabilônico; se com base nestes cânticos, os cativos conseguiam construir um ordenamento para as suas comunidades, que gozavam de uma relativa autonomia, tais como colônias e guetos ; se de posse dessa ousada profecia, os judeus da golah eram capazes de elaborar uma desobediência cívil nos termos de um nutrir nos corações, uma utopia que rompesse com o status quo do passado, comprometendo-os com a esperança no Javé criador.(AU)
Resumo:
O trabalho aborda o papel desempenhado pelo elemento musical, o Jingle, dentro da estrutura da peça publicitária comumente chamada de comercial . Para averiguar a premissa de que o Jingle perdeu importância expressiva pela chegada da Era dos Meios Digitais em relação às mensagens publicitárias veiculadas até o final do século XX pelo Rádio e pela Televisão, as técnicas metodológicas aplicadas foram a Análise de Conteúdo, com a proposta de mensurar e apontar a significativa diminuição no tempo médio destinado ao Jingle nas peças publicitárias na Era dos Meios Digitais e a Análise do Discurso, que possibilitou aferir o empobrecimento no uso dos recursos de linguagem na estruturação do discurso músico-literário do Jingle ao longo da primeira década do século XXI na publicidade brasileira. Após a avaliação dos parâmetros quantitativo e qualitativo da produção do Jingle na Era dos Meios Digitais, o trabalho conclui que houve, de fato, perda de espaço e importância e que o Jingle, que já ocupou o papel de protagonista, hoje é um coadjuvante dentro do processo comunicacional com o mercado.
Resumo:
Este projeto tem como objetivo realizar uma análise da forma com que vem sendo realizada, na prática, a Comunicação Interna no Brasil por meio de house-organs. Para tanto, utiliza-se de duas fontes principais de consulta: a bibliografia publicada sobre o tema e os vencedores do Prêmio Aberje dos últimos 7 (sete) anos. Com isso, pretende-se verificar, de um lado, o que autores e especialistas sugerem e indicam como ideal para execução da Comunicação Interna através de house-organs; de outro, analisa-se, com base nessa teoria, o que vem sendo feito na prática. A escolha dos vencedores do prêmio Aberje categoria House-organ para Comunicação Interna, tem como objetivo apenas apresentar um critério lógico para a seleção das mídias analisadas, uma vez que se trata de um prêmio concedido pelo principal órgão de representação da Comunicação Empresarial do Brasil. O foco deste estudo é exclusivamente o conteúdo dos veículos e neste se encaixam itens como o fluxo do discurso organizacional (ascendente, descendente etc.), mensagens, o tipo de linguagem utilizada etc. Não se focarão, dessa maneira, aspectos de diagramação e estética.(AU)
Resumo:
O presente trabalho analisa os importantes desafios que as novas tecnologias e as transformações na sociedade pós-industrial estão impondo à TV digital brasileira e ao seu modelo de negócios. Tem como principal objetivo realizar uma reflexão sobre a viabilidade financeira das emissoras abertas com a chegada da TV digital. Para tanto, analisa o modelo de negócios anterior, da TV analógica, baseado nos comerciais de trinta segundos e como esta forma poderá ser afetada inviabilizando a estrutura de produção e distribuição de conteúdo pelas emissoras abertas digitais. Ainda, busca evidenciar demais fatores que contribuem para a migração da audiência para outras plataformas de distribuição de conteúdo, refutando o senso comum de que o acesso à internet banda larga é a principal causa da queda dos índices de audiência. Este estudo se utiliza de uma ampla bibliografia que extrapola o campo específico da Comunicação e amplia o olhar sobre a indústria televisiva aberta no Brasil, enumerando fragilidades do setor e apontando possíveis estratégias para que a televisão brasileira possa se adaptar à nova estrutura da comunicação que está se formando em nosso país.
Resumo:
Resiliência remete à habilidade do ser humano de demonstrar êxito diante das adversidades da vida, superá-las e, inclusive, ser fortalecido ou transformado por elas. O construto tem sido estudado há cerca de quarenta anos na Psiquiatria com foco em crianças, mas sua investigação é bem mais recente com a população adulta. No mundo da competição esportiva, os estudos são escassos. O contexto esportivo apresenta altos desafios e adversidades constantes que os atletas precisam vencer para cumprir as metas profissionais; por isso, convivem, muito frequentemente, com seus limites físicos e psicológicos. Assim, a resiliência pode ser um importante aspecto em suas vidas profissionais. Este estudo objetiva descrever os níveis de resiliência dos atletas no Basquetebol e identificar possíveis relações entre resiliência e alguns indicadores de eficiência estatística. Participaram da pesquisa, voluntariamente, 71 atletas profissionais adultos e atuantes da modalidade. As variáveis foram avaliadas por meio da Escala de Avaliação de Resiliência EAR, de um questionário de dados sociodemográficos e de índices de eficiência registrados pela Federação Paulista de Basquetebol. Os resultados de análises estatísticas descritivas e de correlações bivariadas de Pearson permitiram observar que os atletas demonstraram um alto nível de resiliência com destaque para a persistência diante das dificuldades e a aceitação positiva de mudanças. Os fatores que compõem a resiliência não apresentaram correlação significativa no tocante ao coeficiente de eficiência dos atletas. Ao comparar as médias por meio da análise de variância percebeu-se que os atletas que possuíam entre cinco e dez anos de profissão apresentaram melhores médias de coeficiente de eficiência. Os resultados revelam, ainda, que os atletas que atuam menos de 8 minutos na partida, em média, produzem menores índices de eficiência estatística e que os atletas que pertencem às equipes de resultados medianos na tabela de classificação tendem a apresentar maior percepção de competência pessoal que os atletas que atuam nas equipes mais mal colocadas. Os fatores de resiliência não se diferenciam em função da experiência dos atletas, nem do tempo em média que permanecem em quadra. Esses resultados revelam a necessidade de questionar se os indicadores de eficiência estatística seriam os critérios mais adequados para verificar o papel da resiliência na vida de atletas de Basquetebol e apontam para a necessidade de aumentar o número de estudos sobre a influência de características individuais no mundo dos esportes profissionais.
Resumo:
Guest editorial: This special issue has been drawn from papers that were published as part of the Second European Conference on Management of Technology (EuroMOT) which was held at Aston Business School (Birmingham, UK) 10-12 September 2006. This was the official European conference for the International Association for Management of Technology (IAMOT); the overall theme of the conference was “Technology and global integration.” There were many high-calibre papers submitted to the conference and published in the associated proceedings (Bennett et al., 2006). The streams of interest that emerged from these submissions were the importance of: technology strategy, innovation, process technologies, managing change, national policies and systems, research and development, supply chain technology, service and operational technology, education and training, small company incubation, technology transfer, virtual operations, technology in developing countries, partnership and alliance, and financing and investment. This special issue focuses upon the streams of interest that accentuate the importance of collaboration between different organisations. Such organisations vary greatly in character; for instance, they may be large or small, publicly or privately owned, and operate in manufacturing or service sectors. Despite these varying characteristics they all have something in common; they all stress the importance of inter-organisational collaboration as a critical success factor for their organisation. In today's global economy it is essential that organisations decide what their core competencies are what those of complementing organisations are. Core competences should be developed to become a bases of differentiation, leverage and competitive advantage, whilst those that are less mature should be outsourced to other organisations that can claim to have had more recognition and success in that particular core competence (Porter, 2001). This strategic trend can be observed throughout advanced economies and is growing strongly. If a posteriori reasoning is applied here it follows that organisations could continue to become more specialised in fewer areas whilst simultaneously becoming more dependent upon other organisations for critical parts of their operations. Such actions seem to fly in the face of rational business strategy and so the question must be asked: why are organisations developing this way? The answer could lie in the recent changes in endogenous and exogenous factors of the organisation; the former emphasising resource-based issues in the short-term, and strategic positioning in the long-term whilst the later emphasises transaction costs in the short-term and acquisition of new skills and knowledge in the long-term. For a harmonious balance of these forces to prevail requires organisations to firstly declare a shared meta-strategy, then to put some cross-organisational processes into place which have their routine operations automated as far as possible. A rolling business plan would review, assess and reposition each organisation within this meta-strategy according to how well they have contributed (Binder and Clegg, 2006). The important common issue here is that an increasing number of businesses today are gaining direct benefit from increasing their levels of inter-organisational collaboration. Such collaboration has largely been possible due to recent technological advances which can make organisational structures more agile (e.g. the extended or the virtual enterprise), organisational infra-structure more connected, and the sharing of real-time information an operational reality. This special issue consists of research papers that have explored the above phenomenon in some way. For instance, the role of government intervention, the use of internet-based technologies, the role of research and development organisations, the changing relationships between start-ups and established firms, the importance of cross-company communities of practice, the practice of networking, the front-loading of large-scale projects, innovation and the probabilistic uncertainties that organisations experience are explored in these papers. The cases cited in these papers are limited as they have a Eurocentric focus. However, it is hoped that readers of this special issue will gain a valuable insight into the increasing importance of collaborative practices via these studies.
Resumo:
The oharaoter of right-wing extremism in the Federal Republic has undergone extensive transformations in the seventies. As electoral support for the extreme Right declined, a whole range of new groupings emerged pursuing a militant extra-parliamentary strategy. Essential charaoteristics are an increasing tendency to use violence and a close ideological affinity to the NSDAP. They attract a growing number of young people. The increasing susceptibility of young people to rightist ideologies coincides with an economic recession of which young people especially are the victims. Widespread ignorance about Nazism and the prevalence of anti-democratic political attitudes constitute important contributary factors and point to a considerable potential for right-extremism in the Federal Republic., This potential can be attributed. to the negative effects of much of the material dealing with the NS past, to serious deficiencies in the area of historical-political education in schools and, above all, to the absence of any :real process of "coming to terms with the past" in the postwar period. Neo-Nazism is not completely isolated from other trends in West German society. Rightist elements within the established party system and broad sections of the population hold similar views and attitudes. This similarity, linked with an over-exaggerated concern with a perceived threat from the extreme Left may explain the absence of any concerted effort to deal with nee-Nazi tendencies. The response of the courts exemplifies a widespread tendency to under-estimate the significance of the extreme Right. Opposition to the Right is restricted primarily to those circles which suffered most under the Nazi regime. The analysis suggests that one must reject the simplistic view that at the present time the Right does not constitute a serious threat to West German democracy. The study evaluates the wide range of views to be found in secondary sources on the subject of neo-Nazism and is :intended, to contribute to the ongoing discussion conceming the potential for right-extremism in West Germany.
Resumo:
This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.
Resumo:
This paper explores a new method of analysing muscle fatigue within the muscles predominantly used during microsurgery. The captured electromyographic (EMG) data retrieved from these muscles are analysed for any defining patterns relating to muscle fatigue. The analysis consists of dynamically embedding the EMG signals from a single muscle channel into an embedded matrix. The muscle fatigue is determined by defining its entropy characterized by the singular values of the dynamically embedded (DE) matrix. The paper compares this new method with the traditional method of using mean frequency shifts in the EMG signal's power spectral density. Linear regressions are fitted to the results from both methods, and the coefficients of variation of both their slope and point of intercept are determined. It is shown that the complexity method is slightly more robust in that the coefficient of variation for the DE method has lower variability than the conventional method of mean frequency analysis.
Resumo:
The recent expansion of clinical applications for optical coherence tomography (OCT) is driving the development of approaches for consistent image acquisition. There is a simultaneous need for time-stable, easy-to-use imaging targets for calibration and standardization of OCT devices. We present calibration targets consisting of three-dimensional structures etched into nanoparticle-embedded resin. Spherical iron oxide nanoparticles with a predominant particle diameter of 400 nm were homogeneously dispersed in a two part polyurethane resin and allowed to harden overnight. These samples were then etched using a precision micromachining femtosecond laser with a center wavelength of 1026 nm, 100kHz repetition rate and 450 fs pulse duration. A series of lines in depth were etched, varying the percentage of inscription energy and speed of the translation stage moving the target with respect to the laser. Samples were imaged with a dual wavelength spectral-domain OCT system and point-spread function of nanoparticles within the target was measured.
Resumo:
Albumin is not endogenous to the tear film and is present as a product of plasma leakage. It is used as a diagnostic marker of ocular insult and inflammation. Tear albumin is, however, poorly understood, with large variations in reported concentrations between studies. There is also no authoritative information on whether its presence in tears is responsive or part of an adaptive reaction.The presented research aimed to resolve the disparities in published tear albumin concentrations and investigate the role of albumin in the tear film. Collation and evaluation of the available literature identified collection method, stimulus, assay technique, and disease state as factors able to influence quoted tear albumin to different extents. Difference in sampling technique exhibited the largest variations in mean tear albumin concentrations. Review of the literature also highlighted that little systematic investigations of the daily cycle of tear albumin levels, and subject-to-subject-variation, had been carried out. In order to remedy this shortcoming, variations in tear albumin concentration were investigated in 13 subjects throughout the waking day. Results identified a time period where albumin levels are relatively stable (2-6 hours post-waking). This was designated a suitable baseline for the determinations of tear albumin concentrations and subject-to-subject comparisons. Significantly, a previously unrecognised progressive increase in albumin concentration during the latter part of the day was also identified in the population. This increase suggests that albumin may play a more active and dynamic role in the ocular environment than is commonly perceived. To facilitate the collection of additional tear albumin data, tear sampling and point-of-care analysis in contact lens clinics were investigated. Two instruments were evaluated and were found to be suitable for the analysis of tear albumin in commercial institutions. Collectively, the described research has provided new insight into tear albumin and a strong foundation for further studies.
Resumo:
The recent expansion of clinical applications for optical coherence tomography (OCT) is driving the development of approaches for consistent image acquisition. There is a simultaneous need for time-stable, easy-to-use imaging targets for calibration and standardization of OCT devices. We present calibration targets consisting of three-dimensional structures etched into nanoparticle-embedded resin. Spherical iron oxide nanoparticles with a predominant particle diameter of 400 nm were homogeneously dispersed in a two part polyurethane resin and allowed to harden overnight. These samples were then etched using a precision micromachining femtosecond laser with a center wavelength of 1026 nm, 100kHz repetition rate and 450 fs pulse duration. A series of lines in depth were etched, varying the percentage of inscription energy and speed of the translation stage moving the target with respect to the laser. Samples were imaged with a dual wavelength spectral-domain OCT system and point-spread function of nanoparticles within the target was measured.
Resumo:
Background - Carbon monoxide, the gaseous product of heme oxygenase, is a signalling molecule with a broad spectrum of biological activities. The aim of this study was to investigate the effects of carbon monoxide on proliferation of human pancreatic cancer. Methods - In vitro studies were performed on human pancreatic cancer cells (CAPAN-2, BxPc3, and PaTu-8902) treated with a carbon monoxide-releasing molecule or its inactive counterpart, or exposed to carbon monoxide gas (500 ppm/24 h). For in vivo studies, pancreatic cancer cells (CAPAN-2/PaTu-8902) were xenotransplanted subcutaneously into athymic mice, subsequently treated with carbon monoxide-releasing molecule (35 mg/kg b.w. i.p./day), or exposed to safe doses of carbon monoxide (500 ppm 1 h/day; n = 6 in each group). Results - Both carbon monoxide-releasing molecule and carbon monoxide exposure significantly inhibited proliferation of human pancreatic cancer cells (p < 0.05). A substantial decrease in Akt phosphorylation was observed in carbon monoxide-releasing molecule compared with inactive carbon monoxide-releasing molecule treated cancer cells (by 30–50%, p < 0.05). Simultaneously, carbon monoxide-releasing molecule and carbon monoxide exposure inhibited tumour proliferation and microvascular density of xenotransplanted tumours (p < 0.01), and doubled the survival rates (p < 0.005). Exposure of mice to carbon monoxide led to an almost 3-fold increase in carbon monoxide content in tumour tissues (p = 0.006). Conclusion - These data suggest a new biological function for carbon monoxide in carcinogenesis, and point to the potential chemotherapeutic/chemoadjuvant use of carbon monoxide in pancreatic cancer.
Resumo:
We propose a scheme for multilevel (nine or more) amplitude regeneration based on a nonlinear optical loop mirror (NOLM) and demonstrate through numerical modeling its efficiency and cascadability on circular 16-, 64-, and 256- symbol constellations. We show that the amplitude noise is efficiently suppressed. The design is flexible and enables variation of the number of levels and their positioning. The scheme is compatible with phase regenerators. Also, compared to the traditional single-NOLM configuration scheme, new features, such as reduced and sign-varied power-dependent phase shift, are available. The model is simple to implement, as it requires only two couplers in addition to the traditional NOLM, and offers a vast range of optimization parameters. © 2014 Optical Society of America.