989 resultados para Distributed linear precoding
Resumo:
RESUMO: Introdução – A Radioterapia (RT) é uma abordagem terapêutica para tratamento de neoplasia de mama. Contudo, diferentes técnicas de irradiação (TI) podem ser usadas. Objetivos – Comparar 4 TI, considerando a irradiação dos volumes alvo (PTV) e dos órgãos de risco (OAR). Metodologia – Selecionaram-se 7 pacientes com indicação para RT de mama esquerda. Sobre tomografia computorizada foram feitos os contornos do PTV e dos OAR. Foram calculadas 4 planimetrias/paciente para as TI: conformacional externa (EBRT), intensidade modulada com 2 (IMRT2) e 5 campos (IMRT5) e arco dinâmico (DART). Resultados – Histogramas de dose volume foram comparados para todas as TI usando o software de análise estatística, IBM SPSS v20. Com IMRT5 e DART, os OAR recebem mais doses baixas. No entanto, IMRT5 apresenta melhores índices de conformidade e homogeneidade para o PTV. Conclusões – IMRT5 apresenta o melhor índice de conformidade; EBRT e IMRT2 apresentam melhores resultados que DART. Há d.e.s entre as TI, sobretudo em doses mais baixas nos OAR.
Resumo:
OBJECTIVE: Myocardial infarction is an acute and severe cardiovascular disease that generally leads to patient admissions to intensive care units and few cases are initially admitted to infirmaries. The objective of the study was to assess whether estimates of air pollution effects on myocardial infarction morbidity are modified by the source of health information. METHODS: The study was carried out in hospitals of the Brazilian Health System in the city of São Paulo, Southern Brazil. A time series study (1998-1999) was performed using two outcomes: infarction admissions to infirmaries and to intensive care units, both for people older than 64 years of age. Generalized linear models controlling for seasonality (long and short-term trends) and weather were used. The eight-day cumulative effects of air pollutants were assessed using third degree polynomial distributed lag models. RESULTS: Almost 70% of daily hospital admissions due to myocardial infarction were to infirmaries. Despite that, the effects of air pollutants on infarction were higher for intensive care units admissions. All pollutants were positively associated with the study outcomes but SO2 presented the strongest statistically significant association. An interquartile range increase on SO2 concentration was associated with increases of 13% (95% CI: 6-19) and 8% (95% CI: 2-13) of intensive care units and infirmary infarction admissions, respectively. CONCLUSIONS: It may be assumed there is a misclassification of myocardial infarction admissions to infirmaries leading to overestimation. Also, despite the absolute number of events, admissions to intensive care units data provides a more adequate estimate of the magnitude of air pollution effects on infarction admissions.
Resumo:
Dissertação de Mestrado, Ciências Económicas e Empresariais, 16 de Dezembro 2013, Universidade dos Açores.
Resumo:
This paper presents a distributed model predictive control (DMPC) for indoor thermal comfort that simultaneously optimizes the consumption of a limited shared energy resource. The control objective of each subsystem is to minimize the heating/cooling energy cost while maintaining the indoor temperature and used power inside bounds. In a distributed coordinated environment, the control uses multiple dynamically decoupled agents (one for each subsystem/house) aiming to achieve satisfaction of coupling constraints. According to the hourly power demand profile, each house assigns a priority level that indicates how much is willing to bid in auction for consume the limited clean resource. This procedure allows the bidding value vary hourly and consequently, the agents order to access to the clean energy also varies. Despite of power constraints, all houses have also thermal comfort constraints that must be fulfilled. The system is simulated with several houses in a distributed environment.
Resumo:
OBJETIVO: Identificar os determinantes da desnutrição energético-protéica que ocasionam déficits ponderal e de crescimento linear em crianças. MÉTODOS: Estudo transversal envolvendo 1.041 crianças (menores de dois anos de idade) de 10 municípios do Estado da Bahia, de 1999 a 2000. Utilizou-se a técnica de regressão logística e estratégia da abordagem hierárquica para identificar os fatores associados ao estado antropométrico. RESULTADOS: O modelo final para déficit no crescimento linear revelou como determinante básico: a posse de dois ou menos equipamentos domésticos (OR=2,9; IC 95%: 1,74-4,90) e no nível subjacente, a ausência de consulta pré-natal (OR=2,7; IC 95%: 1,47-4,97); entre os determinantes imediatos o baixo peso ao nascer (<2.500 g) (OR=3,6; IC 95%: 1,72-7,70) e relato de hospitalização nos 12 meses anteriores à entrevista (OR=2,4; IC 95%: 1,42-4,10). Fatores determinantes no déficit ponderal nos níveis básico, subjacente e imediato foram, respectivamente: a renda mensal per capita inferior a ¼ do salário-mínimo (OR=3,4; IC 95%: 1,41-8,16), a ausência de pré-natal (OR=2,1; IC 95%: 1,03-4,35), e o baixo peso ao nascer (OR=4,8; IC 95%: 2,00-11,48). CONCLUSÕES: Os déficits ponderal e linear das crianças foram explicados pela intermediação entre as precárias condições materiais de vida e o restrito acesso ao cuidado com a saúde e a carga de morbidade. Intervenções que melhorem as condições de vida e ampliem o acesso às ações do serviço de saúde são estratégias que caminham na busca da eqüidade em saúde e nutrição na infância.
Resumo:
In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.
Resumo:
OBJETIVO: Estratégias metodológicas vêm sendo desenvolvidas para minimizar o efeito do erro de medida da dieta. O objetivo do estudo foi descrever a aplicação de uma estratégia para correção da informação dietética pelo erro de medida. MÉTODOS: Foram obtidos dados de consumo alimentar pela aplicação do Questionário de Freqüência Alimentar a 79 adolescentes do Município de São Paulo em 1999. Os dados dietéticos obtidos foram corrigidos por meio de regressão linear, após o ajuste pela energia usando-se o método dos resíduos. O método de referência utilizado foi o recordatório de 24 horas, aplicado em três momentos distintos. RESULTADOS: Os valores corrigidos aproximaram-se dos valores de referência. O fator de correção lambda foi de 0,89 para energia. Para os macronutrientes, os fatores foram de 0,41; 0,22 e 0,20, para carboidratos, lipídios e proteínas, respectivamente. CONCLUSÕES: As médias e desvios-padrão dos valores corrigidos denotam que houve uma correção do erro de medida. Apesar disso, debate-se o desempenho desses métodos, que são notoriamente imperfeitos quando seus pressupostos teóricos não são atendidos, o que é comum nos estudos da dieta que usam instrumentos de medida baseados no relato dos indivíduos.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
Processes are a central entity in enterprise collaboration. Collaborative processes need to be executed and coordinated in a distributed Computational platform where computers are connected through heterogeneous networks and systems. Life cycle management of such collaborative processes requires a framework able to handle their diversity based on different computational and communication requirements. This paper proposes a rational for such framework, points out key requirements and proposes it strategy for a supporting technological infrastructure. Beyond the portability of collaborative process definitions among different technological bindings, a framework to handle different life cycle phases of those definitions is presented and discussed. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
As indústrias de componentes e acessórios automóveis são um elo fundamental no processo produtivo da indústria automóvel. Neste leque industrial encontra-se a Preh Portugal, Lda, como empresa fornecedora de componentes electrónicos, mais concretamente, painéis de controlo de climatização. Os painéis fornecidos pela Preh aos seus clientes encontram-se sujeitos a rigorosos testes de qualidade e funcionalidade. Neste sentido o teste funcional das teclas surge, relacionando o curso da tecla em função de uma força actuante. Esta relação está comprometida com uma curva característica padrão para o tipo de tecla. Para além destes compromissos, também é necessário que a tecla feche e abra o seu contacto eléctrico. Esta tese foca-se no desenvolvimento do teste de teclas, apresentando uma alteração ao sistema actual com a introdução de um sistema embebido, no intuito de flexibilizar o sistema de teste e reduzindo custos. O sistema embebido pretende dar capacidade de processamento ao teste e, desta forma, substituir o actual computador como elemento de processamento. A solução implementada consistiu numa mudança estrutural, através da inclusão do sistema embebido entre o computador e o sistema de deslocamento. Passando o foco central do processo de teste a residir no sistema embebido, este tem de estabelecer comunicações com os restantes elementos intervenientes no teste. Estabelece comunicações série RS-232 com o sistema de deslocamento (leitura do curso e força na tecla), Ethernet com o computador (comandos, parâmetros e resultados) e CAN com o painel de controlo de climatização (fecho/abertura do contacto eléctrico). A concretização deste projecto resultou numa nova estrutura e aplicação, a qual é facilmente integrada na linha de produção com as vantagens de ser menos onerosa e mais flexível, conforme o pretendido.
Resumo:
In basaltic dykes the magnetic lineation K1 (maximum magnetic susceptibility axis) is generally taken to indicate the flow direction during solidification of the magma. This assumption was tested in Tertiary basaltic dykes from Greenland displaying independent evidence of subhorizontal flow. The digital processing of microphotographs from thin sections cut in (K1, K2) planes yields the preferred linear orientation of plagioclase, which apparently marks the magma flow lineation. In up to 60% of cases, the angular separation between K1 and the assumed flow direction is greater than 45degrees. This suggests that the uncorroborated use of magnetic lineations in dykes is risky. A simple geometrical method is proposed to infer the flow vector from AMS in dykes based solely on magnetic foliations.
Resumo:
Chpater in Book Proceedings with Peer Review Second Iberian Conference, IbPRIA 2005, Estoril, Portugal, June 7-9, 2005, Proceedings, Part II
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
We use a simple model of associating fluids which consists of spherical particles having a hard-core repulsion, complemented by three short-ranged attractive sites on the surface (sticky spots). Two of the spots are of type A and one is of type B; the bonding interactions between each pair of spots have strengths epsilon(AA), epsilon(BB), and epsilon(AB). The theory is applied over the whole range of bonding strengths and the results are interpreted in terms of the equilibrium cluster structures of the phases. In addition to our numerical results, we derive asymptotic expansions for the free energy in the limits for which there is no liquid-vapor critical point: linear chains (epsilon(AA)not equal 0, epsilon(AB)=epsilon(BB)=0), hyperbranched polymers (epsilon(AB)not equal 0, epsilon(AA)=epsilon(BB)=0), and dimers (epsilon(BB)not equal 0, epsilon(AA)=epsilon(AB)=0). These expansions also allow us to calculate the structure of the critical fluid by perturbing around the above limits, yielding three different types of condensation: of linear chains (AA clusters connected by a few AB or BB bonds); of hyperbranched polymers (AB clusters connected by AA bonds); or of dimers (BB clusters connected by AA bonds). Interestingly, there is no critical point when epsilon(AA) vanishes despite the fact that AA bonds alone cannot drive condensation.
Resumo:
Screening of topologies developed by hierarchical heuristic procedures can be carried out by comparing their optimal performance. In this work we will be exploiting mono-objective process optimization using two algorithms, simulated annealing and tabu search, and four different objective functions: two of the net present value type, one of them including environmental costs and two of the global potential impact type. The hydrodealkylation of toluene to produce benzene was used as case study, considering five topologies with different complexities mainly obtained by including or not liquid recycling and heat integration. The performance of the algorithms together with the objective functions was observed, analyzed and discussed from various perspectives: average deviation of results for each algorithm, capacity for producing high purity product, screening of topologies, objective functions robustness in screening of topologies, trade-offs between economic and environmental type objective functions and variability of optimum solutions.