39 resultados para source parameters
em Reposit
Resumo:
Coastal areas are highly exposed to natural hazards associated with the sea. In all cases where there is historical evidence for devastating tsunamis, as is the case of the southern coasts of the Iberian Peninsula, there is a need for quantitative hazard tsunami assessment to support spatial planning. Also, local authorities must be able to act towards the population protection in a preemptive way, to inform 'what to do' and 'where to go' and in an alarm, to make people aware of the incoming danger. With this in mind, we investigated the inundation extent, run-up and water depths, of a 1755-like event on the region of Huelva, located on the Spanish southwestern coast, one of the regions that was affected in the past by several high energy events, as proved by historical documents and sedimentological data. Modelling was made with a slightly modified version of the COMCOT (Cornell Multi-grid Coupled Tsunami Model) code. Sensitivity tests were performed for a single source in order to understand the relevance and influence of the source parameters in the inundation extent and the fundamental impact parameters. We show that a 1755-like event will have a dramatic impact in a large area close to Huelva inundating an area between 82 and 92 km(2) and reaching maximum run-up around 5 m. In this sense our results show that small variations on the characteristics of the tsunami source are not too significant for the impact assessment. We show that the maximum flow depth and the maximum run-up increase with the average slip on the source, while the strike of the fault is not a critical factor as Huelva is significantly far away from the potential sources identified up to now. We also show that the maximum flow depth within the inundated area is very dependent on the tidal level, while maximum run-up is less affected, as a consequence of the complex morphology of the area.
Resumo:
On 26 January 1531, a strong-magnitude earthquake heavily impacted Lisbon downtown. Immediately after the earthquake, the eyewitnesses reported large waves in the Tagus estuary, mainly north of the city and along the northern bank of the river. Descriptions include large impacts on ships anchored in the estuary and even morphological changes in the riverbed. We present a synthesis of the available information concerning both the earthquake and the water disturbance as a basis for the discussion of the probable tectonic source and the magnitude of the associated river oscillations. We hypothesize that the initial disturbance of the water can be attributed to the coseismic deformation of the estuary riverbed, and we use a nonlinear shallow water model to simulate the tsunami propagation and inundation. We show that the Vila Franca de Xira fault is the most probable source of the 1531 event. The largest inundation effects of the model correlate well with the historical descriptions: the impact is relevant in the inner Tagus estuary, but inundation in downtown Lisbon is small.
Resumo:
This review aims to identify strategies to optimise radiography practice using digital technologies, for full spine studies on paediatrics focusing particularly on methods used to diagnose and measure severity of spinal curvatures. The literature search was performed on different databases (PubMed, Google Scholar and ScienceDirect) and relevant websites (e.g., American College of Radiology and International Commission on Radiological Protection) to identify guidelines and recent studies focused on dose optimisation in paediatrics using digital technologies. Plain radiography was identified as the most accurate method. The American College of Radiology (ACR) and European Commission (EC) provided two guidelines that were identified as the most relevant to the subject. The ACR guidelines were updated in 2014; however these guidelines do not provide detailed guidance on technical exposure parameters. The EC guidelines are more complete but are dedicated to screen film systems. Other studies provided reviews on the several exposure parameters that should be included for optimisation, such as tube current, tube voltage and source-to-image distance; however, only explored few of these parameters and not all of them together. One publication explored all parameters together but this was for adults only. Due to lack of literature on exposure parameters for paediatrics, more research is required to guide and harmonise practice.
Resumo:
No início da década de 90, as empresas começaram a sentir a necessidade de melhorar o acesso à informação das suas actividades para auxiliar na tomada de decisões. Desta forma, no mundo da informática, emergiu o sector Business Intelligence (BI) composto inicialmente por data warehousing e ferramentas de geração de relatórios. Ao longo dos anos o conceito de BI evoluiu de acordo com as necessidades empresariais, tornando a análise das actividades e do desempenho das organizações em aspectos críticos na gestão das mesmas. A área de BI abrange diversos sectores, sendo o de geração de relatórios e o de análise de dados aqueles que melhor preenchem os requisitos pretendidos no controlo de acesso à informação do negócio e respectivos processos. Actualmente o tempo e a informação são vantagens competitivas e por esse mesmo motivo as empresas estão cada vez mais preocupadas com o facto de o aumento do volume de informação estar a tornar-se insustentável na medida que o tempo necessário para processar a informação é cada vez maior. Por esta razão muitas empresas de software, tais como Microsoft, IBM e Oracle estão numa luta por um lugar neste mercado de BI em expansão. Para que as empresas possam ser competitivas, a sua capacidade de previsão e resposta às necessidades de mercado em tempo real é requisito principal, em detrimento da existência apenas de uma reacção a uma necessidade que peca por tardia. Os produtos de BI têm fama de trabalharem apenas com dados históricos armazenados, o que faz com que as empresas não se possam basear nessas soluções quando o requisito de alguns negócios é de tempo quase real. A latência introduzida por um data warehouse é demasiada para que o desempenho seja aceitável. Desta forma, surge a tecnologia Business Activity Monitoring (BAM) que fornece análise de dados e alertas em tempo quase real sobre os processos do negócio, utilizando fontes de dados como Web Services, filas de mensagens, etc. O conceito de BAM surgiu em Julho de 2001 pela organização Gartner, sendo uma extensão orientada a eventos da área de BI. O BAM define-se pelo acesso em tempo real aos indicadores de desempenho de negócios com o intuito de aumentar a velocidade e eficácia dos processos de negócio. As soluções BAM estão a tornar-se cada vez mais comuns e sofisticadas.
Resumo:
Esta tese pretende contribuir para o estudo e análise dos factores relacionados com as técnicas de aquisição de imagens radiológicas digitais, a qualidade diagnóstica e a gestão da dose de radiação em sistema de radiologia digital. A metodologia encontra-se organizada em duas componentes. A componente observacional, baseada num desenho do estudo de natureza retrospectiva e transversal. Os dados recolhidos a partir de sistemas CR e DR permitiram a avaliação dos parâmetros técnicos de exposição utilizados em radiologia digital, a avaliação da dose absorvida e o índice de exposição no detector. No contexto desta classificação metodológica (retrospectiva e transversal), também foi possível desenvolver estudos da qualidade diagnóstica em sistemas digitais: estudos de observadores a partir de imagens arquivadas no sistema PACS. A componente experimental da tese baseou-se na realização de experiências em fantomas para avaliar a relação entre dose e qualidade de imagem. As experiências efectuadas permitiram caracterizar as propriedades físicas dos sistemas de radiologia digital, através da manipulação das variáveis relacionadas com os parâmetros de exposição e a avaliação da influência destas na dose e na qualidade da imagem. Utilizando um fantoma contraste de detalhe, fantomas antropomórficos e um fantoma de osso animal, foi possível objectivar medidas de quantificação da qualidade diagnóstica e medidas de detectabilidade de objectos. Da investigação efectuada, foi possível salientar algumas conclusões. As medidas quantitativas referentes à performance dos detectores são a base do processo de optimização, permitindo a medição e a determinação dos parâmetros físicos dos sistemas de radiologia digital. Os parâmetros de exposição utilizados na prática clínica mostram que a prática não está em conformidade com o referencial Europeu. Verifica-se a necessidade de avaliar, melhorar e implementar um padrão de referência para o processo de optimização, através de novos referenciais de boa prática ajustados aos sistemas digitais. Os parâmetros de exposição influenciam a dose no paciente, mas a percepção da qualidade de imagem digital não parece afectada com a variação da exposição. Os estudos que se realizaram envolvendo tanto imagens de fantomas como imagens de pacientes mostram que a sobreexposição é um risco potencial em radiologia digital. A avaliação da qualidade diagnóstica das imagens mostrou que com a variação da exposição não se observou degradação substancial da qualidade das imagens quando a redução de dose é efectuada. Propõe-se o estudo e a implementação de novos níveis de referência de diagnóstico ajustados aos sistemas de radiologia digital. Como contributo da tese, é proposto um modelo (STDI) para a optimização de sistemas de radiologia digital.
Evaluation of exposure parameters in plain radiography: a comparative study with european guidelines
Resumo:
Typical distribution of exposure parameters in plain radiography is unknown in Portugal. This study aims to identify exposure parameters that are being used in plain radiography in the Lisbon area and to compare the collected data with European references [Commission of European Communities (CEC) guidelines]. The results show that in four examinations (skull, chest, lumbar spine and pelvis), there is a strong tendency of using exposure times above the European recommendation. The X-ray tube potential values (in kV) are below the recommended values from CEC guidelines. This study shows that at a local level (Lisbon region), radiographic practice does not comply with CEC guidelines concerning exposure techniques. Further national/local studies are recommended with the objective to improve exposure optimisation and technical procedures in plain radiography. This study also suggests the need to establish national/local diagnostic reference levels and to proceed to effective measurements for exposure optimisation.
Resumo:
O objectivo deste trabalho passa pelo desenvolvimento de uma ferramenta de simulação dinâmica de recursos rádio em LTE no sentido descendente, com recurso à Framework OMNeT++. A ferramenta desenvolvida permite realizar o planeamento das estações base, simulação e análise de resultados. São descritos os principais aspectos da tecnologia de acesso rádio, designadamente a arquitectura da rede, a codificação, definição dos recursos rádio, os ritmos de transmissão suportados ao nível de canal e o mecanismo de controlo de admissão. Foi definido o cenário de utilização de recursos rádio que inclui a definição de modelos de tráfego e de serviços orientados a pacotes e circuitos. Foi ainda considerado um cenário de referência para a verificação e validação do modelo de simulação. A simulação efectua-se ao nível de sistema, suportada por um modelo dinâmico, estocástico e orientado por eventos discretos de modo a contemplar os diferentes mecanismos característicos da tecnologia OFDMA. Os resultados obtidos permitem a análise de desempenho dos serviços, estações base e sistema ao nível do throughput médio da rede, throughput médio por eNodeB e throughput médio por móvel para além de permitir analisar o contributo de outros parâmetros designadamente, largura de banda, raio de cobertura, perfil dos serviços, esquema de modulação, entre outros. Dos resultados obtidos foi possível verificar que, considerando um cenário com estações base com raio de cobertura de 100 m obteve-se um throughput ao nível do utilizador final igual a 4.69494 Mbps, ou seja, 7 vezes superior quando comparado a estações base com raios de cobertura de 200m.
Resumo:
The 27 December 1722 Algarve earthquake destroyed a large area in southern Portugal generating a local tsunami that inundated the shallow areas of Tavira. It is unclear whether its source was located onshore or offshore and, in any case, what was the tectonic source responsible for the event. We analyze available historical information concerning macroseismicity and the tsunami to discuss the most probable location of the source. We also review available seismotectonic knowledge of the offshore region close to the probable epicenter, selecting a set of four candidate sources. We simulate tsunamis produced by these candidate sources assuming that the sea bottom displacement is caused by a compressive dislocation over a rectangular fault, as given by the half-space homogeneous elastic approach, and we use numerical modeling to study wave propagation and run-up. We conclude that the 27 December 1722 Tavira earthquake and tsunami was probably generated offshore, close to 37 degrees 01'N, 7 degrees 49'W.
Resumo:
An optimized ZnO:Al/a-pin SixC1-x:H/Al configuration for the laser scanned photodiode (LSP) imaging detector is proposed and the read-out parameters improved. The effect of the sensing element structure, cell configuration and light source flux are investigated and correlated with the sensor output characteristics. Data reveals that for sensors with wide band gap doped layers an increase on the image signal optimized to the blue is achieved with a dynamic range of two orders of magnitude, a responsivity of 6 mA W-1 and a sensitivity of 17 muW cm(-2) at 530 nm. The main output characteristics such as image responsivity, resolution, linearity and dynamic range were analyzed under reverse, forward and short circuit modes. The results show that the sensor performance can be optimized in short circuit mode. A trade-off between the scan time and the required resolution is needed since the spot size limits the resolution due to the cross-talk between dark and illuminated regions leading to blurring effects.
Resumo:
This work reports on the synthesis of CrO2 thin films by atmospheric pressure CVD using chromium trioxide (CrO3) and oxygen. Highly oriented (100) CrO2 films containing highly oriented (0001) Cr2O3 were grown onto Al2O3(0001) substrates. Films display a sharp magnetic transition at 375 K and a saturation magnetization of 1.92 mu(B)/f.u., close to the bulk value of 2 mu(B)/f.u. for the CrO2.
Resumo:
In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
A biosensor for urea has been developed based on the observation that urea is a powerful active-site inhibitor of amidase, which catalyzes the hydrolysis of amides such as acetamide to produce ammonia and the corresponding organic acid. Cell-free extract from Pseudomonas aeruginosa was the source of amidase (acylamide hydrolase, EC 3.5.1.4) which was immobilized on a polyethersulfone membrane in the presence of glutaraldehyde; anion-selective electrode for ammonium ions was used for biosensor development. Analysis of variance was used for optimization of the biosensorresponse and showed that 30 mu L of cell-free extract containing 7.47 mg protein mL(-1), 2 mu L of glutaraldehyde (5%, v/v) and 10 mu L of gelatin (15%, w/v) exhibited the highest response. Optimization of other parameters showed that pH 7.2 and 30 min incubation time were optimum for incubation ofmembranes in urea. The biosensor exhibited a linear response in the range of 4.0-10.0 mu M urea, a detection limit of 2.0 mu M for urea, a response timeof 20 s, a sensitivity of 58.245 % per mu M urea and a storage stability of over 4 months. It was successfully used for quantification of urea in samples such as wine and milk; recovery experiments were carried out which revealed an average substrate recovery of 94.9%. The urea analogs hydroxyurea, methylurea and thiourea inhibited amidase activity by about 90%, 10% and 0%, respectively, compared with urea inhibition.
Resumo:
This work addresses the present-day (<100 ka) mantle heterogeneity in the Azores region through the study of two active volcanic systems from Terceira Island. Our study shows that mantle heterogeneities are detectable even when "coeval" volcanic systems (Santa Barbara and Fissural) erupted less than 10 km away. These volcanic systems, respectively, reflect the influence of the Terceira and D. Joao de Castro Bank end-members defined by Beier et at (2008) for the Terceira Rift Santa Barbara magmas are interpreted to be the result of mixing between a HIMU-type component, carried to the upper mantle by the Azores plume, and the regional depleted MORB magmas/source. Fissural lavas are characterized by higher Ba/Nb and Nb/U ratios and less radiogenic Pb-206/Pb-204, Nd-143/Nd-144 and Hf-176/Hf-177, requiring the small contribution of delaminated sub-continental lithospheric mantle residing in the upper mantle. Published noble gas data on lavas from both volcanic systems also indicate the presence of a relatively undegassed component, which is interpreted as inherited from a lower mantle reservoir sampled by the ascending Azores plume. As inferred from trace and major elements, melting began in the garnet stability field, while magma extraction occurred within the spinel zone. The intra-volcanic system's chemical heterogeneity is mainly explained by variable proportions of the above-mentioned local end-members and by crystal fractionation processes. (C) 2011 Elsevier By. All rights reserved.
Resumo:
As Tecnologias da Informação e da Comunicação (TIC) têm influenciado de forma inequívoca o desenvolvimento das bibliotecas à escala global. Nas últimas décadas, as TIC mudaram a dinâmica das bibliotecas permitindo a sua modernização (pelo desenvolvimento da eficiência das tarefas já realizadas), favorecendo a inovação (pela utilização das tecnologias como base para o desenvolvimento de novos serviços/técnicas) e promovendo a sua transformação (ao nível do paradigma funcional, da disponibilização de conteúdos, etc.) – criando, em suma, uma nova relação com os seus públicos
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). Methods: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. Results: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. Conclusion: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.