814 resultados para Observational Methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta tese pretende contribuir para o estudo e análise dos factores relacionados com as técnicas de aquisição de imagens radiológicas digitais, a qualidade diagnóstica e a gestão da dose de radiação em sistema de radiologia digital. A metodologia encontra-se organizada em duas componentes. A componente observacional, baseada num desenho do estudo de natureza retrospectiva e transversal. Os dados recolhidos a partir de sistemas CR e DR permitiram a avaliação dos parâmetros técnicos de exposição utilizados em radiologia digital, a avaliação da dose absorvida e o índice de exposição no detector. No contexto desta classificação metodológica (retrospectiva e transversal), também foi possível desenvolver estudos da qualidade diagnóstica em sistemas digitais: estudos de observadores a partir de imagens arquivadas no sistema PACS. A componente experimental da tese baseou-se na realização de experiências em fantomas para avaliar a relação entre dose e qualidade de imagem. As experiências efectuadas permitiram caracterizar as propriedades físicas dos sistemas de radiologia digital, através da manipulação das variáveis relacionadas com os parâmetros de exposição e a avaliação da influência destas na dose e na qualidade da imagem. Utilizando um fantoma contrastedetalhe, fantomas antropomórficos e um fantoma de osso animal, foi possível objectivar medidas de quantificação da qualidade diagnóstica e medidas de detectabilidade de objectos. Da investigação efectuada, foi possível salientar algumas conclusões. As medidas quantitativas referentes à performance dos detectores são a base do processo de optimização, permitindo a medição e a determinação dos parâmetros físicos dos sistemas de radiologia digital. Os parâmetros de exposição utilizados na prática clínica mostram que a prática não está em conformidade com o referencial Europeu. Verifica-se a necessidade de avaliar, melhorar e implementar um padrão de referência para o processo de optimização, através de novos referenciais de boa prática ajustados aos sistemas digitais. Os parâmetros de exposição influenciam a dose no paciente, mas a percepção da qualidade de imagem digital não parece afectada com a variação da exposição. Os estudos que se realizaram envolvendo tanto imagens de fantomas como imagens de pacientes mostram que a sobreexposição é um risco potencial em radiologia digital. A avaliação da qualidade diagnóstica das imagens mostrou que com a variação da exposição não se observou degradação substancial da qualidade das imagens quando a redução de dose é efectuada. Propõe-se o estudo e a implementação de novos níveis de referência de diagnóstico ajustados aos sistemas de radiologia digital. Como contributo da tese, é proposto um modelo (STDI) para a optimização de sistemas de radiologia digital.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although security plays an important role in the development of multiagent systems, a careful analysis of software development processes shows that the definition of security requirements is, usually, considered after the design of the system. One of the reasons is the fact that agent oriented software engineering methodologies have not integrated security concerns throughout their developing stages. The integration of security concerns during the whole range of the development stages can help towards the development of more secure multiagent systems. In this paper we introduce extensions to the Tropos methodology to enable it to model security concerns throughout the whole development process. A description of the new concepts and modelling activities is given along with a discussion on how these concepts and modelling activities are integrated to the current stages of Tropos. A real life case study from the health and social care sector is used to illustrate the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The residence time has long been used as a classification parameter for estuaries and other semi- enclosed water bodies. It aims to quantify the time water remains inside the estuary, being used as an indicator both for pollution assessment and for ecological processes. Estuaries with a short residence time will export nutrients from upstream sources more rapidly then estuaries with longer residence time. On the other hand the residence time determines if micro-algae can stay long enough to generate a bloom. As a consequence, estuaries with very short residence time are expected to have much lower algae blooms, then estuaries with longer residence time. In addition, estuaries with residence times shorter than the doubling time of algae cells will inhibit formation of algae blooms (EPA, 2001). The residence time is also an important issue for processes taking place in the sediment. The fluxes of particulate matter and associated adsorbed species from the water column to the sediment depends of the particle’s vertical velocity, water depth and residence time. This is particularly important for the fine fractions with lower sinking velocities. The question is how to compute the residence time and how does it depend on the computation method adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new electrochemical methodology to study labile trace metal/natural organic matter complexation at low concentration levels in natural waters is presented. This methodology consists of three steps: (i) an estimation of the complex diffusion coefficient (DML), (ii) determination at low pH of the total metal concentration initially present in the sample, (iii) a metal titration at the desired pH. The free and bound metal concentrations are determined for each point of the titration and modeled with the non-ideal competitive adsorption (NICA-Donnan) model in order to obtain the binding parameters. In this methodology, it is recommended to determine the hydrodynamic transport parameter, α, for each set of hydrodynamic conditions used in the voltammetric measurements. The methodology was tested using two fractions of natural organic matter (NOM) isolated from the Loire river, namely the hydrophobic organic matter (HPO) and the transphilic organic matter (TPI), and a well characterized fulvic acid (Laurentian fulvic acid, LFA). The complex diffusion coefficients obtained at pH 5 were 0.4 ± 0.2 for Pb and Cu/HPO, 1.8 ± 0.2 for Pb/TPI and (0.612 ± 0.009) × 10−10 m2 s−1 for Pb/LFA. NICA-Donnan parameters for lead binding were obtained for the HPO and TPI fractions. The new lead/LFA results were successfully predicted using parameters derived in our previous work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Computer Game industry is big business, the demand for graduates is high, indeed there is a continuing shortage of skilled employees. As with most professions, the skill set required is both specific and diverse. There are currently over 30 Higher Education Institutions (HEIs) in the UK offering Computer games related courses. We expect that as the demand from the industry is sustained, more HEIs will respond with the introduction of game-related degrees. This is quite a considerable undertaking involving many issues from integration of new modules or complete courses within the existing curriculum, to staff development. In this paper we share our experiences of introducing elements of game development into our curriculum. This has occurred over the past two years, starting with the inclusion of elements of game development into existing programming modules, followed by the validation of complete modules, and culminating in a complete degree course. Our experience is that our adopting a progressive approach to development, spread over a number of years, was crucial in achieving a successful outcome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper links research and teaching through an applied Soft Systems Methodology case study. The case study focuses on the redevelopment of a Research and Professional Skills module to provide support for international postgraduate students through the use of formative feedback with the aim of increasing academic research skills and confidence. The stages of the Soft Systems Methodology were used as a structure for the redevelopment of module content and assessment. It proved to be a valuable tool for identifying complex issues, a basis for discussion and debate from which an enhanced understanding was gained and a successful solution implemented together with a case study that could be utilised for teaching Soft Systems Methodology concepts. Changes to the module were very successful and resulted in significantly higher grades and a higher pass rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a quality controlled observational atmospheric, snow, and soil data set from Snoqualmie Pass, Washington, U.S.A., to enable testing of hydrometeorological and snow process representations within a rain-snow transitional climate where existing observations are sparse and limited. Continuous meteorological forcing (including air temperature, total precipitation, wind speed, specific humidity, air pressure, short- and longwave irradiance) are provided at hourly intervals for a 24-year historical period (water years 1989-2012) and at half-hourly intervals for a more-recent period (water years 2013-2015), separated based on the availability of observations. Additional observations include 40-years of snow board new snow accumulation, multiple measurements of total snow depth, and manual snow pits, while more recent years include sub-daily surface temperature, snowpack drainage, soil moisture and temperature profiles, and eddy co-variance derived turbulent heat flux. This data set is ideal for testing hypotheses about energy balance, soil and snow processes in the rain-snow transition zone. Plots of live data can be found here: http://depts.washington.edu/mtnhydr/cgi/plot.cgi

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To describe (1) the relationship between nutrition and the preterm-at-term infant phenotype, (2) phenotypic differences between preterm-at-term infants and healthy term born infants and (3) relationships between somatic and brain MRI outcomes. Design: Prospective observational study. Setting: UK tertiary neonatal unit. Participants: Preterm infants (<32 weeks gestation) (n=22) and healthy term infants (n=39) Main outcome measures: Preterm nutrient intake; total and regional adipose tissue (AT) depot volumes; brain volume and proximal cerebral arterial vessel tortuosity (CAVT) in preterm infants and in term infants. Results: Preterm nutrition was deficient in protein and high in carbohydrate and fat. Preterm nutrition was not related to AT volumes, brain volume or proximal CAVT score; a positive association was noted between human milk intake and proximal CAVT score (r=0.44, p=0.05). In comparison to term infants, preterm infants had increased total adiposity, comparable brain volumes and reduced proximal CAVT scores. There was a significant negative correlation between deep subcutaneous abdominal AT volume and brain volume in preterm infants (r=−0.58, p=0.01). Conclusions: Though there are significant phenotypic differences between preterm infants at term and term infants, preterm macronutrient intake does not appear to be a determinant. Our preliminary data suggest that (1) human milk may exert a beneficial effect on cerebral arterial vessel tortuosity and (2) there is a negative correlation between adiposity and brain volume in preterm infants at term. Further work is warranted to see if our findings can be replicated and to understand the causal mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objetivos: Pretende-se verificar as modificações neuromotoras após uma intervenção baseada no conceito de Bobath ao nível dos ajustes posturais durante o alcance funcional dos membros superiores, em três crianças com paralisia cerebral. Pretende-se também, verificar o efeito desta abordagem nas atividades e participação, bem como destacar os aspetos individuais das mesmas crianças com a capacidade de mudança após a intervenção. Metodologia: A avaliação foi realizada antes e três meses após a intervenção em fisioterapia segundo o conceito de Bobath. Optou-se por um registo observacional com uma Máquina Fotográfica Digital, um sistemas de Câmaras de Vídeo, uma Plataforma de Forças e, utilizaram-se ainda instrumentos como o Gross Motor Functional Measure– versão 88 itens, o Gross Motor Function Classification System, o Teste de Alcance Funcional Modificado e a ferramenta, Classificação Internacional de Funcionalidade, Incapacidade e Saúde – crianças e jovens. Resultados: Verificou-se um progresso nos ajustes posturais e na funcionalidade em geral, o que se repercutiu na restrição da participação e na limitação da actividade. A postura na posição de sentado, o deslocamento do centro de pressão, a capacidade de deslocamento no sentido anterior, bem como as capacidades motoras grosseiras modificaram-se em todas as crianças, tendo a criança B apresentado a maior e a criança A a menor capacidade de mudança após a intervenção. Conclusão: A intervenção segundo o Conceito de Bobath promoveu modificações neuromotoras, o que levaram a uma melhoria da funcionalidade geral, da mobilidade e do controlo postural da criança, refletindo-se nos ajustes posturais durante o alcance funcional dos membros superiores na posição de sentado. Verificou-se ainda, uma melhoria na restrição da participação e na limitação da actividade diária.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many countries the use of renewable energy is increasing due to the introduction of new energy and environmental policies. Thus, the focus on the efficient integration of renewable energy into electric power systems is becoming extremely important. Several European countries have already achieved high penetration of wind based electricity generation and are gradually evolving towards intensive use of this generation technology. The introduction of wind based generation in power systems poses new challenges for the power system operators. This is mainly due to the variability and uncertainty in weather conditions and, consequently, in the wind based generation. In order to deal with this uncertainty and to improve the power system efficiency, adequate wind forecasting tools must be used. This paper proposes a data-mining-based methodology for very short-term wind forecasting, which is suitable to deal with large real databases. The paper includes a case study based on a real database regarding the last three years of wind speed, and results for wind speed forecasting at 5 minutes intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A methodology based on data mining techniques to support the analysis of zonal prices in real transmission networks is proposed in this paper. The mentioned methodology uses clustering algorithms to group the buses in typical classes that include a set of buses with similar LMP values. Two different clustering algorithms have been used to determine the LMP clusters: the two-step and K-means algorithms. In order to evaluate the quality of the partition as well as the best performance algorithm adequacy measurements indices are used. The paper includes a case study using a Locational Marginal Prices (LMP) data base from the California ISO (CAISO) in order to identify zonal prices.