33 resultados para multiple measurements


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Multiple Sclerosis (MS) is a chronic disease of the central nervous system that affects more often young adults in the prime of his career and personal development, with no cure and unknown causes. The most common signs and symptoms are fatigue, muscle weakness, changes in sensation, ataxia, changes in balance, gait difficulties, memory difficulties, cognitive impairment and difficulties in problem solving MS is a relatively common neurological disorder in which various impairments and disabilities impact strongly on function and daily life activities. Purpose: The aim of this study is to examine the implications of an Intervention Program of Physical Activity (IPPA) in quality of life in MS patients, six months after the intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Multiple sclerosis is a disease of the central nervous system that affects more frequently young women. It is a progressive and unpredictable disease, resulting in some cases of disabilities and limitations to physical, psychological and social level. Purpose: To review the literature for evidence based of the effectiveness of physiotherapy intervention in multiple sclerosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to discuss the linear solution of equality constrained problems by using the Frontal solution method without explicit assembling. Design/methodology/approach - Re-written frontal solution method with a priori pivot and front sequence. OpenMP parallelization, nearly linear (in elimination and substitution) up to 40 threads. Constraints enforced at the local assembling stage. Findings - When compared with both standard sparse solvers and classical frontal implementations, memory requirements and code size are significantly reduced. Research limitations/implications - Large, non-linear problems with constraints typically make use of the Newton method with Lagrange multipliers. In the context of the solution of problems with large number of constraints, the matrix transformation methods (MTM) are often more cost-effective. The paper presents a complete solution, with topological ordering, for this problem. Practical implications - A complete software package in Fortran 2003 is described. Examples of clique-based problems are shown with large systems solved in core. Social implications - More realistic non-linear problems can be solved with this Frontal code at the core of the Newton method. Originality/value - Use of topological ordering of constraints. A-priori pivot and front sequences. No need for symbolic assembling. Constraints treated at the core of the Frontal solver. Use of OpenMP in the main Frontal loop, now quantified. Availability of Software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article reports density measurements of dipropyl (DPA), dibutyl (DBA) and bis(2-ethylhexyl) (DEHA) adipates, using a vibrating U-tube densimeter, model DMA HP, from Anton Paar GmbH. The measurements were performed in the temperature range (293 to 373) K and at pressures up to about 68 MPa, except for DPA for which the upper limits were 363 K and 65 MPa, respectively. The density data for each liquid was correlated with the temperature and pressure using a modified Tait equation. The expanded uncertainty of the present density results is estimated as 0.2% at a 95% confidence level. No literature density data at pressures higher than 0.1 MPa could be found. DEHA literature data at atmospheric pressure agree with the correlation of the present measurements, in the corresponding temperature range, within +/- 0.11%. The isothermal compressibility and the isobaric thermal expansion were calculated by differentiation of the modified Tait correlation equation. These two parameters were also calculated for dimethyl adipate (DMA), from density data reported in a previous work. The uncertainties of isothermal compressibility and the isobaric thermal expansion are estimated to be less than +/- 1.7% and +/- 1.1%, respectively, at a 95% confidence level. Literature data of isothermal compressibility and isobaric thermal expansivity for DMA have an agreement within +/- 1% and +/- 2.4%, respectively, with results calculated in this work. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

No literature data above atmospheric pressure could be found for the viscosity of TOTIVI. As a consequence, the present viscosity results could only be compared upon extrapolation of the vibrating wire data to 0.1 MPa. Independent viscosity measurements were performed, at atmospheric pressure, using an Ubbelohde capillary in order to compare with the vibrating wire results, extrapolated by means of the above mentioned correlation. The two data sets agree within +/- 1%, which is commensurate with the mutual uncertainty of the experimental methods. Comparisons of the literature data obtained at atmospheric pressure with the present extrapolated vibrating-wire viscosity measurements have shown an agreement within +/- 2% for temperatures up to 339 K and within +/- 3.3% for temperatures up to 368 K. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Part I of the present work we describe the viscosity measurements performed on tris(2-ethylhexyl) trimellitate or 1,2,4-benzenetricarboxylic acid, tris(2-ethylhexyl) ester (TOTM) up to 65 MPa and at six temperatures from (303 to 373)K, using a new vibrating-wire instrument. The main aim is to contribute to the proposal of that liquid as a potential reference fluid for high viscosity, high pressure and high temperature. The present Part II is dedicated to report the density measurements of TOTM necessary, not only to compute the viscosity data presented in Part I, but also as complementary data for the mentioned proposal. The present density measurements were obtained using a vibrating U-tube densimeter, model DMA HP, using model DMA5000 as a reading unit, both instruments from Anton Paar GmbH. The measurements were performed along five isotherms from (293 to 373)K and at eleven different pressures up to 68 MPa. As far as the authors are aware, the viscosity and density results are the first, above atmospheric pressure, to be published for TOTM. Due to TOTM's high viscosity, its density data were corrected for the viscosity effect on the U-tube density measurements. This effect was estimated using two Newtonian viscosity standard liquids, 20 AW and 200 GW. The density data were correlated with temperature and pressure using a modified Tait equation. The expanded uncertainty of the present density results is estimated as +/- 0.2% at a 95% confidence level. Those results were correlated with temperature and pressure by a modified Tait equation, with deviations within +/- 0.25%. Furthermore, the isothermal compressibility, K-T, and the isobaric thermal expansivity, alpha(p), were obtained by derivation of the modified Tait equation used for correlating the density data. The corresponding uncertainties, at a 95% confidence level, are estimated to be less than +/- 1.5% and +/- 1.2%, respectively. No isobaric thermal expansivity and isothermal compressibility for TOTM were found in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The erosion depth profile of planar targets in balanced and unbalanced magnetron cathodes with cylindrical symmetry is measured along the target radius. The magnetic fields have rotational symmetry. The horizontal and vertical components of the magnetic field B are measured at points above the cathode target with z = 2 x 10(-3) m. The experimental data reveal that the target erosion depth profile is a function of the angle. made by B with a horizontal line defined by z = 2 x 10(-3) m. To explain this dependence a simplified model of the discharge is developed. In the scope of the model, the pathway lengths of the secondary electrons in the pre-sheath region are calculated by analytical integration of the Lorentz differential equations. Weighting these lengths by using the distribution law of the mean free path of the secondary electrons, we estimate the densities of the ionizing events over the cathode and the relative flux of the sputtered atoms. The expression so deduced correlates for the first time the erosion depth profile of the target with the angle theta. The model shows reasonably good fittings to the experimental target erosion depth profiles confirming that ionization occurs mainly in the pre-sheath zone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review aims to identify strategies to optimise radiography practice using digital technologies, for full spine studies on paediatrics focusing particularly on methods used to diagnose and measure severity of spinal curvatures. The literature search was performed on different databases (PubMed, Google Scholar and ScienceDirect) and relevant websites (e.g., American College of Radiology and International Commission on Radiological Protection) to identify guidelines and recent studies focused on dose optimisation in paediatrics using digital technologies. Plain radiography was identified as the most accurate method. The American College of Radiology (ACR) and European Commission (EC) provided two guidelines that were identified as the most relevant to the subject. The ACR guidelines were updated in 2014; however these guidelines do not provide detailed guidance on technical exposure parameters. The EC guidelines are more complete but are dedicated to screen film systems. Other studies provided reviews on the several exposure parameters that should be included for optimisation, such as tube current, tube voltage and source-to-image distance; however, only explored few of these parameters and not all of them together. One publication explored all parameters together but this was for adults only. Due to lack of literature on exposure parameters for paediatrics, more research is required to guide and harmonise practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: Optimise a set of exposure factors, with the lowest effective dose, to delineate spinal curvature with the modified Cobb method in a full spine using computed radiography (CR) for a 5-year-old paediatric anthropomorphic phantom. Methods: Images were acquired by varying a set of parameters: positions (antero-posterior (AP), posteroanterior (PA) and lateral), kilo-voltage peak (kVp) (66-90), source-to-image distance (SID) (150 to 200cm), broad focus and the use of a grid (grid in/out) to analyse the impact on E and image quality (IQ). IQ was analysed applying two approaches: objective [contrast-to-noise-ratio/(CNR] and perceptual, using 5 observers. Monte-Carlo modelling was used for dose estimation. Cohen’s Kappa coefficient was used to calculate inter-observer-variability. The angle was measured using Cobb’s method on lateral projections under different imaging conditions. Results: PA promoted the lowest effective dose (0.013 mSv) compared to AP (0.048 mSv) and lateral (0.025 mSv). The exposure parameters that allowed lower dose were 200cm SID, 90 kVp, broad focus and grid out for paediatrics using an Agfa CR system. Thirty-seven images were assessed for IQ and thirty-two were classified adequate. Cobb angle measurements varied between 16°±2.9 and 19.9°±0.9. Conclusion: Cobb angle measurements can be performed using the lowest dose with a low contrast-tonoise ratio. The variation on measurements for this was ±2.9° and this is within the range of acceptable clinical error without impact on clinical diagnosis. Further work is recommended on improvement to the sample size and a more robust perceptual IQ assessment protocol for observers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Todas as crianças, independentemente das suas necessidades, deveriam ter acesso a uma educação de qualidade e a serem incluídas nas suas famílias e comunidades. Esta afirmação inclui as crianças mais vulneráveis, em particular as crianças com dificuldades intelectuais e multideficiência. Os resultados da investigação sobre a educação de crianças com dificuldades intelectuais e multideficiência ainda não produziram até ao momento informação suficiente que possa ser usada para desenvolver indicadores de qualidade para a avaliação das práticas e dos serviços. A investigação nesta área é limitada por constrangimentos éticos, dificuldades na determinação de amostras e desafios metodológicos, sendo reduzido o número de estudos capaz de produzir a informação necessária. Este artigo tem como objetivo discutir fatores que contribuam para a qualidade do envolvimento de crianças com dificuldades intelectuais e multideficiência em atividades educativas, com base na experiência das autoras e na informação disponível que tem sido publicada sobre este assunto. Com base nesta discussão é sugerido um conjunto de indicadores que poderão ajudar os profissionais a dirigir as suas observações para a qualidade da oferta educativa e para aspetos significativos dos desempenhos das crianças quando envolvidas em atividades curriculares.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Session 7: Playing with Roles, images and improvising New States of Awareness, 3rd Global Conference, 1st November – 3rd November, 2014, Prague, Czech Republic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2. © Author(s) 2015.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Materials selection is a matter of great importance to engineering design and software tools are valuable to inform decisions in the early stages of product development. However, when a set of alternative materials is available for the different parts a product is made of, the question of what optimal material mix to choose for a group of parts is not trivial. The engineer/designer therefore goes about this in a part-by-part procedure. Optimizing each part per se can lead to a global sub-optimal solution from the product point of view. An optimization procedure to deal with products with multiple parts, each with discrete design variables, and able to determine the optimal solution assuming different objectives is therefore needed. To solve this multiobjective optimization problem, a new routine based on Direct MultiSearch (DMS) algorithm is created. Results from the Pareto front can help the designer to align his/hers materials selection for a complete set of materials with product attribute objectives, depending on the relative importance of each objective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last decade, local image features have been widely used in robot visual localization. In order to assess image similarity, a strategy exploiting these features compares raw descriptors extracted from the current image with those in the models of places. This paper addresses the ensuing step in this process, where a combining function must be used to aggregate results and assign each place a score. Casting the problem in the multiple classifier systems framework, in this paper we compare several candidate combiners with respect to their performance in the visual localization task. For this evaluation, we selected the most popular methods in the class of non-trained combiners, namely the sum rule and product rule. A deeper insight into the potential of these combiners is provided through a discriminativity analysis involving the algebraic rules and two extensions of these methods: the threshold, as well as the weighted modifications. In addition, a voting method, previously used in robot visual localization, is assessed. Furthermore, we address the process of constructing a model of the environment by describing how the model granularity impacts upon performance. All combiners are tested on a visual localization task, carried out on a public dataset. It is experimentally demonstrated that the sum rule extensions globally achieve the best performance, confirming the general agreement on the robustness of this rule in other classification problems. The voting method, whilst competitive with the product rule in its standard form, is shown to be outperformed by its modified versions.