865 resultados para Assessment methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Cytomegalovirus (CMV) disease remains an important problem in solid-organ transplant recipients, with the greatest risk among donor CMV-seropositive, recipient-seronegative (D(+)/R(-)) patients. CMV-specific cell-mediated immunity may be able to predict which patients will develop CMV disease. METHODS: We prospectively included D(+)/R(-) patients who received antiviral prophylaxis. We used the Quantiferon-CMV assay to measure interferon-γ levels following in vitro stimulation with CMV antigens. The test was performed at the end of prophylaxis and 1 and 2 months later. The primary outcome was the incidence of CMV disease at 12 months after transplant. We calculated positive and negative predictive values of the assay for protection from CMV disease. RESULTS: Overall, 28 of 127 (22%) patients developed CMV disease. Of 124 evaluable patients, 31 (25%) had a positive result, 81 (65.3%) had a negative result, and 12 (9.7%) had an indeterminate result (negative mitogen and CMV antigen) with the Quantiferon-CMV assay. At 12 months, patients with a positive result had a subsequent lower incidence of CMV disease than patients with a negative and an indeterminate result (6.4% vs 22.2% vs 58.3%, respectively; P < .001). Positive and negative predictive values of the assay for protection from CMV disease were 0.90 (95% confidence interval [CI], .74-.98) and 0.27 (95% CI, .18-.37), respectively. CONCLUSIONS: This assay may be useful to predict if patients are at low, intermediate, or high risk for the development of subsequent CMV disease after prophylaxis. CLINICAL TRIALS REGISTRATION: NCT00817908.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Healthcare accreditation models generally include indicators related to healthcare employees' perceptions (e.g. satisfaction, career development, and health safety). During the accreditation process, organizations are asked to demonstrate the methods with which assessments are made. However, none of the models provide standardized systems for the assessment of employees. In this study, we analyzed the psychometric properties of an instrument for the assessment of nurses' perceptions as indicators of human capital quality in healthcare organizations. The Human Capital Questionnaire was applied to a sample of 902 nurses in four European countries (Spain, Portugal, Poland, and the UK). Exploratory factor analysis identified six factors: satisfaction with leadership, identification and commitment, satisfaction with participation, staff well-being, career development opportunities, and motivation. The results showed the validity and reliability of the questionnaire, which when applied to healthcare organizations, provide a better understanding of nurses' perceptions, and is a parsimonious instrument for assessment and organizational accreditation. From a practical point of view, improving the quality of human capital, by analyzing nurses and other healthcare employees' perceptions, is related to workforce empowerment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Daily precipitation is recorded as the total amount of water collected by a rain-gauge in 24h. Events are modelled as a Poisson process and the 24h precipitation by a Generalized Pareto Distribution (GPD) of excesses. Hazard assessment is complete when estimates of the Poisson rate and the distribution parameters, together with a measure of their uncertainty, are obtained. The shape parameter of the GPD determines the support of the variable: Weibull domain of attraction (DA) corresponds to finite support variables, as should be for natural phenomena. However, Fréchet DA has been reported for daily precipitation, which implies an infinite support and a heavy-tailed distribution. We use the fact that a log-scale is better suited to the type of variable analyzed to overcome this inconsistency, thus showing that using the appropriate natural scale can be extremely important for proper hazard assessment. The approach is illustrated with precipitation data from the Eastern coast of the Iberian Peninsula affected by severe convective precipitation. The estimation is carried out by using Bayesian techniques

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The purpose of the present study was to challenge the hypothetical advantage of single port laparoscopy (SPL) over conventional laparoscopy by measuring prospectively the morbidity specifically related to conventional trocar sites (TS). METHODS: From November 2010 to December 2011, 300 patients undergoing various laparoscopic procedures were enrolled. Patient, surgery, and trocar characteristics were recorded. We evaluated at three time points (in-hospital and at 1 and 6 months postoperatively) specifically for each TS, pain (Visual Analog Scale), morbidity (infection, hematoma, hernia), and cosmesis (Patient Scar Assessment Score; PSAS). Patients designated their "worst TS," and a composite endpoint "bad TS" was defined to include any adverse outcome at a TS. RESULTS: We analyzed 1,074 TS. Follow-up was >90 %. Pain scores of >3/10 at 1 and 6 months postoperatively, were reported by 3 and 1 % of patients at the 5 mm TS and by 9 and 1 % at the larger TS, respectively (5 mm TS vs larger TS; p = 0.001). Pain was significantly lower for TS located in the lower abdomen than for the upper abdomen or the umbilicus (p = 0.001). The overall complication rate was <1 % and significantly lower for the 5 mm TS (hematoma p = 0.046; infection p = 0.0001). No hernia was found. The overall PSAS score was low and significantly lower for the 5 mm TS (p = 0.0001). Significant predictors of "bad TS" were larger TS (p = 0.001), umbilical position (p = 0.0001), emergency surgery (p = 0.0001), accidental trocar exit (p = 0.022), fascia closure (p = 0.006), and specimen extraction site (p = 0.0001). CONCLUSIONS: Specific trocar morbidity is low and almost negligible for 5 mm trocars. The umbilicus appears to be an unfavorable TS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanotechnology encompasses the design, characterisation, production and application of materials and systems by controlling shape and size at the nanoscale (nanometres). Nanomaterials may differ from other materials because of their relatively large specific surface area, such that surface properties become particularly important. There has been rapid growth in investment in nanotechnology by both the public and private sectors worldwide. In the EU, nanotechnology is expected to become an important strategic contributor to achieving economic gain and societal and individual benefits. At the same time there is continuing scientific uncertainty and controversy about the safety of nanomaterials. It is important to ensure that timely policy development takes this into consideration. Uncertainty about safety may lead to polarised public debate and to business unwillingness to invest further. A clear regulatory framework to address potential health and environmental impacts, within the wider context of evaluating and communicating the benefit-risk balance, must be a core part of Europe's integrated efforts for nanotechnology innovation. While a number of studies have been carried out on the effect of environmental nanoparticles, e.g. from combustion processes, on human health, there is yet no generally acceptable paradigm for safety assessment of nanomaterials in consumer and other products. Therefore, a working group was established to consider issues for the possible impact of nanomaterials on human health focussing specifically on engineered nanomaterials. This represents the first joint initiative between EASAC and the Joint Research Centre of the European Commission. The working group was given the remit to describe the state of the art of benefits and potential risks, current methods for safety assessment, and to evaluate their relevance, identify knowledge gaps in studying the safety of current nanomaterials, and recommend on priorities for nanomaterial research and the regulatory framework. This report focuses on key principles and issues, cross-referencing other sources for detailed information, rather than attempting a comprehensive account of the science. The focus is on human health although environmental effects are also discussed when directly relevant to health

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wiss, Janney, Elstner Associates, Inc. (WJE) evaluated potential nondestructive evaluation (NDE) methodologies that may be effective in 1) identifying internal defects within slip formed concrete barriers and 2) assessing the corrosion condition of barrier dowel bars. The evaluation was requested by the Bridge Maintenance and Inspection Unit of the Iowa Department of Transportation (IaDOT) and the Bureau of Bridges and Structures of the Illinois Department of Transportation (IDOT). The need arose due to instances in each Department’s existing inventory of bridge barriers where internal voids and other defects associated with slip forming construction methods were attributed to poor barrier performance after completion of construction and where, in other barrier walls, unintentional exposure of the dowel bars revealed extensive corrosion-related section loss at previously uninspectable locations, reducing the capacity of the barriers to resist traffic impact loads. WJE trial tested potential NDE techniques on laboratory mock-up samples built with known defects, trial sections of cast-in-place barriers at in-service bridges in Iowa, and slip formed and cast-in-place barrier walls at in-service bridges in Illinois. The work included review of available studies performed by others, field trial testing to assess candidate test methods, verification of the test methods in identifying internal anomalies and dowel bar corrosion, and preparation of this report and nondestructive evaluation guidelines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Perfusion-cardiovascular magnetic resonance (CMR) is generally accepted as an alternative to SPECT to assess myocardial ischemia non-invasively. However its performance vs gated-SPECT and in sub-populations is not fully established. The goal was to compare in a multicenter setting the diagnostic performance of perfusion-CMR and gated-SPECT for the detection of CAD in various populations using conventional x-ray coronary angiography (CXA) as the standard of reference. METHODS: In 33 centers (in US and Europe) 533 patients, eligible for CXA or SPECT, were enrolled in this multivendor trial. SPECT and CXA were performed within 4 weeks before or after CMR in all patients. Prevalence of CAD in the sample was 49% and 515 patients received MR contrast medium. Drop-out rates for CMR and SPECT were 5.6% and 3.7%, respectively (ns). The study was powered for the primary endpoint of non-inferiority of CMR vs SPECT for both, sensitivity and specificity for the detection of CAD (using a single-threshold reading), the results for the primary endpoint were reported elsewhere. In this article secondary endpoints are presented, i.e. the diagnostic performance of CMR versus SPECT in subpopulations such as multi-vessel disease (MVD), in men, in women, and in patients without prior myocardial infarction (MI). For diagnostic performance assessment the area under the receiver-operator-characteristics-curve (AUC) was calculated. Readers were blinded versus clinical data, CXA, and imaging results. RESULTS: The diagnostic performance (= area under ROC = AUC) of CMR was superior to SPECT (p = 0.0004, n = 425) and to gated-SPECT (p = 0.018, n = 253). CMR performed better than SPECT in MVD (p = 0.003 vs all SPECT, p = 0.04 vs gated-SPECT), in men (p = 0.004, n = 313) and in women (p = 0.03, n = 112) as well as in the non-infarct patients (p = 0.005, n = 186 in 1-3 vessel disease and p = 0.015, n = 140 in MVD). CONCLUSION: In this large multicenter, multivendor study the diagnostic performance of perfusion-CMR to detect CAD was superior to perfusion SPECT in the entire population and in sub-groups. Perfusion-CMR can be recommended as an alternative for SPECT imaging. TRIAL REGISTRATION: ClinicalTrials.gov, Identifier: NCT00977093.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine the incidence and risk factors of electrical seizures and other electrical epileptic activity using continuous EEG (cEEG) in patients with acute stroke. METHODS: One hundred consecutive patients with acute stroke admitted to our stroke unit underwent cEEG using 10 electrodes. In addition to electrical seizures, repetitive focal sharp waves (RSHWs), repetitive focal spikes (RSPs), and periodic lateralized epileptic discharges (PLEDs) were recorded. RESULTS: In the 100 patients, cEEG was recorded for a mean duration of 17 hours 34 minutes (range 1 hour 12 minutes to 37 hours 10 minutes). Epileptic activity occurred in 17 patients and consisted of RSHWs in seven, RSPs in seven, and PLEDs in three. Electrical seizures occurred in two patients. On univariate Cox regression analysis, predictors for electrical epileptic activity were stroke severity (high score on the National Institutes of Health Stroke Scale) (hazard ratio [HR] 1.12; p = 0.002), cortical involvement (HR 5.71; p = 0.021), and thrombolysis (HR 3.27; p = 0.040). Age, sex, stroke type, use of EEG-modifying medication, and cardiovascular risk factors were not predictors of electrical epileptic activity. On multivariate analysis, stroke severity was the only independent predictor (HR 1.09; p = 0.016). CONCLUSION: In patients with acute stroke, electrical epileptic activity occurs more frequently than previously suspected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Quantitative ultrasound (QUS) is an attractive method for assessing fracture risk because it is portable, inexpensive, without ionizing radiation, and available in areas of the world where DXA is not readily accessible or affordable. However, the diversity of QUS scanners and variability of fracture outcomes measured in different studies is an important obstacle to widespread utilisation of QUS for fracture risk assessment. We aimed in this review to assess the predictive power of heel QUS for fractures considering different characteristics of the association (QUS parameters and fracture outcomes measured, QUS devices, study populations, and independence from DXA-measured bone density).Materials/Methods : We conducted an inverse-variance randomeffects meta-analysis of prospective studies with heel QUS measures at baseline and fracture outcomes in their follow-up. Relative risks (RR) per standard deviation (SD) of different QUS parameters (broadband ultrasound attenuation [BUA], speed of sound &SOS;, stiffness index &SI;, and quantitative ultrasound index [QUI]) for various fracture outcomes (hip, vertebral, any clinical, any osteoporotic, and major osteoporotic fractures) were reported based on study questions.Results : 21 studies including 55,164 women and 13,742 men were included with a total follow-up of 279,124 person-years. All four QUS parameters were associated with risk of different fractures. For instance, RR of hip fracture for 1 SD decrease of BUA was 1.69 (95% CI 1.43-2.00), SOS was 1.96 (95% CI 1.64-2.34), SI was 2.26 (95%CI 1.71-2.99), and QUI was 1.99 (95% CI 1.49-2.67). Validated devices from different manufacturers predicted fracture risks with a similar performance (meta-regression p-values>0.05 for difference of devices). There was no sign of publication bias among the studies. QUS measures predicted fracture with a similar performance in men and women. Meta-analysis of studies with QUS measures adjusted for hip DXA showed a significant and independent association with fracture risk (RR/SD for BUA =1.34 [95%CI 1.22-1.49]).Conclusions : This study confirms that QUS of the heel using validated devices predicts risk of different fracture outcomes in elderly men and women. Further research and international collaborations are needed for standardisation of QUS parameters across various manufacturers and inclusion of QUS in fracture risk assessment tools. Disclosure of Interest : None declared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tavoite onharmonisoida yhtenäiset rakenteet UPM:n paperi- ja sellutehtaiden merkittävilleympäristönäkökohdille sekä niiden ympäristöriskienhallintajärjestelmille. Näin saavutetaan yhteneväiset tavoitteet ja analysointikeinot yrityksen yksiköille. Harmonisointiprosessi on osa koko yrityksen ympäristöhallintajärjestelmän kehittämistä. Ja konsernin EMS -prosessi puolestaan konvergoi konsernin integroidun johtamisjärjestelmän kehitystä. Lisäksi työn tapaustutkimuksessa selvitettiin riskienhallintajärjestelmien integroitumispotentiaalia. Sen avulla saavutettaisiin paremmin suuren yrityksen synergia-etuja ja vuorovaikutteisuutta toimijoiden kesken, sekä parannettaisiin riskienhallintajärjestelmän mukautuvuutta ja käytettävyyttä. Työssä käsitellään kolmea esimerkkiä, joiden pohjalta tehdään esitys harmonisoiduille merkittäville ympäristönäkökohdille sekä riskienhallintajärjestelmien parametreille. Tutkimusongelmaa lähestytään haastattelujen, kirjallisuuden, yrityksen PWC:llä teettämän selvityksen sekä omien päätelmien avulla. Lisäksi työssä esitetään ympäristöhallintajärjestelmän tehokkuuden todentaminen ympäristösuorituskyvyn muuttujiin suhteutettuna. Pohjana jatkuvan kehityksen päämäärälle on organisaatio-oppiminen, niin yksittäisen työntekijän, tiimien kuin eri yksiköiden kesken. Se antaa sysäyksen aineettoman omaisuuden, kuten ympäristö-osaamisen, hyödyntämiseen parhaalla mahdollisella tavalla. Tärkeimpinä lopputuloksina työssä ovat ehdotukset harmonisoiduille merkittäville ympäristönäkökohdille sekä ympäristöriskienhallintajärjestelmän määritetyille komponenteille. Niitä ovat määritelmät ja skaalat riskien todennäköisyydelle, seurauksille sekä riskiluokille. Työn viimeisenä osana luodaan pohja tapaustutkimuksen avulla Rauman tehtaan jätevedenpuhdistamon kahden erilaisen riskienhallintajärjestelmän integroitumiselle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate measurement of knee kinematics during functional activities suffers mainly from soft tissue artifact (STA): the combination of local surface deformations and rigid movement of markers relative to the underlying bone (also called rigid STA movement: RSTAM). This study proposes to assess RSTAM on the thigh, shank, and knee joint and to observe possible features between subjects. Nineteen subjects with knee arthroplasty were asked to walk on a treadmill while a biplane fluoroscopic system (X-rays) and a stereophotogrammetric system (skin markers) recorded their knee movement. The RSTAM was defined as the rigid movement of the cluster of skin markers relative to the prosthesis. The results showed that RSTAM amplitude represents approximately 80-100% of the STA. The vertical axis of the anatomical frame of the femur was influenced the most by RSTAM. Combined with tibial error, internal/external rotation angle and distraction-compression were the knee kinematics parameters most affected by RSTAM during the gait cycle, with average rms values of 3.8° and 11.1 mm. This study highlighted higher RSTAM during the swing phase particularly in the thigh segment and suggests new features for RSTAM such as the particular shape of some RSTAM waveforms and the absence of RSTAM in certain kinematics during the gait phases. The comparison of coefficient of multiple correlations showed some similarities of RSTAM between subjects, while some correlations were found with gait speed and BMI. These new insights could potentially allow the development of new methods of compensation to avoid STA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Determine the effect of repeated intravitreal injections of ranibizumab (0.5 mg; 0.05 ml) on retrobulbar blood flow velocities (BFVs) using ultrasound imaging quantification in twenty patients with exudative age-related macular degeneration treated for 6 months. METHODS: Visual acuity (ETDRS), central macular thickness (OCT), peak-systolic, end-diastolic and mean-BFVs in central retinal (CRA), temporal posterior ciliary (TPCA) and ophthalmic (OA) arteries were measured before, 2 days, 3 weeks and 6 months after the first injection. Patients were examined monthly and received 1-5 additional injections depending on ophthalmologic examination results. RESULTS: Six months after the first injection, a significant increase in visual acuity 50.9 ± 25.9 versus 44.4 ± 21.7 (p < 0.01) and decrease in mean central macular thickness 267 ± 74 versus 377 ± 115 μm (p < 0.001) were observed compared to baseline. Although mean-BFVs decreased by 16%±3% in CRA and 20%±5% in TPCA (p < 0.001) 2 days after the first injection, no significant change was seen thereafter. Mean-BFVs in OA decreased by 19%±5% at week 3 (p < 0.001). However, the smallest number of injections (two injections) was associated with the longest time interval between the last injection and month 6 (20 weeks) and with the best return to baseline levels for mean-BFVs in CRA, suggesting that ranibizumab had reversible effects on native retinal vascular supply after its discontinuation. Moreover, a significant correlation between the number of injections and percentage of changes in mean-BFVs in CRA was observed at month 6 (R = 0.74, p < 0.001) unlike TPCA or OA. CONCLUSION: Ranibizumab could impair the native choroidal and retinal vascular networks, but its effect seems reversible after its discontinuation.