909 resultados para measurement and metrology
Resumo:
Until recently, much effort has been devoted to the estimation of panel data regression models without adequate attention being paid to the drivers of diffusion and interaction across cross section and spatial units. We discuss some new methodologies in this emerging area and demonstrate their use in measurement and inferences on cross section and spatial interactions. Specifically, we highlight the important distinction between spatial dependence driven by unobserved common factors and those based on a spatial weights matrix. We argue that, purely factor driven models of spatial dependence may be somewhat inadequate because of their connection with the exchangeability assumption. Limitations and potential enhancements of the existing methods are discussed, and several directions for new research are highlighted.
Resumo:
Precise focusing is essential for transcranial MRI-guided focused ultrasound (TcMRgFUS) to minimize collateral damage to non-diseased tissues and to achieve temperatures capable of inducing coagulative necrosis at acceptable power deposition levels. CT is usually used for this refocusing but requires a separate study (CT) ahead of the TcMRgFUS procedure. The goal of this study was to determine whether MRI using an appropriate sequence would be a viable alternative to CT for planning ultrasound refocusing in TcMRgFUS. We tested three MRI pulse sequences (3D T1 weighted 3D volume interpolated breath hold examination (VIBE), proton density weighted 3D sampling perfection with applications optimized contrasts using different flip angle evolution and 3D true fast imaging with steady state precision T2-weighted imaging) on patients who have already had a CT scan performed. We made detailed measurements of the calvarial structure based on the MRI data and compared those so-called 'virtual CT' to detailed measurements of the calvarial structure based on the CT data, used as a reference standard. We then loaded both standard and virtual CT in a TcMRgFUS device and compared the calculated phase correction values, as well as the temperature elevation in a phantom. A series of Bland-Altman measurement agreement analyses showed T1 3D VIBE as the optimal MRI sequence, with respect to minimizing the measurement discrepancy between the MRI derived total skull thickness measurement and the CT derived total skull thickness measurement (mean measurement discrepancy: 0.025; 95% CL (-0.22-0.27); p = 0.825). The T1-weighted sequence was also optimal in estimating skull CT density and skull layer thickness. The mean difference between the phase shifts calculated with the standard CT and the virtual CT reconstructed from the T1 dataset was 0.08 ± 1.2 rad on patients and 0.1 ± 0.9 rad on phantom. Compared to the real CT, the MR-based correction showed a 1 °C drop on the maximum temperature elevation in the phantom (7% relative drop). Without any correction, the maximum temperature was down 6 °C (43% relative drop). We have developed an approach that allows for a reconstruction of a virtual CT dataset from MRI to perform phase correction in TcMRgFUS.
Resumo:
In Switzerland, individuals exposed to the risk of activity intake are required to perform regular monitoring. Monitoring consists in a screening measurement and is meant to be performed using commonly available laboratory instruments. More particularly, iodine intake is measured using a surface contamination monitor. The goal of the present paper is to report the calibration method developed for thyroid screening instruments. It consists of measuring the instrument response to a known activity located in the thyroid gland of a standard neck phantom. One issue of this procedure remains that the iodine radioisotopes have a short half-life. Therefore, the adequacy and limitations to simulate the short-lived radionuclides with so-called mock radionuclides of longer half-life were also evaluated. In light of the results, it has been decided to use only the appropriate iodine sources to perform the calibration.
Resumo:
PURPOSE: In Switzerland, nationwide large-scale radon surveys have been conducted since the early 1980s to establish the distribution of indoor radon concentrations (IRC). The aim of this work was to study the factors influencing IRC in Switzerland using univariate analyses that take into account biases caused by spatial irregularities of sampling. METHODS: About 212,000 IRC measurements carried out in more than 136,000 dwellings were available for this study. A probability map to assess risk of exceeding an IRC of 300 Bq/m(3) was produced using basic geostatistical techniques. Univariate analyses of IRC for different variables, namely the type of radon detector, various building characteristics such as foundation type, year of construction and building type, as well as the altitude, the average outdoor temperature during measurement and the lithology, were performed comparing 95% confidence intervals among classes of each variable. Furthermore, a map showing the spatial aggregation of the number of measurements was generated for each class of variable in order to assess biases due to spatially irregular sampling. RESULTS: IRC measurements carried out with electret detectors were 35% higher than measurements performed with track detectors. Regarding building characteristics, the IRC of apartments are significantly lower than individual houses. Furthermore, buildings with concrete foundations have the lowest IRC. A significant decrease in IRC was found in buildings constructed after 1900 and again after 1970. Moreover, IRC decreases at higher outdoor temperatures. There is also a tendency to have higher IRC with altitude. Regarding lithology, carbonate rock in the Jura Mountains produces significantly higher IRC, almost by a factor of 2, than carbonate rock in the Alps. Sedimentary rock and sediment produce the lowest IRC while carbonate rock from the Jura Mountains and igneous rock produce the highest IRC. Potential biases due to spatially unbalanced sampling of measurements were identified for several influencing factors. CONCLUSIONS: Significant associations were found between IRC and all variables under study. However, we showed that the spatial distribution of samples strongly affected the relevance of those associations. Therefore, future methods to estimate local radon hazards should take the multidimensionality of the process of IRC into account.
Resumo:
A solution of (18)F was standardised with a 4pibeta-4pigamma coincidence counting system in which the beta detector is a one-inch diameter cylindrical UPS89 plastic scintillator, positioned at the bottom of a well-type 5''x5'' NaI(Tl) gamma-ray detector. Almost full detection efficiency-which was varied downwards electronically-was achieved in the beta-channel. Aliquots of this (18)F solution were also measured using 4pigamma NaI(Tl) integral counting and Monte Carlo calculated efficiencies as well as the CIEMAT-NIST method. Secondary measurements of the same solution were also performed with an IG11 ionisation chamber whose equivalent activity is traceable to the Système International de Référence through the contribution IRA-METAS made to it in 2001; IRA's degree of equivalence was found to be close to the key comparison reference value (KCRV). The (18)F activity predicted by this coincidence system agrees closely with the ionisation chamber measurement and is compatible within one standard deviation of the other primary measurements. This work demonstrates that our new coincidence system can standardise short-lived radionuclides used in nuclear medicine.
Resumo:
This study describes major electrocardiogram (ECG) measurements and diagnoses in a population of African individuals; most reference data have been collected in Caucasian populations and evidence exists for interethnic differences in ECG findings. This study was conducted in the Seychelles islands (Indian Ocean) and included 709 black individuals (343 men and 366 women) aged 25 to 64 years randomly selected from the general population. Resting ECG were recorded by using a validated ECG unit equipped with a measurement and interpretation software (Cardiovit AT-6, Schiller, Switzerland). The epidemiology of 14 basic ECG measurements, 6 composite criteria for left ventricular hypertrophy and 19 specific ECG diagnoses including abnormal rhythms, conduction abnormalities, repolarization abnormalities, and myocardial infarction were examined. Substantial gender and age differences were found for several ECG parameters. Moreover, tracings recorded in African individuals of the Seychelles differed from those collected similarly in Caucasian populations in many respects. For instance, heart rate was approximately 5 beats per minute lower in the African individuals than in selected Caucasian populations, prevalence of first degree atrio-ventricular block was especially high (4.8%), and the average Sokolow-Lyon voltage was markedly higher in African individuals of the Seychelles compared with black and white Americans. The integrated interpretation software detected "old myocardial infarction" in 3.8% of men and 0% of women and "old myocardial infarction possible" in 6.1% and 3%, respectively. Cardiac infarction injury scores are also provided. In conclusion, the study provides reference values for ECG findings in a specific population of people of African descent and stresses the need to systematically consider gender, age, and ethnicity when interpreting ECG tracings in individuals.
Resumo:
The sample dimension, types of variables, format used for measurement, and construction of instruments to collect valid and reliable data must be considered during the research process. In the social and health sciences, and more specifically in nursing, data-collection instruments are usually composed of latent variables or variables that cannot be directly observed. Such facts emphasize the importance of deciding how to measure study variables (using an ordinal scale or a Likert or Likert-type scale). Psychometric scales are examples of instruments that are affected by the type of variables that comprise them, which could cause problems with measurement and statistical analysis (parametric tests versus non-parametric tests). Hence, investigators using these variables must rely on suppositions based on simulation studies or recommendations based on scientific evidence in order to make the best decisions.
Resumo:
Presentemente, o controlo de gestão está vocacionado para agir antes de os factos indesejáveis ocorrerem, assegurando que os objectivos estabelecidos pela gestão são atingidos dentro do timing fixado. Além disso, o controlo de gestão deve ser o motor que permita alcançar as melhores performances nas áreas críticas da empresa, não só no domínio económico e financeiro, mas também nas áreas do crescimento, segurança e produtividade. Um dos mais importantes objectos das administrações actuais, é determinar se o desempenho da organização está de acordo com o que foi estabelecido previamente, ou seja, seus objectivos e metas. O meio através do qual se verificaria este desempenho seria a utilização de métodos e sistemas de avaliação de desempenho eficazes. Neste contexto, o presente estudo consiste em fazer um estudo exploratório descritivo identificando e averiguando de que forma as instituições bancárias de Cabo Verde efectuam a gestão de alguns aspectos, especialmente a avaliação de desempenho e o controlo estratégico, e que indicadores utilizam. Não obstante os objectivos específicos do trabalho serem outros, também damos especial atenção às características do mercado cabo-verdiano e à importância do sector bancário para a economia. Finalmente, apresentamos o Balanced Scorecard como uma ferramenta capaz de suprir as dificuldades da avaliação de desempenho e o conjunto de indicadores que vemos como o mais adequado. Neste ponto, concentramos nas quatro perspectivas básicas e no mapa estratégico, referindo o papel do Balanced Scorecard no alinhamento estratégico e na avaliação do desempenho organizacional. Para concluir, reforçamos o estudo, entrevistando um especialista (Director Financeiro) de um dos bancos da praça, cujo nome prometemos não publicar. Dessa forma, esperamos contribuir para uma melhor percepção da realidade em estudo, tanto do ponto de vista teórico, quanto da verificação das práticas no sector. Presently, the management control is oriented to act before the undesirable facts happen, assuring that the management established objectives are being achieved in the fixed timing. Besides, the management control must be an engine that permits to achieve the best performances at critical company areas, not only in the economic and financial areas, but at the growth, security and productivity areas too. One of the most important administration objects nowadays is to know if the organization performance is according to the fixed targets. The performance measurement could be done through effective methods and performance measurement systems. That’s why this assignment consists in doing an exploratory and descriptive study, identifying and investigating how the bank institutions of Cape Verde manage some things, particularly the performance measurement and the strategic control, and to know which indicators they use. Although the specific objectives of this assignment are others, we also give special attention to the Capeverdean market characteristics, and to the relevance of the banking industry to the economy of the country. Finally, we present the Balanced Scorecard as a competent tool to supply the measurement performance difficulties and a number of indicators that we find appropriate. In this point, we focus in the four basic perspectives and the strategic map, referring to the role of the Balanced Scorecard in the strategic alignment and organization performance measurement. We conclude this study with an interview to an expert (A Financial Manager) of a bank working in Cape Verde, whose name we promise to preserve. In this way, we hope to contribute to a better perception of this reality, in the theoretical point-of-view as much as in the practical check of this industry’s labour.
Resumo:
A mudança do normativo contabilístico ocorrido em 2009, alterou o paradigma de reconhecimento e mensuração de activos. Embora a natureza das operações contínua presente no processo contabilístico, muitas coisas foram alteradas tendo em conta a substância da informação e a sua realidade económica. O caso dos contratos de concessão é disto um bom exemplo. Há casos em que no normativo anterior eram reconhecidos como activos fixos tangíveis e actualmente são reconhecidos como intangíveis. O estudo em causa tem como objetivo principal analisar o conceito dos contratos de concessão, bem como os procedimentos para o reconhecimento, mensuração e divulgação nas demonstrações financeiras. Considerados activos intangíveis (de facto a entidade acaba por ter um “Direito” de explorar um determinado activo), o processo contabilístico é feito a luz do disposto na Norma de Relato Financeiro nº6 – Activos Intangíveis. Os contractos de concessão apresentam especificidades próprias e por esta razão o IASB emitiu uma IFRIC (nº 12) com o objectivo de clarificar o tratamento contabilístico desta problemática. Não existindo no normativo nacional tal norma interpretativa as empresas nacionais que convivem com esta realidade vêem-se na contingência de, supletivamente, recorrer às normas internacionais de contabilidade para resolver o assunto. É o caso da ELECTRA para os activos afectos a distribuição. Neste sentido, o estudo debruça sobre esta problemática, apresenta um enquadramento teórico, analisar os principais aspectos de reconhecimento a luz dos dois normativos contabilísticos nacionais (o antigo Plano Nacional de Contabilidade e o actual Sistema de Normalização Contabilística e de Relato Financeiro) e termina utilizando as informações da ELECTRA, SARL para ilustrar este processo de reconhecimento contabilístico. The change of a the accounting regulatory occurred in 2009, changed the paradigm for recognizing and measuring assets. Although the continuous nature of the operations in this accounting process, many things have changed in view of the substance of information and its economic reality. The case of concession contracts, it is a good example. There are cases where the former were recognized as legal and tangible fixed assets are currently recognized as intangible assets. The study is aimed to analyzing the concept of concession contracts, as well as procedures for the recognition, measurement and disclosure in the financial statements. Considered intangible assets (in fact the entity turns out to have a “right” to exploit a particular asset) the accounting process is done in light of the provisions of Financial Reporting Standard No. 6 – Intangible Assets. The concession contracts have specific characteristics and for this reason the IASB issued IFRIC one (Ner. 12 ) in order to clarify the accounting treatment of this problem. In the absence of such a standard national regulatory interpretative national companies that live with this reality find themselves in contingency, additionally, make use of international accounting standards to resolve the matter. ELECTRA is the case of the assets connected to the distribution. In this sense, the study focuses on this issue, presents a theoretical framework to analyze the main aspects of recognition light of both national accounting standards (formerly the National Accounting Standards and the current system of accounting and financial reporting) and ends up using the information the Electra SARL to illustrate this process of accounting recognition.
Resumo:
OBJECTIVES: To provide a global, up-to-date picture of the prevalence, treatment, and outcomes of Candida bloodstream infections in intensive care unit patients and compare Candida with bacterial bloodstream infection. DESIGN: A retrospective analysis of the Extended Prevalence of Infection in the ICU Study (EPIC II). Demographic, physiological, infection-related and therapeutic data were collected. Patients were grouped as having Candida, Gram-positive, Gram-negative, and combined Candida/bacterial bloodstream infection. Outcome data were assessed at intensive care unit and hospital discharge. SETTING: EPIC II included 1265 intensive care units in 76 countries. PATIENTS: Patients in participating intensive care units on study day. INTERVENTIONS: None. MEASUREMENT AND MAIN RESULTS: Of the 14,414 patients in EPIC II, 99 patients had Candida bloodstream infections for a prevalence of 6.9 per 1000 patients. Sixty-one patients had candidemia alone and 38 patients had combined bloodstream infections. Candida albicans (n = 70) was the predominant species. Primary therapy included monotherapy with fluconazole (n = 39), caspofungin (n = 16), and a polyene-based product (n = 12). Combination therapy was infrequently used (n = 10). Compared with patients with Gram-positive (n = 420) and Gram-negative (n = 264) bloodstream infections, patients with candidemia were more likely to have solid tumors (p < .05) and appeared to have been in an intensive care unit longer (14 days [range, 5-25 days], 8 days [range, 3-20 days], and 10 days [range, 2-23 days], respectively), but this difference was not statistically significant. Severity of illness and organ dysfunction scores were similar between groups. Patients with Candida bloodstream infections, compared with patients with Gram-positive and Gram-negative bloodstream infections, had the greatest crude intensive care unit mortality rates (42.6%, 25.3%, and 29.1%, respectively) and longer intensive care unit lengths of stay (median [interquartile range]) (33 days [18-44], 20 days [9-43], and 21 days [8-46], respectively); however, these differences were not statistically significant. CONCLUSION: Candidemia remains a significant problem in intensive care units patients. In the EPIC II population, Candida albicans was the most common organism and fluconazole remained the predominant antifungal agent used. Candida bloodstream infections are associated with high intensive care unit and hospital mortality rates and resource use.
Resumo:
This study was performed to analyse the prevalence of obesity in children living in six different areas of the north-east of Italy. The study included 1523 children (749 male, 774 female), divided into four age categories (4, 8, 10, 12 +/- 0.5 years of age, respectively). The physical characteristics of the children were measured by trained and standardized examiners. In accordance with the guidelines on the Italian Consensus Conference on Obesity (Rome, 4-6 June 1991), a child was defined as obese when his weight was higher than 120% of the weight predicted for height, as calculated from the Tanner's tables. On average, the prevalence of obesity was higher in males than in females (15.7% vs. 11%). The highest prevalence was seen in 10-year-old males (23.4%). The prevalence increased with age both in males (4 years = 3.6%, 8 years = 11.2%, 10 years = 23.4%, 12 years = 17.3%) and in females (4 years = 2%, 8 years = 13.3%, 10 years = 12.7%, 12 years = 11.9%). This tendency was maintained when calculating the obesity prevalence by other methods, such as BMI, triceps skinfold and fat mass, although the magnitude of the prevalence was different depending on the criteria used to define it. A consensus on more precise criteria to define obesity is needed for a better diagnosis of obesity in childhood and to allow a more reliable measurement and comparison of the prevalence of obesity among populations.
Resumo:
Generally, medicine books are concentrated almost exclusively in explaining methodology that analyzes fixed measures, measures done in a certain moment, nevertheless the evolution of the measurement and correct interpretation of the missed values are very important and sometimes can give the key information of the results obtained. Thus, the analysis of the temporary series and spectral analysis or analysis of the time series in the dominion of frequencies can be regarded as an appropriate tool for this kind of studies.In this work the frequency of the pulsating secretion of luteinizing hormone LH (thatregulates the fertile life of women) were analyzed in order to determine the existence of the significant frequencies obtained by analysis of Fourier. Detection of the frequencies, with which the pulsating secretion of the LH takes place, is a quite difficult question due topresence of the random errors in measures and samplings, i.e. that pulsating secretions of small amplitude are not detected and disregarded. In physiology it is accepted that cyclical patterns in the secretion of the LH exist and in the results of this research confirm this pattern and determine its frequency presented in the corresponded periodograms to each of studied cycle. The obtained results can be used as key pattern for future sampling frequencies in order to ¿catch¿ the significant picks of the luteinizing hormone and reflect on time forproductivity treatment of women.
Resumo:
ABSTRACTIn normal tissues, a balance between pro- and anti-angiogenic factors tightly controls angiogenesis. Alterations of this balance may have pathological consequences. For instance, concerning the retina, the vascular endothelial growth factor (VEGF) is a potent pro-angiogenic factor, and has been identified has a key player during ocular neovascularization implicated in a variety of retinal diseases. In the exudative form (wet-form) of age-related macular degeneration (AMD), neovascularizations occurring from the choroidal vessels are responsible for a quick and dramatic loss of visual acuity. In diabetic retinopathy and retinopathy of prematurity, sprouting from the retinal vessels leads to vision loss. Furthermore, the aging of the population, the increased- prevalence of diabetes and the better survival rate of premature infants will lead to an increasing rate of these conditions. In this way, anti-VEGF strategy represents an important therapeutic target to treat ocular neovascular disorders.In addition, the administration of Pigmented Epithelial growth factor, a neurotrophic and an anti- angiogenic factor, prevents photoreceptor cell death in a model of retinal degeneration induced by light. Previous results analyzing end point morphology reveal that the light damage (LD) model is used to mimic retinal degenerations arising from environmental insult, as well as aging and genetic disease such as advanced atrophic AMD. Moreover, light has been identified as a co-factor in a number of retinal diseases, speeding up the degeneration process. This protecting effect of PEDF in the LD retina raises the possibility of involvement of the balance between pro- and anti-angiogenic factors not only for angiogenesis, but also in cell survival and maintenance.The aim of the work presented here was to evaluate the importance of this balance in neurodegenerative processes. To this aim, a model of light-induced retinal degeneration was used and characterized, mainly focusing on factors simultaneously controlling neuron survival and angiogenesis, such as PEDF and VEGF.In most species, prolonged intense light exposure can lead to photoreceptor cell damage that can progress to cell death and vision loss. A protocol previously described to induce retinal degeneration in Balb/c mice was used. Retinas were characterized at different time points after light injury through several methods at the functional and molecular levels. Data obtained confirmed that toxic level of light induce PR cell death. Variations were observed in VEGF pathway players in both the neural retina and the eye-cup containing the retinal pigment epithelium (RPE), suggesting a flux of VEGF from the RPE towards the neuroretina. Concomitantly, the integrity of the outer blood-retinal-barrier (BRB) was altered, leading to extravascular albumin leakage from the choroid throughout the photoreceptor layer.To evaluate the importance of VEGF during light-induced retinal degeneration process, a lentiviral vector encoding the cDNA of a single chain antibody directed against all VEGF-A isoforms was developed (LV-V65). The bioactivity of this vector to block VEGF was validated in a mouse model of laser-induced choroidal neovascularization mediated by VEGF upregulation. The vector was then used in the LD model. The administration of the LV-V65 contributed to the maintenance of functional photoreceptors, which was assessed by ERG recording, visual acuity measurement and histological analyses. At the RPE level, the BRB integrity was preserved as shown by the absence of albumin leakage and the maintenance of RPE cell cohesion.These results taken together indicate that the VEGF is a mediator of light induced PR degeneration process and confirm the crucial role of the balance between pro- and anti-angiogenic factors in the PR cell survival. This work also highlights the prime importance of BRB integrity and functional coupling between RPE and PR cells to maintain the PR survival. VEGF dysregulation was already shown to be involved in wet AMD forms and our study suggests that VEGF dysregulation may also occur at early stages of AMD and could thus be a potential therapeutic target for several RPE related diseases.RESUMEDans les différents tissues de l'organisme, l'angiogenèse est strictement contrôlée par une balance entre les facteurs pro- et anti-angiogéniques. Des modifications survenant dans cette balance peuvent engendrer des conséquences pathologiques. Par exemple, concernant la rétine, le facteur de croissance de l'endothélium vasculaire (VEGF) est un facteur pro-angiogénique important. Ce facteur a été identifié comme un acteur majeur dans les néovascularisations oculaires et les processus pathologiques angiogéniques survenant dans l'oeil et responsables d'une grande variété de maladies rétiniennes. Dans la forme humide de la dégénérescence maculaire liée à l'âge (DMLA), la néovascularisation choroïdienne est responsable de la perte rapide et brutale de l'acuité visuelle chez les patients affectés. Dans la rétinopathie diabétique et celle lié à la prématurité, l'émergence de néovaisseaux rétiniens est la cause de la perte de la vision. Les néovascularisations oculaires représentent la principale cause de cécité dans les pays développés. De plus, l'âge croissant de la population, la progression de la prévalence du diabète et la meilleure survie des enfants prématurés mèneront sans doute à l'augmentation de ces pathologies dans les années futures. Dans ces conditions, les thérapies anti- angiogéniques visant à inhiber le VEGF représentent une importante cible thérapeutique pour le traitement de ces pathologies.Plusieurs facteurs anti-angiogéniques ont été identifiés. Parmi eux, le facteur de l'épithélium pigmentaire (PEDF) est à la fois un facteur neuro-trophique et anti-angiogénique, et l'administration de ce facteur au niveau de la rétine dans un modèle de dégénérescence rétinienne induite par la lumière protège les photorécepteurs de la mort cellulaire. Des études antérieures basées sur l'analyse morphologique ont révélé que les modifications survenant lors de la dégénération induite suite à l'exposition à des doses toxiques de lumière représente un remarquable modèle pour l'étude des dégénérations rétiniennes suite à des lésions environnementales, à l'âge ou encore aux maladies génétiques telle que la forme atrophique avancée de la DMLA. De plus, la lumière a été identifiée comme un co-facteur impliqué dans un grand nombre de maladies rétiniennes, accélérant le processus de dégénération. L'effet protecteur du PEDF dans les rétines lésées suite à l'exposition de des doses toxiques de lumière suscite la possibilité que la balance entre les facteurs pro- et anti-angiogéniques soit impliquée non seulement dans les processus angiogéniques, mais également dans le maintient et la survie des cellules.Le but de ce projet consiste donc à évaluer l'implication de cette balance lors des processus neurodégénératifs. Pour cela, un modèle de dégénération induite par la lumière à été utilisé et caractérisé, avec un intérêt particulier pour les facteurs comme le PEDF et le VEGF contrôlant simultanément la survie des neurones et l'angiogenèse.Dans la plupart des espèces, l'exposition prolongée à une lumière intense peut provoquer des dommages au niveau des cellules photoréceptrices de l'oeil, qui peut mener à leur mort, et par conséquent à la perte de la vision. Un protocole préalablement décrit a été utilisé pour induire la dégénération rétinienne dans les souris albinos Balb/c. Les rétines ont été analysées à différents moments après la lésion par différentes techniques, aussi bien au niveau moléculaire que fonctionnel. Les résultats obtenus ont confirmé que des doses toxiques de lumière induisent la mort des photorécepteurs, mais altèrent également la voie de signalisation du VEGF, aussi bien dans la neuro-rétine que dans le reste de l'oeil, contenant l'épithélium pigmentaire (EP), et suggérant un flux de VEGF provenant de ΙΈΡ en direction de la neuro-rétine. Simultanément, il se produit une altération de l'intégrité de la barrière hémato-rétinienne externe, menant à la fuite de protéine telle que l'albumine, provenant de la choroïde et retrouvée dans les compartiments extravasculaires de la rétine, telle que dans la couche des photorécepteurs.Pour déterminer l'importance et le rôle du VEGF, un vecteur lentiviral codant pour un anticorps neutralisant dirigée contre tous les isoformes du VEGF a été développé (LV-V65). La bio-activité de ce vecteur a été testé et validée dans un modèle de laser, connu pour induire des néovascularisations choroïdiennes chez la souris suite à l'augmentation du VEGF. Ce vecteur a ensuite été utilisé dans le modèle de dégénération induite par la lumière. Les résultats des électrorétinogrammes, les mesures de l'acuité visuelle et les analyses histologiques ont montré que l'injection du LV-V65 contribue à la maintenance de photorécepteurs fonctionnels. Au niveau de l'EP, l'absence d'albumine et la maintenance des jonctions cellulaires des cellules de l'EP ont démontré que l'intégrité de la barrière hémato-rétinienne externe est préservée suite au traitement.Par conséquent, tous les résultats obtenus indiquent que le VEGF est un médiateur important impliquée dans le processus de dégénération induit par la lumière et confirme le rôle cruciale de la balance entre les facteurs pro- et anti-angiogéniques dans la survie des photorécepteurs. Cette étude révèle également l'importance de l'intégrité de la barrière hémato-rétinienne et l'importance du lien fonctionnel et structurel entre l'EP et les photorécepteurs, essentiel pour la survie de ces derniers. Par ailleurs, Cette étude suggère que des dérèglements au niveau de l'équilibre du VEGF ne sont pas seulement impliqués dans la forme humide de la DMLA, comme déjà démontré dans des études antérieures, mais pourraient également contribuer et survenir dans des formes précoces de la DMLA, et par conséquent le VEGF représente une cible thérapeutique potentielle pour les maladies associées à des anomalies au niveau de l'EP.
Resumo:
BACKGROUND: Multislice CT (MSCT) combined with D-dimer measurement can safely exclude pulmonary embolism in patients with a low or intermediate clinical probability of this disease. We compared this combination with a strategy in which both a negative venous ultrasonography of the leg and MSCT were needed to exclude pulmonary embolism. METHODS: We included 1819 consecutive outpatients with clinically suspected pulmonary embolism in a multicentre non-inferiority randomised controlled trial comparing two strategies: clinical probability assessment and either D-dimer measurement and MSCT (DD-CT strategy [n=903]) or D-dimer measurement, venous compression ultrasonography of the leg, and MSCT (DD-US-CT strategy [n=916]). Randomisation was by computer-generated blocks with stratification according to centre. Patients with a high clinical probability according to the revised Geneva score and a negative work-up for pulmonary embolism were further investigated in both groups. The primary outcome was the 3-month thromboembolic risk in patients who were left untreated on the basis of the exclusion of pulmonary embolism by diagnostic strategy. Clinicians assessing outcome were blinded to group assignment. Analysis was per protocol. This study is registered with ClinicalTrials.gov, number NCT00117169. FINDINGS: The prevalence of pulmonary embolism was 20.6% in both groups (189 cases in DD-US-CT group and 186 in DD-CT group). We analysed 855 patients in the DD-US-CT group and 838 in the DD-CT group per protocol. The 3-month thromboembolic risk was 0.3% (95% CI 0.1-1.1) in the DD-US-CT group and 0.3% (0.1-1.2) in the DD-CT group (difference 0.0% [-0.9 to 0.8]). In the DD-US-CT group, ultrasonography showed a deep-venous thrombosis in 53 (9% [7-12]) of 574 patients, and thus MSCT was not undertaken. INTERPRETATION: The strategy combining D-dimer and MSCT is as safe as the strategy using D-dimer followed by venous compression ultrasonography of the leg and MSCT for exclusion of pulmonary embolism. An ultrasound could be of use in patients with a contraindication to CT.
Resumo:
Light-emitting diodes (LEDs) are taking an increasing place in the market of domestic lighting because they produce light with low energy consumption. In the EU, by 2016, no traditional incandescent light sources will be available and LEDs may become the major domestic light sources. Due to specific spectral and energetic characteristics of white LEDs as compared to other domestic light sources, some concerns have been raised regarding their safety for human health and particularly potential harmful risks for the eye. To conduct a health risk assessment on systems using LEDs, the French Agency for Food, Environmental and Occupational Health & Safety (ANSES), a public body reporting to the French Ministers for ecology, for health and for employment, has organized a task group. This group consisted physicists, lighting and metrology specialists, retinal biologist and ophthalmologist who have worked together for a year. Part of this work has comprised the evaluation of group risks of different white LEDs commercialized on the French market, according to the standards and found that some of these lights belonged to the group risk 1 or 2. This paper gives a comprehensive analysis of the potential risks of white LEDs, taking into account pre-clinical knowledge as well as epidemiologic studies and reports the French Agency's recommendations to avoid potential retinal hazards.