885 resultados para new in ILL units


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Evidence from studies conducted mainly in the US and mainland Europe suggests that characteristics of the workforce, such as nurse patient ratios and workload (measured in a number of different ways) may be linked to variations in patient outcomes across health care settings (Carmel and Rowan 2001). Few studies have tested this relationship in the UK thus questions remain about whether we are justified in extrapolating evidence from studies conducted in very different health care systems. Objectives: To investigate whether characteristics of the nursing workforce affect patient mortality UK Intensive Care Units. Data: Patient data came from the case mix programme, Intensive Care National Audit and Research Centre (ICNARC), while information about the units came from a survey of all ICUs in England (Audit Comission 1998). The merged data set contained information on 43,859 patients in 69 units across England. ICNARC also supplied a risk adjustment variable to control for patient characteristics that are often the most important determinants of survival. Methods: Multivariate multilevel logistic regression. Findings: Higher numbers of direct care nurses and lower scores on measures of workload(proportion of occupied beds at the time the patient was admitted and mean daily transfers into the unit) were associated with lower mortality rates. Furthermore, the effect of the number of direct care nurses was greatest on the life chances of the patients who were most at risk of dying. Implications: This study has wide implications for workforce policy and planning because it shows that the size of the nursing workforce is associated with mortality (West et al 2006). Few studies have demonstrated this relationship in the UK. This study has a number of strengths and weaknesses and further research is required to determine whether this relationship between the nursing workforce and patient outcomes is causal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Responses evoked in muscle sympathetic nerve activity (MSNA) by systemic hypoxia have received relatively little attention. Moreover, MSNA is generally identified from firing characteristics in fibres supplying whole limbs: their actual destination is not determined. We aimed to address these limitations by using a novel preparation of spinotrapezius muscle in anaesthetised rats. By using focal recording electrodes, multi-unit and discriminated single unit activity were recorded from the surface of arterial vessels. This had cardiac- and respiratory-related activities expected of MSNA, and was increased by baroreceptor unloading, decreased by baroreceptor stimulation and abolished by autonomic ganglion blockade. Progressive, graded hypoxia (breathing sequentially 12, 10, 8% O2 for 2 min each) evoked graded increases in MSNA. In single units, mean firing frequency increased from 0.2 ± 0.04 in 21% O2 to 0.62 ± 0.14 Hz in 8% O2, while instantaneous frequencies ranged from 0.04–6 Hz in 21% O2 to 0.09–20 Hz in 8% O2. Concomitantly, arterial pressure (ABP), fell and heart rate (HR) and respiratory frequency (RF) increased progressively, while spinotrapezius vascular resistance (SVR) decreased (Spinotrapezius blood flow/ABP), indicating muscle vasodilatation. During 8% O2 for 10 min, the falls in ABP and SVR were maintained, but RF, HR and MSNA waned towards baselines from the second to the tenth minute. Thus, we directly show that MSNA increases during systemic hypoxia to an extent that is mainly determined by the increases in peripheral chemoreceptor stimulation and respiratory drive, but its vasoconstrictor effects on muscle vasculature are largely blunted by local dilator influences, despite high instantaneous frequencies in single fibres.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report the results of general practitioners' views on Helicobacter pylori-associated dyspepsia and use of screening tests in the community. The use of office serology tests in screening is of concern as independent validation in specialist units has been disappointing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pain management in premature and sick babies has long been recognised as a vital component of neonatal care; however practices pertaining to pain assessment and administration of analgesia remain variable in Neonatal Units (NNU). Sucrose has been identified as an effective agent in reducing pain during minor painful procedures in premature babies but the uptake has been modest.This article is the first of two, and will describe the rationale for implementation of sucrose administration as a measure for pain relief for minor procedures in one neonatal unit in Northern Ireland. Current literature relating to use of sucrose willbe utilised in generating debate and discussion around the implementation of Clinical Practice Guidelines (CPG) for Sucrose use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gravel aquifers act as important potable water sources in central western Europe yet they are subject to numerous contamination pressures. Compositional and textural heterogeneity makes protection zone delineation around groundwater supplies in these units challenging; artificial tracer testing aids characterization. This paper reappraises previous tracer test results in light of new geological and microbiological data. Comparative passive gradient testing, using a fluorescent solute (Uranine), virus (H40/1 bacteriophage), and comparably sized bacterial tracers Escherichia coli and Pseudomonas putida, was used to investigate a calcareous gravel aquifer’s ability to remove microbiological contaminants at a test site near Munich, Germany. Test results revealed E. coli relative recoveries could exceed those of H40/1 at monitoring wells 10 m and 20 m from an injection well by almost four times; P. putida recoveries varied by a factor of up to three between wells. Application of filtration theory suggested greater attenuation of H40/1 relative to similarly charged E. coli occurred due to differences in microorganism size, while estimated collision efficiencies appeared comparable. By contrast, more positively charged P. putida experienced greater attenuation at one monitoring point, while lower attenuation rates at the second location indicated the influence of geochemical heterogeneity. Test findings proved consistent with observations from nearby fresh outcrops that suggested thin open framework gravel beds dominated mass transport in the aquifer, while discrete intervals containing stained clasts reflect localized geochemical heterogeneity. Study results highlight the utility of reconciling outcrop observations with artificial tracer test responses, using microbiological tracers with well-defined properties, to characterize aquifer heterogeneity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Future emerging market trends head towards positioning based services placing a new perspective on the way we obtain and exploit positioning information. On one hand, innovations in information technology and wireless communication systems enabled the development of numerous location based applications such as vehicle navigation and tracking, sensor networks applications, home automation, asset management, security and context aware location services. On the other hand, wireless networks themselves may bene t from localization information to improve the performances of di erent network layers. Location based routing, synchronization, interference cancellation are prime examples of applications where location information can be useful. Typical positioning solutions rely on measurements and exploitation of distance dependent signal metrics, such as the received signal strength, time of arrival or angle of arrival. They are cheaper and easier to implement than the dedicated positioning systems based on ngerprinting, but at the cost of accuracy. Therefore intelligent localization algorithms and signal processing techniques have to be applied to mitigate the lack of accuracy in distance estimates. Cooperation between nodes is used in cases where conventional positioning techniques do not perform well due to lack of existing infrastructure, or obstructed indoor environment. The objective is to concentrate on hybrid architecture where some nodes have points of attachment to an infrastructure, and simultaneously are interconnected via short-range ad hoc links. The availability of more capable handsets enables more innovative scenarios that take advantage of multiple radio access networks as well as peer-to-peer links for positioning. Link selection is used to optimize the tradeo between the power consumption of participating nodes and the quality of target localization. The Geometric Dilution of Precision and the Cramer-Rao Lower Bound can be used as criteria for choosing the appropriate set of anchor nodes and corresponding measurements before attempting location estimation itself. This work analyzes the existing solutions for node selection in order to improve localization performance, and proposes a novel method based on utility functions. The proposed method is then extended to mobile and heterogeneous environments. Simulations have been carried out, as well as evaluation with real measurement data. In addition, some speci c cases have been considered, such as localization in ill-conditioned scenarios and the use of negative information. The proposed approaches have shown to enhance estimation accuracy, whilst signi cantly reducing complexity, power consumption and signalling overhead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main scope of this work was to evaluate the metabolic effects of anticancer agents (three conventional and one new) in osteosarcoma (OS) cells and osteoblasts, by measuring alterations in the metabolic profile of cells by nuclear magnetic resonance (NMR) spectroscopy metabolomics. Chapter 1 gives a theoretical framework of this work, beginning with the main metabolic characteristics that globally describe cancer as well as the families and mechanisms of action of drugs used in chemotherapy. The drugs used nowadays to treat OS are also presented, together with the Palladium(II) complex with spermine, Pd2Spm, potentially active against cancer. Then, the global strategy for cell metabolomics is explained and the state of the art of metabolomic studies that analyze the effect of anticancer agents in cells is presented. In Chapter 2, the fundamentals of the analytical techniques used in this work, namely for biological assays, NMR spectroscopy and multivariate and statistical analysis of the results are described. A detailed description of the experimental procedures adopted throughout this work is given in Chapter 3. The biological and analytical reproducibility of the metabolic profile of MG-63 cells by high resolution magic angle spinning (HRMAS) NMR is evaluated in Chapter 4. The metabolic impact of several factors (cellular integrity, spinning rate, temperature, time and acquisition parameters) on the 1H HRMAS NMR spectral profile and quality is analysed, enabling the definition of the best acquisition parameters for further experiments. The metabolic consequences of increasing number of passages in MG-63 cells as well as the duration of storage are also investigated. Chapter 5 describes the metabolic impact of drugs conventionally used in OS chemotherapy, through NMR metabolomics studies of lysed cells and aqueous extracts analysis. The results show that MG-63 cells treated with cisplatin (cDDP) undergo a strong up-regulation of lipid contents, alterations in phospholipid constituents (choline compounds) and biomarkers of DNA degradation, all associated with cell death by apoptosis. Cells exposed to doxorubicin (DOX) or methotrexate (MTX) showed much slighter metabolic changes, without any relevant alteration in lipid contents. However, metabolic changes associated with altered Krebs cycle, oxidative stress and nucleotides metabolism were detected and were tentatively interpreted at the light of the known mechanisms of action of these drugs. The metabolic impact of the exposure of MG-63 cells and osteoblasts to cDDP and the Pd2Spm complex is described in Chapter 6. Results show that, despite the ability of the two agents to bind DNA, the metabolic consequences that arise from exposure to them are distinct, namely in what concerns to variation in lipid contents (absent for Pd2Spm). Apoptosis detection assays showed that, differently from what was seen for MG-63 cells treated with cDDP, the decreased number of living cells upon exposure to Pd2Spm was not due to cell death by apoptosis or necrosis. Moreover, the latter agent induces more marked alterations in osteoblasts than in cancer cells, while the opposite seemed to occur upon cDDP exposure. Nevertheless, the results from MG-63 cells exposure to combination regimens with cDDP- or Pd2Spm-based cocktails, described in Chapter 7, revealed that, in combination, the two agents induce similar metabolic responses, arising from synergy mechanisms between the tested drugs. Finally, the main conclusions of this thesis are summarized in Chapter 8, and future perspectives in the light of this work are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Senior thesis written for Oceanography 445

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A melhoria progressiva na prestação de cuidados de saúde que se verifica nos dias de hoje, deve-se maioritariamente ao desenvolvimento de novas tecnologias médicas, que se traduzem na criação de inovadores dispositivos médicos, cujo fim é auxiliar no diagnóstico, prevenção e tratamento de doenças, melhorando assim as condições de trabalho e os cuidados oferecidos aos pacientes. No entanto estas melhorias apenas são vantajosas quando as novas tecnologias são utilizadas de forma segura, o que leva a uma preocupação crescente com a segurança dos profissionais de saúde e dos pacientes em ambiente hospitalar. Como forma de reduzir e controlar os riscos existentes, as unidades de saúde introduziram mecanismos de gestão que permitem o conhecimento das fontes de risco e respetivos mecanismos de ação. A presente dissertação de mestrado apresenta uma proposta de modelo de Manual de Procedimentos para Gestão de Risco de Dispositivos Médicos, aplicável a todos os dispositivos médicos existentes nas Unidades de Saúde. Para a criação deste manual, foram utilizados por meio de adaptação, as etapas da gestão de risco definidas na Norma ISO 14971:2007 em conjunto com o método de gestão de risco utilizado pela Unidade Local de Saúde de Matosinhos O desenvolvimento deste manual de procedimentos permitirá a esta unidade de saúde, a aquisição e fornecimento de informações úteis na tomada de decisão sobre os procedimentos de controlo de risco de dispositivos médicos, com o objetivo de manter o risco destes dispositivos dentro dos níveis previamente estabelecidos e auxiliar a tomada de decisão de programas de manutenção preventiva e de aquisição de dispositivos médicos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Progress in Industrial Ecology, An International Journal, nº 4(5), p. 363-381

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transport is an essential sector in modern societies. It connects economic sectors and industries. Next to its contribution to economic development and social interconnection, it also causes adverse impacts on the environment and results in health hazards. Transport is a major source of ground air pollution, especially in urban areas, and therefore contributing to the health problems, such as cardiovascular and respiratory diseases, cancer, and physical injuries. This thesis presents the results of a health risk assessment that quantifies the mortality and the diseases associated with particulate matter pollution resulting from urban road transport in Hai Phong City, Vietnam. The focus is on the integration of modelling and GIS approaches in the exposure analysis to increase the accuracy of the assessment and to produce timely and consistent assessment results. The modelling was done to estimate traffic conditions and concentrations of particulate matters based on geo-references data. A simplified health risk assessment was also done for Ha Noi based on monitoring data that allows a comparison of the results between the two cases. The results of the case studies show that health risk assessment based on modelling data can provide a much more detail results and allows assessing health impacts of different mobility development options at micro level. The use of modeling and GIS as a common platform for the integration of different assessments (environmental, health, socio-economic, etc.) provides various strengths, especially in capitalising on the available data stored in different units and forms and allows handling large amount of data. The use of models and GIS in a health risk assessment, from a decision making point of view, can reduce the processing/waiting time while providing a view at different scales: from micro scale (sections of a city) to a macro scale. It also helps visualising the links between air quality and health outcomes which is useful discussing different development options. However, a number of improvements can be made to further advance the integration. An improved integration programme of the data will facilitate the application of integrated models in policy-making. Data on mobility survey, environmental monitoring and measuring must be standardised and legalised. Various traffic models, together with emission and dispersion models, should be tested and more attention should be given to their uncertainty and sensitivity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neurological disorders are a major concern in modern societies, with increasing prevalence mainly related with the higher life expectancy. Most of the current available therapeutic options can only control and ameliorate the patients’ symptoms, often be-coming refractory over time. Therapeutic breakthroughs and advances have been hampered by the lack of accurate central nervous system (CNS) models. The develop-ment of these models allows the study of the disease onset/progression mechanisms and the preclinical evaluation of novel therapeutics. This has traditionally relied on genetically engineered animal models that often diverge considerably from the human phenotype (developmentally, anatomically and physiologically) and 2D in vitro cell models, which fail to recapitulate the characteristics of the target tissue (cell-cell and cell-matrix interactions, cell polarity). The in vitro recapitulation of CNS phenotypic and functional features requires the implementation of advanced culture strategies that enable to mimic the in vivo struc-tural and molecular complexity. Models based on differentiation of human neural stem cells (hNSC) in 3D cultures have great potential as complementary tools in preclinical research, bridging the gap between human clinical studies and animal models. This thesis aimed at the development of novel human 3D in vitro CNS models by integrat-ing agitation-based culture systems and a wide array of characterization tools. Neural differentiation of hNSC as 3D neurospheres was explored in Chapter 2. Here, it was demonstrated that human midbrain-derived neural progenitor cells from fetal origin (hmNPC) can generate complex tissue-like structures containing functional dopaminergic neurons, as well as astrocytes and oligodendrocytes. Chapter 3 focused on the development of cellular characterization assays for cell aggregates based on light-sheet fluorescence imaging systems, which resulted in increased spatial resolu-tion both for fixed samples or live imaging. The applicability of the developed human 3D cell model for preclinical research was explored in Chapter 4, evaluating the poten-tial of a viral vector candidate for gene therapy. The efficacy and safety of helper-dependent CAV-2 (hd-CAV-2) for gene delivery in human neurons was evaluated, demonstrating increased neuronal tropism, efficient transgene expression and minimal toxicity. The potential of human 3D in vitro CNS models to mimic brain functions was further addressed in Chapter 5. Exploring the use of 13C-labeled substrates and Nucle-ar Magnetic Resonance (NMR) spectroscopy tools, neural metabolic signatures were evaluated showing lineage-specific metabolic specialization and establishment of neu-ron-astrocytic shuttles upon differentiation. Chapter 6 focused on transferring the knowledge and strategies described in the previous chapters for the implementation of a scalable and robust process for the 3D differentiation of hNSC derived from human induced pluripotent stem cells (hiPSC). Here, software-controlled perfusion stirred-tank bioreactors were used as technological system to sustain cell aggregation and dif-ferentiation. The work developed in this thesis provides practical and versatile new in vitro ap-proaches to model the human brain. Furthermore, the culture strategies described herein can be further extended to other sources of neural phenotypes, including pa-tient-derived hiPSC. The combination of this 3D culture strategy with the implemented characterization methods represents a powerful complementary tool applicable in the drug discovery, toxicology and disease modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies assessing skin irritation to chemicals have traditionally used laboratory animals; however, such methods are questionable regarding their relevance for humans. New in vitro methods have been validated, such as the reconstructed human epidermis (RHE) model (Episkin®, Epiderm®). The comparison (accuracy) with in vivo results such as the 4-h human patch test (HPT) is 76% at best (Epiderm®). There is a need to develop an in vitro method that better simulates the anatomo-pathological changes encountered in vivo. To develop an in vitro method to determine skin irritation using human viable skin through histopathology, and compare the results of 4 tested substances to the main in vitro methods and in vivo animal method (Draize test). Human skin removed during surgery was dermatomed and mounted on an in vitro flow-through diffusion cell system. Ten chemicals with known non-irritant (heptylbutyrate, hexylsalicylate, butylmethacrylate, isoproturon, bentazon, DEHP and methylisothiazolinone (MI)) and irritant properties (folpet, 1-bromohexane and methylchloroisothiazolinone (MCI/MI)), a negative control (sodiumchloride) and a positive control (sodiumlaurylsulphate) were applied. The skin was exposed at least for 4h. Histopathology was performed to investigate irritation signs (spongiosis, necrosis, vacuolization). We obtained 100% accuracy with the HPT model; 75% with the RHE models and 50% with the Draize test for 4 tested substances. The coefficients of variation (CV) between our three test batches were <0.1, showing good reproducibility. Furthermore, we reported objectively histopathological irritation signs (irritation scale): strong (folpet), significant (1-bromohexane), slight (MCI/MI at 750/250ppm) and none (isoproturon, bentazon, DEHP and MI). This new in vitro test method presented effective results for the tested chemicals. It should be further validated using a greater number of substances; and tested in different laboratories in order to suitably evaluate reproducibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Red blood cell-derived microparticles (RMPs) are small phospholipid vesicles shed from RBCs in blood units, where they accumulate during storage. Because microparticles are bioactive, it could be suggested that RMPs are mediators of posttransfusion complications or, on the contrary, constitute a potential hemostatic agent. STUDY DESIGN AND METHODS: This study was performed to establish the impact on coagulation of RMPs isolated from blood units. Using calibrated automated thrombography, we investigated whether RMPs affect thrombin generation (TG) in plasma. RESULTS: We found that RMPs were not only able to increase TG in plasma in the presence of a low exogenous tissue factor (TF) concentration, but also to initiate TG in plasma in absence of exogenous TF. TG induced by RMPs in the absence of exogenous TF was neither affected by the presence of blocking anti-TF nor by the absence of Factor (F)VII. It was significantly reduced in plasma deficient in FVIII or F IX and abolished in FII-, FV-, FX-, or FXI-deficient plasma. TG was also totally abolished when anti-XI 01A6 was added in the sample. Finally, neither Western blotting, flow cytometry, nor immunogold labeling allowed the detection of traces of TF antigen. In addition, RMPs did not comprise polyphosphate, an important modulator of coagulation. CONCLUSIONS: Taken together, our data show that RMPs have FXI-dependent procoagulant properties and are able to initiate and propagate TG. The anionic surface of RMPs might be the site of FXI-mediated TG amplification and intrinsic tenase and prothrombinase complex assembly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RÉSUMÉ - Les images satellitales multispectrales, notamment celles à haute résolution spatiale (plus fine que 30 m au sol), représentent une source d’information inestimable pour la prise de décision dans divers domaines liés à la gestion des ressources naturelles, à la préservation de l’environnement ou à l’aménagement et la gestion des centres urbains. Les échelles d’étude peuvent aller du local (résolutions plus fines que 5 m) à des échelles régionales (résolutions plus grossières que 5 m). Ces images caractérisent la variation de la réflectance des objets dans le spectre qui est l’information clé pour un grand nombre d’applications de ces données. Or, les mesures des capteurs satellitaux sont aussi affectées par des facteurs « parasites » liés aux conditions d’éclairement et d’observation, à l’atmosphère, à la topographie et aux propriétés des capteurs. Deux questions nous ont préoccupé dans cette recherche. Quelle est la meilleure approche pour restituer les réflectances au sol à partir des valeurs numériques enregistrées par les capteurs tenant compte des ces facteurs parasites ? Cette restitution est-elle la condition sine qua non pour extraire une information fiable des images en fonction des problématiques propres aux différents domaines d’application des images (cartographie du territoire, monitoring de l’environnement, suivi des changements du paysage, inventaires des ressources, etc.) ? Les recherches effectuées les 30 dernières années ont abouti à une série de techniques de correction des données des effets des facteurs parasites dont certaines permettent de restituer les réflectances au sol. Plusieurs questions sont cependant encore en suspens et d’autres nécessitent des approfondissements afin, d’une part d’améliorer la précision des résultats et d’autre part, de rendre ces techniques plus versatiles en les adaptant à un plus large éventail de conditions d’acquisition des données. Nous pouvons en mentionner quelques unes : - Comment prendre en compte des caractéristiques atmosphériques (notamment des particules d’aérosol) adaptées à des conditions locales et régionales et ne pas se fier à des modèles par défaut qui indiquent des tendances spatiotemporelles à long terme mais s’ajustent mal à des observations instantanées et restreintes spatialement ? - Comment tenir compte des effets de « contamination » du signal provenant de l’objet visé par le capteur par les signaux provenant des objets environnant (effet d’adjacence) ? ce phénomène devient très important pour des images de résolution plus fine que 5 m; - Quels sont les effets des angles de visée des capteurs hors nadir qui sont de plus en plus présents puisqu’ils offrent une meilleure résolution temporelle et la possibilité d’obtenir des couples d’images stéréoscopiques ? - Comment augmenter l’efficacité des techniques de traitement et d’analyse automatique des images multispectrales à des terrains accidentés et montagneux tenant compte des effets multiples du relief topographique sur le signal capté à distance ? D’autre part, malgré les nombreuses démonstrations par des chercheurs que l’information extraite des images satellitales peut être altérée à cause des tous ces facteurs parasites, force est de constater aujourd’hui que les corrections radiométriques demeurent peu utilisées sur une base routinière tel qu’est le cas pour les corrections géométriques. Pour ces dernières, les logiciels commerciaux de télédétection possèdent des algorithmes versatiles, puissants et à la portée des utilisateurs. Les algorithmes des corrections radiométriques, lorsqu’ils sont proposés, demeurent des boîtes noires peu flexibles nécessitant la plupart de temps des utilisateurs experts en la matière. Les objectifs que nous nous sommes fixés dans cette recherche sont les suivants : 1) Développer un logiciel de restitution des réflectances au sol tenant compte des questions posées ci-haut. Ce logiciel devait être suffisamment modulaire pour pouvoir le bonifier, l’améliorer et l’adapter à diverses problématiques d’application d’images satellitales; et 2) Appliquer ce logiciel dans différents contextes (urbain, agricole, forestier) et analyser les résultats obtenus afin d’évaluer le gain en précision de l’information extraite par des images satellitales transformées en images des réflectances au sol et par conséquent la nécessité d’opérer ainsi peu importe la problématique de l’application. Ainsi, à travers cette recherche, nous avons réalisé un outil de restitution de la réflectance au sol (la nouvelle version du logiciel REFLECT). Ce logiciel est basé sur la formulation (et les routines) du code 6S (Seconde Simulation du Signal Satellitaire dans le Spectre Solaire) et sur la méthode des cibles obscures pour l’estimation de l’épaisseur optique des aérosols (aerosol optical depth, AOD), qui est le facteur le plus difficile à corriger. Des améliorations substantielles ont été apportées aux modèles existants. Ces améliorations concernent essentiellement les propriétés des aérosols (intégration d’un modèle plus récent, amélioration de la recherche des cibles obscures pour l’estimation de l’AOD), la prise en compte de l’effet d’adjacence à l’aide d’un modèle de réflexion spéculaire, la prise en compte de la majorité des capteurs multispectraux à haute résolution (Landsat TM et ETM+, tous les HR de SPOT 1 à 5, EO-1 ALI et ASTER) et à très haute résolution (QuickBird et Ikonos) utilisés actuellement et la correction des effets topographiques l’aide d’un modèle qui sépare les composantes directe et diffuse du rayonnement solaire et qui s’adapte également à la canopée forestière. Les travaux de validation ont montré que la restitution de la réflectance au sol par REFLECT se fait avec une précision de l’ordre de ±0.01 unités de réflectance (pour les bandes spectrales du visible, PIR et MIR), même dans le cas d’une surface à topographie variable. Ce logiciel a permis de montrer, à travers des simulations de réflectances apparentes à quel point les facteurs parasites influant les valeurs numériques des images pouvaient modifier le signal utile qui est la réflectance au sol (erreurs de 10 à plus de 50%). REFLECT a également été utilisé pour voir l’importance de l’utilisation des réflectances au sol plutôt que les valeurs numériques brutes pour diverses applications courantes de la télédétection dans les domaines des classifications, du suivi des changements, de l’agriculture et de la foresterie. Dans la majorité des applications (suivi des changements par images multi-dates, utilisation d’indices de végétation, estimation de paramètres biophysiques, …), la correction des images est une opération cruciale pour obtenir des résultats fiables. D’un point de vue informatique, le logiciel REFLECT se présente comme une série de menus simples d’utilisation correspondant aux différentes étapes de saisie des intrants de la scène, calcul des transmittances gazeuses, estimation de l’AOD par la méthode des cibles obscures et enfin, l’application des corrections radiométriques à l’image, notamment par l’option rapide qui permet de traiter une image de 5000 par 5000 pixels en 15 minutes environ. Cette recherche ouvre une série de pistes pour d’autres améliorations des modèles et méthodes liés au domaine des corrections radiométriques, notamment en ce qui concerne l’intégration de la FDRB (fonction de distribution de la réflectance bidirectionnelle) dans la formulation, la prise en compte des nuages translucides à l’aide de la modélisation de la diffusion non sélective et l’automatisation de la méthode des pentes équivalentes proposée pour les corrections topographiques.