866 resultados para modelling and simulation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Työssä tutkitaan pienitehoisen monimoottoritaajuusmuuttajakäytön yhteismuotovirtojen aiheuttamia ongelmia. Työn tavoitteena on muodostaa monimoottorikäyttöjen toteuttajille opastus, jonka avulla sähkökäytön suunnittelun alkuvaiheessa voidaan ottaa huomioon monimoottorikäytön erikoistarpeita. Työn teoriaosuudessa esitellään taajuusmuuttajan vaihtosuuntaustekniikan aiheuttamia ongelmia, moottorikaapelin mallintamista ja vertaillaan ongelmien ratkaisukeinoja pienitehoisten taajuusmuuttajien näkökulmasta. Yhteismuotoista virtaa tutkitaan mittaamalla rakennetulla monimoottorikäytöllä sekä simuloimalla vaihdellen kaapelin pituutta ja moottorien lukumäärää. Mittaamalla vertaillaan myös virran suodatusratkaisuja. Muodostettu simulointimalli on hyvin yksinkertainen ja luotettavammat virta-arvot saadaankin mittauksella. Kuitenkin työn perusteella voidaan antaa ohjeita monimoottorikäytön suunnittelulle. Esimerkiksi jokaiselle moottorille täytyy suunnitella erillinen suojaus ja taajuusmuuttaja on syytä mitoittaa todellisen kuormavirran mukaan. Yhteismuotoisen virran aiheuttamat ongelmat ovat sitä pienempiä, mitä vähemmän moottoreita on rinnakkain ja mitä lyhyempi moottorikaapelien yhteispituus on.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cells of epithelial origin, e.g. from breast and prostate cancers, effectively differentiate into complex multicellular structures when cultured in three-dimensions (3D) instead of conventional two-dimensional (2D) adherent surfaces. The spectrum of different organotypic morphologies is highly dependent on the culture environment that can be either non-adherent or scaffold-based. When embedded in physiological extracellular matrices (ECMs), such as laminin-rich basement membrane extracts, normal epithelial cells differentiate into acinar spheroids reminiscent of glandular ductal structures. Transformed cancer cells, in contrast, typically fail to undergo acinar morphogenic patterns, forming poorly differentiated or invasive multicellular structures. The 3D cancer spheroids are widely accepted to better recapitulate various tumorigenic processes and drug responses. So far, however, 3D models have been employed predominantly in the Academia, whereas the pharmaceutical industry has yet to adopt a more widely and routine use. This is mainly due to poor characterisation of cell models, lack of standardised workflows and high throughput cell culture platforms, and the availability of proper readout and quantification tools. In this thesis, a complete workflow has been established entailing well-characterised 3D cell culture models for prostate cancer, a standardised 3D cell culture routine based on high-throughput-ready platform, automated image acquisition with concomitant morphometric image analysis, and data visualisation, in order to enable large-scale high-content screens. Our integrated suite of software and statistical analysis tools were optimised and validated using a comprehensive panel of prostate cancer cell lines and 3D models. The tools quantify multiple key cancer-relevant morphological features, ranging from cancer cell invasion through multicellular differentiation to growth, and detect dynamic changes both in morphology and function, such as cell death and apoptosis, in response to experimental perturbations including RNA interference and small molecule inhibitors. Our panel of cell lines included many non-transformed and most currently available classic prostate cancer cell lines, which were characterised for their morphogenetic properties in 3D laminin-rich ECM. The phenotypes and gene expression profiles were evaluated concerning their relevance for pre-clinical drug discovery, disease modelling and basic research. In addition, a spontaneous model for invasive transformation was discovered, displaying a highdegree of epithelial plasticity. This plasticity is mediated by an abundant bioactive serum lipid, lysophosphatidic acid (LPA), and its receptor LPAR1. The invasive transformation was caused by abrupt cytoskeletal rearrangement through impaired G protein alpha 12/13 and RhoA/ROCK, and mediated by upregulated adenylyl cyclase/cyclic AMP (cAMP)/protein kinase A, and Rac/ PAK pathways. The spontaneous invasion model tangibly exemplifies the biological relevance of organotypic cell culture models. Overall, this thesis work underlines the power of novel morphometric screening tools in drug discovery.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Welding has a growing role in modern world manufacturing. Welding joints are extensively used from pipes to aerospace industries. Prediction of welding residual stresses and distortions is necessary for accurate evaluation of fillet welds in relation to design and safety conditions. Residual stresses may be beneficial or detrimental, depending whether they are tensile or compressive and the loading. They directly affect the fatigue life of the weld by impacting crack growth rate. Beside theoretical background of residual stresses this study calculates residual stresses and deformations due to localized heating by welding process and subsequent rapid cooling in fillet welds. Validated methods are required for this purpose due to complexity of process, localized heating, temperature dependence of material properties and heat source. In this research both empirical and simulation methods were used for the analysis of welded joints. Finite element simulation has become a popular tool of prediction of welding residual stresses and distortion. Three different cases with and without preload have been modeled during this study. Thermal heat load set is used by calculating heat flux from the given heat input energy. First the linear and then nonlinear material behavior model is modeled for calculation of residual stresses. Experimental work is done to calculate the stresses empirically. The results from both the methods are compared to check their reliability. Residual stresses can have a significant effect on fatigue performance of the welded joints made of high strength steel. Both initial residual stress state and subsequent residual stress relaxation need to be considered for accurate description of fatigue behavior. Tensile residual stresses are detrimental and will reduce the fatigue life and compressive residual stresses will increase it. The residual stresses follow the yield strength of base or filler material and the components made of high strength steel are typically thin, where the role of distortion is emphasizing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Industrial applications demand that robots operate in agreement with the position and orientation of their end effector. It is necessary to solve the kinematics inverse problem. This allows the displacement of the joints of the manipulator to be determined, to accomplish a given objective. Complete studies of dynamical control of joint robotics are also necessary. Initially, this article focuses on the implementation of numerical algorithms for the solution of the kinematics inverse problem and the modeling and simulation of dynamic systems. This is done using real time implementation. The modeling and simulation of dynamic systems are performed emphasizing off-line programming. In sequence, a complete study of the control strategies is carried out through the study of several elements of a robotic joint, such as: DC motor, inertia, and gearbox. Finally a trajectory generator, used as input for a generic group of joints, is developed and a proposal of the controller's implementation of joints, using EPLD development system, is presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The report presents the results of the commercialization project called the Container logistic services for forest bioenergy. The project promotes new business that is emerging around overall container logistic services in the bioenergy sector. The results assess the European markets of the container logistics for biomass, enablers for new business creation and required service bundles for the concept. We also demonstrate the customer value of the container logistic services for different market segments. The concept analysis is based on concept mapping, quality function deployment process (QFD) and business network analysis. The business network analysis assesses key shareholders and their mutual connections. The performance of the roadside chipping chain is analysed by the logistic cost simulation, RFID system demonstration and freezing tests. The EU has set the renewable energy target to 20 % in 2020 of which Biomass could account for two-thirds. In the Europe, the production of wood fuels was 132.9 million solid-m3 in 2012 and production of wood chips and particles was 69.0 million solidm3. The wood-based chips and particle flows are suitable for container transportation providing market of 180.6 million loose- m3 which mean 4.5 million container loads per year. The intermodal logistics of trucks and trains are promising for the composite containers because the biomass does not freeze onto the inner surfaces in the unloading situations. The overall service concept includes several packages: container rental, container maintenance, terminal services, RFID-tracking service, and simulation and ERP-integration service. The container rental and maintenance would provide transportation entrepreneurs a way to increase the capacity without high investment costs. The RFID-concept would lead to better work planning improving profitability throughout the logistic chain and simulation supports fuel supply optimization.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Life cycle assessment (LCA) is one of the most established quantitative tools for environmental impact assessment of products. To be able to provide support to environmentally-aware decision makers on environmental impacts of biomass value-chains, the scope of LCA methodology needs to be augmented to cover landuse related environmental impacts. This dissertation focuses on analysing and discussing potential impact assessment methods, conceptual models and environmental indicators that have been proposed to be implemented into the LCA framework for impacts of land use. The applicability of proposed indicators and impact assessment frameworks is tested from practitioners' perspective, especially focusing on forest biomass value chains. The impacts of land use on biodiversity, resource depletion, climate change and other ecosystem services is analysed and discussed and the interplay in between value choices in LCA modelling and the decision-making situations to be supported is critically discussed. It was found out that land use impact indicators are necessary in LCA in highlighting differences in impacts from distinct land use classes. However, many open questions remain on certainty of highlighting actual impacts of land use, especially regarding impacts of managed forest land use on biodiversity and ecosystem services such as water regulation and purification. The climate impact of energy use of boreal stemwood was found to be higher in the short term and lower in the long-term in comparison with fossil fuels that emit identical amount of CO2 in combustion, due to changes implied to forest C stocks. The climate impacts of energy use of boreal stemwood were found to be higher than the previous estimates suggest on forest residues and stumps. The product lifetime was found to have much higher influence on the climate impacts of woodbased value chains than the origin of stemwood either from thinnings or final fellings. Climate neutrality seems to be likely only in the case when almost all the carbon of harvested wood is stored in long-lived wooden products. In the current form, the land use impacts cannot be modelled with a high degree of certainty nor communicated with adequate level of clarity to decision makers. The academia needs to keep on improving the modelling framework, and more importantly, clearly communicate to decision-makers the limited certainty on whether land-use intensive activities can help in meeting the strict mitigation targets we are globally facing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Wind turbines based on doubly fed induction generators (DFIG) become the most popular solution in high power wind generation industry. While this topology provides great performance with the reduced power rating of power converter, it has more complicated structure in comparison with full-rated topologies, and therefore leads to complexity of control algorithms and electromechanical processes in the system. The purpose of presented study is to present a proper vector control scheme for the DFIG and overall control for the WT to investigate its behavior at different wind speeds and in different grid voltage conditions: voltage sags, magnitude and frequency variations. The key principles of variable-speed wind turbine were implemented in simulation model and demonstrated during the study. Then, based on developed control scheme and mathematical model, the set of simulation is made to analyze reactive power capabilities of the DFIG wind turbine. Further, the rating of rotor-side converter is modified to not only generate active rated active power, but also to fulfill Grid Codes. Results of modelling and analyzing of the DFIG WT behavior under different speeds and different voltage conditions are presented in the work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis addresses the coolability of porous debris beds in the context of severe accident management of nuclear power reactors. In a hypothetical severe accident at a Nordic-type boiling water reactor, the lower drywell of the containment is flooded, for the purpose of cooling the core melt discharged from the reactor pressure vessel in a water pool. The melt is fragmented and solidified in the pool, ultimately forming a porous debris bed that generates decay heat. The properties of the bed determine the limiting value for the heat flux that can be removed from the debris to the surrounding water without the risk of re-melting. The coolability of porous debris beds has been investigated experimentally by measuring the dryout power in electrically heated test beds that have different geometries. The geometries represent the debris bed shapes that may form in an accident scenario. The focus is especially on heap-like, realistic geometries which facilitate the multi-dimensional infiltration (flooding) of coolant into the bed. Spherical and irregular particles have been used to simulate the debris. The experiments have been modeled using 2D and 3D simulation codes applicable to fluid flow and heat transfer in porous media. Based on the experimental and simulation results, an interpretation of the dryout behavior in complex debris bed geometries is presented, and the validity of the codes and models for dryout predictions is evaluated. According to the experimental and simulation results, the coolability of the debris bed depends on both the flooding mode and the height of the bed. In the experiments, it was found that multi-dimensional flooding increases the dryout heat flux and coolability in a heap-shaped debris bed by 47–58% compared to the dryout heat flux of a classical, top-flooded bed of the same height. However, heap-like beds are higher than flat, top-flooded beds, which results in the formation of larger steam flux at the top of the bed. This counteracts the effect of the multi-dimensional flooding. Based on the measured dryout heat fluxes, the maximum height of a heap-like bed can only be about 1.5 times the height of a top-flooded, cylindrical bed in order to preserve the direct benefit from the multi-dimensional flooding. In addition, studies were conducted to evaluate the hydrodynamically representative effective particle diameter, which is applied in simulation models to describe debris beds that consist of irregular particles with considerable size variation. The results suggest that the effective diameter is small, closest to the mean diameter based on the number or length of particles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nykypäivän monimutkaisessa ja epävakaassa liiketoimintaympäristössä yritykset, jotka kykenevät muuttamaan tuottamansa operatiivisen datan tietovarastoiksi, voivat saavuttaa merkittävää kilpailuetua. Ennustavan analytiikan hyödyntäminen tulevien trendien ennakointiin mahdollistaa yritysten tunnistavan avaintekijöitä, joiden avulla he pystyvät erottumaan kilpailijoistaan. Ennustavan analytiikan hyödyntäminen osana päätöksentekoprosessia mahdollistaa ketterämmän, reaaliaikaisen päätöksenteon. Tämän diplomityön tarkoituksena on koota teoreettinen viitekehys analytiikan mallintamisesta liike-elämän loppukäyttäjän näkökulmasta ja hyödyntää tätä mallinnusprosessia diplomityön tapaustutkimuksen yritykseen. Teoreettista mallia hyödynnettiin asiakkuuksien mallintamisessa sekä tunnistamalla ennakoivia tekijöitä myynnin ennustamiseen. Työ suoritettiin suomalaiseen teollisten suodattimien tukkukauppaan, jolla on liiketoimintaa Suomessa, Venäjällä ja Balteissa. Tämä tutkimus on määrällinen tapaustutkimus, jossa tärkeimpänä tiedonkeruumenetelmänä käytettiin tapausyrityksen transaktiodataa. Data työhön saatiin yrityksen toiminnanohjausjärjestelmästä.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Social insects are known for their ability to display swarm intelligence, where the cognitive capabilities of the collective surpass those of the individuals forming it by orders of magnitude. The rise of crowdsourcing in recent years has sparked speculation as to whether something similar might be taking place on crowdsourcing sites, where hundreds or thousands of people interact with each other. The phenomenon has been dubbed collective intelligence. This thesis focuses on exploring the role of collective intelligence in crowdsourcing innovations. The task is approached through three research questions: 1) what is collective intelligence; 2) how is collective intelligence manifested in websites involved in crowdsourcing innovation; and 3) how important is collective intelligence for the functioning of the crowdsourcing sites. After developing a theoretical framework for collective intelligence, a multiple case study is conducted using an ethnographic data collection approach for the most part. A variety of qualitative, quantitative and simulation modelling methods are used to analyse the complex phenomenon from several theoretical viewpoints or ‘lenses’. Two possible manifestations of collective intelligence are identified: discussion, typical of web forums; and the wisdom of crowds in evaluating crowd submissions to websites. However, neither of these appears to be specific to crowdsourcing or critical for the functioning of the sites. Collective intelligence appears to play only a minor role in the cases investigated here. In addition, this thesis shows that feedback loops, which are found in all the cases investigated, reduce the accuracy of the crowd’s evaluations when a count of votes is used for aggregation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study examined the effect of expHcitly instructing students to use a repertoire of reading comprehension strategies. Specifically, this study examined whether providing students with a "predictive story-frame" which combined the use of prediction and summarization strategies improved their reading comprehension relative to providing students with generic instruction on prediction and summarization. Results were examined in terms of instructional condition and reading ability. Students from 2 grade 4 classes participated in this study. The reading component of the Canadian Achievement Tests, Second Edition (CAT/2) was used to identify students as either "average or above average" or "below average" readers. Students received either strategic predication and summarization instruction (story-frame) or generic prediction and summarization instruction (notepad). Students were provided with new but comparable stories for each session. For both groups, the researcher modelled the strategic tools and provided guided practice, independent practice, and independent reading sessions. Comprehension was measured with an immediate and 1-week delayed comprehension test for each of the 4 stories, hi addition, students participated in a 1- week delayed interview, where they were asked to retell the story and to answer questions about the central elements (character, setting, problem, solution, beginning, middle, and ending events) of each story. There were significant differences, with medium to large effect sizes, in comprehension and recall scores as a fimction of both instructional condition and reading ability. Students in the story-frame condition outperformed students in the notepad condition, and average to above average readers performed better than below average readers. Students in the story-frame condition outperformed students in the notepad condition on the comprehension tests and on the oral retellings when teacher modelling and guidance were present. In the cued recall sessions, students in the story-frame instructional condition recalled more correct information and generated fewer errors than students in the notepad condition. Average to above average readers performed better than below average readers across comprehension and retelling measures. The majority of students in both instructional conditions reported that they would use their strategic tool again.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RÉSUMÉ - Les images satellitales multispectrales, notamment celles à haute résolution spatiale (plus fine que 30 m au sol), représentent une source d’information inestimable pour la prise de décision dans divers domaines liés à la gestion des ressources naturelles, à la préservation de l’environnement ou à l’aménagement et la gestion des centres urbains. Les échelles d’étude peuvent aller du local (résolutions plus fines que 5 m) à des échelles régionales (résolutions plus grossières que 5 m). Ces images caractérisent la variation de la réflectance des objets dans le spectre qui est l’information clé pour un grand nombre d’applications de ces données. Or, les mesures des capteurs satellitaux sont aussi affectées par des facteurs « parasites » liés aux conditions d’éclairement et d’observation, à l’atmosphère, à la topographie et aux propriétés des capteurs. Deux questions nous ont préoccupé dans cette recherche. Quelle est la meilleure approche pour restituer les réflectances au sol à partir des valeurs numériques enregistrées par les capteurs tenant compte des ces facteurs parasites ? Cette restitution est-elle la condition sine qua non pour extraire une information fiable des images en fonction des problématiques propres aux différents domaines d’application des images (cartographie du territoire, monitoring de l’environnement, suivi des changements du paysage, inventaires des ressources, etc.) ? Les recherches effectuées les 30 dernières années ont abouti à une série de techniques de correction des données des effets des facteurs parasites dont certaines permettent de restituer les réflectances au sol. Plusieurs questions sont cependant encore en suspens et d’autres nécessitent des approfondissements afin, d’une part d’améliorer la précision des résultats et d’autre part, de rendre ces techniques plus versatiles en les adaptant à un plus large éventail de conditions d’acquisition des données. Nous pouvons en mentionner quelques unes : - Comment prendre en compte des caractéristiques atmosphériques (notamment des particules d’aérosol) adaptées à des conditions locales et régionales et ne pas se fier à des modèles par défaut qui indiquent des tendances spatiotemporelles à long terme mais s’ajustent mal à des observations instantanées et restreintes spatialement ? - Comment tenir compte des effets de « contamination » du signal provenant de l’objet visé par le capteur par les signaux provenant des objets environnant (effet d’adjacence) ? ce phénomène devient très important pour des images de résolution plus fine que 5 m; - Quels sont les effets des angles de visée des capteurs hors nadir qui sont de plus en plus présents puisqu’ils offrent une meilleure résolution temporelle et la possibilité d’obtenir des couples d’images stéréoscopiques ? - Comment augmenter l’efficacité des techniques de traitement et d’analyse automatique des images multispectrales à des terrains accidentés et montagneux tenant compte des effets multiples du relief topographique sur le signal capté à distance ? D’autre part, malgré les nombreuses démonstrations par des chercheurs que l’information extraite des images satellitales peut être altérée à cause des tous ces facteurs parasites, force est de constater aujourd’hui que les corrections radiométriques demeurent peu utilisées sur une base routinière tel qu’est le cas pour les corrections géométriques. Pour ces dernières, les logiciels commerciaux de télédétection possèdent des algorithmes versatiles, puissants et à la portée des utilisateurs. Les algorithmes des corrections radiométriques, lorsqu’ils sont proposés, demeurent des boîtes noires peu flexibles nécessitant la plupart de temps des utilisateurs experts en la matière. Les objectifs que nous nous sommes fixés dans cette recherche sont les suivants : 1) Développer un logiciel de restitution des réflectances au sol tenant compte des questions posées ci-haut. Ce logiciel devait être suffisamment modulaire pour pouvoir le bonifier, l’améliorer et l’adapter à diverses problématiques d’application d’images satellitales; et 2) Appliquer ce logiciel dans différents contextes (urbain, agricole, forestier) et analyser les résultats obtenus afin d’évaluer le gain en précision de l’information extraite par des images satellitales transformées en images des réflectances au sol et par conséquent la nécessité d’opérer ainsi peu importe la problématique de l’application. Ainsi, à travers cette recherche, nous avons réalisé un outil de restitution de la réflectance au sol (la nouvelle version du logiciel REFLECT). Ce logiciel est basé sur la formulation (et les routines) du code 6S (Seconde Simulation du Signal Satellitaire dans le Spectre Solaire) et sur la méthode des cibles obscures pour l’estimation de l’épaisseur optique des aérosols (aerosol optical depth, AOD), qui est le facteur le plus difficile à corriger. Des améliorations substantielles ont été apportées aux modèles existants. Ces améliorations concernent essentiellement les propriétés des aérosols (intégration d’un modèle plus récent, amélioration de la recherche des cibles obscures pour l’estimation de l’AOD), la prise en compte de l’effet d’adjacence à l’aide d’un modèle de réflexion spéculaire, la prise en compte de la majorité des capteurs multispectraux à haute résolution (Landsat TM et ETM+, tous les HR de SPOT 1 à 5, EO-1 ALI et ASTER) et à très haute résolution (QuickBird et Ikonos) utilisés actuellement et la correction des effets topographiques l’aide d’un modèle qui sépare les composantes directe et diffuse du rayonnement solaire et qui s’adapte également à la canopée forestière. Les travaux de validation ont montré que la restitution de la réflectance au sol par REFLECT se fait avec une précision de l’ordre de ±0.01 unités de réflectance (pour les bandes spectrales du visible, PIR et MIR), même dans le cas d’une surface à topographie variable. Ce logiciel a permis de montrer, à travers des simulations de réflectances apparentes à quel point les facteurs parasites influant les valeurs numériques des images pouvaient modifier le signal utile qui est la réflectance au sol (erreurs de 10 à plus de 50%). REFLECT a également été utilisé pour voir l’importance de l’utilisation des réflectances au sol plutôt que les valeurs numériques brutes pour diverses applications courantes de la télédétection dans les domaines des classifications, du suivi des changements, de l’agriculture et de la foresterie. Dans la majorité des applications (suivi des changements par images multi-dates, utilisation d’indices de végétation, estimation de paramètres biophysiques, …), la correction des images est une opération cruciale pour obtenir des résultats fiables. D’un point de vue informatique, le logiciel REFLECT se présente comme une série de menus simples d’utilisation correspondant aux différentes étapes de saisie des intrants de la scène, calcul des transmittances gazeuses, estimation de l’AOD par la méthode des cibles obscures et enfin, l’application des corrections radiométriques à l’image, notamment par l’option rapide qui permet de traiter une image de 5000 par 5000 pixels en 15 minutes environ. Cette recherche ouvre une série de pistes pour d’autres améliorations des modèles et méthodes liés au domaine des corrections radiométriques, notamment en ce qui concerne l’intégration de la FDRB (fonction de distribution de la réflectance bidirectionnelle) dans la formulation, la prise en compte des nuages translucides à l’aide de la modélisation de la diffusion non sélective et l’automatisation de la méthode des pentes équivalentes proposée pour les corrections topographiques.