994 resultados para data source


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Les milieux humides remplissent plusieurs fonctions écologiques d’importance et contribuent à la biodiversité de la faune et de la flore. Même s’il existe une reconnaissance croissante sur l’importante de protéger ces milieux, il n’en demeure pas moins que leur intégrité est encore menacée par la pression des activités humaines. L’inventaire et le suivi systématique des milieux humides constituent une nécessité et la télédétection est le seul moyen réaliste d’atteindre ce but. L’objectif de cette thèse consiste à contribuer et à améliorer la caractérisation des milieux humides en utilisant des données satellites acquises par des radars polarimétriques en bande L (ALOS-PALSAR) et C (RADARSAT-2). Cette thèse se fonde sur deux hypothèses (chap. 1). La première hypothèse stipule que les classes de physionomies végétales, basées sur la structure des végétaux, sont plus appropriées que les classes d’espèces végétales car mieux adaptées au contenu informationnel des images radar polarimétriques. La seconde hypothèse stipule que les algorithmes de décompositions polarimétriques permettent une extraction optimale de l’information polarimétrique comparativement à une approche multipolarisée basée sur les canaux de polarisation HH, HV et VV (chap. 3). En particulier, l’apport de la décomposition incohérente de Touzi pour l’inventaire et le suivi de milieux humides est examiné en détail. Cette décomposition permet de caractériser le type de diffusion, la phase, l’orientation, la symétrie, le degré de polarisation et la puissance rétrodiffusée d’une cible à l’aide d’une série de paramètres extraits d’une analyse des vecteurs et des valeurs propres de la matrice de cohérence. La région du lac Saint-Pierre a été sélectionnée comme site d’étude étant donné la grande diversité de ses milieux humides qui y couvrent plus de 20 000 ha. L’un des défis posés par cette thèse consiste au fait qu’il n’existe pas de système standard énumérant l’ensemble possible des classes physionomiques ni d’indications précises quant à leurs caractéristiques et dimensions. Une grande attention a donc été portée à la création de ces classes par recoupement de sources de données diverses et plus de 50 espèces végétales ont été regroupées en 9 classes physionomiques (chap. 7, 8 et 9). Plusieurs analyses sont proposées pour valider les hypothèses de cette thèse (chap. 9). Des analyses de sensibilité par diffusiogramme sont utilisées pour étudier les caractéristiques et la dispersion des physionomies végétales dans différents espaces constitués de paramètres polarimétriques ou canaux de polarisation (chap. 10 et 12). Des séries temporelles d’images RADARSAT-2 sont utilisées pour approfondir la compréhension de l’évolution saisonnière des physionomies végétales (chap. 12). L’algorithme de la divergence transformée est utilisé pour quantifier la séparabilité entre les classes physionomiques et pour identifier le ou les paramètres ayant le plus contribué(s) à leur séparabilité (chap. 11 et 13). Des classifications sont aussi proposées et les résultats comparés à une carte existante des milieux humide du lac Saint-Pierre (14). Finalement, une analyse du potentiel des paramètres polarimétrique en bande C et L est proposé pour le suivi de l’hydrologie des tourbières (chap. 15 et 16). Les analyses de sensibilité montrent que les paramètres de la 1re composante, relatifs à la portion dominante (polarisée) du signal, sont suffisants pour une caractérisation générale des physionomies végétales. Les paramètres des 2e et 3e composantes sont cependant nécessaires pour obtenir de meilleures séparabilités entre les classes (chap. 11 et 13) et une meilleure discrimination entre milieux humides et milieux secs (chap. 14). Cette thèse montre qu’il est préférable de considérer individuellement les paramètres des 1re, 2e et 3e composantes plutôt que leur somme pondérée par leurs valeurs propres respectives (chap. 10 et 12). Cette thèse examine également la complémentarité entre les paramètres de structure et ceux relatifs à la puissance rétrodiffusée, souvent ignorée et normalisée par la plupart des décompositions polarimétriques. La dimension temporelle (saisonnière) est essentielle pour la caractérisation et la classification des physionomies végétales (chap. 12, 13 et 14). Des images acquises au printemps (avril et mai) sont nécessaires pour discriminer les milieux secs des milieux humides alors que des images acquises en été (juillet et août) sont nécessaires pour raffiner la classification des physionomies végétales. Un arbre hiérarchique de classification développé dans cette thèse constitue une synthèse des connaissances acquises (chap. 14). À l’aide d’un nombre relativement réduit de paramètres polarimétriques et de règles de décisions simples, il est possible d’identifier, entre autres, trois classes de bas marais et de discriminer avec succès les hauts marais herbacés des autres classes physionomiques sans avoir recours à des sources de données auxiliaires. Les résultats obtenus sont comparables à ceux provenant d’une classification supervisée utilisant deux images Landsat-5 avec une exactitude globale de 77.3% et 79.0% respectivement. Diverses classifications utilisant la machine à vecteurs de support (SVM) permettent de reproduire les résultats obtenus avec l’arbre hiérarchique de classification. L’exploitation d’une plus forte dimensionalitée par le SVM, avec une précision globale maximale de 79.1%, ne permet cependant pas d’obtenir des résultats significativement meilleurs. Finalement, la phase de la décomposition de Touzi apparaît être le seul paramètre (en bande L) sensible aux variations du niveau d’eau sous la surface des tourbières ouvertes (chap. 16). Ce paramètre offre donc un grand potentiel pour le suivi de l’hydrologie des tourbières comparativement à la différence de phase entre les canaux HH et VV. Cette thèse démontre que les paramètres de la décomposition de Touzi permettent une meilleure caractérisation, de meilleures séparabilités et de meilleures classifications des physionomies végétales des milieux humides que les canaux de polarisation HH, HV et VV. Le regroupement des espèces végétales en classes physionomiques est un concept valable. Mais certaines espèces végétales partageant une physionomie similaire, mais occupant un milieu différent (haut vs bas marais), ont cependant présenté des différences significatives quant aux propriétés de leur rétrodiffusion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cooperative caching is an attractive solution for reducing bandwidth demands and network latency in mobile ad hoc networks. Deploying caches in mobile nodes can reduce the overall traffic considerably. Cache hits eliminate the need to contact the data source frequently, which avoids additional network overhead. In this paper we propose a data discovery and cache management policy for cooperative caching, which reduces the caching overhead and delay by reducing the number of control messages flooded in to the network. A cache discovery process based on location of neighboring nodes is developed for this. The cache replacement policy we propose aims at increasing the cache hit ratio. The simulation results gives a promising result based on the metrics of studies

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data caching is an attractive solution for reducing bandwidth demands and network latency in mobile ad hoc networks. Deploying caches in mobile nodes can reduce the overall traf c considerably. Cache hits eliminate the need to contact the data source frequently, which avoids additional network overhead. In this paper we propose a data discovery and cache management policy for cooperative caching, which reduces the power usage, caching overhead and delay by reducing the number of control messages flooded into the network .A cache discovery process based on position cordinates of neighboring nodes is developed for this .The stimulstion results gives a promising result based on the metrics of the studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kerala, a classic ecotourism destination in India, provides significant opportunities for livelihood options to the people who depend on the resources from the forest and those who live in difficult terrains. This article analyses the socio-demographic, psychographic and travel behavior patterns and its sub-characteristics in the background of foreign and domestic tourists. The data source for the article has been obtained from a primary survey of 350 randomly chosen tourists, 175 each from domestic and foreign tourists, visiting Kerala’s ecotourists destinations during August-December 2010-11. Several socio-demographic, psychographic and life style factors have been identified based on the inference from field survey. There is considerable divergence in most of the factors identified in the case of domestic and international tourists. Post-trip attributes like satisfaction and intentions to return show that the ecotourism destinations in Kerala have significant potential that can help communities in the region.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study is an attempt to situate the quality of life and standard of living of local communities in ecotourism destinations inter alia their perception on forest conservation and the satisfaction level of the local community. 650 EDC/VSS members from Kerala demarcated into three zones constitute the data source. Four variables have been considered for evaluating the quality of life of the stakeholders of ecotourism sites, which is then funneled to the income-education spectrum for hypothesizing into the SLI framework. Zone-wise analysis of the community members working in tourism sector shows that the community members have benefited totally from tourism development in the region as they have got both employments as well as secured livelihood options. Most of the quality of life-indicators of the community in the eco-tourist centres show a promising position. The community perception does not show any negative impact on environment as well as on their local culture.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: to evaluate the efficacy and safety of human immunoglobulin versus plasmapheresis in the management of autoimmune neurologic diseases. Likewise, length of hospital stay and duration of ventilator support were compared. Methods: Randomized controlled trials and analytical observational studies of more than 10 cases, were reviewed. Cochrane Neuromuscular Disease Group trials, MEDLINE, EMBASE, HINARI Ovid, the Database of abstracts of reviews of effectiveness and the Economic evaluation Database were searched as data source. Reference lists were examined for further relevant articles. A random-effect model was used to derive a pooled risk ratio. Results: 725 articles were found and 27 met the criteria for a population studied of 4717 cases: 14 articles were about Guillain Barré syndrome, 10 of Myasthenia Gravis, one of Sydenham Chorea, one of Chronic inflammatory demyelinating polyneuropathy, and one of PANDAS. No evidence was found in favor of any of the two treatments as regards effectiveness (OR 0.94, IC 0.63 – 1.41, p= 0.77) or ventilator support time; IGIV had a significant better safety profile than plasmapheresis (OR 0.70, IC 0.51 – 0.96, p= 0.03) and patients needed less time of hospital stay (p=0.03). Conclusions: There is no evidence for superiority in the effectiveness of immunoglobulin or plasmapheresis in the management of autoimmune neurologic diseases. Nevertheless, patients treated with immunoglobulin have statistically significant less adverse effects, a shorter hospital stay and a tendency of less ventilator support time. These premises could lead to fewer costs for health services but an economic study should be done.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To explore and define the utility of different strategies for primary prevention (ASA, diet, physical activity) and strategies of screening test (FOBT, sigmoidoscopy, colonoscopy, etc.) for colorectal cancer. Data source: Databases consulted were MEDLINE (1966 to 2006), DARE (1980 to 2006), Cochrane Central Register of Controlled Trials, Cochrane Collaboration’s Registry of Clinical Trials, Cochrane Database of Systematic Reviews and LILACS. Study selection: Studies such clinical trial, cohort and case-control studies of the effectiveness of tests for screening and primary prevention adenoma and colorectal cancer were identified by two reviewers. Data Extraction: The extraction of data and its evaluation is done in most of the process so paired. Limitations: Not strictly complies with the methodology of a systematic review and therefore reproducibility is questionable, the conclusions of this study should be extrapolated with caution. Conclusions: The major strategy of screening on the effectiveness of early detection of premalignant lesions or cancer is colonoscopy every 5 years, however it is necessary to evaluate this measure cost-effectiveness studies. For primary prevention, aspirin and cyclooxygenase-2 inhibitors reduce the incidence of colorectal adenomas. Aspirin can reduce colorectal cancer incidence. However, these medications may be associated with a significant risk of cardiovascular events and gastrointestinal bleeding. The balance between risks and benefits must be evaluated in future studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El ausentismo laboral es un problema para la economía de las empresas y para la sociedad en general, su principal causa son las enfermedades de origen común. Los costos totales son difíciles de cuantificar por la cantidad de factores que influyen en él. METODOLOGIA: Estudio descriptivo de corte transversal que caracterizó el ausentismo laboral teniendo como fuente la base de datos de una empresa del sector terciario de la economía Colombiana, dedicada a las telecomunicaciones a nivel nacional. Se tomó como población universo 3.647 trabajadores a mitad de año de 2013. La población objeto de estudio la conformaron 889 trabajadores que se incapacitaron desde el 1 de enero de 2013 al 31 de diciembre de 2013. RESULTADOS: Se presentaron por enfermedad común 1.097 (83,87%)incapacidades, 740 (83,24%) incapacitados y 7.526 (57,28%) de días perdidos. Las mujeres son las que presentaron mayor número de incapacidades 63,3% (694). Según la clasificación de grupos diagnósticos del CIE 10, el sistema que generó mayor número de días perdidos fue el osteomuscular con el 13,40%. El diagnostico individual que genero mayor número de días perdidos fue el trastorno mixto de ansiedad y depresión 4% (301). Los trabajadores con cinco o más incapacidades durante el año fueron 26 (3,51%) que generaron el 33,3% (2.506) de los días perdidos. Los accidentes de trabajo no son una causa importante de ausentismo. RECOMENDACIONES: Implementar un sistema de recolección de información del ausentismo que se alimente de manera sistemática y continua para tener información suficiente, oportuna y confiable que permita analizarla y generar planes de acción. revisar las actividades que la empresa ha implementado para identificar, evaluar, gestionar los factores de riesgo psicolaborales; optimizar la detección temprana y prevención de desórdenes musculo esqueléticos. Por último realizar monitoreo evaluación y seguimiento a los trabajadores que presentan más de cinco incapacidades durante el año, con un grupo de salud interdisciplinario que permita implementar acciones de promoción y prevención de salud.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El ausentismo laboral es un problema para la economía de las empresas y para la sociedad en general, su principal causa son las enfermedades de origen común. Los costos totales son difíciles de cuantificar por la cantidad de factores que influyen en él. METODOLOGIA: Estudio descriptivo de corte transversal que caracterizó el ausentismo laboral teniendo como fuente la base de datos de una empresa del sector terciario de la economía Colombiana, dedicada a las telecomunicaciones a nivel nacional. Se tomó como población universo 3.647 trabajadores a mitad de año de 2013. La población objeto de estudio la conformaron 889 trabajadores que se incapacitaron desde el 1 de enero de 2013 al 31 de diciembre de 2013. RESULTADOS: Se presentaron por enfermedad común 1.097 (83,87%)incapacidades, 740 (83,24%) incapacitados y 7.526 (57,28%) de días perdidos. Las mujeres son las que presentaron mayor número de incapacidades 63,3% (694). Según la clasificación de grupos diagnósticos del CIE 10, el sistema que generó mayor número de días perdidos fue el osteomuscular con el 13,40%. El diagnostico individual que genero mayor número de días perdidos fue el trastorno mixto de ansiedad y depresión 4% (301). Los trabajadores con cinco o más incapacidades durante el año fueron 26 (3,51%) que generaron el 33,3% (2.506) de los días perdidos. Los accidentes de trabajo no son una causa importante de ausentismo. RECOMENDACIONES: Implementar un sistema de recolección de información del ausentismo que se alimente de manera sistemática y continua para tener información suficiente, oportuna y confiable que permita analizarla y generar planes de acción. revisar las actividades que la empresa ha implementado para identificar, evaluar, gestionar los factores de riesgo psicolaborales; optimizar la detección temprana y prevención de desórdenes musculo esqueléticos. Por último realizar monitoreo evaluación y seguimiento a los trabajadores que presentan más de cinco incapacidades durante el año, con un grupo de salud interdisciplinario que permita implementar acciones de promoción y prevención de salud.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large scale image mosaicing methods are in great demand among scientists who study different aspects of the seabed, and have been fostered by impressive advances in the capabilities of underwater robots in gathering optical data from the seafloor. Cost and weight constraints mean that lowcost Remotely operated vehicles (ROVs) usually have a very limited number of sensors. When a low-cost robot carries out a seafloor survey using a down-looking camera, it usually follows a predetermined trajectory that provides several non time-consecutive overlapping image pairs. Finding these pairs (a process known as topology estimation) is indispensable to obtaining globally consistent mosaics and accurate trajectory estimates, which are necessary for a global view of the surveyed area, especially when optical sensors are the only data source. This thesis presents a set of consistent methods aimed at creating large area image mosaics from optical data obtained during surveys with low-cost underwater vehicles. First, a global alignment method developed within a Feature-based image mosaicing (FIM) framework, where nonlinear minimisation is substituted by two linear steps, is discussed. Then, a simple four-point mosaic rectifying method is proposed to reduce distortions that might occur due to lens distortions, error accumulation and the difficulties of optical imaging in an underwater medium. The topology estimation problem is addressed by means of an augmented state and extended Kalman filter combined framework, aimed at minimising the total number of matching attempts and simultaneously obtaining the best possible trajectory. Potential image pairs are predicted by taking into account the uncertainty in the trajectory. The contribution of matching an image pair is investigated using information theory principles. Lastly, a different solution to the topology estimation problem is proposed in a bundle adjustment framework. Innovative aspects include the use of fast image similarity criterion combined with a Minimum spanning tree (MST) solution, to obtain a tentative topology. This topology is improved by attempting image matching with the pairs for which there is the most overlap evidence. Unlike previous approaches for large-area mosaicing, our framework is able to deal naturally with cases where time-consecutive images cannot be matched successfully, such as completely unordered sets. Finally, the efficiency of the proposed methods is discussed and a comparison made with other state-of-the-art approaches, using a series of challenging datasets in underwater scenarios

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new database of weather and circulation type catalogs is presented comprising 17 automated classification methods and five subjective classifications. It was compiled within COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions" in order to evaluate different methods for weather and circulation type classification. This paper gives a technical description of the included methods using a new conceptual categorization for classification methods reflecting the strategy for the definition of types. Methods using predefined types include manual and threshold based classifications while methods producing types derived from the input data include those based on eigenvector techniques, leader algorithms and optimization algorithms. In order to allow direct comparisons between the methods, the circulation input data and the methods' configuration were harmonized for producing a subset of standard catalogs of the automated methods. The harmonization includes the data source, the climatic parameters used, the classification period as well as the spatial domain and the number of types. Frequency based characteristics of the resulting catalogs are presented, including variation of class sizes, persistence, seasonal and inter-annual variability as well as trends of the annual frequency time series. The methodological concept of the classifications is partly reflected by these properties of the resulting catalogs. It is shown that the types of subjective classifications compared to automated methods show higher persistence, inter-annual variation and long-term trends. Among the automated classifications optimization methods show a tendency for longer persistence and higher seasonal variation. However, it is also concluded that the distance metric used and the data preprocessing play at least an equally important role for the properties of the resulting classification compared to the algorithm used for type definition and assignment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Arctic is an important region in the study of climate change, but monitoring surface temperatures in this region is challenging, particularly in areas covered by sea ice. Here in situ, satellite and reanalysis data were utilised to investigate whether global warming over recent decades could be better estimated by changing the way the Arctic is treated in calculating global mean temperature. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques. Kriging techniques provided the smallest errors in anomaly estimates. Similar accuracies were found for anomalies estimated from in situ meteorological station SAT records using a kriging technique. Whether additional data sources, which are not currently utilised in temperature anomaly datasets, would improve estimates of Arctic surface air temperature anomalies was investigated within the reanalysis testbed and using in situ data. For the reanalysis study, the additional input anomalies were reanalysis data sampled at certain supplementary data source locations over Arctic land and sea ice areas. For the in situ data study, the additional input anomalies over sea ice were surface temperature anomalies derived from the Advanced Very High Resolution Radiometer satellite instruments. The use of additional data sources, particularly those located in the Arctic Ocean over sea ice or on islands in sparsely observed regions, can lead to substantial improvements in the accuracy of estimated anomalies. Decreases in Root Mean Square Error can be up to 0.2K for Arctic-average anomalies and more than 1K for spatially resolved anomalies. Further improvements in accuracy may be accomplished through the use of other data sources.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate to what extent one can apply experiential learning theory (ELT) to the public-private partnership (PPP) setting in Russia and to draw insights regarding the learning cycle ' s nature. Additionally, the paper assesses whether the PPP case confirms Kolb ' s ELT. Design/methodology/approach – The case study draws upon primary data which the authors collected by interviewing informants including a PPP operator ' s managers, lawyers from Russian law firms and an expert from the National PPP Centre. The authors accomplished data source triangulation in order to ensure a high degree of research validity. Findings – Experiential learning has resulted in a successful and a relatively fast PPP project launch without the concessionary framework. The lessons learned include the need for effective stakeholder engagement; avoiding being stuck in bureaucracy such as collaboration with Federal Ministries and anti-trust agency; avoiding application for government funding as the approval process is tangled and lengthy; attracting strategic private investors; shaping positive public perception of a PPP project; and making continuous efforts in order to effectively mitigate the public acceptance risk. Originality/value – The paper contributes to ELT by incorporating the impact of social environment in the learning model. Additionally, the paper tests the applicability of ELT to learning in the complex organisational setting, i.e., a PPP.