917 resultados para Model Driven Architecture (MDA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente conjunto de investigações pretendeu estudar o envolvimento parental na competição desportiva de crianças e jovens. Baseado no modelo do envolvimento parental no desporto (Teques & Serpa, 2009), o estudo permitiu concretizar dois objectivos fundamentais. Primeiro, desenvolver um conjunto de escalas válidas e fidedignas para aceder aos constructos incluídos no modelo teórico. Segundo, testar as hipóteses fundamentadas na estrutura conceptual do modelo com o propósito de compreender (1) a razão porque os pais se envolvem no desporto dos filhos, (2) quais os comportamentos utilizados pelos pais durante o envolvimento, e (3) como é que o envolvimento influencia o contexto de realização do jovem atleta. No total, participaram voluntariamente 1620 pais e 1665 jovens atletas de vários desportos individuais e coletivos, com idades compreendidas entre os 9 e os 18 anos. A prossecução dos objectivos teve por base uma série de três estudos independentes. Os resultados do primeiro estudo sugerem que as crenças do papel parental, a auto-eficácia, a perceção das invocações oriundas do treinador e do jovem atleta, o tempo e energia disponíveis, e os conhecimentos e competências relacionam-se com as atividades de envolvimento dos pais. No segundo estudo, os resultados demonstraram que as perceções dos comportamentos parentais de encorajamento, reforço, instrução, e modelagem medeiam a relação entre os comportamentos reportados pelos pais e as variáveis psicológicas de auto-eficácia, auto-eficácia social, motivação intrínseca, e estratégias de autorregulação dos jovens. Os resultados do terceiro estudo indicam que as perceções dos comportamentos dos pais relacionam-se com a realização desportiva através dos efeitos de mediação da auto-eficácia, autoeficácia social e das estratégias de autorregulação. Implicações para a intervenção, limitações e direções futuras para a investigação são também discutidas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The file contains the ontology created and instantiated according to a case study as well as a little explanation of the framework in which it is included.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityö käsittelee hisseissä erikoistapauksessa käytettävän kulmakorin suunnittelua ja tuotteistamista. Työ suoritetaan KONE Oyj:lle. Diplomityössä luotiin kulmakorille modulaarinen tuotearkkitehtuuri ja määritettiin korin toimitusprosessi. Työn tavoitteena oli saavuttaa 48,12% asiakkaiden mahdollisista vaatimuksista ja vähentää suunnitteluun kuluvaa aikaa aikaisemmasta 24 tunnista neljään tuntiin. Työn tavoite saavutettiin kokeneen tapauskohtaisten kulmakorien suunnittelijan kommenttien perusteella. 48,12% asiakasvaatimuksista sisällytettiin tuotemalliin konfigurointimahdollisuuksina. Työn alussa on esitelty tuotesuunnittelua, laadun hallintaa, parametrista mallinnusta, massakustomointia ja tuotetiedon hallintaa. Sen jälkeen on käsitelty kulmakorin tuotteistamisen kannalta kaikki tärkeimmät muuttujat. Tämän jälkeen kulmakorin tuotemalli suunnitellaan ja mallinnetaan systemaattisesti ylhäältä-alas –mallinnustapaa käyttäen ja luodaan osille ja kokoonpanoille valmistuskuvat. Päätyökaluna työssä käytettiin Pro/ENGINEER-ohjelmistoa. Tällä mallinnettiin parametrinen tuotemalli ja rakenteiden lujuustarkastelussa käytettiin ohjelmistoa Ansys. Työn tavoite saavutettiin analysoimalla massakustomoinnin perusteiden olennaisimmat osat ja seuraamalla analyyttistä ja systemaattista tuotekehitysprosessia. Laatua painottaen tuotearkkitehtuuri validoitiin suorittamalla rajoitettu tuotanto, joka sisälsi kolme tuotemallilla konfiguroitua kulmakoria. Yksi koreista testikasattiin Hyvinkään tehtaalla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On présente une nouvelle approche de simulation pour la fonction de densité conjointe du surplus avant la ruine et du déficit au moment de la ruine, pour des modèles de risque déterminés par des subordinateurs de Lévy. Cette approche s'inspire de la décomposition "Ladder height" pour la probabilité de ruine dans le Modèle Classique. Ce modèle, déterminé par un processus de Poisson composé, est un cas particulier du modèle plus général déterminé par un subordinateur, pour lequel la décomposition "Ladder height" de la probabilité de ruine s'applique aussi. La Fonction de Pénalité Escomptée, encore appelée Fonction Gerber-Shiu (Fonction GS), a apporté une approche unificatrice dans l'étude des quantités liées à l'événement de la ruine été introduite. La probabilité de ruine et la fonction de densité conjointe du surplus avant la ruine et du déficit au moment de la ruine sont des cas particuliers de la Fonction GS. On retrouve, dans la littérature, des expressions pour exprimer ces deux quantités, mais elles sont difficilement exploitables de par leurs formes de séries infinies de convolutions sans formes analytiques fermées. Cependant, puisqu'elles sont dérivées de la Fonction GS, les expressions pour les deux quantités partagent une certaine ressemblance qui nous permet de nous inspirer de la décomposition "Ladder height" de la probabilité de ruine pour dériver une approche de simulation pour cette fonction de densité conjointe. On présente une introduction détaillée des modèles de risque que nous étudions dans ce mémoire et pour lesquels il est possible de réaliser la simulation. Afin de motiver ce travail, on introduit brièvement le vaste domaine des mesures de risque, afin d'en calculer quelques unes pour ces modèles de risque. Ce travail contribue à une meilleure compréhension du comportement des modèles de risques déterminés par des subordinateurs face à l'éventualité de la ruine, puisqu'il apporte un point de vue numérique absent de la littérature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La transformation de modèles consiste à transformer un modèle source en un modèle cible conformément à des méta-modèles source et cible. Nous distinguons deux types de transformations. La première est exogène où les méta-modèles source et cible représentent des formalismes différents et où tous les éléments du modèle source sont transformés. Quand elle concerne un même formalisme, la transformation est endogène. Ce type de transformation nécessite généralement deux étapes : l’identification des éléments du modèle source à transformer, puis la transformation de ces éléments. Dans le cadre de cette thèse, nous proposons trois principales contributions liées à ces problèmes de transformation. La première contribution est l’automatisation des transformations des modèles. Nous proposons de considérer le problème de transformation comme un problème d'optimisation combinatoire où un modèle cible peut être automatiquement généré à partir d'un nombre réduit d'exemples de transformations. Cette première contribution peut être appliquée aux transformations exogènes ou endogènes (après la détection des éléments à transformer). La deuxième contribution est liée à la transformation endogène où les éléments à transformer du modèle source doivent être détectés. Nous proposons une approche pour la détection des défauts de conception comme étape préalable au refactoring. Cette approche est inspirée du principe de la détection des virus par le système immunitaire humain, appelée sélection négative. L’idée consiste à utiliser de bonnes pratiques d’implémentation pour détecter les parties du code à risque. La troisième contribution vise à tester un mécanisme de transformation en utilisant une fonction oracle pour détecter les erreurs. Nous avons adapté le mécanisme de sélection négative qui consiste à considérer comme une erreur toute déviation entre les traces de transformation à évaluer et une base d’exemples contenant des traces de transformation de bonne qualité. La fonction oracle calcule cette dissimilarité et les erreurs sont ordonnées selon ce score. Les différentes contributions ont été évaluées sur d’importants projets et les résultats obtenus montrent leurs efficacités.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’ingénierie dirigée par les modèles (IDM) est un paradigme d’ingénierie du logiciel bien établi, qui préconise l’utilisation de modèles comme artéfacts de premier ordre dans les activités de développement et de maintenance du logiciel. La manipulation de plusieurs modèles durant le cycle de vie du logiciel motive l’usage de transformations de modèles (TM) afin d’automatiser les opérations de génération et de mise à jour des modèles lorsque cela est possible. L’écriture de transformations de modèles demeure cependant une tâche ardue, qui requiert à la fois beaucoup de connaissances et d’efforts, remettant ainsi en question les avantages apportés par l’IDM. Afin de faire face à cette problématique, de nombreux travaux de recherche se sont intéressés à l’automatisation des TM. L’apprentissage de transformations de modèles par l’exemple (TMPE) constitue, à cet égard, une approche prometteuse. La TMPE a pour objectif d’apprendre des programmes de transformation de modèles à partir d’un ensemble de paires de modèles sources et cibles fournis en guise d’exemples. Dans ce travail, nous proposons un processus d’apprentissage de transformations de modèles par l’exemple. Ce dernier vise à apprendre des transformations de modèles complexes en s’attaquant à trois exigences constatées, à savoir, l’exploration du contexte dans le modèle source, la vérification de valeurs d’attributs sources et la dérivation d’attributs cibles complexes. Nous validons notre approche de manière expérimentale sur 7 cas de transformations de modèles. Trois des sept transformations apprises permettent d’obtenir des modèles cibles parfaits. De plus, une précision et un rappel supérieurs à 90% sont enregistrés au niveau des modèles cibles obtenus par les quatre transformations restantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A key problem in object recognition is selection, namely, the problem of identifying regions in an image within which to start the recognition process, ideally by isolating regions that are likely to come from a single object. Such a selection mechanism has been found to be crucial in reducing the combinatorial search involved in the matching stage of object recognition. Even though selection is of help in recognition, it has largely remained unsolved because of the difficulty in isolating regions belonging to objects under complex imaging conditions involving occlusions, changing illumination, and object appearances. This thesis presents a novel approach to the selection problem by proposing a computational model of visual attentional selection as a paradigm for selection in recognition. In particular, it proposes two modes of attentional selection, namely, attracted and pay attention modes as being appropriate for data and model-driven selection in recognition. An implementation of this model has led to new ways of extracting color, texture and line group information in images, and their subsequent use in isolating areas of the scene likely to contain the model object. Among the specific results in this thesis are: a method of specifying color by perceptual color categories for fast color region segmentation and color-based localization of objects, and a result showing that the recognition of texture patterns on model objects is possible under changes in orientation and occlusions without detailed segmentation. The thesis also presents an evaluation of the proposed model by integrating with a 3D from 2D object recognition system and recording the improvement in performance. These results indicate that attentional selection can significantly overcome the computational bottleneck in object recognition, both due to a reduction in the number of features, and due to a reduction in the number of matches during recognition using the information derived during selection. Finally, these studies have revealed a surprising use of selection, namely, in the partial solution of the pose of a 3D object.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reanalysis data obtained from data assimilation are increasingly used for diagnostic studies of the general circulation of the atmosphere, for the validation of modelling experiments and for estimating energy and water fluxes between the Earth surface and the atmosphere. Because fluxes are not specifically observed, but determined by the data assimilation system, they are not only influenced by the utilized observations but also by model physics and dynamics and by the assimilation method. In order to better understand the relative importance of humidity observations for the determination of the hydrological cycle, in this paper we describe an assimilation experiment using the ERA40 reanalysis system where all humidity data have been excluded from the observational data base. The surprising result is that the model, driven by the time evolution of wind, temperature and surface pressure, is able to almost completely reconstitute the large-scale hydrological cycle of the control assimilation without the use of any humidity data. In addition, analysis of the individual weather systems in the extratropics and tropics using an objective feature tracking analysis indicates that the humidity data have very little impact on these systems. We include a discussion of these results and possible consequences for the way moisture information is assimilated, as well as the potential consequences for the design of observing systems for climate monitoring. It is further suggested, with support from a simple assimilation study with another model, that model physics and dynamics play a decisive role for the hydrological cycle, stressing the need to better understand these aspects of model parametrization. .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enhanced release of CO2 to the atmosphere from soil organic carbon as a result of increased temperatures may lead to a positive feedback between climate change and the carbon cycle, resulting in much higher CO2 levels and accelerated lobal warming. However, the magnitude of this effect is uncertain and critically dependent on how the decomposition of soil organic C (heterotrophic respiration) responds to changes in climate. Previous studies with the Hadley Centre’s coupled climate–carbon cycle general circulation model (GCM) (HadCM3LC) used a simple, single-pool soil carbon model to simulate the response. Here we present results from numerical simulations that use the more sophisticated ‘RothC’ multipool soil carbon model, driven with the same climate data. The results show strong similarities in the behaviour of the two models, although RothC tends to simulate slightly smaller changes in global soil carbon stocks for the same forcing. RothC simulates global soil carbon stocks decreasing by 54 GtC by 2100 in a climate change simulation compared with an 80 GtC decrease in HadCM3LC. The multipool carbon dynamics of RothC cause it to exhibit a slower magnitude of transient response to both increased organic carbon inputs and changes in climate. We conclude that the projection of a positive feedback between climate and carbon cycle is robust, but the magnitude of the feedback is dependent on the structure of the soil carbon model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models of functional connectivity in cortical cultures on multi-electrodes arrays may aid in understanding how cognitive pathways form and improve techniques that aim to interface with neuronal systems. To enable research on such models, this study uses both data- and model-driven approaches to determine what dependencies are present in and between functional connectivity networks derived from bursts of extracellularly recorded activity. Properties of excitation in bursts were analysed using correlative techniques to assess the degree of linear dependence and then two parallel techniques were used to assess functional connectivity. Three models presenting increasing levels of spatio-temporal dependency were used to capture the dynamics of individual functional connections and their consistencies were verified using surrogate data. By comparing network-wide properties between model generated networks and functional networks from data, complex interdependencies were revealed. This indicates the persistent co-activation of neuronal pathways in spontaneous bursts, as can be found in whole brain structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the tropical middle atmosphere the climatological radiative equilibrium temperature is inconsistent with gradient-wind balance and the available angular momentum, especially during solstice seasons. Adjustment toward a balanced state results in a type of Hadley circulation that lies outside the “downward control” view of zonally averaged dynamics. This middle-atmosphere Hadley circulation is reexamined here using a zonally symmetric balance model driven through an annual cycle. It is found that the inclusion of a realistic radiation scheme leads to a concentration of the circulation near the stratopause and to its closing off in the mesosphere, with no need for relaxational damping or a rigid lid. The evolving zonal flow is inertially unstable, leading to a rapid process of inertial adjustment, which becomes significant in the mesosphere. This short-circuits the slower process of angular momentum homogenization by the Hadley circulation itself, thereby weakening the latter. The effect of the meridional circulation associated with extratropical wave drag on the Hadley circulation is considered. It is shown that the two circulations are independent for linear (quasigeostrophic) zonal-mean dynamics, and interact primarily through the advection of temperature and angular momentum. There appears to be no significant coupling in the deep Tropics via temperature advection since the wave-driven circulation is unable to alter meridional temperature gradients in this region. However, the wave-driven circulation can affect the Hadley circulation by advecting angular momentum out of the Tropics. The validity of the zonally symmetric balance model with parameterized inertial adjustment is tested by comparison with a three-dimensional primitive equations model. Fields from a middle-atmosphere GCM are also examined for evidence of these processes. While many aspects of the GCM circulation are indicative of the middle-atmosphere Hadley circulation, particularly in the upper stratosphere, it appears that the circulation is obscured in the mesosphere and lower stratosphere by other processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ERA-Interim/Land is a global land surface reanalysis data set covering the period 1979–2010. It describes the evolution of soil moisture, soil temperature and snowpack. ERA-Interim/Land is the result of a single 32-year simulation with the latest ECMWF (European Centre for Medium-Range Weather Forecasts) land surface model driven by meteorological forcing from the ERA-Interim atmospheric reanalysis and precipitation adjustments based on monthly GPCP v2.1 (Global Precipitation Climatology Project). The horizontal resolution is about 80 km and the time frequency is 3-hourly. ERA-Interim/Land includes a number of parameterization improvements in the land surface scheme with respect to the original ERA-Interim data set, which makes it more suitable for climate studies involving land water resources. The quality of ERA-Interim/Land is assessed by comparing with ground-based and remote sensing observations. In particular, estimates of soil moisture, snow depth, surface albedo, turbulent latent and sensible fluxes, and river discharges are verified against a large number of site measurements. ERA-Interim/Land provides a global integrated and coherent estimate of soil moisture and snow water equivalent, which can also be used for the initialization of numerical weather prediction and climate models.