886 resultados para IT -project
Resumo:
Organisations typically define and execute their selected strategy by developing and managing a portfolio of projects. The governance of this portfolio has proved to be a major challenge, particularly for large organisations. Executives and managers face even greater pressures when the nature of the strategic landscape is uncertain. This paper explores approaches for dealing with different levels of certainty in business IT projects and provides a contingent governance framework. Historically business IT projects have relied on a structured sequential approach, also referred to as a waterfall method. There is a distinction between the development stages of a solution and the management stages of a project that delivers the solution although these are often integrated in a business IT systems project. Prior research has demonstrated that the level of certainty varies between development projects. There can be uncertainty on what needs to be developed and also on how this solution should be developed. The move to agile development and management reflects a greater level of uncertainty often on both dimensions and this has led the adoption of more iterative approaches. What has been less well researched is the impact of uncertainty on the governance of the change portfolio and the corresponding implications for business executives. This paper poses this research question and proposes a govemance framework to address these aspects. The governance framework has been reviewed in the context of a major anonymous organisation, FinOrg. Findings are reported in this paper with a focus on the need to apply different approaches. In particular, the governance of uncertain business change is contrasted with the management approach for defined IT projects. Practical outputs from the paper include a consideration of some innovative approaches that can be used by executives. It also investigates the role of the business change portfolio group in evaluating and executing the appropriate level of governance. These results lead to recommendations for executives and also proposed further research.
Resumo:
Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling
Resumo:
The new Max-Planck-Institute Earth System Model (MPI-ESM) is used in the Coupled Model Intercomparison Project phase 5 (CMIP5) in a series of climate change experiments for either idealized CO2-only forcing or forcings based on observations and the Representative Concentration Pathway (RCP) scenarios. The paper gives an overview of the model configurations, experiments related forcings, and initialization procedures and presents results for the simulated changes in climate and carbon cycle. It is found that the climate feedback depends on the global warming and possibly the forcing history. The global warming from climatological 1850 conditions to 2080–2100 ranges from 1.5°C under the RCP2.6 scenario to 4.4°C under the RCP8.5 scenario. Over this range, the patterns of temperature and precipitation change are nearly independent of the global warming. The model shows a tendency to reduce the ocean heat uptake efficiency toward a warmer climate, and hence acceleration in warming in the later years. The precipitation sensitivity can be as high as 2.5% K−1 if the CO2 concentration is constant, or as small as 1.6% K−1, if the CO2 concentration is increasing. The oceanic uptake of anthropogenic carbon increases over time in all scenarios, being smallest in the experiment forced by RCP2.6 and largest in that for RCP8.5. The land also serves as a net carbon sink in all scenarios, predominantly in boreal regions. The strong tropical carbon sources found in the RCP2.6 and RCP8.5 experiments are almost absent in the RCP4.5 experiment, which can be explained by reforestation in the RCP4.5 scenario.
Resumo:
Massive economic and population growth, and urbanization are expected to lead to a tripling of anthropogenic emissions in southern West Africa (SWA) between 2000 and 2030. However, the impacts of this on human health, ecosystems, food security, and the regional climate are largely unknown. An integrated assessment is challenging due to (a) a superposition of regional effects with global climate change, (b) a strong dependence on the variable West African monsoon, (c) incomplete scientific understanding of interactions between emissions, clouds, radiation, precipitation, and regional circulations, and (d) a lack of observations. This article provides an overview of the DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa) project. DACCIWA will conduct extensive fieldwork in SWA to collect high-quality observations, spanning the entire process chain from surface-based natural and anthropogenic emissions to impacts on health, ecosystems, and climate. Combining the resulting benchmark dataset with a wide range of modeling activities will allow (a) assessment of relevant physical, chemical, and biological processes, (b) improvement of the monitoring of climate and atmospheric composition from space, and (c) development of the next generation of weather and climate models capable of representing coupled cloud-aerosol interactions. The latter will ultimately contribute to reduce uncertainties in climate predictions. DACCIWA collaborates closely with operational centers, international programs, policy-makers, and users to actively guide sustainable future planning for West Africa. It is hoped that some of DACCIWA’s scientific findings and technical developments will be applicable to other monsoon regions.
Resumo:
Within the ESA Climate Change Initiative (CCI) project Aerosol_cci (2010–2013), algorithms for the production of long-term total column aerosol optical depth (AOD) datasets from European Earth Observation sensors are developed. Starting with eight existing pre-cursor algorithms three analysis steps are conducted to improve and qualify the algorithms: (1) a series of experiments applied to one month of global data to understand several major sensitivities to assumptions needed due to the ill-posed nature of the underlying inversion problem, (2) a round robin exercise of "best" versions of each of these algorithms (defined using the step 1 outcome) applied to four months of global data to identify mature algorithms, and (3) a comprehensive validation exercise applied to one complete year of global data produced by the algorithms selected as mature based on the round robin exercise. The algorithms tested included four using AATSR, three using MERIS and one using PARASOL. This paper summarizes the first step. Three experiments were conducted to assess the potential impact of major assumptions in the various aerosol retrieval algorithms. In the first experiment a common set of four aerosol components was used to provide all algorithms with the same assumptions. The second experiment introduced an aerosol property climatology, derived from a combination of model and sun photometer observations, as a priori information in the retrievals on the occurrence of the common aerosol components. The third experiment assessed the impact of using a common nadir cloud mask for AATSR and MERIS algorithms in order to characterize the sensitivity to remaining cloud contamination in the retrievals against the baseline dataset versions. The impact of the algorithm changes was assessed for one month (September 2008) of data: qualitatively by inspection of monthly mean AOD maps and quantitatively by comparing daily gridded satellite data against daily averaged AERONET sun photometer observations for the different versions of each algorithm globally (land and coastal) and for three regions with different aerosol regimes. The analysis allowed for an assessment of sensitivities of all algorithms, which helped define the best algorithm versions for the subsequent round robin exercise; all algorithms (except for MERIS) showed some, in parts significant, improvement. In particular, using common aerosol components and partly also a priori aerosol-type climatology is beneficial. On the other hand the use of an AATSR-based common cloud mask meant a clear improvement (though with significant reduction of coverage) for the MERIS standard product, but not for the algorithms using AATSR. It is noted that all these observations are mostly consistent for all five analyses (global land, global coastal, three regional), which can be understood well, since the set of aerosol components defined in Sect. 3.1 was explicitly designed to cover different global aerosol regimes (with low and high absorption fine mode, sea salt and dust).
Resumo:
Observations from the Heliospheric Imager (HI) instruments aboard the twin STEREO spacecraft have enabled the compilation of several catalogues of coronal mass ejections (CMEs), each characterizing the propagation of CMEs through the inner heliosphere. Three such catalogues are the Rutherford Appleton Laboratory (RAL)-HI event list, the Solar Stormwatch CME catalogue, and, presented here, the J-tracker catalogue. Each catalogue uses a different method to characterize the location of CME fronts in the HI images: manual identification by an expert, the statistical reduction of the manual identifications of many citizen scientists, and an automated algorithm. We provide a quantitative comparison of the differences between these catalogues and techniques, using 51 CMEs common to each catalogue. The time-elongation profiles of these CME fronts are compared, as are the estimates of the CME kinematics derived from application of three widely used single-spacecraft-fitting techniques. The J-tracker and RAL-HI profiles are most similar, while the Solar Stormwatch profiles display a small systematic offset. Evidence is presented that these differences arise because the RAL-HI and J-tracker profiles follow the sunward edge of CME density enhancements, while Solar Stormwatch profiles track closer to the antisunward (leading) edge. We demonstrate that the method used to produce the time-elongation profile typically introduces more variability into the kinematic estimates than differences between the various single-spacecraft-fitting techniques. This has implications for the repeatability and robustness of these types of analyses, arguably especially so in the context of space weather forecasting, where it could make the results strongly dependent on the methods used by the forecaster.
Resumo:
A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.
Resumo:
TIGGE was a major component of the THORPEX (The Observing System Research and Predictability Experiment) research program, whose aim is to accelerate improvements in forecasting high-impact weather. By providing ensemble prediction data from leading operational forecast centers, TIGGE has enhanced collaboration between the research and operational meteorological communities and enabled research studies on a wide range of topics. The paper covers the objective evaluation of the TIGGE data. For a range of forecast parameters, it is shown to be beneficial to combine ensembles from several data providers in a Multi-model Grand Ensemble. Alternative methods to correct systematic errors, including the use of reforecast data, are also discussed. TIGGE data have been used for a range of research studies on predictability and dynamical processes. Tropical cyclones are the most destructive weather systems in the world, and are a focus of multi-model ensemble research. Their extra-tropical transition also has a major impact on skill of mid-latitude forecasts. We also review how TIGGE has added to our understanding of the dynamics of extra-tropical cyclones and storm tracks. Although TIGGE is a research project, it has proved invaluable for the development of products for future operational forecasting. Examples include the forecasting of tropical cyclone tracks, heavy rainfall, strong winds, and flood prediction through coupling hydrological models to ensembles. Finally the paper considers the legacy of TIGGE. We discuss the priorities and key issues in predictability and ensemble forecasting, including the new opportunities of convective-scale ensembles, links with ensemble data assimilation methods, and extension of the range of useful forecast skill.
Resumo:
This paper introduces the special issue of Climatic Change on the QUEST-GSI project, a global-scale multi-sectoral assessment of the impacts of climate change. The project used multiple climate models to characterise plausible climate futures with consistent baseline climate and socio-economic data and consistent assumptions, together with a suite of global-scale sectoral impacts models. It estimated impacts across sectors under specific SRES emissions scenarios, and also constructed functions relating impact to change in global mean surface temperature. This paper summarises the objectives of the project and its overall methodology, outlines how the project approach has been used in subsequent policy-relevant assessments of future climate change under different emissions futures, and summarises the general lessons learnt in the project about model validation and the presentation of multi-sector, multi-region impact assessments and their associated uncertainties to different audiences.
Resumo:
The goal of the Palaeoclimate Modelling Intercomparison Project (PMIP) is to understand the response of the climate system to changes in different climate forcings and to feedbacks. Through comparison with observations of the environmental impacts of these climate changes, or with climate reconstructions based on physical, chemical or biological records, PMIP also addresses the issue of how well state-of-the-art models simulate climate changes. Palaeoclimate states are radically different from those of the recent past documented by the instrumental record and thus provide an out-of-sample test of the models used for future climate projections and a way to assess whether they have the correct sensitivity to forcings and feedbacks. Five distinctly different periods have been selected as focus for the core palaeoclimate experiments that are designed to contribute to the objectives of the sixth phase of the Coupled Model Intercomparison Project (CMIP6). This manuscript describes the motivation for the choice of these periods and the design of the numerical experiments, with a focus upon their novel features compared to the experiments performed in previous phases of PMIP and CMIP as well as the benefits of common analyses of the models across multiple climate states. It also describes the information needed to document each experiment and the model outputs required for analysis and benchmarking.
Resumo:
Video exposure monitoring (VEM) is a group of methods used for occupational hygiene studies. The method is based on a combined use of video recordings with measurements taken with real-time monitoring instruments. A commonly used name for VEM is PIMEX. Since PIMEX initially was invented in the mid 1980’s have the method been implemented and developed in a number of countries. With the aim to give an updated picture of how VEM methods are used and to investigate needs for further development have a number of workshops been organised in Finland, UK, the Netherlands, Germany and Austria. Field studies have also been made with the aim to study to what extent the PIMEX method can improve workers motivation to actively take part in actions aimed at workplace improvements.The results from the workshops illustrates clearly that there is an impressive amount of experiences and ideas for the use of VEM within the network of the groups participating in the workshops. The sharing of these experiences between the groups, as well as dissemination of it to wider groups is, however, limited. The field studies made together with a number of welders indicate that their motivation to take part in workplace improvements is improved after the PIMEX intervention. The results are however not totally conclusive and further studies focusing on motivation are called for.It is recommended that strategies for VEM, for interventions in single workplaces, as well as for exposure categorisation and production of training material are further developed. It is also recommended to conduct a research project with the intention of evaluating the effects of the use of VEM as well as to disseminate knowledge about the potential of VEM to occupational hygiene experts and others who may benefit from its use.
Resumo:
Många projekt misslyckas och en av anledningarna är dålig styrning av projektet i allmänhet och inom IT branschen i synnerhet. Baserad på kritik av de traditionella metoderna under de senaste åren, så har det uppkommit flera lättrörliga metoder som kallas Agila metoder. Scrum är den mest kända Agila metoden som används idag. Metoden lovar goda resultat, men i en artikel ur tidningen Computer Sweden (feb 2009) står det ”siffror visar att nio av tio Scrumprojekt misslyckas”. Artikeln triggade vårt intresse av att ta reda på vilka problem specifika för Scrum som många har kritiserat och valde därför att rikta in vår studie mot detta. Uppsatsen syftar till att undersöka om lokala IT-företag i Borlänge, Headlight, Sogeti ochstatliga nätkapacitetleverantören Trafikverket ICT lider av det allmänna problem som de andra Scrumanvändarna upplever i samband med användningen av metoden. Denna uppsats har fokus på fyra problemområden: bristfällig dokumentation, sämre effektivitet i arbetsprocessen, sämre effektivitet i arbetsprocessen i stora projekt samt bristande stöd för utvärdering. För vår studie har litteraturstudier och intervjuer genomförts. Intervjuserier gjordes på elva personer hos våra fallföretag. Målgruppen för våra intervjuer är Product Owner (PO) ScrumMaster (SM) och utvecklare. Vi kan efter genomförd studie dra slutsatsen att de allmänna upplevda problem som de andra Scrumanvändaren upplever har vi även kunnat identifiera hos våra fallföretag. Resultaten har bekräftats med insamlade data och vår teoretiska ram. I diskussionen presenterar vi rekommendationer för att undvik relaterade problem med Scrum.
Resumo:
In this project, Stora Enso’s newly developed building system has been further developed to allow building to the Swedish passive house standard for the Swedish climate. The building system is based on a building framework of CLT (Cross laminated timber) boards. The concept has been tested on a small test building. The experience gained from this test building has also been used for planning a larger building (two storeys with the option of a third storey) with passive house standard with this building system. The main conclusions from the project are: It is possible to build airtight buildings with this technique without using traditional vapour barriers. Initial measurements show that this can be done without reaching critical humidity levels in the walls and roof, at least where wood fibre insulation is used, as this has a greater capacity for storing and evening out the moisture than mineral wool. However, the test building has so far not been exposed to internal generation of moisture (added moisture from showers, food preparation etc.). This needs to be investigated and this will be done during the winter 2013-14. A new fixing method for doors and windows has been tested without traditional fibre filling between them and the CLT panel. The door or window is pressed directly on to the CLT panel instead, with an expandable sealing strip between them. This has been proved to be successful. The air tightness between the CLT panels is achieved with expandable sealing strips between the panels. The position of the sealing strips is important, both for the air tightness itself and to allow rational assembly. Recurrent air tightness measurements show that the air tightness decreased somewhat during the first six months, but not to such an extent that the passive house criteria were not fulfilled. The reason for the decreased air tightness is not clear, but can be due to small movements in the CLT construction and also to the sealing strips being affected by changing outdoor temperatures. Long term measurements (at least two years) have to be carried out before more reliable conclusions can be drawn regarding the long term effect of the construction on air tightness and humidity in the walls. An economic analysis comparing using a concrete frame or the studied CLT frame for a three storey building shows that it is probably more expensive to build with CLT. For buildings higher than three floors, the CLT frame has economic advantages, mainly because of the shorter building time compared to using concrete for the frame. In this analysis, no considerations have been taken to differences in the influence on the environment or the global climate between the two construction methods.
Resumo:
Fan culture is a subculture that has developed explosively on the internet over the last decades. Fans are creating their own films, translations, fiction, fan art, blogs, role play and also various forms that are all based on familiar popular culture creations like TV-series, bestsellers, anime, manga stories and games. In our project, we analyze two of these subculture genres, fan fiction and scanlation. Amateurs, and sometimes professional writers, create new stories by adapting and developing existing storylines and characters from the original. In this way, a "network" of texts occurs, and writers step into an intertextual dialogue with established writers such as JK Rowling (Harry Potter) and Stephanie Meyer (Twilight). Literary reception and creation then merge into a rich reciprocal creative activity which includes comments and feedback from the participators in the community. The critical attitude of the fans regarding quality and the frustration at waiting for the official translation of manga books led to the development of scanlation, which is an amateur translation of manga distributed on the internet. Today, young internet users get involved in conceptual discussions of intertextuality and narrative structures through fan activity. In the case of scanlation, the scanlators practice the skills and techniques of translating in an informal environment. This phenomenon of participatory culture has been observed by scholars and it is concluded that they contribute to the development of a student’s literacy and foreign language skills. Furthermore, there is no doubt that the fandom related to Japanese cultural products such as manga, anime and videogames is one of the strong motives for foreign students to start learning Japanese. This is something to take into pedagogical consideration when we develop web-based courses. Fan fiction and fan culture make it possible to have an intensive transcultural dialogue between participators throughout the world and is of great interest when studying the interaction between formal and informal learning that puts the student in focus
Resumo:
Att kunna gör en effektiv undersökning av det flyktiga minnet är något som blir viktigare ochviktigare i IT-forensiska utredningar. Dels under Linux och Windows baserade PC installationermen också för mobila enheter i form av Android och enheter baserade andra mobila opperativsy-stem.Android använder sig av en modifierad Linux-kärna var modifikationer är för att anpassa kärnantill de speciella krav som gäller för ett mobilt operativsystem. Dessa modifikationer innefattardels meddelandehantering mellan processer men även ändringar till hur internminnet hanteras ochövervakas.Då dessa två kärnor är så pass nära besläktade kan samma grundläggande principer användas föratt dumpa och undersöka minne. Dumpningen sker via en kärn-modul vilket i den här rapportenutgörs av en programvara vid namn LiME vilken kan hantera bägge kärnorna.Analys av minnet kräver att verktygen som används har en förståelse för minneslayouten i fråga.Beroende på vilken metod verktyget använder så kan det även behövas information om olika sym-boler. Verktyget som används i det här examensarbetet heter Volatility och klarar på papperet avatt extrahera all den information som behövs för att kunna göra en korrekt undersökning.Arbetet avsåg att vidareutveckla existerande metoder för analys av det flyktiga minnet på Linux-baserade maskiner (PC) och inbyggda system(Android). Problem uppstod då undersökning avflyktigt minne på Android och satta mål kunde inte uppnås fullt ut. Det visade sig att minnesanalysriktat emot PC-plattformen är både enklare och smidigare än vad det är mot Android.