36 resultados para event mapping


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämän pro gradu -tutkielman aiheena on kääntäminen ja viestintä vieraalla kielellä Suomen ulkoasiainministeriön rahoittamissa kansalaisjärjestöjen kehitysyhteistyöhankkeissa. Tutkielman on tarkoitus kartoittaa kyseisten kehitysyhteistyöhankkeiden kääntämiseen ja vieraskieliseen viestintään liittyviä tarpeita ja -käytäntöjä. Aihetta ei ole tutkittu laajasti muutoin, joten empiirinen osuus on erityisen merkittävässä roolissa. Tutkielman alussa esitellään ulkoasiainministeriön laatimia suomalaisten kansalaisjärjestöjen kehitysyhteistyöhankkeiden ehtoja ja ohjeita, jotka ovat kyseiselle tutkimukselle olennaisia. Tutkielman teoreettinen osuus koostuu lähinnä metodologisesta pohdinnasta, koska aiempaa tutkimustietoa on vähän. Tutkimusmetodien analysointia on painotettu myös siksi, että empiirisellä osuudella on niin suuri merkitys. Tutkielman empiirinen aineisto on yhtäältä peräisin tutkijan havainnoista, jotka saatiin omia varhaisempia hankehallinnointikokemuksia hyödyntäen sekä osallistuvan havainnoinnin menetelmää käyttäen kahden opiskelijajärjestön kehitysyhteistyöhankkeista vuosien 2009 ja 2012 välisenä aikana. Toisaalta aineisto on kerätty kyselytutkimuksen vastauksista. Kysely lähetettiin 35 suomalaiselle kansalaisjärjestölle, joiden kehitysyhteistyöhankkeita Suomen ulkoasiainministeriö rahoittaa. Järjestöt valittiin satunnaisotannalla. Empiirisen aineiston analyysistä ilmeni muun muassa, että tutkijan alkuperäinen hypoteesi kehitysyhteistyöhön liittyvien tekstien kääntämisen yleisyydestä pitää osittain paikkansa, vaikka tekstejä käännettiinkin kansalaisjärjestöissä vähemmän kuin tutkija odotti. Tutkimusaineisto ei ole kuitenkaan tarpeeksi laaja, jotta tuloksia voitaisiin yleistää, sillä kansalaisjärjestöt, niiden hankkeet, käännöstarpeet ja asiantuntemus osoittautuivat odotetusti hyvin erilaisiksi. Tulokset sisältävät asiaa myös esimerkiksi järjestöissä kääntävien ja vieraalla kielellä viestivien taustoista. Jatkotutkimukset ovat kuitenkin tarpeen, ja mahdollisia tutkimusaiheita on runsaasti.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the fact that the literature on mergers and acquisitions is extensive, relatively little effort has been made to examine the relationship between the acquiring firms’ financial slack and short-term post-takeover announcement abnormal stock returns. In this study, the case is made that the financial slack of a firm is not only an outcome of past business and financing activities but it also may affect the quality of acquisition decisions. We will hypothesize that the level of financial slack in a firm is negatively associated with the abnormal returns following acquisition announcements because slack reduces managerial discipline over the use of corporate funds and also because it may give rise to managerial self-serving behavior. In this study, financial slack is measured in terms of three financial statements ratios: leverage ratio, cash and equivalents to total assets ratio and free cash flow to total assets ratio. The data used in this paper is collected from two main sources. A list comprising 90 European acquisition announcements is retrieved from Thomson One Banker database. The stock price data and financial statements information for the respective firms is collected using Datastream. Our empirical analysis is two-fold. First, we conduct a two-sample t-test where we find that the most slack-rich firms experience lower abnormal returns than the most slack-poor firms in the event window [-1, +1], significant at 5% risk level. Second, we perform a cross sectional regression for sample firms using three financial statements ratios to explain cumulative abnormal returns (CAR). We find that leverage shows a statistically significant positive relationship with cumulative abnormal returns in event window [-1; +1] (significance 5%). Moreover, cash to total assets ratio showed a weak negative relationship with CAR (significant at 10%) in event window [-1; +1]. We conclude that our hypothesis for the inverse relationship between slack and abnormal returns receives empirical support. Based on the results of the event study we get empirical support for the hypothesis that the capital markets expect the acquisitions undertaken by slack-rich firms to more likely be driven by managerial self-serving behavior and hubris than do those undertaken by slackpoor firms, signaling possible agency problems and behavioral biases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vision affords us with the ability to consciously see, and use this information in our behavior. While research has produced a detailed account of the function of the visual system, the neural processes that underlie conscious vision are still debated. One of the aims of the present thesis was to examine the time-course of the neuroelectrical processes that correlate with conscious vision. The second aim was to study the neural basis of unconscious vision, that is, situations where a stimulus that is not consciously perceived nevertheless influences behavior. According to current prevalent models of conscious vision, the activation of visual cortical areas is not, as such, sufficient for consciousness to emerge, although it might be sufficient for unconscious vision. Conscious vision is assumed to require reciprocal communication between cortical areas, but views differ substantially on the extent of this recurrent communication. Visual consciousness has been proposed to emerge from recurrent neural interactions within the visual system, while other models claim that more widespread cortical activation is needed for consciousness. Studies I-III compared models of conscious vision by studying event-related potentials (ERP). ERPs represent the brain’s average electrical response to stimulation. The results support the model that associates conscious vision with activity localized in the ventral visual cortex. The timing of this activity corresponds to an intermediate stage in visual processing. Earlier stages of visual processing may influence what becomes conscious, although these processes do not directly enable visual consciousness. Late processing stages, when more widespread cortical areas are activated, reflect the access to and manipulation of contents of consciousness. Studies IV and V concentrated on unconscious vision. By using transcranial magnetic stimulation (TMS) we show that when early visual cortical processing is disturbed so that subjects fail to consciously perceive visual stimuli, they may nevertheless guess (above chance-level) the location where the visual stimuli were presented. However, the results also suggest that in a similar situation, early visual cortex is necessary for both conscious and unconscious perception of chromatic information (i.e. color). Chromatic information that remains unconscious may influence behavioral responses when activity in visual cortex is not disturbed by TMS. Our results support the view that early stimulus-driven (feedforward) activation may be sufficient for unconscious processing. In conclusion, the results of this thesis support the view that conscious vision is enabled by a series of processing stages. The processes that most closely correlate with conscious vision take place in the ventral visual cortex ~200 ms after stimulus presentation, although preceding time-periods and contributions from other cortical areas such as the parietal cortex are also indispensable. Unconscious vision relies on intact early visual activation, although the location of visual stimulus may be unconsciously resolved even when activity in the early visual cortex is interfered with.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ongoing global financial crisis has demonstrated the importance of a systemwide, or macroprudential, approach to safeguarding financial stability. An essential part of macroprudential oversight concerns the tasks of early identification and assessment of risks and vulnerabilities that eventually may lead to a systemic financial crisis. Thriving tools are crucial as they allow early policy actions to decrease or prevent further build-up of risks or to otherwise enhance the shock absorption capacity of the financial system. In the literature, three types of systemic risk can be identified: i ) build-up of widespread imbalances, ii ) exogenous aggregate shocks, and iii ) contagion. Accordingly, the systemic risks are matched by three categories of analytical methods for decision support: i ) early-warning, ii ) macro stress-testing, and iii ) contagion models. Stimulated by the prolonged global financial crisis, today's toolbox of analytical methods includes a wide range of innovative solutions to the two tasks of risk identification and risk assessment. Yet, the literature lacks a focus on the task of risk communication. This thesis discusses macroprudential oversight from the viewpoint of all three tasks: Within analytical tools for risk identification and risk assessment, the focus concerns a tight integration of means for risk communication. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. The overall task of this thesis is to represent high-dimensional data concerning financial entities on lowdimensional displays. The low-dimensional representations have two subtasks: i ) to function as a display for individual data concerning entities and their time series, and ii ) to use the display as a basis to which additional information can be linked. The final nuance of the task is, however, set by the needs of the domain, data and methods. The following ve questions comprise subsequent steps addressed in the process of this thesis: 1. What are the needs for macroprudential oversight? 2. What form do macroprudential data take? 3. Which data and dimension reduction methods hold most promise for the task? 4. How should the methods be extended and enhanced for the task? 5. How should the methods and their extensions be applied to the task? Based upon the Self-Organizing Map (SOM), this thesis not only creates the Self-Organizing Financial Stability Map (SOFSM), but also lays out a general framework for mapping the state of financial stability. This thesis also introduces three extensions to the standard SOM for enhancing the visualization and extraction of information: i ) fuzzifications, ii ) transition probabilities, and iii ) network analysis. Thus, the SOFSM functions as a display for risk identification, on top of which risk assessments can be illustrated. In addition, this thesis puts forward the Self-Organizing Time Map (SOTM) to provide means for visual dynamic clustering, which in the context of macroprudential oversight concerns the identification of cross-sectional changes in risks and vulnerabilities over time. Rather than automated analysis, the aim of visual means for identifying and assessing risks is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence, as well as external risk communication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, image based estimation methods, also known as direct methods, are studied which avoid feature extraction and matching completely. Cost functions use raw pixels as measurements and the goal is to produce precise 3D pose and structure estimates. The cost functions presented minimize the sensor error, because measurements are not transformed or modified. In photometric camera pose estimation, 3D rotation and translation parameters are estimated by minimizing a sequence of image based cost functions, which are non-linear due to perspective projection and lens distortion. In image based structure refinement, on the other hand, 3D structure is refined using a number of additional views and an image based cost metric. Image based estimation methods are particularly useful in conditions where the Lambertian assumption holds, and the 3D points have constant color despite viewing angle. The goal is to improve image based estimation methods, and to produce computationally efficient methods which can be accomodated into real-time applications. The developed image-based 3D pose and structure estimation methods are finally demonstrated in practise in indoor 3D reconstruction use, and in a live augmented reality application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Value network has been studied greatly in the academic research, but a tool for value network mapping is missing. The objective of this study was to design a tool (process) for value network mapping in cross-sector collaboration. Furthermore, the study addressed a future perspective of collaboration, aiming to map the value network potential. During the study was investigated and pondered how to get the full potential of collaboration, by creating new value in collaboration process. These actions are parts of mapping process proposed in the study. The implementation and testing of the mapping process were realized through a case study of cross-sector collaboration in welfare services for elderly in the Eastern Finland. Key representatives in elderly care from public, private and third sectors were interviewed and a workshop with experts from every sector was also conducted in this regard. The value network mapping process designed in this study consists of specific steps that help managers and experts to understand how to get a complex value network map and how to enhance it. Furthermore, it make easier the understanding of how new value can be created in collaboration process. The map can be used in order to motivate participants to be engaged with responsibility in collaboration and to be fully committed in their interactions. It can be also used as a motivator tool for those organizations that intend to engage in collaboration process. Additionally, value network map is a starting point in many value network analyses. Furthermore, the enhanced value network map can be used as a performance measurement tool in cross-sector collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This bachelor’s thesis, written for Lappeenranta University of Technology and implemented in a medium-sized enterprise (SME), examines a distributed document migration system. The system was created to migrate a large number of electronic documents, along with their metadata, from one document management system to another, so as to enable a rapid switchover of an enterprise resource planning systems inside the company. The paper examines, through theoretical analysis, messaging as a possible enabler of distributing applications and how it naturally fits an event based model, whereby system transitions and states are expressed through recorded behaviours. This is put into practice by analysing the implemented migration systems and how the core components, MassTransit, RabbitMQ and MongoDB, were orchestrated together to realize such a system. As a result, the paper presents an architecture for a scalable and distributed system that could migrate hundreds of thousands of documents over weekend, serving its goals in enabling a rapid system switchover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical natural language processing (BioNLP) is a subfield of natural language processing, an area of computational linguistics concerned with developing programs that work with natural language: written texts and speech. Biomedical relation extraction concerns the detection of semantic relations such as protein-protein interactions (PPI) from scientific texts. The aim is to enhance information retrieval by detecting relations between concepts, not just individual concepts as with a keyword search. In recent years, events have been proposed as a more detailed alternative for simple pairwise PPI relations. Events provide a systematic, structural representation for annotating the content of natural language texts. Events are characterized by annotated trigger words, directed and typed arguments and the ability to nest other events. For example, the sentence “Protein A causes protein B to bind protein C” can be annotated with the nested event structure CAUSE(A, BIND(B, C)). Converted to such formal representations, the information of natural language texts can be used by computational applications. Biomedical event annotations were introduced by the BioInfer and GENIA corpora, and event extraction was popularized by the BioNLP'09 Shared Task on Event Extraction. In this thesis we present a method for automated event extraction, implemented as the Turku Event Extraction System (TEES). A unified graph format is defined for representing event annotations and the problem of extracting complex event structures is decomposed into a number of independent classification tasks. These classification tasks are solved using SVM and RLS classifiers, utilizing rich feature representations built from full dependency parsing. Building on earlier work on pairwise relation extraction and using a generalized graph representation, the resulting TEES system is capable of detecting binary relations as well as complex event structures. We show that this event extraction system has good performance, reaching the first place in the BioNLP'09 Shared Task on Event Extraction. Subsequently, TEES has achieved several first ranks in the BioNLP'11 and BioNLP'13 Shared Tasks, as well as shown competitive performance in the binary relation Drug-Drug Interaction Extraction 2011 and 2013 shared tasks. The Turku Event Extraction System is published as a freely available open-source project, documenting the research in detail as well as making the method available for practical applications. In particular, in this thesis we describe the application of the event extraction method to PubMed-scale text mining, showing how the developed approach not only shows good performance, but is generalizable and applicable to large-scale real-world text mining projects. Finally, we discuss related literature, summarize the contributions of the work and present some thoughts on future directions for biomedical event extraction. This thesis includes and builds on six original research publications. The first of these introduces the analysis of dependency parses that leads to development of TEES. The entries in the three BioNLP Shared Tasks, as well as in the DDIExtraction 2011 task are covered in four publications, and the sixth one demonstrates the application of the system to PubMed-scale text mining.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuous loading and unloading can cause breakdown of cranes. In seeking solution to this problem, the use of an intelligent control system for improving the fatigue life of cranes in the control of mechatronics has been under study since 1994. This research focuses on the use of neural networks as possibilities of developing algorithm to map stresses on a crane. The intelligent algorithm was designed to be a part of the system of a crane, the design process started with solid works, ANSYS and co-simulation using MSc Adams software which was incorporated in MATLAB-Simulink and finally MATLAB neural network (NN) for the optimization process. The flexibility of the boom accounted for the accuracy of the maximum stress results in the ADAMS model. The flexibility created in ANSYS produced more accurate results compared to the flexibility model in ADAMS/View using discrete link. The compatibility between.ADAMS and ANSYS softwares was paramount in the efficiency and the accuracy of the results. Von Mises stresses analysis was more suitable for this thesis work because the hydraulic boom was made from construction steel FE-510 of steel grade S355 with yield strength of 355MPa. Von Mises theory was good for further analysis due to ductility of the material and the repeated tensile and shear loading. Neural network predictions for the maximum stresses were then compared with the co-simulation results for accuracy, and the comparison showed that the results obtained from neural network model were sufficiently accurate in predicting the maximum stresses on the boom than co-simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: Web services have been gaining popularity due to the success of service oriented architecture and cloud computing. Web services offer tremendous opportunity for service developers to publish their services and applications over the boundaries of the organization or company. However, to fully exploit these opportunities it is necessary to find efficient discovery mechanism thus, Web services discovering mechanism has attracted a considerable attention in Semantic Web research, however, there have been no literature surveys that systematically map the present research result thus overall impact of these research efforts and level of maturity of their results are still unclear. This thesis aims at providing an overview of the current state of research into Web services discovering mechanism using systematic mapping. The work is based on the papers published 2004 to 2013, and attempts to elaborate various aspects of the analyzed literature including classifying them in terms of the architecture, frameworks and methods used for web services discovery mechanism. Objective: The objective if this work is to summarize the current knowledge that is available as regards to Web service discovery mechanisms as well as to systematically identify and analyze the current published research works in order to identify different approaches presented. Method: A systematic mapping study has been employed to assess the various Web Services discovery approaches presented in the literature. Systematic mapping studies are useful for categorizing and summarizing the level of maturity research area. Results: The result indicates that there are numerous approaches that are consistently being researched and published in this field. In terms of where these researches are published, conferences are major contributing publishing arena as 48% of the selected papers were conference published papers illustrating the level of maturity of the research topic. Additionally selected 52 papers are categorized into two broad segments namely functional and non-functional based approaches taking into consideration architectural aspects and information retrieval approaches, semantic matching, syntactic matching, behavior based matching as well as QOS and other constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acid sulfate (a.s.) soils constitute a major environmental issue. Severe ecological damage results from the considerable amounts of acidity and metals leached by these soils in the recipient watercourses. As even small hot spots may affect large areas of coastal waters, mapping represents a fundamental step in the management and mitigation of a.s. soil environmental risks (i.e. to target strategic areas). Traditional mapping in the field is time-consuming and therefore expensive. Additional more cost-effective techniques have, thus, to be developed in order to narrow down and define in detail the areas of interest. The primary aim of this thesis was to assess different spatial modeling techniques for a.s. soil mapping, and the characterization of soil properties relevant for a.s. soil environmental risk management, using all available data: soil and water samples, as well as datalayers (e.g. geological and geophysical). Different spatial modeling techniques were applied at catchment or regional scale. Two artificial neural networks were assessed on the Sirppujoki River catchment (c. 440 km2) located in southwestern Finland, while fuzzy logic was assessed on several areas along the Finnish coast. Quaternary geology, aerogeophysics and slope data (derived from a digital elevation model) were utilized as evidential datalayers. The methods also required the use of point datasets (i.e. soil profiles corresponding to known a.s. or non-a.s. soil occurrences) for training and/or validation within the modeling processes. Applying these methods, various maps were generated: probability maps for a.s. soil occurrence, as well as predictive maps for different soil properties (sulfur content, organic matter content and critical sulfide depth). The two assessed artificial neural networks (ANNs) demonstrated good classification abilities for a.s. soil probability mapping at catchment scale. Slightly better results were achieved using a Radial Basis Function (RBF) -based ANN than a Radial Basis Functional Link Net (RBFLN) method, narrowing down more accurately the most probable areas for a.s. soil occurrence and defining more properly the least probable areas. The RBF-based ANN also demonstrated promising results for the characterization of different soil properties in the most probable a.s. soil areas at catchment scale. Since a.s. soil areas constitute highly productive lands for agricultural purpose, the combination of a probability map with more specific soil property predictive maps offers a valuable toolset to more precisely target strategic areas for subsequent environmental risk management. Notably, the use of laser scanning (i.e. Light Detection And Ranging, LiDAR) data enabled a more precise definition of a.s. soil probability areas, as well as the soil property modeling classes for sulfur content and the critical sulfide depth. Given suitable training/validation points, ANNs can be trained to yield a more precise modeling of the occurrence of a.s. soils and their properties. By contrast, fuzzy logic represents a simple, fast and objective alternative to carry out preliminary surveys, at catchment or regional scale, in areas offering a limited amount of data. This method enables delimiting and prioritizing the most probable areas for a.s soil occurrence, which can be particularly useful in the field. Being easily transferable from area to area, fuzzy logic modeling can be carried out at regional scale. Mapping at this scale would be extremely time-consuming through manual assessment. The use of spatial modeling techniques enables the creation of valid and comparable maps, which represents an important development within the a.s. soil mapping process. The a.s. soil mapping was also assessed using water chemistry data for 24 different catchments along the Finnish coast (in all, covering c. 21,300 km2) which were mapped with different methods (i.e. conventional mapping, fuzzy logic and an artificial neural network). Two a.s. soil related indicators measured in the river water (sulfate content and sulfate/chloride ratio) were compared to the extent of the most probable areas for a.s. soils in the surveyed catchments. High sulfate contents and sulfate/chloride ratios measured in most of the rivers demonstrated the presence of a.s. soils in the corresponding catchments. The calculated extent of the most probable a.s. soil areas is supported by independent data on water chemistry, suggesting that the a.s. soil probability maps created with different methods are reliable and comparable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Successful management of rivers requires an understanding of the fluvial processes that govern them. This, in turn cannot be achieved without a means of quantifying their geomorphology and hydrology and the spatio-temporal interactions between them, that is, their hydromorphology. For a long time, it has been laborious and time-consuming to measure river topography, especially in the submerged part of the channel. The measurement of the flow field has been challenging as well, and hence, such measurements have long been sparse in natural environments. Technological advancements in the field of remote sensing in the recent years have opened up new possibilities for capturing synoptic information on river environments. This thesis presents new developments in fluvial remote sensing of both topography and water flow. A set of close-range remote sensing methods is employed to eventually construct a high-resolution unified empirical hydromorphological model, that is, river channel and floodplain topography and three-dimensional areal flow field. Empirical as well as hydraulic theory-based optical remote sensing methods are tested and evaluated using normal colour aerial photographs and sonar calibration and reference measurements on a rocky-bed sub-Arctic river. The empirical optical bathymetry model is developed further by the introduction of a deep-water radiance parameter estimation algorithm that extends the field of application of the model to shallow streams. The effect of this parameter on the model is also assessed in a study of a sandy-bed sub-Arctic river using close-range high-resolution aerial photography, presenting one of the first examples of fluvial bathymetry modelling from unmanned aerial vehicles (UAV). Further close-range remote sensing methods are added to complete the topography integrating the river bed with the floodplain to create a seamless high-resolution topography. Boat- cart- and backpack-based mobile laser scanning (MLS) are used to measure the topography of the dry part of the channel at a high resolution and accuracy. Multitemporal MLS is evaluated along with UAV-based photogrammetry against terrestrial laser scanning reference data and merged with UAV-based bathymetry to create a two-year series of seamless digital terrain models. These allow the evaluation of the methodology for conducting high-resolution change analysis of the entire channel. The remote sensing based model of hydromorphology is completed by a new methodology for mapping the flow field in 3D. An acoustic Doppler current profiler (ADCP) is deployed on a remote-controlled boat with a survey-grade global navigation satellite system (GNSS) receiver, allowing the positioning of the areally sampled 3D flow vectors in 3D space as a point cloud and its interpolation into a 3D matrix allows a quantitative volumetric flow analysis. Multitemporal areal 3D flow field data show the evolution of the flow field during a snow-melt flood event. The combination of the underwater and dry topography with the flow field yields a compete model of river hydromorphology at the reach scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this study was to explore and understand the definition of technical debt. Technical debt refers to situation in a software development, where shortcuts or workarounds are taken in technical decision. However, the original definition has been applied to other parts of software development and it is currently difficult to define technical debt. We used mapping study process as a research methodology to collect literature related to the research topic. We collected 159 papers that referred to original definition of technical debt, which were retrieved from scientific literature databases to conduct the search process. We retrieved 107 definitions that were split into keywords. The keyword map is one of the main results of this work. Apart from that, resulting synonyms and different types of technical debt were analyzed and added to the map as branches. Overall, 33 keywords or phrases, 6 synonyms and 17 types of technical debt were distinguished.