12 resultados para Digital Journalism
em Helda - Digital Repository of University of Helsinki
Resumo:
Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.
Resumo:
Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.
Resumo:
Purpose: The aim of the present study was to develop and test new digital imaging equipment and methods for diagnosis and follow-up of ocular diseases. Methods: The whole material comprised 398 subjects (469 examined eyes), including 241 patients with melanocytic choroidal tumours, 56 patients with melanocytic iris tumours, 42 patients with diabetes, a 52-year old patient with chronic phase of VKH disease, a 30-year old patient with an old blunt eye injury, and 57 normal healthy subjects. Digital 50° (Topcon TRC 50 IA) and 45° (Canon CR6-45NM) fundus cameras, a new handheld digital colour videocamera for eye examinations (MediTell), a new subtraction method using the Topcon Image Net Program (Topcon corporation, Tokyo, Japan), a new method for digital IRT imaging of the iris we developed, and Zeiss photoslitlamp with a digital camera body were used for digital imaging. Results: Digital 50° red-free imaging had a sensitivity of 97.7% and two-field 45° and 50° colour imaging a sensitivity of 88.9-94%. The specificity of the digital 45°-50° imaging modalities was 98.9-100% versus the reference standard and ungradeable images that were 1.2-1.6%. By using the handheld digital colour video camera only, the optic disc and central fundus located inside 20° from the fovea could be recorded with a sensitivity of 6.9% for detection of at least mild NPDR when compared with the reference standard. Comparative use of digital colour, red-free, and red light imaging showed 85.7% sensitivity, 99% specificity, and 98.2 % exact agreement versus the reference standard in differentiation of small choroidal melanoma from pseudomelanoma. The new subtraction method showed growth in four of 94 melanocytic tumours (4.3%) during a mean ±SD follow-up of 23 ± 11 months. The new digital IRT imaging of the iris showed the sphincter muscle and radial contraction folds of Schwalbe in the pupillary zone and radial structural folds of Schwalbe and circular contraction furrows in the ciliary zone of the iris. The 52-year-old patient with a chronic phase of VKH disease showed extensive atrophy and occasional pigment clumps in the iris stroma, detachment of the ciliary body with severe ocular hypotony, and shallow retinal detachment of the posterior pole in both eyes. Infrared transillumination imaging and fluorescein angiographic findings of the iris showed that IR translucence (p=0.53), complete masking of fluorescence (p=0.69), presence of disorganized vessels (p=0.32), and fluorescein leakage (p=1.0) at the site of the lesion did not differentiate an iris nevus from a melanoma. Conclusions: Digital 50° red-free and two-field 50° or 45° colour imaging were suitable for DR screening, whereas the handheld digital video camera did not fulfill the needs of DR screening. Comparative use of digital colour, red-free and red light imaging was a suitable method in the differentiation of small choroidal melanoma from different pseudomelanomas. The subtraction method may reveal early growth of the melanocytic choroidal tumours. Digital IRT imaging may be used to study changes of the stroma and posterior surface of the iris in various diseases of the uvea. It contributed to the revealment of iris atrophy and serous detachment of the ciliary body with ocular hypotony together with the shallow retinal detachment of the posterior pole as new findings of the chronic phase of VKH disease. Infrared translucence and angiographic findings are useful in differential diagnosis of melanocytic iris tumours, but they cannot be used to determine if the lesion is benign or malignant.
Resumo:
The methods for estimating patient exposure in x-ray imaging are based on the measurement of radiation incident on the patient. In digital imaging, the useful dose range of the detector is large and excessive doses may remain undetected. Therefore, real-time monitoring of radiation exposure is important. According to international recommendations, the measurement uncertainty should be lower than 7% (confidence level 95%). The kerma-area product (KAP) is a measurement quantity used for monitoring patient exposure to radiation. A field KAP meter is typically attached to an x-ray device, and it is important to recognize the effect of this measurement geometry on the response of the meter. In a tandem calibration method, introduced in this study, a field KAP meter is used in its clinical position and calibration is performed with a reference KAP meter. This method provides a practical way to calibrate field KAP meters. However, the reference KAP meters require comprehensive calibration. In the calibration laboratory it is recommended to use standard radiation qualities. These qualities do not entirely correspond to the large range of clinical radiation qualities. In this work, the energy dependence of the response of different KAP meter types was examined. According to our findings, the recommended accuracy in KAP measurements is difficult to achieve with conventional KAP meters because of their strong energy dependence. The energy dependence of the response of a novel large KAP meter was found out to be much lower than with a conventional KAP meter. The accuracy of the tandem method can be improved by using this meter type as a reference meter. A KAP meter cannot be used to determine the radiation exposure of patients in mammography, in which part of the radiation beam is always aimed directly at the detector without attenuation produced by the tissue. This work assessed whether pixel values from this detector area could be used to monitor the radiation beam incident on the patient. The results were congruent with the tube output calculation, which is the method generally used for this purpose. The recommended accuracy can be achieved with the studied method. New optimization of radiation qualities and dose level is needed when other detector types are introduced. In this work, the optimal selections were examined with one direct digital detector type. For this device, the use of radiation qualities with higher energies was recommended and appropriate image quality was achieved by increasing the low dose level of the system.
Resumo:
The study Slogans of Change. Three Outlooks on Finnish Television Contents is concerned with alleged changes of television contents during the 1990s and 2000s, such as dumbing down, tabloidisation, entertainisation , and the like. Specifically, the focus is on the ways these changes might manifest in Finnish television. The aim of the study has been threefold: 1. To operationalise public and academic discussions about changes via specific slogans emerging from the debates; 2. Consequently, to study the slogans empirically and reflect on the findings with earlier research, including studies on institutional and audience-related aspects; 3. Finally, to suggest what the findings might mean regarding discussions about television s role, and what kinds of slogans or concepts might best serve future discussions and research. The empirical outlooks presented in this study offer analyses with three different sets of opposing slogans of change. The outlooks also follow three different traditions of the study of television. The first outlook focuses on quantity, as it gives a longitudinal (1993-2004), macro-level view on programme structures. The methodological approach is derived from media economic and policy studies. The claims that frame the analysis are convergence versus diversification of programme structures. The second outlook provides quantitative and qualitative views on the characteristics and quality the term signifying essence as well as worth of Finnish television journalism during sample weeks from the years 2002 and 2003. This outlook follows the traditions of quantitative content analysis found in journalism studies coupled with descriptive qualitative content analyses. The slogans reflected in this section are the lightening or widening of journalism. The third outlook narrows down the material and focuses at a micro-level on form; that is, communicative conventions in a small array of selected programmes in 1993, 2000 and during 2002-2004. The analyses have been inspired by the method of conversation analysis of verbal interaction, and coupled with qualitative close readings, with the focus of different communicative situations in the programmes. The catchphrases employed in this part are emotainment versus democratainment, coupled with more specific claims of discursive hybridisation and conversationalisation. The findings depict that, empirically, changes in Finnish television contents are not clear linear trends and cannot easily be moulded into neat slogans. The quantitative outlook on programme output during 1993-2004 depicts a tendency towards differentiation of channels, paving the way for the multi-channel digital system. The change in programme structures, however, is not dramatic on the level of total output. The second outlook suggests that the dualistic concepts, such as the pair information-entertainment, are not sufficient in understanding the array and changes of programmes that could be called journalism. The outlook on communicative conventions highlights hybridisation in the manner of television talk and its relation to broader debates on contents. Despite the three dissimilar empirical approaches, unifying aspects emerge. The outlooks suggest, albeit in different ways, tendencies toward distinction and polarisation. This study proposes that in order to facilitate a more nuanced understanding of the changes in television contents, dualistic slogans should be replaced with a multi-dimensional understanding of the concept of diversity.
Resumo:
The Cold War era was characterized by ideological struggles that had a major impact on economic decision-making, and also on management practice. To date, however, these ideological struggles have received little attention from management and organizational scholars. To partially fill this research gap, we focus on the role of the media in these ideological struggles. Our starting point is that the media both reflect more general societal debates but also act as an agency promoting specific kinds of ideas and ideologies. In this sense, the media exercise significant power in society; this influece, however, is often subtle and easily dismissed in historical analyses focusing on political and corporate decision-making. In this article, we focus on the role of business journalism in the ideological struggles of the Cold War era. Our case in point is Finland, which is arguably a particularly interesting example due to its geo-political position between East and West. Our approach is socio-historical: we focus on the emergence and development of business journalism in the context of the specific struggles in the Finnish political and economic fields. Our analysis shows how the business journalists struggled between nationalist, pro-Soviet and pro-West political forces, but gradually developed into an increasingly influential force promoting neo-liberal ideology.
Resumo:
The new paradigm of connectedness and empowerment brought by the interactivity feature of the Web 2.0 has been challenging the traditional centralized performance of mainstream media. The corporation has been able to survive the strong winds by transforming itself into a global multimedia business network embedded in the network society. By establishing networks, e.g. networks of production and distribution, the global multimedia business network has been able to sight potential solutions by opening the doors to innovation in a decentralized and flexible manner. Under this emerging context of re-organization, traditional practices like sourcing need to be re- explained and that is precisely what this thesis attempts to tackle. Based on ICT and on the network society, the study seeks to explain within the Finnish context the particular case of Helsingin Sanomat (HS) and its relations with the youth news agency, Youth Voice Editorial Board (NÄT). In that sense, the study can be regarded as an explanatory embedded single case study, where HS is the principal unit of analysis and NÄT its embedded unit of analysis. The thesis was able to reach explanations through interrelated steps. First, it determined the role of ICT in HS’s sourcing practices. Then it mapped an overview of the HS’s sourcing relations and provided a context in which NÄT was located. And finally, it established conceptualized institutional relational data between HS and NÄT for their posterior measurement through social network analysis. The data set was collected via qualitative interviews addressed to online and offline editors of HS as well as interviews addressed to NÄT’s personnel. The study concluded that ICT’s interactivity and User Generated Content (UGC) are not sourcing tools as such but mechanism used by HS for getting ideas that could turn into potential news stories. However, when it comes to visual communication, some exemptions were found. The lack of official sources amidst the immediacy leads HS to rely on ICT’s interaction and UGC. More than meets the eye, ICT’s input into the sourcing practice may be more noticeable if the interaction and UGC is well organized and coordinated into proper and innovative networks of alternative content collaboration. Currently, HS performs this sourcing practice via two projects that differ, precisely, by the mode they are coordinated. The first project found, Omakaupunki, is coordinated internally by Sanoma Group’s owned media houses HS, Vartti and Metro. The second project found is coordinated externally. The external alternative sourcing network, as it was labeled, consists of three actors, namely HS, NÄT (professionals in charge) and the youth. This network is a balanced and complete triad in which the actors connect themselves in relations of feedback, recognition, creativity and filtering. However, as innovation is approached very reluctantly, this content collaboration is a laboratory of experiments; a ‘COLLABORATORY’.
Resumo:
This paper describes the cost-benefit analysis of digital long-term preservation (LTP) that was carried out in the context of the Finnish National Digital Library Project (NDL) in 2010. The analysis was based on the assumption that as many as 200 archives, libraries, and museums will share an LTP system. The term ‘system’ shall be understood as encompassing not only information technology, but also human resources, organizational structures, policies and funding mechanisms. The cost analysis shows that an LTP system will incur, over the first 12 years, cumulative costs of €42 million, i.e. an average of €3.5 million per annum. Human resources and investments in information technology are the major cost factors. After the initial stages, the analysis predicts annual costs of circa €4 million. The analysis compared scenarios with and without a shared LTP system. The results indicate that a shared system will have remarkable benefits. At the development and implementation stages, a shared system shows an advantage of €30 million against the alternative scenario consisting of five independent LTP solutions. During the later stages, the advantage is estimated at €10 million per annum. The cumulative cost benefit over the first 12 years would amount to circa €100 million.
Resumo:
The loss and degradation of forest cover is currently a globally recognised problem. The fragmentation of forests is further affecting the biodiversity and well-being of the ecosystems also in Kenya. This study focuses on two indigenous tropical montane forests in the Taita Hills in southeastern Kenya. The study is a part of the TAITA-project within the Department of Geography in the University of Helsinki. The study forests, Ngangao and Chawia, are studied by remote sensing and GIS methods. The main data includes black and white aerial photography from 1955 and true colour digital camera data from 2004. This data is used to produce aerial mosaics from the study areas. The land cover of these study areas is studied by visual interpretation, pixel-based supervised classification and object-oriented supervised classification. The change of the forest cover is studied with GIS methods using the visual interpretations from 1955 and 2004. Furthermore, the present state of the study forests is assessed with leaf area index and canopy closure parameters retrieved from hemispherical photographs as well as with additional, previously collected forest health monitoring data. The canopy parameters are also compared with textural parameters from digital aerial mosaics. This study concludes that the classification of forest areas by using true colour data is not an easy task although the digital aerial mosaics are proved to be very accurate. The best classifications are still achieved with visual interpretation methods as the accuracies of the pixel-based and object-oriented supervised classification methods are not satisfying. According to the change detection of the land cover in the study areas, the area of indigenous woodland in both forests has decreased in 1955 2004. However in Ngangao, the overall woodland area has grown mainly because of plantations of exotic species. In general, the land cover of both study areas is more fragmented in 2004 than in 1955. Although the forest area has decreased, forests seem to have a more optimistic future than before. This is due to the increasing appreciation of the forest areas.