18 resultados para Multicast application level


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän tutkintotyön tavoitteena on ollut Finnair konsernin GRI raportoinnin kehittäminen. Tutkintotyön osuus koko GRI projektissa on ollut toimintaindikaattoreiden määrittäminen. Tavoitteena oli määrittää toimintaindikaattorit Finnair konsernin ensimmäiseen GRI raporttiin niin, että raportti täyttää vähintään ohjeistuksessa määritellyn C-sovellustason vaatimukset. Asetettu C sovellustason tavoite saavutettiin G3 profiili kohdassa, ja ylitettiin selvästi G3 toimintaindikaattoreiden osalta, jossa varsinkin taloudelliset, sekä ymparistöpuolen toimintaindikaattorit saatiin tulevia raportteja ja korkeampaa sovellustasoa ajatellen hyvälle mallille. Tutkintotyön taustoittavassa osassa selvitettiin yhteiskuntavastuuraportin merkitystä organisaatiolle ja sen sidosryhmille, sekä läpikäytiin markkinoilla olevia eri yhteiskuntavastuuraportointivaihtoehtoja ja niiden merkittävimpiä eroavaisuuksia. Empiirisessä osassa puolestaan keskityttiin selvittämään, mitkä indikaattorit ovat olennaisia ja mahdollisia selvittää Finnair konsernin ensimmäisessä GRI raportissa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Talvivaaran Kaivososakeyhtiö Oyj:n tavoitteena on julkaista yhtiön ensimmäinen yhteiskuntavastuuraportti vuonna 2011. Vastuuraportin tavoitteena on täyttää GRI (Global Reporting Initiative) -ohjeiston C-tason raportointivaatimukset. Diplomityö liittyy olennaisena osana Talvivaaran yhteiskuntavastuuraportoinnin kehittämiseen. Diplomityön tavoitteena oli määrittää Talvivaaran ensimmäiseen raporttiin soveltuvat GRI-ohjeiston mukaiset mittarit. Työssä tarkastellaan Talvivaaran vuosikertomusta 2009. Työssä selvitettiin kuinka raporttia tulisi täydentää, jotta se täyttäisi GRI:n perussisällön C-tason vaatimukset. Näiden lisäksi työssä tehtiin sidosryhmäkartoitus, jossa selvitettiin yhtiön näkemys sidosryhmien odotuksista. Tulevaan vastuuraporttiin suunniteltujen mittareiden valintaan vaikutti sidosryhmien kiinnostuksen lisäksi se, kuinka olennaisia mittarit ovat Talvivaaran toiminnan kannalta. Valittujen mittareiden osalta yhtiön tuleva vastuuraportti täyttää selvästi C-tason raportointivaatimukset. Työssä annetaan ehdotus jatkotoimenpiteistä, joilla viestintää voidaan edelleen kehittää.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cloud computing, despite its success and promises, presents issues for businesses mi­grating their legacy applications to cloud. In this research legacy-to-cloud migration issues are reviewed based on literature findings and an experience report. Solutions are applied to Tieto Open Application Suite (TOAS) software development platform running on cloud infrastructure. It is observed that the migration strategy heavily affects the migration approach. For TOAS a strategy of redesigning the applications for cloud is suggested. Common migration-driven application level modifications in­clude adaptation to service-oriented architecture, load balancing, and runtime and technology changes. A cloud platform such as TOAS might introduce additional needs. Decision making on migration strategy is found to be an issue to be solved case by case. Use of assistive decision making tools is suggested.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Korkeasaatavuus on olennainen osa nykyaikaisissa, integroiduissa yritysjärjestelmissä. Yritysten kansainvälistyessä tiedon on oltava saatavissa ympärivuorokautisesti, mikä asettaa yhä kovempia vaatimuksia järjestelmän yksittäisten osien saatavuudelle. Kasvava tietojärjestelmäintegraatio puolestaan tekee järjestelmän solmukohdista kriittisiä liiketoiminnan kannalta. Tässä työssä perehdytään hajautettujen järjestelmien ominaisuuksiin ja niiden asettamiin haasteisiin. Esiteltyjä teknologioita ovat muun muassa väliohjelmistot, klusterit ja kuormantasaus. Yrityssovellusten pohjana käytetty Java 2 Enterprise Edition (J2EE) -teknologia käsitellään olennaisilta osiltaan. Työssä käytetään sovelluspalvelinalustana BEA WebLogic Server -ohjelmistoa, jonka ominaisuudet käydään läpi hajautuksen kannalta. Työn käytännön osuudessa toteutetaan kahdelle erilaiselle olemassa olevalle yrityssovellukselle korkean saatavuuden sovelluspalvelinympäristö, joissa sovellusten asettamat rajoitukset on otettu huomioon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Langattoman laajakaistaisen tietoliikennetekniikan kehittyminen on herättänyt kiinnostuksen sen ammattimaiseen hyödyntämiseen yleisen turvallisuuden ja kriisinhallinnan tarpeisiin. Hätätilanteissa usein olemassa olevat kiinteät tietoliikennejärjestelmät eivät ole ollenkaan käytettävissä tai niiden tarjoama kapasiteetti ei ole riittävä. Tästä syystä on noussut esiin tarve nopeasti toimintakuntoon saatettaville ja itsenäisille langattomille laajakaistaisille järjestelmille. Tässä diplomityössä on tarkoitus tutkia langattomia ad hoc monihyppy -verkkoja yleisen turvallisuuden tarpeiden pohjalta ja toteuttaa testialusta, jolla voidaan demonstroida sekä tutkia tällaisen järjestelmän toimintaa käytännössä. Työssä tutkitaan pisteestä pisteeseen sekä erityisesti pisteestä moneen pisteeseen suoritettavaa tietoliikennettä. Mittausten kohteena on testialustan tiedonsiirtonopeus, lähetysteho ja vastaanottimen herkkyys. Näitä tuloksia käytetään simulaattorin parametreina, jotta simulaattorin tulokset olisivat mahdollisimman aidot ja yhdenmukaiset testialustan kanssa. Sen jälkeen valitaan valikoima yleisen turvallisuuden vaatimusten mukaisia ohjelmia ja sovellusmalleja, joiden suorituskyky mitataan erilaisten reititysmenetelmien alaisena sekä testialustalla että simulaattorilla. Tuloksia arvioidaan ja vertaillaan. Multicast monihyppy -video päätettiin sovelluksista valita tutkimusten pääkohteeksi ja sitä sekä sen ominaisuuksia on tarkoitus myös oikeissa kenttäkokeissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies gray-level distance transforms, particularly the Distance Transform on Curved Space (DTOCS). The transform is produced by calculating distances on a gray-level surface. The DTOCS is improved by definingmore accurate local distances, and developing a faster transformation algorithm. The Optimal DTOCS enhances the locally Euclidean Weighted DTOCS (WDTOCS) with local distance coefficients, which minimize the maximum error from the Euclideandistance in the image plane, and produce more accurate global distance values.Convergence properties of the traditional mask operation, or sequential localtransformation, and the ordered propagation approach are analyzed, and compared to the new efficient priority pixel queue algorithm. The Route DTOCS algorithmdeveloped in this work can be used to find and visualize shortest routes between two points, or two point sets, along a varying height surface. In a digital image, there can be several paths sharing the same minimal length, and the Route DTOCS visualizes them all. A single optimal path can be extracted from the route set using a simple backtracking algorithm. A new extension of the priority pixel queue algorithm produces the nearest neighbor transform, or Voronoi or Dirichlet tessellation, simultaneously with the distance map. The transformation divides the image into regions so that each pixel belongs to the region surrounding the reference point, which is nearest according to the distance definition used. Applications and application ideas for the DTOCS and its extensions are presented, including obstacle avoidance, image compression and surface roughness evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reusability has become more popular factor in modern software engineering. This is mainly because object-orientation has brought methods that allow reusing more easily. Today more and more application developer thinks how they can reuse already existing applications in their work. If the developer wants to use existing components outside the current project, he can use design patterns, class libraries or frameworks. These provide solution for specific or general problems that has been already encountered. Application frameworks are collection of classes that provides base for the developer. Application frameworks are mostly implementation phase tools, but can also be used in application design. The main purpose of the frameworks is separate domain specific functionalities from the application specific. Usually the frameworks are divided into two categories: black and white box. Difference between those categories is the way the reuse is done. The application frameworks provide properties that can be examined and compared between different frameworks. These properties are: extensibility, reusability, modularity and scalability. These examine how framework will handle different platforms, changes in framework, increasing demand for resources, etc. Generally application frameworks do have these properties in good level. When comparing general purpose framework and more specific purpose framework, the main difference can be located in reusability of frameworks. It is mainly because the framework designed to specific domain can have constraints from external systems and resources. With general purpose framework these are set by the application developed based on the framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, models of ecological systems can be broadly categorized as ’top-down’ or ’bottom-up’ models, based on the hierarchical level that the model processes are formulated on. The structure of a top-down, also known as phenomenological, population model can be interpreted in terms of population characteristics, but it typically lacks an interpretation on a more basic level. In contrast, bottom-up, also known as mechanistic, population models are derived from assumptions and processes on a more basic level, which allows interpretation of the model parameters in terms of individual behavior. Both approaches, phenomenological and mechanistic modelling, can have their advantages and disadvantages in different situations. However, mechanistically derived models might be better at capturing the properties of the system at hand, and thus give more accurate predictions. In particular, when models are used for evolutionary studies, mechanistic models are more appropriate, since natural selection takes place on the individual level, and in mechanistic models the direct connection between model parameters and individual properties has already been established. The purpose of this thesis is twofold. Firstly, a systematical way to derive mechanistic discrete-time population models is presented. The derivation is based on combining explicitly modelled, continuous processes on the individual level within a reproductive period with a discrete-time maturation process between reproductive periods. Secondly, as an example of how evolutionary studies can be carried out in mechanistic models, the evolution of the timing of reproduction is investigated. Thus, these two lines of research, derivation of mechanistic population models and evolutionary studies, are complementary to each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the behaviour of cellulose, hemicelluloses, and lignin during wood and pulp processing is essential for understanding and controlling the processes. Determination of monosaccharide composition gives information about the structural polysaccharide composition of wood material and helps when determining the quality of fibrous products. In addition, monitoring of the acidic degradation products gives information of the extent of degradation of lignin and polysaccharides. This work describes two capillary electrophoretic methods developed for the analysis of monosaccharides and for the determination of aliphatic carboxylic acids from alkaline oxidation solutions of lignin and wood. Capillary electrophoresis (CE), in its many variants is an alternative separation technique to chromatographic methods. In capillary zone electrophoresis (CZE) the fused silica capillary is filled with an electrolyte solution. An applied voltage generates a field across the capillary. The movement of the ions under electric field is based on the charge and hydrodynamic radius of ions. Carbohydrates contain hydroxyl groups that are ionised only in strongly alkaline conditions. After ionisation, the structures are suitable for electrophoretic analysis and identification through either indirect UV detection or electrochemical detection. The current work presents a new capillary zone electrophoretic method, relying on in-capillary reaction and direct UV detection at the wavelength of 270 nm. The method has been used for the simultaneous separation of neutral carbohydrates, including mono- and disaccharides and sugar alcohols. The in-capillary reaction produces negatively charged and UV-absorbing compounds. The optimised method was applied to real samples. The methodology is fast since no other sample preparation, except dilution, is required. A new method for aliphatic carboxylic acids in highly alkaline process liquids was developed. The goal was to develop a method for the simultaneous analysis of the dicarboxylic acids, hydroxy acids and volatile acids that are oxidation and degradation products of lignin and wood polysaccharides. The CZE method was applied to three process cases. First, the fate of lignin under alkaline oxidation conditions was monitored by determining the level of carboxylic acids from process solutions. In the second application, the degradation of spruce wood using alkaline and catalysed alkaline oxidation were compared by determining carboxylic acids from the process solutions. In addition, the effectiveness of membrane filtration and preparative liquid chromatography in the enrichment of hydroxy acids from black liquor was evaluated, by analysing the effluents with capillary electrophoresis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to describe the demographic, clinicopathological, biological and morphometric features of Libyan breast cancer patients. The supporting value of nuclear morphometry and static image cytometry in the sensitivity for detecting breast cancer in conventional fine-needle aspiration biopsies were estimated. The findings were compared with findings in breast cancer in Finland and Nigeria. In addation, the value of ER and PR were evaluated. There were 131 histological samples, 41 cytological samples, and demographic and clinicopathological data from 234 Libyan patients. The Libyan breast cancer is dominantly premenopausal and in this feature it is similar to breast cancer in sub-Saharan Africans, but clearly different from breast cancer in Europeans, whose cancers are dominantly postmenopausal in character. At presention most Libyan patients have locally advanced disease, which is associated with poor survival rates. Nuclear morphometry and image DNA cytometry agree with earlier published data in the Finnish population and indicate that nuclear size and DNA analysis of nuclear content can be used to increase the cytological sensitivity and specificity in doubtful breast lesions, particularly when free cell sampling method is used. Combination of the morphometric data with earlier free cell data gave the following diagnostic guidelines: Range of overlap in free cell samples: 55 μm2 -71 μm2. Cut-off values for diagnostic purposes: Mean nuclear area (MNA) >54 μm2 for 100% detection of malignant cases (specificity 84 %), MNA < 72 μm2 for 100% detection of benign cases (sensitivity 91%). Histomorphometry showed a significant correlation between the MNA and most clinicopathological features, with the strongest association observed for histological grade (p <0.0001). MNA seems to be a prognosticator in Libyan breast cancer (Pearson’s test r = - 0.29, p = 0.019), but at lower level of significance than in the European material. A corresponding relationship was not found in shape-related morphometric features. ER and PR staining scores were in correlation with the clinical stage (p= 0.017, and 0.015, respectively), and also associated with lymph node negative patients (p=0.03, p=0.05, respectively). Receptor-positive (HR+) patients had a better survival. The fraction of HR+ cases among Libyan breast cancers is about the same as the fraction of positive cases in European breast cancer. The study suggests that also weak staining (corresponding to as few as 1% positive cells) has prognostic value. The prognostic significance may be associated with the practice to use antihormonal therapy in HR+ cases. The low survival and advanced presentation is associated with active cell proliferation, atypical nuclear morphology and aneuploid nuclear DNA content in Libyan breast cancer patients. The findings support the idea that breast cancer is not one type of disease, but should probably be classified into premenopausal and post menopausal types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological developments in microprocessors and ICT landscape have made a shift to a new era where computing power is embedded in numerous small distributed objects and devices in our everyday lives. These small computing devices are ne-tuned to perform a particular task and are increasingly reaching our society at every level. For example, home appliances such as programmable washing machines, microwave ovens etc., employ several sensors to improve performance and convenience. Similarly, cars have on-board computers that use information from many di erent sensors to control things such as fuel injectors, spark plug etc., to perform their tasks e ciently. These individual devices make life easy by helping in taking decisions and removing the burden from their users. All these objects and devices obtain some piece of information about the physical environment. Each of these devices is an island with no proper connectivity and information sharing between each other. Sharing of information between these heterogeneous devices could enable a whole new universe of innovative and intelligent applications. The information sharing between the devices is a diffcult task due to the heterogeneity and interoperability of devices. Smart Space vision is to overcome these issues of heterogeneity and interoperability so that the devices can understand each other and utilize services of each other by information sharing. This enables innovative local mashup applications based on shared data between heterogeneous devices. Smart homes are one such example of Smart Spaces which facilitate to bring the health care system to the patient, by intelligent interconnection of resources and their collective behavior, as opposed to bringing the patient into the health system. In addition, the use of mobile handheld devices has risen at a tremendous rate during the last few years and they have become an essential part of everyday life. Mobile phones o er a wide range of different services to their users including text and multimedia messages, Internet, audio, video, email applications and most recently TV services. The interactive TV provides a variety of applications for the viewers. The combination of interactive TV and the Smart Spaces could give innovative applications that are personalized, context-aware, ubiquitous and intelligent by enabling heterogeneous systems to collaborate each other by sharing information between them. There are many challenges in designing the frameworks and application development tools for rapid and easy development of these applications. The research work presented in this thesis addresses these issues. The original publications presented in the second part of this thesis propose architectures and methodologies for interactive and context-aware applications, and tools for the development of these applications. We demonstrated the suitability of our ontology-driven application development tools and rule basedapproach for the development of dynamic, context-aware ubiquitous iTV applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thousands of tons of pharmaceuticals are consumed yearly worldwide. Due to the continuous and increasing consumption and their incomplete elimination in wastewater treatment plants (WWTP), pharmaceuticals and their metabolites can be detected in receiving waters, although at low concentrations (ng to low μg/L). As bioactive molecules the presence of pharmaceuticals in the aquatic environment must be considered potentially hazardous for the aquatic organisms. In this thesis, the biotransformation and excretion of pharmaceuticals in fish was studied. The main biotransformation pathways of three anti‐inflammatory drugs, diclofenac, naproxen and ibuprofen, in rainbow trout were glucuronidation and taurine conjugation of the parent compounds and their phase I metabolites. The same metabolites were present in fish bile in aquatic exposures as in fish dosed with intraperitoneal injection. Higher bioconcentration factor in bile (BCFbile) was found for ibuprofen when compared to diclofenac and naproxen. Laboratory exposure studies were followed by a study of uptake of pharmaceuticals in a wild fish population living in lake contaminated with WWTP effluents. Of the analyzed 17 pharmaceuticals and six phase I metabolites, only diclofenac, naproxen and ibuprofen was present in bream and roach bile. It was shown, that diclofenac, naproxen and ibuprofen excreted by the liver can be found in rainbow trout and in two native fish species living in the receiving waters. In the bream and roach bile, the concentrations of diclofenac, naproxen and ibuprofen were roughly 1000 times higher than those found in the lake water, while in the laboratory exposures, the bioconcentration of the compounds and their metabolites in rainbow trout bile were at the same level as in wild fish or an order of magnitude higher. Thus, the parent compounds and their metabolites in fish bile can be used as a reliable biomarker to monitor the exposure of fish to environmental pharmaceuticals present in water receiving discharges from WWTPs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.