28 resultados para Earnings manipulation
em Helda - Digital Repository of University of Helsinki
Resumo:
Detecting Earnings Management Using Neural Networks. Trying to balance between relevant and reliable accounting data, generally accepted accounting principles (GAAP) allow, to some extent, the company management to use their judgment and to make subjective assessments when preparing financial statements. The opportunistic use of the discretion in financial reporting is called earnings management. There have been a considerable number of suggestions of methods for detecting accrual based earnings management. A majority of these methods are based on linear regression. The problem with using linear regression is that a linear relationship between the dependent variable and the independent variables must be assumed. However, previous research has shown that the relationship between accruals and some of the explanatory variables, such as company performance, is non-linear. An alternative to linear regression, which can handle non-linear relationships, is neural networks. The type of neural network used in this study is the feed-forward back-propagation neural network. Three neural network-based models are compared with four commonly used linear regression-based earnings management detection models. All seven models are based on the earnings management detection model presented by Jones (1991). The performance of the models is assessed in three steps. First, a random data set of companies is used. Second, the discretionary accruals from the random data set are ranked according to six different variables. The discretionary accruals in the highest and lowest quartiles for these six variables are then compared. Third, a data set containing simulated earnings management is used. Both expense and revenue manipulation ranging between -5% and 5% of lagged total assets is simulated. Furthermore, two neural network-based models and two linear regression-based models are used with a data set containing financial statement data from 110 failed companies. Overall, the results show that the linear regression-based models, except for the model using a piecewise linear approach, produce biased estimates of discretionary accruals. The neural network-based model with the original Jones model variables and the neural network-based model augmented with ROA as an independent variable, however, perform well in all three steps. Especially in the second step, where the highest and lowest quartiles of ranked discretionary accruals are examined, the neural network-based model augmented with ROA as an independent variable outperforms the other models.
Resumo:
Defects in mitochondrial DNA (mtDNA) maintenance cause a range of human diseases, including autosomal dominant progressive external ophthalmoplegia (adPEO). This study aimed to clarify the molecular background of adPEO. We discovered that deoxynucleoside triphosphate (dNTP) metabolism plays a crucial in mtDNA maintenance and were thus prompted to search for therapeutic strategies based on the modulation of cellular dNTP pools or mtDNA copy number. Human mtDNA is a 16.6 kb circular molecule present in hundreds to thousands of copies per cell. mtDNA is compacted into nucleoprotein clusters called nucleoids. mtDNA maintenance diseases result from defects in nuclear encoded proteins that maintain the mtDNA. These syndromes typically afflict highly differentiated, post-mitotic tissues such as muscle and nerve, but virtually any organ can be affected. adPEO is a disease where mtDNA molecules with large-scale deletions accumulate in patients tissues, particularly in skeletal muscle. Mutations in five nuclear genes, encoding the proteins ANT1, Twinkle, POLG, POLG2 and OPA1, have previously been shown to cause adPEO. Here, we studied a large North American pedigree with adPEO, and identified a novel heterozygous mutation in the gene RRM2B, which encodes the p53R2 subunit of the enzyme ribonucleotide reductase (RNR). RNR is the rate-limiting enzyme in dNTP biosynthesis, and is required both for nuclear and mitochondrial DNA replication. The mutation results in the expression of a truncated form of p53R2, which is likely to compete with the wild-type allele. A change in enzyme function leads to defective mtDNA replication due to altered dNTP pools. Therefore, RRM2B is a novel adPEO disease gene. The importance of adequate dNTP pools and RNR function for mtDNA maintenance has been established in many organisms. In yeast, induction of RNR has previously been shown to increase mtDNA copy number, and to rescue the phenotype caused by mutations in the yeast mtDNA polymerase. To further study the role of RNR in mammalian mtDNA maintenance, we used mice that broadly overexpress the RNR subunits Rrm1, Rrm2 or p53R2. Active RNR is a heterotetramer consisting of two large subunits (Rrm1) and two small subunits (either Rrm2 or p53R2). We also created bitransgenic mice that overexpress Rrm1 together with either Rrm2 or p53R2. In contrast to the previous findings in yeast, bitransgenic RNR overexpression led to mtDNA depletion in mouse skeletal muscle, without mtDNA deletions or point mutations. The mtDNA depletion was associated with imbalanced dNTP pools. Furthermore, the mRNA expression levels of Rrm1 and p53R2 were found to correlate with mtDNA copy number in two independent mouse models, suggesting nuclear-mitochondrial cross talk with regard to mtDNA copy number. We conclude that tight regulation of RNR is needed to prevent harmful alterations in the dNTP pool balance, which can lead to disordered mtDNA maintenance. Increasing the copy number of wild-type mtDNA has been suggested as a strategy for treating PEO and other mitochondrial diseases. Only two proteins are known to cause a robust increase in mtDNA copy number when overexpressed in mice; the mitochondrial transcription factor A (TFAM), and the mitochondrial replicative helicase Twinkle. We studied the mechanisms by which Twinkle and TFAM elevate mtDNA levels, and showed that Twinkle specifically implements mtDNA synthesis. Furthermore, both Twinkle and TFAM were found to increase mtDNA content per nucleoid. Increased mtDNA content in mouse tissues correlated with an age-related accumulation of mtDNA deletions, depletion of mitochondrial transcripts, and progressive respiratory dysfunction. Simultaneous overexpression of Twinkle and TFAM led to a further increase in the mtDNA content of nucleoids, and aggravated the respiratory deficiency. These results suggested that high mtDNA levels have detrimental long-term effects in mice. These data have to be considered when developing and evaluating treatment strategies for elevating mtDNA copy number.
Resumo:
There is much literature developing theories when and where earnings management occurs. Among the several possible motives driving earnings management behaviour in firms, this thesis focuses on motives that aim to influence the valuation of the firm. Earnings management that makes the firm look better than it really is may result in disappointment for the single investor and potentially leads to a welfare loss in society when the resource allocation is distorted. A more specific knowledge of the occurrence of earnings management supposedly increases the awareness of the investor and thus leads to better investments and increased welfare. This thesis contributes to the literature by increasing the knowledge as to where and when earnings management is likely to occur. More specifically, essay 1 adds to existing research connecting earnings management to IPOs and increases the knowledge in arguing that the tendency to manage earnings differs between the IPOs. Evidence is found that entrepreneur owned IPOs are more likely to be earnings managers than the institutionally owned ones. Essay 2 considers the reliability of quarterly earnings reports that precedes insider selling binges. The essay contributes by suggesting that earnings management is likely to occur before high insider selling. Essay 3 examines the widely studied phenomenon of income smoothing and investigates if income smoothing can be explained with proxies for information asymmetry. The essay argues that smoothing is more pervasive in private and smaller firms.
Resumo:
A growing body of empirical research examines the structure and effectiveness of corporate governance systems around the world. An important insight from this literature is that corporate governance mechanisms address the excessive use of managerial discretionary powers to get private benefits by expropriating the value of shareholders. One possible way of expropriation is to reduce the quality of disclosed earnings by manipulating the financial statements. This lower quality of earnings should then be reflected by the stock price of firm according to value relevance theorem. Hence, instead of testing the direct effect of corporate governance on the firm’s market value, it is important to understand the causes of the lower quality of accounting earnings. This thesis contributes to the literature by increasing knowledge about the extent of the earnings management – measured as the extent of discretionary accruals in total disclosed earnings - and its determinants across the Transitional European countries. The thesis comprises of three essays of empirical analysis of which first two utilize the data of Russian listed firms whereas the third essay uses data from 10 European economies. More specifically, the first essay adds to existing research connecting earnings management to corporate governance. It testifies the impact of the Russian corporate governance reforms of 2002 on the quality of disclosed earnings in all publicly listed firms. This essay provides empirical evidence of the fact that the desired impact of reforms is not fully substantiated in Russia without proper enforcement. Instead, firm-level factors such as long-term capital investments and compliance with International financial reporting standards (IFRS) determine the quality of the earnings. The result presented in the essay support the notion proposed by Leuz et al. (2003) that the reforms aimed to bring transparency do not correspond to desired results in economies where investor protection is lower and legal enforcement is weak. The second essay focuses on the relationship between the internal-control mechanism such as the types and levels of ownership and the quality of disclosed earnings in Russia. The empirical analysis shows that the controlling shareholders in Russia use their powers to manipulate the reported performance in order to get private benefits of control. Comparatively, firms owned by the State have significantly better quality of disclosed earnings than other controllers such as oligarchs and foreign corporations. Interestingly, market performance of firms controlled by either State or oligarchs is better than widely held firms. The third essay provides useful evidence on the fact that both ownership structures and economic characteristics are important factors in determining the quality of disclosed earnings in three groups of countries in Europe. Evidence suggests that ownership structure is a more important determinant in developed and transparent countries, while economic determinants are important determinants in developing and transitional countries.
Resumo:
Activation of midbrain dopamine systems is thought to be critically involved in the addictive properties of abused substances. Drugs of abuse increase dopamine release in the nucleus accumbens and dorsal striatum, which are the target areas of mesolimbic and nigrostriatal dopamine pathways, respectively. Dopamine release in the nucleus accumbens is thought to mediate the attribution of incentive salience to rewards, and dorsal striatal dopamine release is involved in habit formation. In addition, changes in the function of prefrontal cortex (PFC), the target area of mesocortical dopamine pathway, may skew information processing and memory formation such that the addict pays an abnormal amount of attention to drug-related cues. In this study, we wanted to explore how long-term forced oral nicotine exposure or the lack of catechol-O-methyltransferase (COMT), one of the dopamine metabolizing enzymes, would affect the functioning of these pathways. We also wanted to find out how the forced nicotine exposure or the lack of COMT would affect the consumption of nicotine, alcohol, or cocaine. First, we studied the effect of forced chronic nicotine exposure on the sensitivity of dopamine D2-like autoreceptors in microdialysis and locomotor activity experiments. We found that the sensitivity of these receptors was unchanged after forced oral nicotine exposure, although an increase in the sensitivity was observed in mice treated with intermittent nicotine injections twice daily for 10 days. Thus, the effect of nicotine treatment on dopamine autoreceptor sensitivity depends on the route, frequency, and time course of drug administration. Second, we investigated whether the forced oral nicotine exposure would affect the reinforcing properties of nicotine injections. The chronic nicotine exposure did not significantly affect the development of conditioned place preference to nicotine. In the intravenous self-administration paradigm, however, the nicotine-exposed animals self-administered nicotine at a lower unit dose than the control animals, indicating that their sensitivity to the reinforcing effects of nicotine was enhanced. Next, we wanted to study whether the Comt gene knock-out animals would be a suitable model to study alcohol and cocaine consumption or addiction. Although previous work had shown male Comt knock-out mice to be less sensitive to the locomotor-activating effects of cocaine, the present study found that the lack of COMT did not affect the consumption of cocaine solutions or the development of cocaine-induced place preference. However, the present work did find that male Comt knock-out mice, but not female knock-out mice, consumed ethanol more avidly than their wild-type littermates. This finding suggests that COMT may be one of the factors, albeit not a primary one, contributing to the risk of alcoholism. Last, we explored the effect of COMT deficiency on dorsal striatal, accumbal, and prefrontal cortical dopamine metabolism under no-net-flux conditions and under levodopa load in freely-moving mice. The lack of COMT did not affect the extracellular dopamine concentrations under baseline conditions in any of the brain areas studied. In the prefrontal cortex, the dopamine levels remained high for a prolonged time after levodopa treatment in male, but not female, Comt knock-out mice. COMT deficiency induced accumulation of 3,4-dihydroxyphenylacetic acid, which increased further under levodopa load. Homovanillic acid was not detectable in Comt knock-out animals either under baseline conditions or after levodopa treatment. Taken together, the present results show that although forced chronic oral nicotine exposure affects the reinforcing properties of self-administered nicotine, it is not an addiction model itself. COMT seems to play a minor role in dopamine metabolism and in the development of addiction under baseline conditions, indicating that dopamine function in the brain is well-protected from perturbation. However, the role of COMT becomes more important when the dopaminergic system is challenged, such as by pharmacological manipulation.
Resumo:
What are the musical features that turn a song into a hit? The aim of this research is to explore the musical features of hit tunes by studying the 224 most popular Finnish evergreens from the 1930s to the 1990s. It is remarkable, that 80-90% of Finnish oldies are in a minor key, though parallel major keys have also been widely employed within single pieces through, for example, modulations. Furthermore, melodies are usually diatonic, staying mostly in the same key. Consequently, chromatically altered tones in the melody and short modulations in the bridge sections become more prominent. I have concentrated in particular on the melodic lines in order to find the most typical melodic formulas from the data. These analyzed melodic formulas play an important role, because they serve as leading phrases and punchlines in songs. Analysis has revealed three major melodic formulas, which most often appear in the melodic lines of hit tunes. All of these formulas share common thematic ground, because they originate from the triadic tonic chord. Because the tonic chord is the most conventional opening chord in the verse parts, it is logical that these formulas occur most often in verses. The strong dominance of these formulas is very much a result of the rhythmic flexibility they possess; for instance, they can be found in every musical style from waltz to foxtrot. Alongside the major formulas lies a miscellaneous group of other tonic-related melodic formulas. One group of melodic formulas consists of melodic quotations. These quotations appear in a different musical context, for instance in a harmonically altered form, and are therefore often difficult to recognize as such. Yet despite the contextual manipulation, the distinctive character of the cited melody usually remains the same. Composers have also made use of certain popular chord-progressions in order to create new but familiar-sounding melodies. The most important individual progression in this case is what is known as a "circle of fifths" and its shortened, prolonged and altered versions. Because that progression is harmonically strong, it is also a contrastive tool used especially in chorus parts and middle sections (AABA). I have also paid attention to ragtime and jazz influences, which can be found in harmony parts and certain melody notes, which extend, suspend or alter the accompaning chords. Other influences from jazz and ragtime in the Finnish evergreen are evident in the use of typical Tin Pan Alley popular song forms. The most important is the AABA form, which dominates over the data along with the verse/chorus-type popular song form. To briefly illustrate the main results, the basic concept of the hit tune can be traced back to Tin Pan Alley songs, whereas the major stylistic aspects, such as minor keys and musical styles, bear influences from Russian, Western European, and Finnish traditions.
Resumo:
The objective of my dissertation Pull (or Draught, or Moves) at the Parnassus , is to provide a deeper understanding of Nordic Middle Class radicalism of the 1960 s as featured in Finland-Swedish literature. My approach is cultural materialist in a broad sense; social class is regarded a crucial aspect of the contents and contexts of the novels and literary discussions explored. In the first volume, Middle Class With A Human Face , novels by Christer Kihlman, Jarl Sjöblom, Marianne Alopaeus, and Ulla-Lena Lundberg, respectively, are read from the points of view of place, emotion, and power. The term "cryptotope" is used to designate the hidden places found to play an important role in all of these four narratives. Also, the "chronotope of the provincial small town", described by Mikhail Bakhtin in 1938, is exemplified in Kihlman s satirical novel, as is the chronotope of of war (Algeria, Vietnam) in those of Alopaeus and Lundberg s. All the four novels signal changes in the way general "scripts of emotions", e.g. jealousy, are handled and described. The power relations in the novels are also read, with reference to Michel Foucault. As the protagonists in two of them work as journalists, a critical discussion about media and Bourgeois hegemony is found; the term "repressive legitimation" is created to grasp these patterns of manipulation. The Modernist Debate , part II of the study, concerns a literary discussion between mainly Finland-Swedish authors and critics. Essayist Johannes Salminen (40) provided much of the fuel for the debate in 1963, questioning the relevance to contemporary life of the Finland-Swedish modernist tradition of the 1910 s and 1920 s. In 1965, a group of younger authors and critics, including poet Claes Andersson (28), followed up this critique in a debate taking place mainly in the newspaper Vasabladet. Poets Rabbe Enckell (62), Bo Carpelan (39) and others defended a timeless poetry. This debate is contextualized and the changing literary field is analyzed using concepts provided by sociologist Pierre Bourdieu. In the thesis, the historical moment of Middle Class radicalism with a human face is regarded a temporary luxury that new social groups could afford themselves, as long as they were knocking over the statues and symbols of the Old Bourgeoisie. This is not to say that all components of the Sixties strategy have lost their power. Some of them have survived and even grown, others remain latent in the gene bank of utopias, waiting for new moments of change.
Resumo:
When experts construct mental images, they do not rely only on perceptual features; they also access domain-specific knowledge and skills in long-term memory, which enables them to exceed the capacity limitations of the short-term working memory system. The central question of the present dissertation was whether the facilitating effect of long-term memory knowledge on working memory imagery tasks is primarily based on perceptual chunking or whether it relies on higher-level conceptual knowledge. Three domains of expertise were studied: chess, music, and taxi driving. The effects of skill level, stimulus surface features, and the stimulus structure on incremental construction of mental images were investigated. A method was developed to capture the chunking mechanisms that experts use in constructing images: chess pieces, street names, and visual notes were presented in a piecemeal fashion for later recall. Over 150 experts and non-experts participated in a total of 13 experiments, as reported in five publications. The results showed skill effects in all of the studied domains when experts performed memory and problem solving tasks that required mental imagery. Furthermore, only experts' construction of mental images benefited from meaningful stimuli. Manipulation of the stimulus surface features, such as replacing chess pieces with dots, did not significantly affect experts' performance in the imagery tasks. In contrast, the structure of the stimuli had a significant effect on experts' performance in every task domain. For example, taxi drivers recalled more street names from lists that formed a spatially continuous route than from alphabetically organised lists. The results suggest that the mechanisms of conceptual chunking rather than automatic perceptual pattern matching underlie expert performance, even though the tasks of the present studies required perception-like mental representations. The results show that experts are able to construct skilled images that surpass working memory capacity, and that their images are conceptually organised and interpreted rather than merely depictive.
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
Ubiquitous computing is about making computers and computerized artefacts a pervasive part of our everyday lifes, bringing more and more activities into the realm of information. The computationalization, informationalization of everyday activities increases not only our reach, efficiency and capabilities but also the amount and kinds of data gathered about us and our activities. In this thesis, I explore how information systems can be constructed so that they handle this personal data in a reasonable manner. The thesis provides two kinds of results: on one hand, tools and methods for both the construction as well as the evaluation of ubiquitous and mobile systems---on the other hand an evaluation of the privacy aspects of a ubiquitous social awareness system. The work emphasises real-world experiments as the most important way to study privacy. Additionally, the state of current information systems as regards data protection is studied. The tools and methods in this thesis consist of three distinct contributions. An algorithm for locationing in cellular networks is proposed that does not require the location information to be revealed beyond the user's terminal. A prototyping platform for the creation of context-aware ubiquitous applications called ContextPhone is described and released as open source. Finally, a set of methodological findings for the use of smartphones in social scientific field research is reported. A central contribution of this thesis are the pragmatic tools that allow other researchers to carry out experiments. The evaluation of the ubiquitous social awareness application ContextContacts covers both the usage of the system in general as well as an analysis of privacy implications. The usage of the system is analyzed in the light of how users make inferences of others based on real-time contextual cues mediated by the system, based on several long-term field studies. The analysis of privacy implications draws together the social psychological theory of self-presentation and research in privacy for ubiquitous computing, deriving a set of design guidelines for such systems. The main findings from these studies can be summarized as follows: The fact that ubiquitous computing systems gather more data about users can be used to not only study the use of such systems in an effort to create better systems but in general to study phenomena previously unstudied, such as the dynamic change of social networks. Systems that let people create new ways of presenting themselves to others can be fun for the users---but the self-presentation requires several thoughtful design decisions that allow the manipulation of the image mediated by the system. Finally, the growing amount of computational resources available to the users can be used to allow them to use the data themselves, rather than just being passive subjects of data gathering.
Resumo:
Unvalued Work. Gender and fragmented labour before national collective bargaining Systematically irregular work creates economic and social insecurity. A novelty? To think that globalisation results in unprecedented labour conditions turns out to be questionable when the idea is put in perspective. In the light of history there is nothing new in the frequency of today s short-term employment, for instance, ranking genders in labour relations in an old custom. Unvalued Work (Halvennettu työ) examines the regulation and management of labour before the time of collective bargaining. In the study present trends engage in a dialogue with empirical findings from the past. Preventing trade unions to take the initiative has been and remains an employer interest. The analysis focuses on female employment in the 1920s and 1930s. The inferences challenge to ask on what conditions the history of Finnish labour relations warrants on the whole speaking of contractual security, stable earnings and regular waged work that provides livelihood. Success in selling one s labour is not synonymous with good employment that yields decent income. Juxtaposing labour relations between the world wars and the 21st century helps us to understand the currently transforming labour market. Present policies are informed by past choices and patterns of thought. Unvalued Work (Halvennettu työ) offers instruments for making sense of today s labour relations.
Resumo:
Väitöskirja koostuu neljästä esseestä, joissa tutkitaan empiirisen työntaloustieteen kysymyksiä. Ensimmäinen essee tarkastelee työttömyysturvan tason vaikutusta työllistymiseen Suomessa. Vuonna 2003 ansiosidonnaista työttömyysturvaa korotettiin työntekijöille, joilla on pitkä työhistoria. Korotus oli keskimäärin 15 % ja se koski ensimmäistä 150 työttömyyspäivää. Tutkimuksessa arvioidaan korotuksen vaikutus vertailemalla työllistymisen todennäköisyyksiä korotuksen saaneen ryhmän ja vertailuryhmän välillä ennen uudistusta ja sen jälkeen. Tuloksien perusteella työttömyysturvan korotus laski työllistymisen todennäköisyyttä merkittävästi, keskimäärin noin 16 %. Korotuksen vaikutus on suurin työttömyyden alussa ja se katoaa kun oikeus korotettuun ansiosidonnaiseen päättyy. Toinen essee tutkii työttömyyden pitkän aikavälin kustannuksia Suomessa keskittyen vuosien 1991 – 1993 syvään lamaan. Laman aikana toimipaikkojen sulkeminen lisääntyi paljon ja työttömyysaste nousi yli 13 prosenttiyksikköä. Tutkimuksessa verrataan laman aikana toimipaikan sulkemisen vuoksi työttömäksi jääneitä parhaassa työiässä olevia miehiä työllisinä pysyneisiin. Työttömyyden vaikutusta tarkastellaan kuuden vuoden seurantajaksolla. Vuonna 1999 työttömyyttä laman aikana kokeneen ryhmän vuosiansiot olivat keskimäärin 25 % alemmat kuin vertailuryhmässä. Tulojen menetys johtui sekä alhaisemmasta työllisyydestä että palkkatasosta. Kolmannessa esseessä tarkastellaan Suomen 1990-luvun alun laman aiheuttamaa työttömyysongelmaa tutkimalla työttömyyden kestoon vaikuttavia tekijöitä yksilötasolla. Kiinnostuksen kohteena on työttömyyden rakenteen ja työn kysynnän muutoksien vaikutus keskimääräiseen kestoon. Usein oletetaan, että laman seurauksena työttömäksi jää keskimääräistä huonommin työllistyviä henkilöitä, jolloin se itsessään pidentäisi keskimääräistä työttömyyden kestoa. Tuloksien perusteella makrotason kysyntävaikutus oli keskeinen työttömyyden keston kannalta ja rakenteen muutoksilla oli vain pieni kestoa lisäävä vaikutus laman aikana. Viimeisessä esseessä tutkitaan suhdannevaihtelun vaikutusta työpaikkaonnettomuuksien esiintymiseen. Tutkimuksessa käytetään ruotsalaista yksilötason sairaalahoitoaineistoa, joka on yhdistetty populaatiotietokantaan. Aineiston avulla voidaan tutkia vaihtoehtoisia selityksiä onnettomuuksien lisääntymiselle noususuhdanteessa, minkä on esitetty johtuvan esim. stressin tai kiireen vaikutuksesta. Tuloksien perusteella työpaikkaonnettomuudet ovat syklisiä, mutta vain tiettyjen ryhmien kohdalla. Työvoiman rakenteen vaihtelu saattaa selittää osan naisten onnettomuuksien syklisyydestä. Miesten kohdalla vain vähemmän vakavat onnettomuudet ovat syklisiä, mikä saattaa johtua strategisesta käyttäytymisestä.