53 resultados para empirical testing
Resumo:
A vast amount of public services and goods are contracted through procurement auctions. Therefore it is very important to design these auctions in an optimal way. Typically, we are interested in two different objectives. The first objective is efficiency. Efficiency means that the contract is awarded to the bidder that values it the most, which in the procurement setting means the bidder that has the lowest cost of providing a service with a given quality. The second objective is to maximize public revenue. Maximizing public revenue means minimizing the costs of procurement. Both of these goals are important from the welfare point of view. In this thesis, I analyze field data from procurement auctions and show how empirical analysis can be used to help design the auctions to maximize public revenue. In particular, I concentrate on how competition, which means the number of bidders, should be taken into account in the design of auctions. In the first chapter, the main policy question is whether the auctioneer should spend resources to induce more competition. The information paradigm is essential in analyzing the effects of competition. We talk of a private values information paradigm when the bidders know their valuations exactly. In a common value information paradigm, the information about the value of the object is dispersed among the bidders. With private values more competition always increases the public revenue but with common values the effect of competition is uncertain. I study the effects of competition in the City of Helsinki bus transit market by conducting tests for common values. I also extend an existing test by allowing bidder asymmetry. The information paradigm seems to be that of common values. The bus companies that have garages close to the contracted routes are influenced more by the common value elements than those whose garages are further away. Therefore, attracting more bidders does not necessarily lower procurement costs, and thus the City should not implement costly policies to induce more competition. In the second chapter, I ask how the auctioneer can increase its revenue by changing contract characteristics like contract sizes and durations. I find that the City of Helsinki should shorten the contract duration in the bus transit auctions because that would decrease the importance of common value components and cheaply increase entry which now would have a more beneficial impact on the public revenue. Typically, cartels decrease the public revenue in a significant way. In the third chapter, I propose a new statistical method for detecting collusion and compare it with an existing test. I argue that my test is robust to unobserved heterogeneity unlike the existing test. I apply both methods to procurement auctions that contract snow removal in schools of Helsinki. According to these tests, the bidding behavior of two of the bidders seems consistent with a contract allocation scheme.
Resumo:
Technical or contaminated ethanol products are sometimes ingested either accidentally or on purpose. Typical misused products are black-market liquor and automotive products, e.g., windshield washer fluids. In addition to less toxic solvents, these liquids may contain the deadly methanol. Symptoms of even lethal solvent poisoning are often non-specific at the early stage. The present series of studies was carried out to develop a method for solvent intoxication breath diagnostics to speed up the diagnosis procedure conventionally based on blood tests. Especially in the case of methanol ingestion, the analysis method should be sufficiently sensitive and accurate to determine the presence of even small amounts of methanol from the mixture of ethanol and other less-toxic components. In addition to the studies on the FT-IR method, the Dräger 7110 evidential breath analyzer was examined to determine its ability to reveal a coexisting toxic solvent. An industrial Fourier transform infrared analyzer was modified for breath testing. The sample cell fittings were widened and the cell size reduced in order to get an alveolar sample directly from a single exhalation. The performance and the feasibility of the Gasmet FT-IR analyzer were tested in clinical settings and in the laboratory. Actual human breath screening studies were carried out with healthy volunteers, inebriated homeless men, emergency room patients and methanol-intoxicated patients. A number of the breath analysis results were compared to blood test results in order to approximate the blood-breath relationship. In the laboratory experiments, the analytical performance of the Gasmet FT-IR analyzer and Dräger 7110 evidential breath analyzer was evaluated by means of artificial samples resembling exhaled breath. The investigations demonstrated that a successful breath ethanol analysis by Dräger 7110 evidential breath analyzer could exclude any significant methanol intoxication. In contrast, the device did not detect very high levels of acetone, 1-propanol and 2-propanol in simulated breath. The Dräger 7110 evidential breath ethanol analyzer was not equipped to recognize the interfering component. According to the studies the Gasmet FT-IR analyzer was adequately sensitive, selective and accurate for solvent intoxication diagnostics. In addition to diagnostics, the fast breath solvent analysis proved feasible for controlling the ethanol and methanol concentration during haemodialysis treatment. Because of the simplicity of the sampling and analysis procedure, non-laboratory personnel, such as police officers or social workers, could also operate the analyzer for screening purposes.
Resumo:
The autonomic nervous system is an important modulator of ventricular repolarization and arrhythmia vulnerability. This study explored the effects of cardiovascular autonomic function tests on repolarization and its heterogeneity, with a special reference to congenital arrhythmogenic disorders typically associated with stress-induced fatal ventricular arrhythmias. The first part explored the effects of standardized autonomic tests on QT intervals in a 12-lead electrocardiogram and in multichannel magnetocardiography in 10 healthy adults. The second part studied the effects of deep breathing, Valsalva manouvre, mental stress, sustained handgrip and mild exercise on QT intervals in asymptomatic patients with LQT1 subtype of the hereditary long QT syndrome (n=9) and in patients with arrhythmogenic right ventricular dysplasia (ARVD, n=9). Even strong sympathetic activation had no effects on spatial QT interval dispersion in healthy subjects, but deep respiratory efforts and Valsalva influenced it in ways that were opposite in electrocardiographic and magnetocardiographic recordings. LQT1 patients showed blunted QT interval and sinus nodal responses to sympathetic challenge, as well as an exaggerated QT prolongation during the recovery phases. LQT1 patients showed a QT interval recovery overshoot in 2.4 ± 1.7 tests compared with 0.8 ± 0.7 in healthy controls (P = 0.02). Valsalva strain prolonged the T wave peak to T wave end interval only in the LQT1 patients, considered to reflect the arrhythmogenic substrate in this syndrome. ARVD patients showed signs of abnormal repolarization in the right ventricle, modulated by abrupt sympathetic activation. An electrocardiographic marker reflecting interventricular dispersion of repolarization was introduced. It showed that LQT1 patients exhibit a repolarization gradient from the left ventricle towards the right ventricle, significantly larger than in controls. In contrast, ARVD patients showed a repolarization gradient from the right ventricle towards the left. Valsalva strain amplified the repolarization gradient in LQT1 patients whereas it transiently reversed it in patients with ARVD. In conclusion, intrathoracic volume and pressure changes influence regional electrocardiographic and magnetocardiographic QT interval measurements differently. Especially recovery phases of standard cardiovascular autonomic functions tests and Valsalva manoeuvre reveal the abnormal repolarization in asymptomatic LQT1 patients. Both LQT1 and ARVD patients have abnormal interventricular repolarization gradients, modulated by abrupt sympathetic activation. Autonomic testing and in particular the Valsalva manoeuvre are potentially useful in unmasking abnormal repolarization in these syndromes.
Resumo:
This thesis focuses on the issue of testing sleepiness quantitatively. The issue is relevant to policymakers concerned with traffic- and occupational safety; such testing provides a tool for safety legislation and -surveillance. The findings of this thesis provide guidelines for a posturographic sleepiness tester. Sleepiness ensuing from staying awake merely 17 h impairs our performance as much as the legally proscribed blood alcohol concentration 0.5 does. Hence, sleepiness is a major risk factor in transportation and occupational accidents. The lack of convenient, commercial sleepiness tests precludes testing impending sleepiness levels contrary to simply breath testing for alcohol intoxication. Posturography is a potential sleepiness test, since clinical diurnal balance testing suggests the hypothesis that time awake could be posturographically estimable. Relying on this hypothesis this thesis examines posturographic sleepiness testing for instrumentation purposes. Empirical results from 63 subjects for whom we tested balance with a force platform during wakefulness for maximum 36 h show that sustained wakefulness impairs balance. The results show that time awake is posturographically estimable with 88% accuracy and 97% precision which validates our hypothesis. Results also show that balance scores tested at 13:30 hours serve as a threshold to detect excessive sleepiness. Analytical results show that the test length has a marked effect on estimation accuracy: 18 s tests suffice to identify sleepiness related balance changes, but trades off some of the accuracy achieved with 30 s tests. The procedure to estimate time awake relies on equating the subject s test score to a reference table (comprising balance scores tested during sustained wakefulness, regressed against time awake). Empirical results showed that sustained wakefulness explains 60% of the diurnal balance variations, whereas the time of day explains 40% of the balance variations. The latter fact implies that time awake estimations also must rely on knowing the local times of both test and reference scores.
Resumo:
Doctoral dissertation work in sociology examines how human heredity became a scientific, political and a personal issue in the 20th century Finland. The study focuses on the institutionalisation of rationales and technologies concerning heredity, in the context of Finnish medicine and health care. The analysis concentrates specifically on the introduction and development of prenatal screening within maternity care. The data comprises of medical articles, policy documents and committee reports, as well as popular guidebooks and health magazines. The study commences with an analysis on the early 20th century discussions on racial hygiene. It ends with an analysis on the choices given to pregnant mothers and families at present. Freedom to choose, considered by geneticists and many others as a guarantee of the ethicality of medical applications, is presented in this study as a historically, politically and scientifically constructed issue. New medical testing methods have generated new possibilities of governing life itself. However, they have also created new ethical problems. Leaning on recent historical data, the study illustrates how medical risk rationales on heredity have been asserted by the medical profession into Finnish health care. It also depicts medical professions ambivalence between maintaining the patients autonomy and utilizing for example prenatal testing according to health policy interests. Personalized risk is discussed as a result of the empirical analysis. It is indicated that increasing risk awareness amongst the public, as well as offering choices, have had unintended consequences. According to doctors, present day parents often want to control risks more than what is considered justified or acceptable. People s hopes to anticipate the health and normality of their future children have exceeded the limits offered by medicine. Individualization of the government of heredity is closely linked to a process that is termed as depolitization. The concept refers to disembedding of medical genetics from its social contexts. Prenatal screening is regarded to be based on individual choice facilitated by neutral medical knowledge. However, prenatal screening within maternity care also has its basis in health policy aims and economical calculations. Methodological basis of the study lies in Michel Foucault s writings on the history of thought, as well as in science and technology studies.
Resumo:
This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.
Resumo:
The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.
Resumo:
This report derives from the EU funded research project “Key Factors Influencing Economic Relationships and Communication in European Food Chains” (FOODCOMM). The research consortium consisted of the following organisations: University of Bonn (UNI BONN), Department of Agricultural and Food Marketing Research (overall project co-ordination); Institute of Agricultural Development in Central and Eastern Europe (IAMO), Department for Agricultural Markets, Marketing and World Agricultural Trade, Halle (Saale), Germany; University of Helsinki, Ruralia Institute Seinäjoki Unit, Finland; Scottish Agricultural College (SAC), Food Marketing Research Team - Land Economy Research Group, Edinburgh and Aberdeen; Ashtown Food Research Centre (AFRC), Teagasc, Food Marketing Unit, Dublin; Institute of Agricultural & Food Economics (IAFE), Department of Market Analysis and Food Processing, Warsaw and Government of Aragon, Center for Agro-Food Research and Technology (CITA), Zaragoza, Spain. The aim of the FOODCOMM project was to examine the role (prevalence, necessity and significance) of economic relationships in selected European food chains and to identify the economic, social and cultural factors which influence co-ordination within these chains. The research project considered meat and cereal commodities in six different European countries (Finland, Germany, Ireland, Poland, Spain, UK/Scotland) and was commissioned against a background of changing European food markets. The research project as a whole consisted of seven different work packages. This report presents the results of qualitative research conducted for work package 5 (WP5) in the pig meat and rye bread chains in Finland. Ruralia Institute would like to give special thanks for all the individuals and companies that kindly gave up their time to take part in the study. Their input has been invaluable to the project. The contribution of research assistant Sanna-Helena Rantala was significant in the data gathering. FOODCOMM project was coordinated by the University of Bonn, Department of Agricultural and Food Market Research. Special thanks especially to Professor Monika Hartmann for acting as the project leader of FOODCOMM.
Resumo:
Governance has been one of the most popular buzzwords in recent political science. As with any term shared by numerous fields of research, as well as everyday language, governance is encumbered by a jungle of definitions and applications. This work elaborates on the concept of network governance. Network governance refers to complex policy-making situations, where a variety of public and private actors collaborate in order to produce and define policy. Governance is processes of autonomous, self-organizing networks of organizations exchanging information and deliberating. Network governance is a theoretical concept that corresponds to an empirical phenomenon. Often, this phenomenon is used to descirbe a historical development: governance is often used to describe changes in political processes of Western societies since the 1980s. In this work, empirical governance networks are used as an organizing framework, and the concepts of autonomy, self-organization and network structure are developed as tools for empirical analysis of any complex decision-making process. This work develops this framework and explores the governance networks in the case of environmental policy-making in the City of Helsinki, Finland. The crafting of a local ecological sustainability programme required support and knowledge from all sectors of administration, a number of entrepreneurs and companies and the inhabitants of Helsinki. The policy process relied explicitly on networking, with public and private actors collaborating to design policy instruments. Communication between individual organizations led to the development of network structures and patterns. This research analyses these patterns and their effects on policy choice, by applying the methods of social network analysis. A variety of social network analysis methods are used to uncover different features of the networked process. Links between individual network positions, network subgroup structures and macro-level network patterns are compared to the types of organizations involved and final policy instruments chosen. By using governance concepts to depict a policy process, the work aims to assess whether they contribute to models of policy-making. The conclusion is that the governance literature sheds light on events that would otherwise go unnoticed, or whose conceptualization would remain atheoretical. The framework of network governance should be in the toolkit of the policy analyst.
Resumo:
The open access (OA) model for journals is compared to the open source principle for computer software. Since the early 1990s nearly 1,000 OA scientific journals have emerged – mostly as voluntary community efforts, although recently some professionally operating publishers have used author charges or institutional membership. This study of OA journals without author charges shows that their impact is still relatively small, but awareness of it is increasing. The average number of research articles per year is lower than for major scientific journals but the publication times are shorter.
Resumo:
The current mainstream scientific-publication process has so far been only marginally affected by the possibilities offered by the Internet, despite some pioneering attempts with free electronic-only journals and electronic preprint archives. Additional electronic versions of traditional paper journals for which one needs a subscription are not a solution. A clear trend, for young researchers in particular, is to go around subscription barriers (both for paper and electronic material) and rely almost exclusively on what they can find free on the Internet, which often includes working versions posted on the home pages of the authors. A survey of how scientists retrieve publications was conducted in February 2000, aimed at measuring to what extent the opportunities offered by the Internet are already changing the scientific information exchange and how researchers feel about this. This paper presents the results based on 236 replies to an extensive Web-based questionnaire, which was announced to around 3,000 researchers in the domains of construction information technology and construction management. The questions dealt with how researchers find, access, and read different sources; how many and what publications they read; how often and to which conferences they travel; how much they publish, and criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing, with one final section dedicated to opinions about electronic publishing. According to the survey, researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author's or publisher's Web site. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available freely in their entirety on the Web, where the costs could be covered by, for instance, professional societies or the publishing university.
Resumo:
This thesis analyzes how matching takes place at the Finnish labor market from three different angles. The Finnish labor market has undergone severe structural changes following the economic crisis in the early 1990s. The labor market has had problems adjusting from these changes and hence a high and persistent unemployment has followed. In this thesis I analyze if matching problems, and in particular if changes in matching, can explain some of this persistence. The thesis consists of three essays. In the first essay Finnish Evidence of Changes in the Labor Market Matching Process the matching process at the Finnish labor market is analyzed. The key finding is that the matching process has changed thoroughly between the booming 1980s and the post-crisis period. The importance of the number of unemployed, and in particular long-term unemployed, for the matching process has vanished. More unemployed do not increase matching as theory predicts but rather the opposite. In the second essay, The Aggregate Matching Function and Directed Search -Finnish Evidence, stock-flow matching as a potential micro foundation of the aggregate matching function is studied. In the essay I show that newly unemployed match mainly with the stock of vacancies while longer term unemployed match with the inflow of vacancies. When aggregating I still find evidence of the traditional aggregate matching function. This could explain the huge support the aggregate matching function has received despite its odd randomness assumption. The third essay, How do Registered Job Seekers really match? -Finnish occupational level Evidence, studies matching for nine occupational groups and finds that very different matching problems exist for different occupations. In this essay also misspecification stemming from non-corresponding variables is dealt with through the introduction of a completely new set of variables. The new outflow measure used is vacancies filled with registered job seekers and it is matched by the supply side measure registered job seekers.
Resumo:
Financing trade between economic agents located in different countries is affected by many types of risks, resulting from incomplete information about the debtor, the problems of enforcing international contracts, or the prevalence of political and financial crises. Trade is important for economic development and the availability of trade finance is essential, especially for developing countries. Relatively few studies treat the topic of political risk, particularly in the context of international lending. This thesis explores new ground to identify links between political risk and international debt defaults. The core hypothesis of the study is that the default probability of debt increases with increasing political risk in the country of the borrower. The thesis consists of three essays that support the hypothesis from different angles of the credit evaluation process. The first essay takes the point of view of an international lender assessing the credit risk of a public borrower. The second investigates creditworthiness assessment of companies. The obtained results are substantiated in the third essay that deals with an extensive political risk survey among finance professionals in developing countries. The financial instruments of core interest are export credit guaranteed debt initiated between the Export Credit Agency of Finland and buyers in 145 countries between 1975 and 2006. Default events of the foreign credit counterparts are conditioned on country-specific macroeconomic variables, corporate-specific accounting information as well as political risk indicators from various international sources. Essay 1 examines debt issued to government controlled institutions and conditions public default events on traditional macroeconomic fundamentals, in addition to selected political and institutional risk factors. Confirming previous research, the study finds country indebtedness and the GDP growth rate to be significant indicators of public default. Further, it is shown that public defaults respond to various political risk factors. However, the impact of the risk varies between countries at different stages of economic development. Essay 2 proceeds by investigating political risk factors as conveivable drivers of corporate default and uses traditional accounting variables together with new political risk indicators in the credit evaluation of private debtors. The study finds links between corporate default and leverage, as well as between corporate default and the general investment climate and measeures of conflict in the debtor country. Essay 3 concludes the thesis by offering survey evidence on the impact of political risk on debt default, as perceived and experienced by 103 finance professionals in 38 developing countries. Taken together, the results of the thesis suggest that various forms of political risk are associated with international debt defaults and continue to pose great concerns for both international creditors and borrowers in developing countries. The study provides new insights on the importance of variable selection in country risk analysis, and shows how political risk is actually perceived and experienced in the riskier, often lower income countries of the global economy.