917 resultados para Earnings and dividend announcements, high frequency data, information asymmetry
Resumo:
This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.
Resumo:
The need for a high quality tourism database is well known. For example, planners and managers need high quality data for budgeting, forecasting, planning marketing and advertising strategies, and staffing. Thus the concepts of quality and need are intertwined to pose a problem to the tourism professional, be they private sector or public sector employees. One could argue that collaboration by public and private sector tourism professionals could provide the best sources and uses of high quality tourism data. This discussion proposes just such a collaboration and a detailed methodology for operationalizing this arrangement.
Resumo:
The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature does not differentiate between single and multiple restatements announcements. This research investigates the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm's rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examines the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigates how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results support for non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examines the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examines the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supports the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examines the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis shows that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.
Resumo:
The objective of this research was to investigate the reason lumps occur in high-slump concrete and develop adequate batching procedures for a lumps-free high-slump ready-mix concrete mix used by the Florida Department of Transportation. Cement balls are round lumps of cement, sand, and coarse aggregate, typically about the size of a baseball that frequently occur in high-slump concrete. Such lumps or balls jeopardize the structural integrity of structural members. Experiments were conducted at the CSR Rinker concrete plant in Miami, Florida, based on a protocol developed by a team of Florida Department of Transportation (FDOT) concrete engineers, Rinker personnel, and Florida International University faculty. A total of seventeen truckloads were investigated in two phases, between April 2001 and March 2002. The tests consisted of gathering data by varying load size, discharge rate, headwater content, and mixing revolutions. The major finding was that a usual load size and discharge rate, an initial headwater ratio of 30%, and an initial number of revolutions of 100 at 12 revolutions per minute seem to produce a lump-free high-slump concrete. It was concluded that inadequate mixing and batching procedures caused cement lumps. Recommendations regarding specific load size, discharge rates, number of mixing revolutions, and initial water content are made. Clear guidelines for a high-slump concrete batching protocol can be developed, with further testing based on these research conclusions.
Resumo:
In this thesis, the first-order radar cross section (RCS) of an iceberg is derived and simulated. This analysis takes place in the context of a monostatic high frequency surface wave radar with a vertical dipole source that is driven by a pulsed waveform. The starting point of this work is a general electric field equation derived previ- ously for an arbitrarily shaped iceberg region surrounded by an ocean surface. The condition of monostatic backscatter is applied to this general field equation and the resulting expression is inverse Fourier transformed. In the time domain the excitation current of the transmit antenna is specified to be a pulsed sinusoid signal. The result- ing electric field equation is simplified and its physical significance is assessed. The field equation is then further simplified by restricting the iceberg's size to fit within a single radar patch width. The power received by the radar is calculated using this electric field equation. Comparing the received power with the radar range equation gives a general expression for the iceberg RCS. The iceberg RCS equation is found to depend on several parameters including the geometry of the iceberg, the radar frequency, and the electrical parameters of both the iceberg and the ocean surface. The RCS is rewritten in a form suitable for simulations and simulations are carried out for rectangularly shaped icebergs. Simulation results are discussed and are found to be consistent with existing research.
Resumo:
The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmanns seminal work in terms of the estimation of highly non-linear model specifications (Causality tests and observationally equivalent representations of econometric models, Journal of Econometrics, 1988, 39(1-2), 69104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.
Resumo:
Copyright 2016. The Authors. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Resumo:
The compositional record of the AND-2A drillcore is examined using petrological, sedimentological, volcanological and geochemical analysis of clasts, sediments and pore waters. Preliminary investigations of basement clasts (granitoids and metasediments) indicate both local and distal sources corresponding to variable ice-volume and ice-flow directions. Low abundance of sedimentary clasts (e.g., arkose, litharenite) suggests reduced contributions from sedimentary covers while intraclasts (e.g., diamictite, conglomerate) attest to intrabasinal reworking. Volcanic material includes pyroclasts (e.g., pumice, scoria), sediments and lava. Primary and reworked tephra layers occur within the Early Miocene interval (1093 to 640 metres below sea floor mbsf). The compositions of volcanic clasts reveal a diversity of alkaline types derived from the McMurdo Volcanic Group. Finer-grained sediments (e.g., sandstone, siltstone) show increases in biogenic silica and volcanic glass from 230 to 780 mbsf and higher proportions of terrigenous material c. 350 to 750 mbsf and below 970 mbsf. Basement clast assemblages suggest a dominant provenance from the Skelton Glacier - Darwin Glacier area and from the Ferrar Glacier - Koettlitz Glacier area. Provenance of sand grains is consistent with clast sources. Thirteen Geochemical Units are established based on compositional trends derived from continuous XRF scanning. High values of Fe and Ti indicate terrigenous and volcanic sources, whereas high Ca values signify either biogenic or diagenic sources. Highly alkaline and saline pore waters were produced by chemical exchange with glass at moderately elevated temperatures.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Understanding how aquatic species grow is fundamental in fisheries because stock assessment often relies on growth dependent statistical models. Length-frequency-based methods become important when more applicable data for growth model estimation are either not available or very expensive. In this article, we develop a new framework for growth estimation from length-frequency data using a generalized von Bertalanffy growth model (VBGM) framework that allows for time-dependent covariates to be incorporated. A finite mixture of normal distributions is used to model the length-frequency cohorts of each month with the means constrained to follow a VBGM. The variances of the finite mixture components are constrained to be a function of mean length, reducing the number of parameters and allowing for an estimate of the variance at any length. To optimize the likelihood, we use a minorizationmaximization (MM) algorithm with a NelderMead sub-step. This work was motivated by the decline in catches of the blue swimmer crab (BSC) (Portunus armatus) off the east coast of Queensland, Australia. We test the method with a simulation study and then apply it to the BSC fishery data.
Resumo:
Chiasma and crossover are two related biological processes of great importance in the understanding genetic variation. The study of these processes is straightforward in organisms where all products of meiosis are recovered and can be observed. This is not the case in mammals. Our understanding of these processes depends on our ability to model them. In this study I describe the biological processes that underline chiasma and crossover as well as the two main inference problems associated with these processes: i) in mammals we only recover one of the four products of meiosis and, ii) in general, we do not observe where the crossovers actually happen, but we find an interval containing type-2 censored information. NPML estimate was proposed and used in this work and used to compare chromosome length and chromosome expansion through the crosses.
Resumo:
Personal information is increasingly gathered and used for providing services tailored to user preferences, but the datasets used to provide such functionality can represent serious privacy threats if not appropriately protected. Work in privacy-preserving data publishing targeted privacy guarantees that protect against record re-identification, by making records indistinguishable, or sensitive attribute value disclosure, by introducing diversity or noise in the sensitive values. However, most approaches fail in the high-dimensional case, and the ones that dont introduce a utility cost incompatible with tailored recommendation scenarios. This paper aims at a sensible trade-off between privacy and the benefits of tailored recommendations, in the context of privacy-preserving data publishing. We empirically demonstrate that significant privacy improvements can be achieved at a utility cost compatible with tailored recommendation scenarios, using a simple partition-based sanitization method.
Resumo:
The design demands on water and sanitation engineers are rapidly changing. The global population is set to rise from 7 billion to 10 billion by 2083. Urbanisation in developing regions is increasing at such a rate that a predicted 56% of the global population will live in an urban setting by 2025. Compounding these problems, the global water and energy crises are impacting the Global North and South alike. High-rate anaerobic digestion offers a low-cost, low-energy treatment alternative to the energy intensive aerobic technologies used today. Widespread implementation however is hindered by the lack of capacity to engineer high-rate anaerobic digestion for the treatment of complex wastes such as sewage. This thesis utilises the Expanded Granular Sludge Bed bioreactor (EGSB) as a model system in which to study the ecology, physiology and performance of high-rate anaerobic digestion of complex wastes. The impacts of a range of engineered parameters including reactor geometry, wastewater type, operating temperature and organic loading rate are systematically investigated using lab-scale EGSB bioreactors. Next generation sequencing of 16S amplicons is utilised as a means of monitoring microbial ecology. Microbial community physiology is monitored by means of specific methanogenic activity testing and a range of physical and chemical methods are applied to assess reactor performance. Finally, the limit state approach is trialled as a method for testing the EGSB and is proposed as a standard method for biotechnology testing enabling improved process control at full-scale. The arising data is assessed both qualitatively and quantitatively. Lab-scale reactor design is demonstrated to significantly influence the spatial distribution of the underlying ecology and community physiology in lab-scale reactors, a vital finding for both researchers and full-scale plant operators responsible for monitoring EGSB reactors. Recurrent trends in the data indicate that hydrogenotrophic methanogenesis dominates in high-rate anaerobic digestion at both full- and lab-scale when subject to engineered or operational stresses including low-temperature and variable feeding regimes. This is of relevance for those seeking to define new directions in fundamental understanding of syntrophic and competitive relations in methanogenic communities and also to design engineers in determining operating parameters for full-scale digesters. The adoption of the limit state approach enabled identification of biological indicators providing early warning of failure under high-solids loading, a vital insight for those currently working empirically towards the development of new biotechnologies at lab-scale.
Resumo:
The under-reporting of cases of infectious diseases is a substantial impediment to the control and management of infectious diseases in both epidemic and endemic contexts. Information about infectious disease dynamics can be recovered from sequence data using time-varying coalescent approaches, and phylodynamic models have been developed in order to reconstruct demographic changes of the numbers of infected hosts through time. In this study I have demonstrated the general concordance between empirically observed epidemiological incidence data and viral demography inferred through analysis of foot-and-mouth disease virus VP1 coding sequences belonging to the CATHAY topotype over large temporal and spatial scales. However a more precise and robust relationship between the effective population size (