900 resultados para Data Structures, Cryptology and Information Theory
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
This thesis attempts to provide deeper historical and theoretical grounding for sense-making, thereby illustrating its applicability to practical information seeking research. In Chapter One I trace the philosophical origins of Brenda Dervin’s theory known as “sense making,” reaching beyond current scholarship that locates the origins of sense-making in twentieth-century Phenomenology and Communication theory and find its rich ontological, epistemological, and etymological heritage that dates back to the Pre-Socratics. After exploring sense-making’s Greek roots, I examine sense-making’s philosophical undercurrents found in Hegel’s Phenomenology of Spirit (1807), where he also returns to the simplicity of the Greeks for his concept of sense. With Chapter Two I explore sense-making methodology and find, in light of the Greek and Hegelian dialectic, a dialogical bridge connecting sense-making’s theory with pragmatic uses. This bridge between Dervin’s situation and use occupies a distinct position in sense-making theory. Moreover, building upon Brenda Dervin’s model of sense-making, I use her metaphors of gap and bridge analogy to discuss the dialectic and dialogic components of sense making. The purpose of Chapter Three is pragmatic – to gain insight into the online information-seeking needs, experiences, and motivation of first-degree relatives (FDRs) of breast cancer survivors through the lens of sense-making. This research analyses four questions: 1) information-seeking behavior among FDRs of cancer survivors compared to survivors and to undiagnosed, non-related online cancer information seekers in the general population, 2) types of and places where information is sought, 3) barriers or gaps and satisfaction rates FDRs face in their cancer information quest, and 4) types and degrees of cancer information and resources FDRs want and use in their information search for themselves and other family members. An online survey instrument designed to investigate these questions was developed and pilot tested. Via an email communication, the Susan Love Breast Cancer Research Foundation distributed 322,000 invitations to its membership to complete the survey, and from March 24th to April 5th 10,692 women agreed to take the survey with 8,804 volunteers actually completing survey responses. Of the 8,804 surveys, 95% of FDRs have searched for cancer information online, and 84% of FDRs use the Internet as a sense-making tool for additional information they have received from doctors or nurses. FDRs report needing much more information than either survivors or family/friends in ten out of fifteen categories related to breast and ovarian cancer. When searching for cancer information online, FDRs also rank highest in several of sense-making’s emotional levels: uncertainty, confusion, frustration, doubt, and disappointment than do either survivors or friends and family. The sense-making process has existed in theory and praxis since the early Greeks. In applying sense–making’s theory to a contemporary problem, the survey reveals unaddressed situations and gaps of FDRs’ information search process. FDRs are a highly motivated group of online information seekers whose needs are largely unaddressed as a result of gaps in available online information targeted to address their specific needs. Since FDRs represent a quarter of the population, further research addressing their specific online information needs and experiences is necessary.
Resumo:
This dissertation comprises three chapters. The first chapter motivates the use of a novel data set combining survey and administrative sources for the study of internal labor migration. By following a sample of individuals from the American Community Survey (ACS) across their employment outcomes over time according to the Longitudinal Employer-Household Dynamics (LEHD) database, I construct a measure of geographic labor mobility that allows me to exploit information about individuals prior to their move. This enables me to explore aspects of the migration decision, such as homeownership and employment status, in ways that have not previously been possible. In the second chapter, I use this data set to test the theory that falling home prices affect a worker’s propensity to take a job in a different metropolitan area from where he is currently located. Employing a within-CBSA and time estimation that compares homeowners to renters in their propensities to relocate for jobs, I find that homeowners who have experienced declines in the nominal value of their homes are approximately 12% less likely than average to take a new job in a location outside of the metropolitan area where they currently reside. This evidence is consistent with the hypothesis that housing lock-in has contributed to the decline in labor mobility of homeowners during the recent housing bust. The third chapter focuses on a sample of unemployed workers in the same data set, in order to compare the unemployment durations of those who find subsequent employment by relocating to a new metropolitan area, versus those who find employment in their original location. Using an instrumental variables strategy to address the endogeneity of the migration decision, I find that out-migrating for a new job significantly reduces the time to re-employment. These results stand in contrast to OLS estimates, which suggest that those who move have longer unemployment durations. This implies that those who migrate for jobs in the data may be particularly disadvantaged in their ability to find employment, and thus have strong short-term incentives to relocate.
Resumo:
Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. This physical complexity has led to ambiguous definition of the reference frame (Lagrangian or Eulerian) in which sediment transport is analysed. A general Eulerian-Lagrangian approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. The necessary Eulerian-Lagrangian transformations are simplified under the assumption of an ideal Inertial Measurement Unit (IMU), rigidly attached at the centre of the mass of a sediment particle. Real, commercially available IMU sensors can provide high frequency data on accelerations and angular velocities (hence forces and energy) experienced by grains during entrainment and motion, if adequately customized. IMUs are subjected to significant error accu- mulation but they can be used for statistical parametrisation of an Eulerian-Lagrangian model, for coarse sediment particles and over the temporal scale of individual entrainment events. In this thesis an Eulerian-Lagrangian model is introduced and evaluated experimentally. Absolute inertial accelerations were recorded at a 4 Hz frequency from a spherical instrumented particle (111 mm diameter and 2383 kg/m3 density) in a series of entrainment threshold experiments on a fixed idealised bed. The grain-top inertial acceleration entrainment threshold was approximated at 44 and 51 mg for slopes 0.026 and 0.037 respectively. The saddle inertial acceleration entrainment threshold was at 32 and 25 mg for slopes 0.044 and 0.057 respectively. For the evaluation of the complete Eulerian-Lagrangian model two prototype sensors are presented: an idealised (spherical) with a diameter of 90 mm and an ellipsoidal with axes 100, 70 and 30 mm. Both are instrumented with a complete IMU, capable of sampling 3D inertial accelerations and 3D angular velocities at 50 Hz. After signal analysis, the results can be used to parametrize sediment movement but they do not contain positional information. The two sensors (spherical and ellipsoidal) were tested in a series of entrainment experiments, similar to the evaluation of the 111 mm prototype, for a slope of 0.02. The spherical sensor entrained at discharges of 24.8 ± 1.8 l/s while the same threshold for the ellipsoidal sensor was 45.2 ± 2.2 l/s. Kinetic energy calculations were used to quantify the particle-bed energy exchange under fluvial (discharge at 30 l/s) and non-fluvial conditions. All the experiments suggest that the effect of the inertial characteristics of coarse sediments on their motion is comparable to the effect hydrodynamic forces. The coupling of IMU sensors with advanced telemetric systems can lead to the tracking of Lagrangian particle trajectories, at a frequency and accuracy that will permit the testing of diffusion/dispersion models across the range of particle diameters.
Resumo:
Analytics is the technology working with the manipulation of data to produce information able to change the world we live every day. Analytics have been largely used within the last decade to cluster people’s behaviour to predict their preferences of items to buy, music to listen, movies to watch and even electoral preference. The most advanced companies succeded in controlling people’s behaviour using analytics. Despite the evidence of the super-power of analytics, they are rarely applied to the big data collected within supply chain systems (i.e. distribution network, storage systems and production plants). This PhD thesis explores the fourth research paradigm (i.e. the generation of knowledge from data) applied to supply chain system design and operations management. An ontology defining the entities and the metrics of supply chain systems is used to design data structures for data collection in supply chain systems. The consistency of this data is provided by mathematical demonstrations inspired by the factory physics theory. The availability, quantity and quality of the data within these data structures define different decision patterns. Ten decision patterns are identified, and validated on-field, to address ten different class of design and control problems in the field of supply chain systems research.
Resumo:
The thesis deals with the problem of Model Selection (MS) motivated by information and prediction theory, focusing on parametric time series (TS) models. The main contribution of the thesis is the extension to the multivariate case of the Misspecification-Resistant Information Criterion (MRIC), a criterion introduced recently that solves Akaike’s original research problem posed 50 years ago, which led to the definition of the AIC. The importance of MS is witnessed by the huge amount of literature devoted to it and published in scientific journals of many different disciplines. Despite such a widespread treatment, the contributions that adopt a mathematically rigorous approach are not so numerous and one of the aims of this project is to review and assess them. Chapter 2 discusses methodological aspects of MS from information theory. Information criteria (IC) for the i.i.d. setting are surveyed along with their asymptotic properties; and the cases of small samples, misspecification, further estimators. Chapter 3 surveys criteria for TS. IC and prediction criteria are considered for: univariate models (AR, ARMA) in the time and frequency domain, parametric multivariate (VARMA, VAR); nonparametric nonlinear (NAR); and high-dimensional models. The MRIC answers Akaike’s original question on efficient criteria, for possibly-misspecified (PM) univariate TS models in multi-step prediction with high-dimensional data and nonlinear models. Chapter 4 extends the MRIC to PM multivariate TS models for multi-step prediction introducing the Vectorial MRIC (VMRIC). We show that the VMRIC is asymptotically efficient by proving the decomposition of the MSPE matrix and the consistency of its Method-of-Moments Estimator (MoME), for Least Squares multi-step prediction with univariate regressor. Chapter 5 extends the VMRIC to the general multiple regressor case, by showing that the MSPE matrix decomposition holds, obtaining consistency for its MoME, and proving its efficiency. The chapter concludes with a digression on the conditions for PM VARX models.
Resumo:
Oggigiorno il concetto di informazione è diventato cruciale in fisica, pertanto, siccome la migliore teoria che abbiamo per compiere predizioni riguardo l'universo è la meccanica quantistica, assume una particolare importanza lo sviluppo di una versione quantistica della teoria dell'informazione. Questa centralità è confermata dal fatto che i buchi neri hanno entropia. Per questo motivo, in questo lavoro sono presentati elementi di teoria dell'informazione quantistica e della comunicazione quantistica e alcuni sono illustrati riferendosi a modelli quantistici altamente idealizzati della meccanica di buco nero. In particolare, nel primo capitolo sono forniti tutti gli strumenti quanto-meccanici per la teoria dell'informazione e della comunicazione quantistica. Successivamente, viene affrontata la teoria dell'informazione quantistica e viene trovato il limite di Bekenstein alla quantità di informazione chiudibile entro una qualunque regione spaziale. Tale questione viene trattata utilizzando un modello quantistico idealizzato della meccanica di buco nero supportato dalla termodinamica. Nell'ultimo capitolo, viene esaminato il problema di trovare un tasso raggiungibile per la comunicazione quantistica facendo nuovamente uso di un modello quantistico idealizzato di un buco nero, al fine di illustrare elementi della teoria. Infine, un breve sommario della fisica dei buchi neri è fornito in appendice.
Resumo:
A catalogue is provided with the type material of four superfamilies of "Acalyptrate" (Conopoidea, Diopsoidea, Nerioidea and Tephritoidea) held in the collection of the Museu de Zoologia da Universidade de São Paulo (MZUSP), São Paulo, Brazil. Concerning the taxa dealt with herein, the Diptera collection of MZUSP held 77 holotypes, 4 "allotypes" and 194 paratypes. In this paper, information about data labels, preservation and missing structures of the type specimens is given.
Resumo:
Assuming as a starting point the acknowledge that the principles and methods used to build and manage the documentary systems are disperse and lack systematization, this study hypothesizes that the notion of structure, when assuming mutual relationships among its elements, promotes more organical systems and assures better quality and consistency in the retrieval of information concerning users` matters. Accordingly, it aims to explore the fundamentals about the records of information and documentary systems, starting from the notion of structure. In order to achieve that, it presents basic concepts and relative matters to documentary systems and information records. Next to this, it lists the theoretical subsides over the notion of structure, studied by Benveniste, Ferrater Mora, Levi-Strauss, Lopes, Penalver Simo, Saussure, apart from Ducrot, Favero and Koch. Appropriations that have already been done by Paul Otlet, Garcia Gutierrez and Moreiro Gonzalez. In Documentation come as a further topic. It concludes that the adopted notion of structure to make explicit a hypothesis of real systematization achieves more organical systems, as well as it grants pedagogical reference to the documentary tasks.
Resumo:
The Brazilian Network of Food Data Systems (BRASILFOODS) has been keeping the Brazilian Food Composition Database-USP (TBCA-USP) (http://www.fcf.usp.br/tabela) since 1998. Besides the constant compilation, analysis and update work in the database, the network tries to innovate through the introduction of food information that may contribute to decrease the risk for non-transmissible chronic diseases, such as the profile of carbohydrates and flavonoids in foods. In 2008, data on carbohydrates, individually analyzed, of 112 foods, and 41 data related to the glycemic response produced by foods widely consumed in the country were included in the TBCA-USP. Data (773) about the different flavonoid subclasses of 197 Brazilian foods were compiled and the quality of each data was evaluated according to the USDAs data quality evaluation system. In 2007, BRASILFOODS/USP and INFOODS/FAO organized the 7th International Food Data Conference ""Food Composition and Biodiversity"". This conference was a unique opportunity for interaction between renowned researchers and participants from several countries and it allowed the discussion of aspects that may improve the food composition area. During the period, the LATINFOODS Regional Technical Compilation Committee and BRASILFOODS disseminated to Latin America the Form and Manual for Data Compilation, version 2009, ministered a Food Composition Data Compilation course and developed many activities related to data production and compilation. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Reviews the ecological status of the mahogany glider and describes its distribution, habitat and abundance, life history and threats to it. Three serial surveys of Brisbane residents provide data on the knowledge of respondents about the mahogany glider. The results provide information about the attitudes of respondents to the mahogany glider, to its conservation and relevant public policies and about variations in these factors as the knowledge of participants of the mahogany glider alters. Similarly data is provided and analysed about the willingness to pay of respondents to conserve the mahogany glider. Population viability analysis is applied to estimate the required habitat area for a minimum viable population of the mahogany glider to ensure at least a 95% probability of its survival for 100 years. Places are identified in Queensland where the requisite minimum area of critical habitat can be conserved. Using the survey results as a basis, the likely willingness of groups of Australians to pay for the conservation of the mahogany glider is estimated and consequently their willingness to pay for the minimum required area of its habitat. Methods for estimating the cost of protecting this habitat are outlined. Australia-wide benefits seem to exceed the costs. Establishing a national park containing the minimum viable population of the mahogany glider is an appealing management option. This would also be beneficial in conserving other endangered wildlife species. Therefore, additional economic benefits to those estimated on account of the mahogany glider itself can be obtained.
Resumo:
The integral of the Wigner function of a quantum-mechanical system over a region or its boundary in the classical phase plane, is called a quasiprobability integral. Unlike a true probability integral, its value may lie outside the interval [0, 1]. It is characterized by a corresponding selfadjoint operator, to be called a region or contour operator as appropriate, which is determined by the characteristic function of that region or contour. The spectral problem is studied for commuting families of region and contour operators associated with concentric discs and circles of given radius a. Their respective eigenvalues are determined as functions of a, in terms of the Gauss-Laguerre polynomials. These polynomials provide a basis of vectors in a Hilbert space carrying the positive discrete series representation of the algebra su(1, 1) approximate to so(2, 1). The explicit relation between the spectra of operators associated with discs and circles with proportional radii, is given in terms of the discrete variable Meixner polynomials.
Resumo:
Geospatial clustering must be designed in such a way that it takes into account the special features of geoinformation and the peculiar nature of geographical environments in order to successfully derive geospatially interesting global concentrations and localized excesses. This paper examines families of geospaital clustering recently proposed in the data mining community and identifies several features and issues especially important to geospatial clustering in data-rich environments.
Resumo:
With the proliferation of relational database programs for PC's and other platforms, many business end-users are creating, maintaining, and querying their own databases. More importantly, business end-users use the output of these queries as the basis for operational, tactical, and strategic decisions. Inaccurate data reduce the expected quality of these decisions. Implementing various input validation controls, including higher levels of normalisation, can reduce the number of data anomalies entering the databases. Even in well-maintained databases, however, data anomalies will still accumulate. To improve the quality of data, databases can be queried periodically to locate and correct anomalies. This paper reports the results of two experiments that investigated the effects of different data structures on business end-users' abilities to detect data anomalies in a relational database. The results demonstrate that both unnormalised and higher levels of normalisation lower the effectiveness and efficiency of queries relative to the first normal form. First normal form databases appear to provide the most effective and efficient data structure for business end-users formulating queries to detect data anomalies.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.