981 resultados para abstract data type


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2014 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We continued the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mitochondrial (mt) genome is, to date, the most extensively studied genomic system in insects, outnumbering nuclear genomes tenfold and representing all orders versus very few. Phylogenomic analysis methods have been tested extensively, identifying compositional bias and rate variation, both within and between lineages, as the principal issues confronting accurate analyses. Major studies at both inter- and intraordinal levels have contributed to our understanding of phylogenetic relationships within many groups. Genome rearrangements are an additional data type for defining relationships, with rearrangement synapomorphies identified across multiple orders and at many different taxonomic levels. Hymenoptera and Psocodea have greatly elevated rates of rearrangement offering both opportunities and pitfalls for identifying rearrangement synapomorphies in each group. Finally, insects are model systems for studying aberrant mt genomes, including truncated tRNAs and multichromosomal genomes. Greater integration of nuclear and mt genomic studies is necessary to further our understanding of insect genomic evolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective Evaluate the effectiveness and robustness of Anonym, a tool for de-identifying free-text health records based on conditional random fields classifiers informed by linguistic and lexical features, as well as features extracted by pattern matching techniques. De-identification of personal health information in electronic health records is essential for the sharing and secondary usage of clinical data. De-identification tools that adapt to different sources of clinical data are attractive as they would require minimal intervention to guarantee high effectiveness. Methods and Materials The effectiveness and robustness of Anonym are evaluated across multiple datasets, including the widely adopted Integrating Biology and the Bedside (i2b2) dataset, used for evaluation in a de-identification challenge. The datasets used here vary in type of health records, source of data, and their quality, with one of the datasets containing optical character recognition errors. Results Anonym identifies and removes up to 96.6% of personal health identifiers (recall) with a precision of up to 98.2% on the i2b2 dataset, outperforming the best system proposed in the i2b2 challenge. The effectiveness of Anonym across datasets is found to depend on the amount of information available for training. Conclusion Findings show that Anonym compares to the best approach from the 2006 i2b2 shared task. It is easy to retrain Anonym with new datasets; if retrained, the system is robust to variations of training size, data type and quality in presence of sufficient training data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2015 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We are continuing the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Function Definition Language (FDL) is presented. Though designed for describing specifications, FDL is also a general-purpose functional programming language. It uses context-free language as data type, supports pattern matching definition of functions, offers several function definition forms, and is executable. It is shown that FDL has strong expressiveness, is easy to use and describes algorithms concisely and naturally. An interpreter of FDL is introduced. Experiments and discussion are included.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This report describes our attempt to add animation as another data type to be used on the World Wide Web. Our current network infrastructure, the Internet, is incapable of carrying video and audio streams for them to be used on the web for presentation purposes. In contrast, object-oriented animation proves to be efficient in terms of network resource requirements. We defined an animation model to support drawing-based and frame-based animation. We also extended the HyperText Markup Language in order to include this animation mode. BU-NCSA Mosanim, a modified version of the NCSA Mosaic for X(v2.5), is available to demonstrate the concept and potentials of animation in presentations an interactive game playing over the web.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Personal communication devices are increasingly equipped with sensors that are able to collect and locally store information from their environs. The mobility of users carrying such devices, and hence the mobility of sensor readings in space and time, opens new horizons for interesting applications. In particular, we envision a system in which the collective sensing, storage and communication resources, and mobility of these devices could be leveraged to query the state of (possibly remote) neighborhoods. Such queries would have spatio-temporal constraints which must be met for the query answers to be useful. Using a simplified mobility model, we analytically quantify the benefits from cooperation (in terms of the system's ability to satisfy spatio-temporal constraints), which we show to go beyond simple space-time tradeoffs. In managing the limited storage resources of such cooperative systems, the goal should be to minimize the number of unsatisfiable spatio-temporal constraints. We show that Data Centric Storage (DCS), or "directed placement", is a viable approach for achieving this goal, but only when the underlying network is well connected. Alternatively, we propose, "amorphous placement", in which sensory samples are cached locally, and shuffling of cached samples is used to diffuse the sensory data throughout the whole network. We evaluate conditions under which directed versus amorphous placement strategies would be more efficient. These results lead us to propose a hybrid placement strategy, in which the spatio-temporal constraints associated with a sensory data type determine the most appropriate placement strategy for that data type. We perform an extensive simulation study to evaluate the performance of directed, amorphous, and hybrid placement protocols when applied to queries that are subject to timing constraints. Our results show that, directed placement is better for queries with moderately tight deadlines, whereas amorphous placement is better for queries with looser deadlines, and that under most operational conditions, the hybrid technique gives the best compromise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Motivated by accurate average-case analysis, MOdular Quantitative Analysis (MOQA) is developed at the Centre for Efficiency Oriented Languages (CEOL). In essence, MOQA allows the programmer to determine the average running time of a broad class of programmes directly from the code in a (semi-)automated way. The MOQA approach has the property of randomness preservation which means that applying any operation to a random structure, results in an output isomorphic to one or more random structures, which is key to systematic timing. Based on original MOQA research, we discuss the design and implementation of a new domain specific scripting language based on randomness preserving operations and random structures. It is designed to facilitate compositional timing by systematically tracking the distributions of inputs and outputs. The notion of a labelled partial order (LPO) is the basic data type in the language. The programmer uses built-in MOQA operations together with restricted control flow statements to design MOQA programs. This MOQA language is formally specified both syntactically and semantically in this thesis. A practical language interpreter implementation is provided and discussed. By analysing new algorithms and data restructuring operations, we demonstrate the wide applicability of the MOQA approach. Also we extend MOQA theory to a number of other domains besides average-case analysis. We show the strong connection between MOQA and parallel computing, reversible computing and data entropy analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Often the modification and enhancement of large scientific software systems are severely hampered because many components of the system are written in an implementation dependent fashion, they are inadequately documented, and their functionalities are not precisely known. In this paper we consider how mathematics may be employed to alleviate some of these problems. In particular, we illustrate how the formal specification notation VDM-SL is being used to specify precisely abstract data types for use in the development of scientific software.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Se analizaron datos de oxígeno disuelto, frente a las costas del Perú, para comprender las variaciones de la ZMO, caracterizadas por: a) el espesor de esa zona, limitada por las isolíneas de oxígeno de 0,5 mL.L-1; y b) la profundidad de su límite superior en la franja marino costera. Se muestran los resultados de las evaluaciones en la columna de agua, en la bahía de Callao en el periodo 1999 – 2009, y además en enero - febrero 2009 en un estudio de la ZMO, durante el Crucero Meteor 77-4 0901-02: Interacción en el Océano Tropical, Biogeoquímica y Clima. En la zona costera del Callao (12°S) se acentúa la hipoxia; la ZMO se ve restringida por la plataforma y su límite superior más superficial se ubicó a los 2,5 m de profundidad. A partir de la información obtenida en enero–febrero 2009 (Crucero 0901-02) se analizó la variabilidad espacial de la ZMO, en donde se halló un espesor de ~637,8 m, en la sección Punta Falsa (6°S). Se analizó la dinámica de la ZMO y su límite superior, debido al gran interés que ha tomado, por el posible incremento de su espesor, en el contexto de cambio climático, con grandes repercusiones sobre los recursos pesqueros.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Résumé - Les données concernant la prise en charge chirurgicale de la maladie tricuspidienne reposent sur des études de cohortes à petite échelle et peu d’entre elles se sont intéressées aux résultats échocardiographiques et aux facteurs de risque de mortalité et de morbidité. Une étude de cohorte rétrospective descriptive et analytique fut effectuée pour analyser l’expérience de l’Institut de Cardiologie de Montréal concernant la chirurgie de la VT. Les données ont été récoltées à l’aide des dossiers médicaux. Durant la période 1977-2008, 792 PVT et 134 RVT furent effectués (âge médian : 62 ans). La mortalité opératoire était de 13,8%. Les taux de survie actuarielle à 5, à 10 et à 15 ans étaient respectivement de 67±2%, de 47±2% et de 29±2%. Au dernier suivi, de l’IT ≥3/4 était présente chez 31% des patients du groupe PVT et chez 12% des patients du groupe RVT (p<0,001). La classe fonctionnelle NYHA s’est améliorée significativement au dernier suivi par rapport à la période pré-opératoire (p<0,001). L’analyse de propension montre que par rapport à une PVT, un RVT est associé significativement à des taux de mortalité opératoire et tardive accrus, mais à moins d’IT ≥2/4 ou ≥3/4 lors du suivi. Cette étude montre que malgré le risque chirurgical substantiel associé à la chirurgie de la VT, les patients bénéficient d’une amélioration fonctionnelle significative. Les facteurs de risque de mortalité et de morbidité sont décrits et des études de sous-groupes sur la chirurgie tri-valvulaire et la chirurgie isolée de la VT sont exposées.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions. The observing system changed considerably over this re-analysis period, with assimilable data provided by a succession of satellite-borne instruments from the 1970s onwards, supplemented by increasing numbers of observations from aircraft, ocean-buoys and other surface platforms, but with a declining number of radiosonde ascents since the late 1980s. The observations used in ERA-40 were accumulated from many sources. The first part of this paper describes the data acquisition and the principal changes in data type and coverage over the period. It also describes the data assimilation system used for ERA-40. This benefited from many of the changes introduced into operational forecasting since the mid-1990s, when the systems used for the 15-year ECMWF re-analysis (ERA-15) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis were implemented. Several of the improvements are discussed. General aspects of the production of the analyses are also summarized. A number of results indicative of the overall performance of the data assimilation system, and implicitly of the observing system, are presented and discussed. The comparison of background (short-range) forecasts and analyses with observations, the consistency of the global mass budget, the magnitude of differences between analysis and background fields and the accuracy of medium-range forecasts run from the ERA-40 analyses are illustrated. Several results demonstrate the marked improvement that was made to the observing system for the southern hemisphere in the 1970s, particularly towards the end of the decade. In contrast, the synoptic quality of the analysis for the northern hemisphere is sufficient to provide forecasts that remain skilful well into the medium range for all years. Two particular problems are also examined: excessive precipitation over tropical oceans and a too strong Brewer-Dobson circulation, both of which are pronounced in later years. Several other aspects of the quality of the re-analyses revealed by monitoring and validation studies are summarized. Expectations that the second-generation ERA-40 re-analysis would provide products that are better than those from the firstgeneration ERA-15 and NCEP/NCAR re-analyses are found to have been met in most cases. © Royal Meteorological Society, 2005. The contributions of N. A. Rayner and R. W. Saunders are Crown copyright.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Comparative analyses are used to address the key question of what makes a species more prone to extinction by exploring the links between vulnerability and intrinsic species’ traits and/or extrinsic factors. This approach requires comprehensive species data but information is rarely available for all species of interest. As a result comparative analyses often rely on subsets of relatively few species that are assumed to be representative samples of the overall studied group. 2. Our study challenges this assumption and quantifies the taxonomic, spatial, and data type biases associated with the quantity of data available for 5415 mammalian species using the freely available life-history database PanTHERIA. 3. Moreover, we explore how existing biases influence results of comparative analyses of extinction risk by using subsets of data that attempt to correct for detected biases. In particular, we focus on links between four species’ traits commonly linked to vulnerability (distribution range area, adult body mass, population density and gestation length) and conduct univariate and multivariate analyses to understand how biases affect model predictions. 4. Our results show important biases in data availability with c.22% of mammals completely lacking data. Missing data, which appear to be not missing at random, occur frequently in all traits (14–99% of cases missing). Data availability is explained by intrinsic traits, with larger mammals occupying bigger range areas being the best studied. Importantly, we find that existing biases affect the results of comparative analyses by overestimating the risk of extinction and changing which traits are identified as important predictors. 5. Our results raise concerns over our ability to draw general conclusions regarding what makes a species more prone to extinction. Missing data represent a prevalent problem in comparative analyses, and unfortunately, because data are not missing at random, conventional approaches to fill data gaps, are not valid or present important challenges. These results show the importance of making appropriate inferences from comparative analyses by focusing on the subset of species for which data are available. Ultimately, addressing the data bias problem requires greater investment in data collection and dissemination, as well as the development of methodological approaches to effectively correct existing biases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new scheme of nomenclature for the pyrochlore supergroup, approved by the CNMNC-IMA, is based on the ions at the A, B and Y sites. What has been referred to until now as the pyrochlore group should be referred to as the pyrochlore supergroup, and the subgroups should be changed to groups. Five groups are recommended, based on the atomic proportions of the B atoms Nb, Ta, Sb, Ti, and W. The recommended groups are pyrochlore, microlite, romite, betafite, and elsmoreite, respectively. The new names are composed of two prefixes and one root name (identical to the name of the group). The first prefix refers to the dominant anion (or cation) of the dominant valence [or H(2)O or rectangle] at the Y site. The second prefix refers to the dominant cation of the dominant valence [or H(2)O or rectangle] at the A site. The prefix "" keno-"" represents "" vacancy"". Where the first and second prefixes are equal, then only one prefix is applied. Complete descriptions are missing for the majority of the pyrochlore-supergroup species. Only seven names refer to valid species on the grounds of their complete descriptions: oxycalciopyrochlore, hydropyrochlore, hydroxykenomicrolite, oxystannomicrolite, oxystibiomicrolite, hydroxycalcioromite, and hydrokenoelsmoreite. Fluornatromicrolite is an IMA-approved mineral, but the complete description has not yet been published. The following 20 names refer to minerals that need to be completely described in order to be approved as valid species: hydroxycalciopyrochlore, fluornatropyrochlore, fluorcalciopyrochlore, fluorstrontiopyrochlore, fluorkenopyrochlore, oxynatropyrochlore, oxyplumbopyrochlore, oxyyttropyrochlore-(Y), kenoplumbopyrochlore, fluorcalciomicrolite, oxycalciomicrolite, kenoplumbomicrolite, hydromicrolite, hydrokenomicrolite, oxycalciobetafite, oxyuranobetafite, fluornatroromite, fluorcalcioromte, oxycalcioromite, and oxyplumboromite. For these, there are only chemical or crystalstructure data. Type specimens need to be defined. Potential candidates for several other species exist, but are not sufficiently well characterized to grant them any official status. Ancient chemical data refer to wet-chemical analyses and commonly represent a mixture of minerals. These data were not used here. All data used represent results of electron-microprobe analyses or were obtained by crystal-structure refinement. We also verified the scarcity of crystal-chemical data in the literature. There are crystalstructure determinations published for only nine pyrochlore-supergroup minerals: hydropyrochlore, hydroxykenomicrolite, hydroxycalcioromite, hydrokenoelsmoreite, hydroxycalciopyrochlore, fluorcalciopyrochlore, kenoplumbomicrolite, oxycalciobetafite, and fluornatroromite. The following mineral names are now discarded: alumotungstite, bariomicrolite, bariopyrochlore, bindheimite, bismutomicrolite, bismutopyrochlore, bismutostibiconite, calciobetafite, ceriopyrochlore-(Ce), cesstibtantite, ferritungstite, jixianite, kalipyrochlore, monimolite, natrobistantite, partzite, plumbobetafite, plumbomicrolite, plumbopyrochlore, stannomicrolite, stetefeldtite, stibiconite, stibiobetafite, stibiomicrolite, strontiopyrochlore, uranmicrolite, uranpyrochlore, yttrobetafite-(Y), and yttropyrochlore-(Y).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Urban Sustainability expresses the level of conservation of a city while living a town or consuming its urban resources, but the measurement of urban sustainability depends on what are considered important indicators of conservation besides the permitted levels of consumption in accordance with adopted criteria. This criterion should have common factors that are shared for all the members tested or cities to be evaluated as in this particular case for Abu Dhabi, but also have specific factors that are related to the geographic place, community and culture, that is the measures of urban sustainability specific to a middle east climate, community and culture where GIS Vector and Raster analysis have a role or add a value in urban sustainability measurements or grading are considered herein. Scenarios were tested using various GIS data types to replicate urban history (ten years period), current status and expected future of Abu Dhabi City setting factors to climate, community needs and culture. The useful Vector or Raster GIS data sets that are related to every scenario where selected and analysed in the sense of how and how much it can benefit the urban sustainability ranking in quantity and quality tests, this besides assessing the suitable data nature, type and format, the important topology rules to be considered, the useful attributes to be added, the relationships which should be maintained between data types of a geo- database, and specify its usage in a specific scenario test, then setting weights to each and every data type representing some elements of a phenomenon related to urban suitability factor. The results of assessing the role of GIS analysis provided data collection specifications such as the measures of accuracy reliable to a certain type of GIS functional analysis used in an urban sustainability ranking scenario tests. This paper reflects the prior results of the research that is conducted to test the multidiscipline evaluation of urban sustainability using different indicator metrics, that implement vector GIS Analysis and Raster GIS analysis as basic tools to assist the evaluation and increase of its reliability besides assessing and decomposing it, after which a hypothetical implementation of the chosen evaluation model represented by various scenarios was implemented on the planned urban sustainability factors for a certain period of time to appraise the expected future grade of urban sustainability and come out with advises associated with scenarios for assuring gap filling and relative high urban future sustainability. The results this paper is reflecting are concentrating on the elements of vector and raster GIS analysis that assists the proper urban sustainability grading within the chosen model, the reliability of spatial data collected; analysis selected and resulted spatial information. Starting from selecting some important indicators to comprise the model which include regional culture, climate and community needs an example of what was used is Energy Demand & Consumption (Cooling systems). Thus, this factor is related to the climate and it‟s regional specific as the temperature varies around 30-45 degrees centigrade in city areas, GIS 3D Polygons of building data used to analyse the volume of buildings, attributes „building heights‟, estimate the number of floors from the equation, following energy demand was calculated and consumption for the unit volume, and compared it in scenario with possible sustainable energy supply or using different environmental friendly cooling systems this is followed by calculating the cooling system effects on an area unit selected to be 1 sq. km, combined with the level of greenery area, and open space, as represented by parks polygons, trees polygons, empty areas, pedestrian polygons and road surface area polygons. (initial measures showed that cooling system consumption can be reduced by around 15 -20 % with a well-planned building distributions, proper spaces and with using environmental friendly products and building material, temperature levels were also combined in the scenario extracted from satellite images as interpreted from thermal bands 3 times during the period of assessment. Other examples of the assessment of GIS analysis to urban sustainability took place included Waste Productivity, some effects of greenhouse gases measured by the intensity of road polygons and closeness to dwelling areas, industry areas as defined from land use land cover thematic maps produced from classified satellite images then vectors were created to take part in defining their role within the scenarios. City Noise and light intensity assessment was also investigated, as the region experiences rapid development and noise is magnified due to construction activities, closeness of the airports, and highways. The assessment investigated the measures taken by urban planners to reduce degradation or properly manage it. Finally as a conclusion tables were presented to reflect the scenario results in combination with GIS data types, analysis types, and the level of GIS data reliability to measure the sustainability level of a city related to cultural and regional demands.