872 resultados para Space and time.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coastal lagoons represent habitats with widely heterogeneous environmental conditions, particularly as regards salinity and temperature,which fluctuate in both space and time. These characteristics suggest that physical and ecological factors could contribute to the genetic divergence among populations occurring in coastal lagoon and opencoast environments. This study investigates the genetic structure of Holothuria polii at a micro-geographic scale across theMar Menor coastal lagoon and nearbymarine areas, estimating the mitochondrial DNA variation in two gene fragments, cytochrome oxidase I (COI) and 16S rRNA (16S). Dataset of mitochondrial sequences was also used to test the influence of environmental differences between coastal lagoon andmarine waters on population genetic structure. All sampled locations exhibited high levels of haplotype diversity and low values of nucleotide diversity. Both genes showed contrasting signals of genetic differentiation (non-significant differences using COI and slight differences using 16S, which could due to different mutation rates or to differential number of exclusive haplotypes. We detected an excess of recent mutations and exclusive haplotypes, which can be generated as a result of population growth. However, selective processes can be also acting on the gene markers used; highly significant generalized additive models have been obtained considering genetic data from16S gene and independent variables such as temperature and salinity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, densities of newly built roadways are checked by direct sampling (cores) or by nuclear density gauge measurements. For roadway engineers, density of asphalt pavement surfaces is essential to determine pavement quality. Unfortunately, field measurements of density by direct sampling or by nuclear measurement are slow processes. Therefore, I have explored the use of rapidly-deployed ground penetrating radar (GPR) as an alternative means of determining pavement quality. The dielectric constant of pavement surface may be a substructure parameter that correlates with pavement density, and can be used as a proxy when density of asphalt is not known from nuclear or destructive methods. The dielectric constant of the asphalt can be determined using ground penetrating radar (GPR). In order to use GPR for evaluation of road surface quality, the relationship between dielectric constants of asphalt and their densities must be established. Field measurements of GPR were taken at four highway sites in Houghton and Keweenaw Counties, Michigan, where density values were also obtained using nuclear methods in the field. Laboratory studies involved asphalt samples taken from the field sites and samples created in the laboratory. These were tested in various ways, including, density, thickness, and time domain reflectometry (TDR). In the field, GPR data was acquired using a 1000 MHz air-launched unit and a ground-coupled unit at 200 and 500 MHz. The equipment used was owned and operated by the Michigan Department of Transportation (MDOT) and available for this study for a total of four days during summer 2005 and spring 2006. The analysis of the reflected waveforms included “routine” processing for velocity using commercial software and direct evaluation of reflection coefficients to determine a dielectric constant. The dielectric constants computed from velocities do not agree well with those obtained from reflection coefficients. Perhaps due to the limited range of asphalt types studied, no correlation between density and dielectric constant was evident. Laboratory measurements were taken with samples removed from the field and samples created for this study. Samples from the field were studied using TDR, in order to obtain dielectric constant directly, and these correlated well with the estimates made from reflection coefficients. Samples created in the laboratory were measured using 1000 MHz air-launched GPR, and 400 MHz ground-coupled GPR, each under both wet and dry conditions. On the basis of these observations, I conclude that dielectric constant of asphalt can be reliably measured from waveform amplitude analysis of GJPR data, based on the consistent agreement with that obtained in the laboratory using TDR. Because of the uniformity of asphalts studied here, any correlation between dielectric constant and density is not yet apparent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work of knowledge organization requires a particular set of tools. For instance we need standards of content description like Anglo-American Cataloging Rules Edition 2, Resource Description and Access (RDA), Cataloging Cultural Objects, and Describing Archives: A Content Standard. When we intellectualize the process of knowledge organization – that is when we do basic theoretical research in knowledge organization we need another set of tools. For this latter exercise we need constructs. Constructs are ideas with many conceptual elements, largely considered subjective. They allow us to be inventive as well as allow us to see a particular point of view in knowledge organization. For example, Patrick Wilson’s ideas of exploitative control and descriptive control, or S. R. Ranganathan’s fundamental categories are constructs. They allow us to identify functional requirements or operationalizations of functional requirements, or at least come close to them for our systems and schemes. They also allow us to carry out meaningful evaluation.What is even more interesting, from a research point of view, is that constructs once offered to the community can be contested and reinterpreted and this has an affect on how we view knowledge organization systems and processes. Fundamental categories are again a good example in that some members of the Classification Research Group (CRG) argued against Ranganathan’s point of view. The CRG posited more fundamental categories than Ranganathan’s five, Personality, Matter, Energy, Space, and Time (Ranganathan, 1967). The CRG needed significantly more fundamental categories for their work.1 And these are just two voices in this space we can also consider the fundamental categories of Johannes Kaiser (1911), Shera and Egan, Barbara Kyle (Vickery, 1960), and Eric de Grolier (1962). We can also reference contemporary work that continues comparison and analysis of fundamental categories (e.g., Dousa, 2011).In all these cases we are discussing a construct. The fundamental category is not discovered; it is constructed by a classificationist. This is done because it is useful in engaging in the act of classification. And while we are accustomed to using constructs or debating their merit in one knowledge organization activity or another, we have not analyzed their structure, nor have we created a typology. In an effort to probe the epistemological dimension of knowledge organization, we think it would be a fruitful exercise to do this. This is because we might benefit from clarity around not only our terminology, but the manner in which we talk about our terminology. We are all creative workers examining what is available to us, but doing so through particular lenses (constructs) identifying particular constructs. And by knowing these and being able to refer to these we would consider a core competency for knowledge organization researchers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metadata that is associated with either an information system or an information object for purposes of description, administration, legal requirements, technical functionality, use and usage, and preservation, plays a critical role in ensuring the creation, management, preservation and use and re-use of trustworthymaterials, including records. Recordkeeping1 metadata, of which one key type is archival description, plays a particularly important role in documenting the reliability and authenticity of records and recordkeeping systemsas well as the various contexts (legal-administrative, provenancial, procedural, documentary, and technical) within which records are created and kept as they move across space and time. In the digital environment, metadata is also the means by which it is possible to identify how record components – those constituent aspects of a digital record that may be managed, stored and used separately by the creator or the preserver – can be reassembled to generate an authentic copy of a record or reformulated per a user’s request as a customized output package.Issues relating to the creation, capture, management and preservation of adequate metadata are, therefore, integral to any research study addressing the reliability and authenticity of digital entities, regardless of the community, sector or institution within which they are being created. The InterPARES 2 Description Cross-Domain Group (DCD) examined the conceptualization, definitions, roles, and current functionality of metadata and archival description in terms of requirements generated by InterPARES 12. Because of the needs to communicate the work of InterPARES in a meaningful way across not only other disciplines, but also different archival traditions; to interface with, evaluate and inform existing standards, practices and other research projects; and to ensure interoperability across the three focus areas of InterPARES2, the Description Cross-Domain also addressed its research goals with reference to wider thinking about and developments in recordkeeping and metadata. InterPARES2 addressed not only records, however, but a range of digital information objects (referred to as “entities” by InterPARES 2, but not to be confused with the term “entities” as used in metadata and database applications) that are the products and by-products of government, scientific and artistic activities that are carried out using dynamic, interactive or experiential digital systems. The nature of these entities was determined through a diplomatic analysis undertaken as part of extensive case studies of digital systems that were conducted by the InterPARES 2 Focus Groups. This diplomatic analysis established whether the entities identified during the case studies were records, non-records that nevertheless raised important concerns relating to reliability and authenticity, or “potential records.” To be determined to be records, the entities had to meet the criteria outlined by archival theory – they had to have a fixed documentary format and stable content. It was not sufficient that they be considered to be or treated as records by the creator. “Potential records” is a new construct that indicates that a digital system has the potential to create records upon demand, but does not actually fix and set aside records in the normal course of business. The work of the Description Cross-Domain Group, therefore, addresses the metadata needs for all three categories of entities.Finally, since “metadata” as a term is used today so ubiquitously and in so many different ways by different communities, that it is in peril of losing any specificity, part of the work of the DCD sought to name and type categories of metadata. It also addressed incentives for creators to generate appropriate metadata, as well as issues associated with the retention, maintenance and eventual disposition of the metadata that aggregates around digital entities over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of research was to know the correlation between floor space and sanitation of cage with mastitis disease occurance on the dairy cattle. Sixtiy infected mastitis disease dairy cows were used in the research in the Banyumas regency. Survey was applied in this experiment and Linier Multiple Regression was used. The result showed that there were correlations between floor space  and sanitation of cage with mastitis which follow the regression line Yi = 15,355 + 1,087 X1 – 0,249 X2 (Animal Production 2(1): 9-12Key Words: mastitis, floor space, and sanitation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The building life cycle process is complex and prone to fragmentation as it moves through its various stages. The number of participants, and the diversity, specialisation and isolation both in space and time of their activities, have dramatically increased over time. The data generated within the construction industry has become increasingly overwhelming. Most currently available computer tools for the building industry have offered productivity improvement in the transmission of graphical drawings and textual specifications, without addressing more fundamental changes in building life cycle management. Facility managers and building owners are primarily concerned with highlighting areas of existing or potential maintenance problems in order to be able to improve the building performance, satisfying occupants and minimising turnover especially the operational cost of maintenance. In doing so, they collect large amounts of data that is stored in the building’s maintenance database. The work described in this paper is targeted at adding value to the design and maintenance of buildings by turning maintenance data into information and knowledge. Data mining technology presents an opportunity to increase significantly the rate at which the volumes of data generated through the maintenance process can be turned into useful information. This can be done using classification algorithms to discover patterns and correlations within a large volume of data. This paper presents how and what data mining techniques can be applied on maintenance data of buildings to identify the impediments to better performance of building assets. It demonstrates what sorts of knowledge can be found in maintenance records. The benefits to the construction industry lie in turning passive data in databases into knowledge that can improve the efficiency of the maintenance process and of future designs that incorporate that maintenance knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Belonging to and identifying with a nation has, since the latter half of the 18th century, been a distinctly human quality. To be human is to be part of a nation. Yet, contemporary theorists such as Appadurai and Fukuyama argue this universal human trait is undergoing vast change, threatened, it seems, by irrelevance and obsolescence, a return to tribalism and widened conceptual horizons represented by the likes of transnationalism and cosmopolitanism. These same threats are often attributed to the changing ideas and experience of spatiality and temporality enabled by information and communication technologies such as the Internet, spurred on by the rising intensity of flow amongst and within the human population. This paper argues that in the analysis of changes to the nation—which I suggest is best considered as the nexus of the body politic, the social body and human bodies—it is the notion of lived time and lived space that is most appropriate. The notion of the lived is borrowed and extended from Henri Lefebvre, who theorises that between mentally conceived and physically perceived space, lies its socially lived counterpart, which he defines as “the materialisation of social being”. As such, lived space (and time) draws on both its material and mental aspects. It is the thesis of this paper that against such a background as lived time and lived space the nation becomes much more than a political concept and/or project and is revealed as lived phenomenon, experienced in and through the dynamics of everyday praxis. Inherent to this argument is the understanding that it is the interplay between the possibilities imagined of the nation and; its eventual realisation through social acts and practices that marks it as a profoundly human institution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a composite participation index (PI) to identify patterns of transport disadvantage in space and time. It is operationalised using 157 weekly activity-travel diaries data collected from three case study areas in rural Northern Ireland. A review of activity space and travel behaviour research found that six dimensional indicators of activity spaces were typically used including the number of unique locations visited, distance travelled, area of activity spaces, frequency of activity participation, types of activity participated in, and duration of participation in order to identify transport disadvantage. A combined measure using six individual indices were developed based on the six dimensional indicators of activity spaces, by taking into account the relativity of the measures for weekdays, weekends, and for a week. Factor analyses were conducted to derive weights of these indices to form the PI measure. Multivariate analysis using general linear models of the different indicators/indices identified new patterns of transport disadvantage. The research found that: indicator based measures and index based measures are complement each other; interactions between different factors generated new patterns of transport disadvantage; and that these patterns vary in space and time. The analysis also indicates that the transport needs of different disadvantaged groups are varied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two major difficulties facing widespread clinical implementation of existing Tissue Engineering (TE) strategies for the treatment of musculoskeletal disorders are (1) the cost, space and time required for ex vivo culture of a patient’s autologous cells prior to re-implantation as part of a TE construct, and (2) the potential risks and availability constraints associated with transplanting exogenous (foreign) cells. These hurdles have led to recent interest in endogenous TE strategies, in which the regenerative potential of a patient’s own cells is harnessed to promote tissue regrowth without ex vivo cell culture. This article provides a focused perspective on key issues in the development of endogenous TE strategies, progress to date, and suggested future research directions toward endogenous repair and regeneration of musculoskeletal tissues and organs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern technology now has the ability to generate large datasets over space and time. Such data typically exhibit high autocorrelations over all dimensions. The field trial data motivating the methods of this paper were collected to examine the behaviour of traditional cropping and to determine a cropping system which could maximise water use for grain production while minimising leakage below the crop root zone. They consist of moisture measurements made at 15 depths across 3 rows and 18 columns, in the lattice framework of an agricultural field. Bayesian conditional autoregressive (CAR) models are used to account for local site correlations. Conditional autoregressive models have not been widely used in analyses of agricultural data. This paper serves to illustrate the usefulness of these models in this field, along with the ease of implementation in WinBUGS, a freely available software package. The innovation is the fitting of separate conditional autoregressive models for each depth layer, the ‘layered CAR model’, while simultaneously estimating depth profile functions for each site treatment. Modelling interest also lay in how best to model the treatment effect depth profiles, and in the choice of neighbourhood structure for the spatial autocorrelation model. The favoured model fitted the treatment effects as splines over depth, and treated depth, the basis for the regression model, as measured with error, while fitting CAR neighbourhood models by depth layer. It is hierarchical, with separate onditional autoregressive spatial variance components at each depth, and the fixed terms which involve an errors-in-measurement model treat depth errors as interval-censored measurement error. The Bayesian framework permits transparent specification and easy comparison of the various complex models compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Everything (2008) is a looped 3 channel digital video (extracted from a 3D computer animation) that appropriates a range of media including photography, drawing, painting, and pre-shot video. The work departs from traditional time-based video which is generally based on a recording of an external event. Instead, “Everything” constructs an event and space more like a painting or drawing might. The works combines constructed events (including space, combinations of objects, and aesthetic relationship of forms) with pre-recorded video footage and pre-made paintings and drawings. The result is a montage of objects, images – both still and moving – and abstracted ‘painterly’ gestures. This technique creates a complex temporal displacement. 'Past' refers to pre-recorded media such as painting and photography, and 'future' refers to a possible virtual space not in the present, that these objects may occupy together. Through this simultaneity between the real and the virtual, the work comments on a disembodied sense of space and time, while also puncturing the virtual with a sense of materiality through the tactility of drawing and painting forms and processes. In so doing, te work challenges the perspectival Cartesian space synonymous with the virtual. In this work the disembodied wandering virtual eye is met with an uncanny combination of scenes, where scale and the relationships between objects are disrupted and changed. Everything is one of the first international examples of 3D animation technology being utilised in contemporary art. The work won the inaugural $75,000 Premier of Queensland National New Media Art Award and was subsequently acquired by the Queensland Art Gallery. The work has been exhibited and reviewed nationally and internationally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relics is a single-channel video derived from a 3D computer animation that combines a range of media including photography, drawing, painting, and pre-shot video. It is constructed around a series of pictorial stills which become interlinked by the more traditionally filmic processes of panning, zooming and crane shots. In keeping with these ideas, the work revolves around a series of static architectural forms within the strangely menacing enclosure of a geodesic dome. These clinical aspects of the work are complemented by a series of elements that evoke fluidity : fireworks, mirrored biomorphic forms and oscillating projections. The visual dimension of the work is complemented by a soundtrack of rainforest bird calls. Through its ambiguous combination of recorded and virtual imagery, Relics explores the indeterminate boundaries between real and virtual space. On the one hand, it represents actual events and spaces drawn from the artist studio and image archive; on the other it represents the highly idealised spaces of drawing and 3D animation. In this work the disembodied wandering virtual eye is met with an uncanny combination of scenes, where scale and the relationships between objects are disrupted and changed. Through this simultaneity between the real and the virtual, the work conveys a disembodied sense of space and time that carries a powerful sense of affect. Relics was among the first international examples of 3D animation technology in contemporary art. It was originally exhibited in the artist’s solo show, ‘Places That Don’t Exist’ (2007, George Petelin Gallery, Gold Coast) and went on to be included in the group shows ‘d/Art 07/Screen: The Post Cinema Experience’ (2007, Chauvel Cinema, Sydney) , ‘Experimenta Utopia Now: International Biennial of Media Art’ (2010, Arts Centre, Melbourne and national touring venues) and ‘Move on Asia’ (2009, Alternative space Loop, Seoul and Para-site Art Space, Hong Kong) and was broadcast on Souvenirs from Earth (Video Art Cable Channel, Germany and France). The work was analysed in catalogue texts for ‘Places That Don’t Exist’ (2007), ‘d/Art 07’ (2007) and ‘Experimenta Utopia Now’ (2010) and the’ Souvenirs from Earth’ website.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Percolation flow problems are discussed in many research fields, such as seepage hydraulics, groundwater hydraulics, groundwater dynamics and fluid dynamics in porous media. Many physical processes appear to exhibit fractional-order behavior that may vary with time, or space, or space and time. The theory of pseudodifferential operators and equations has been used to deal with this situation. In this paper we use a fractional Darcys law with variable order Riemann-Liouville fractional derivatives, this leads to a new variable-order fractional percolation equation. In this paper, a new two-dimensional variable-order fractional percolation equation is considered. A new implicit numerical method and an alternating direct method for the two-dimensional variable-order fractional model is proposed. Consistency, stability and convergence of the implicit finite difference method are established. Finally, some numerical examples are given. The numerical results demonstrate the effectiveness of the methods. This technique can be used to simulate a three-dimensional variable-order fractional percolation equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conducting research into crime and criminal justice carries unique challenges. This Handbook focuses on the application of 'methods' to address the core substantive questions that currently motivate contemporary criminological research. It maps a canon of methods that are more elaborated than in most other fields of social science, and the intellectual terrain of research problems with which criminologists are routinely confronted. Drawing on exemplary studies, chapters in each section illustrate the techniques (qualitative and quantitative) that are commonly applied in empirical studies, as well as the logic of criminological enquiry. Organized into five sections, each prefaced by an editorial introduction, the Handbook covers: • Crime and Criminals • Contextualizing Crimes in Space and Time: Networks, Communities and Culture • Perceptual Dimensions of Crime • Criminal Justice Systems: Organizations and Institutions • Preventing Crime and Improving Justice Edited by leaders in the field of criminological research, and with contributions from internationally renowned experts, The SAGE Handbook of Criminological Research Methods is set to become the definitive resource for postgraduates, researchers and academics in criminology, criminal justice, policing, law, and sociology.