867 resultados para data-basemanagement
Resumo:
With a focus to optimising the life cycle performance of Australian Railway bridges, new bridge classification and environmental classification systems are proposed. The new bridge classification system is mainly to facilitate the implementation of novel Bridge Management System (BMS) which optimise the life cycle cost both at project level and network level while environment classification is mainly to improve accuracy of Remaining Service Potential (RSP) module of the proposed BMS. In fact, limited capacity of the existing BMS to trigger the maintenance intervention point is an indirect result of inadequacies of the existing bridge and environmental classification systems. The proposed bridge classification system permits to identify the intervention points based on percentage deterioration of individual elements and maintenance cost, while allowing performance based rating technique to implement for maintenance optimisation and prioritisation. Simultaneously, the proposed environment classification system will enhance the accuracy of prediction of deterioration of steel components.
Resumo:
Hydrogeophysics is a growing discipline that holds significant promise to help elucidate details of dynamic processes in the near surface, built on the ability of geophysical methods to measure properties from which hydrological and geochemical variables can be derived. For example, bulk electrical conductivity is governed by, amongst others, interstitial water content, fluid salinity, and temperature, and can be measured using a range of geophysical methods. In many cases, electrical resistivity tomography (ERT) is well suited to characterize these properties in multiple dimensions and to monitor dynamic processes, such as water infiltration and solute transport. In recent years, ERT has been used increasingly for ecosystem research in a wide range of settings; in particular to characterize vegetation-driven changes in root-zone and near-surface water dynamics. This increased popularity is due to operational factors (e.g., improved equipment, low site impact), data considerations (e.g., excellent repeatability), and the fact that ERT operates at scales significantly larger than traditional point sensors. Current limitations to a more widespread use of the approach include the high equipment costs, and the need for site-specific petrophysical relationships between properties of interest. In this presentation we will discuss recent equipment advances and theoretical and methodological aspects involved in the accurate estimation of soil moisture from ERT results. Examples will be presented from two studies in a temperate climate (Michigan, USA) and one from a humid tropical location (Tapajos, Brazil).
Resumo:
This paper addresses research from a three-year longitudinal study that engaged children in data modeling experiences from the beginning school year through to third year (6-8 years). A data modeling approach to statistical development differs in several ways from what is typically done in early classroom experiences with data. In particular, data modeling immerses children in problems that evolve from their own questions and reasoning, with core statistical foundations established early. These foundations include a focus on posing and refining statistical questions within and across contexts, structuring and representing data, making informal inferences, and developing conceptual, representational, and metarepresentational competence. Examples are presented of how young learners developed and sustained informal inferential reasoning and metarepresentational competence across the study to become “sophisticated statisticians”.
Resumo:
The study of data modelling with elementary students involves the analysis of a developmental process beginning with children’s investigations of meaningful contexts: visualising, structuring, and representing data and displaying data in simple graphs (English, 2012; Lehrer & Schauble, 2005; Makar, Bakker, & Ben-Zvi, 2011). A 3-year longitudinal study investigated young children’s data modelling, integrating mathematical and scientific investigations. One aspect of this study involved a researcher-led teaching experiment with 21 mathematically able Grade 1 students. The study aimed to describe explicit developmental features of students’ representations of continuous data...
Resumo:
Protein adsorption at solid-liquid interfaces is critical to many applications, including biomaterials, protein microarrays and lab-on-a-chip devices. Despite this general interest, and a large amount of research in the last half a century, protein adsorption cannot be predicted with an engineering level, design-orientated accuracy. Here we describe a Biomolecular Adsorption Database (BAD), freely available online, which archives the published protein adsorption data. Piecewise linear regression with breakpoint applied to the data in the BAD suggests that the input variables to protein adsorption, i.e., protein concentration in solution; protein descriptors derived from primary structure (number of residues, global protein hydrophobicity and range of amino acid hydrophobicity, isoelectric point); surface descriptors (contact angle); and fluid environment descriptors (pH, ionic strength), correlate well with the output variable-the protein concentration on the surface. Furthermore, neural network analysis revealed that the size of the BAD makes it sufficiently representative, with a neural network-based predictive error of 5% or less. Interestingly, a consistently better fit is obtained if the BAD is divided in two separate sub-sets representing protein adsorption on hydrophilic and hydrophobic surfaces, respectively. Based on these findings, selected entries from the BAD have been used to construct neural network-based estimation routines, which predict the amount of adsorbed protein, the thickness of the adsorbed layer and the surface tension of the protein-covered surface. While the BAD is of general interest, the prediction of the thickness and the surface tension of the protein-covered layers are of particular relevance to the design of microfluidics devices.
Resumo:
Critical to the research of urban morphologists is the availability of historical records that document the urban transformation of the study area. However, thus far little work has been done towards an empirical approach to the validation of archival data in this field. Outlined in this paper, therefore, is a new methodology for validating the accuracy of archival records and mapping data, accrued through the process of urban morphological research, so as to establish a reliable platform from which analysis can proceed. The paper particularly addresses the problems of inaccuracies in existing curated historical information, as well as errors in archival research by student assistants, which together give rise to unacceptable levels of uncertainty in the documentation. The paper discusses the problems relating to the reliability of historical information, demonstrates the importance of data verification in urban morphological research, and proposes a rigorous method for objective testing of collected archival data through the use of qualitative data analysis software.