970 resultados para Accessibility metrics
Resumo:
This was presented during the 2nd annual Library Research and Innovation Practices at the University of Maryland Libraries, McKeldin Library, on June 8, 2016.
Resumo:
This presentation was one of four during a Mid-Atlantic Regional Archives Conference presentation on April 15, 2016. Digitization of collections can help to improve internal workflows, make materials more accessible, and create new and engaging relationships with users. Laurie Gemmill Arp will discuss the LYRASIS Digitization Collaborative, created to assist institutions with their digitization needs, and how it has worked to help institutions increase connections with users. Robin Pike from the University of Maryland will discuss how they factor requests for access into selection for digitization and how they track the use of digitized materials. Laura Drake Davis of James Madison University will discuss the establishment of a formal digitization program, its impact on users, and the resulting increased use of their collections. Linda Tompkins-Baldwin will discuss Digital Maryland’s partnership with the Digital Public Library of America to provide access to archives held by institutions without a digitization program.
Resumo:
Even though the use of recommender systems is already widely spread in several application areas, there is still a lack of studies for accessibility research field. One of these attempts to use recommender system benefits for accessibility needs is Vulcanus. The Vulcanus recommender system uses similarity analysis to compare user’s trails. In this way, it is possible to take advantage of the user’s past behavior and distribute personalized content and services. The Vulcanus combined concepts from ubiquitous computing, such as user profiles, context awareness, trails management, and similarity analysis. It uses two different approaches for trails similarity analysis: resources patterns and categories patterns. In this work we performed an asymptotic analysis, identifying Vulcanus’ algorithm complexity. Furthermore we also propose improvements achieved by dynamic programming technique, so the ordinary case is improved by using a bottom-up approach. With that approach, many unnecessary comparisons can be skipped and now Vulcanus 2.0 is presented with improvements in its average case scenario.
Resumo:
Geographically isolated wetlands, those entirely surrounded by uplands, provide numerous ecological functions, some of which are dependent on the degree to which they are hydrologically connected to nearby waters. There is a growing need for field-validated, landscape-scale approaches for classifying wetlands based on their expected degree of connectivity with stream networks. During the 2015 water year, flow duration was recorded in non-perennial streams (n = 23) connecting forested wetlands and nearby perennial streams on the Delmarva Peninsula (Maryland, USA). Field and GIS-derived landscape metrics (indicators of catchment, wetland, non-perennial stream, and soil characteristics) were assessed as predictors of wetland-stream connectivity (duration, seasonal onset and offset dates). Connection duration was most strongly correlated with non-perennial stream geomorphology and wetland characteristics. A final GIS-based stepwise regression model (adj-R2 = 0.74, p < 0.0001) described wetland-stream connection duration as a function of catchment area, wetland area and number, and soil available water storage.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
The interaction of bovine serum albumin (BSA) with the ionic surfactants sodium dodecylsulfate (SDS, anionic), cetyltrimethylammonium chloride (CTAC, cationic) and N-hexadecyl-N,N-dimethyl-3-ammonio-1-propanesulfonate (HPS, zwitterionic) was studied by electron paramagnetic resonance (EPR) spectroscopy of spin label covalently bound to the single free thiol group of the protein. EPR spectra simulation allows to monitor the protein dynamics at the labeling site and to estimate the changes in standard Gibbs free energy, enthalpy and entropy for transferring the nitroxide side chain from the more motionally restricted to the less restricted component. Whereas SDS and CTAC showed similar increases in the dynamics of the protein backbone for all measured concentrations. HPS presented a smaller effect at concentrations above 1.5 mM. At 10 mM of surfactants and 0.15 mM BSA, the standard Gibbs free energy change was consistent with protein backbone conformations more expanded and exposed to the solvent as compared to the native protein, but with a less pronounced effect for HPS. In the presence of the surfactants, the enthalpy change, related to the energy required to dissociate the nitroxide side chain from the protein, was greater, suggesting a lower water activity. The nitroxide side chain also detected a higher viscosity environment in the vicinity of the paramagnetic probe induced by the addition of the surfactants. The results suggest that the surfactant-BSA interaction, at higher surfactant concentration, is affected by the affinities of the surfactant to its own micelles and micelle-like aggregates. Complementary DLS data suggests that the temperature induced changes monitored by the nitroxide probe reflects local changes in the vicinity of the single thiol group of Cys-34 BSA residue. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Dependence of some species on landscape structure has been proved in numerous studies. So far, however, little progress has been made in the integration of landscape metrics in the prediction of species associated with coastal features. Specific landscape metrics were tested as predictors of coastal shape using three coastal features of the Iberian Peninsula (beaches, capes and gulfs) at different scales. We used the landscape metrics in combination with environmental variables to model the niche and find suitable habitats for a seagrass species (Cymodocea nodosa) throughout its entire range of distribution. Landscape metrics able to capture variation in the coastline enhanced significantly the accuracy of the models, despite the limitations caused by the scale of the study. We provided the first global model of the factors that can be shaping the environmental niche and distribution of C. nodosa throughout its range. Sea surface temperature and salinity were the most relevant variables. We identified areas that seem unsuitable for C. nodosa as well as those suitable habitats not occupied by the species. We also present some preliminary results of testing historical biogeographical hypotheses derived from distribution predictions under Last Glacial Maximum conditions and genetic diversity data.
Resumo:
This study presents the results of a systematic literature review on the combined field of Accessibility and Massive Open Online Courses (MOOCs), covering the time period from 2008 to 2016 July. This dataset updates the previous release from 2008 to 2016 May (http://hdl.handle.net/10045/54846).
Resumo:
Determination of combustion metrics for a diesel engine has the potential of providing feedback for closed-loop combustion phasing control to meet current and upcoming emission and fuel consumption regulations. This thesis focused on the estimation of combustion metrics including start of combustion (SOC), crank angle location of 50% cumulative heat release (CA50), peak pressure crank angle location (PPCL), and peak pressure amplitude (PPA), peak apparent heat release rate crank angle location (PACL), mean absolute pressure error (MAPE), and peak apparent heat release rate amplitude (PAA). In-cylinder pressure has been used in the laboratory as the primary mechanism for characterization of combustion rates and more recently in-cylinder pressure has been used in series production vehicles for feedback control. However, the intrusive measurement with the in-cylinder pressure sensor is expensive and requires special mounting process and engine structure modification. As an alternative method, this work investigated block mounted accelerometers to estimate combustion metrics in a 9L I6 diesel engine. So the transfer path between the accelerometer signal and the in-cylinder pressure signal needs to be modeled. Depending on the transfer path, the in-cylinder pressure signal and the combustion metrics can be accurately estimated - recovered from accelerometer signals. The method and applicability for determining the transfer path is critical in utilizing an accelerometer(s) for feedback. Single-input single-output (SISO) frequency response function (FRF) is the most common transfer path model; however, it is shown here to have low robustness for varying engine operating conditions. This thesis examines mechanisms to improve the robustness of FRF for combustion metrics estimation. First, an adaptation process based on the particle swarm optimization algorithm was developed and added to the single-input single-output model. Second, a multiple-input single-output (MISO) FRF model coupled with principal component analysis and an offset compensation process was investigated and applied. Improvement of the FRF robustness was achieved based on these two approaches. Furthermore a neural network as a nonlinear model of the transfer path between the accelerometer signal and the apparent heat release rate was also investigated. Transfer path between the acoustical emissions and the in-cylinder pressure signal was also investigated in this dissertation on a high pressure common rail (HPCR) 1.9L TDI diesel engine. The acoustical emissions are an important factor in the powertrain development process. In this part of the research a transfer path was developed between the two and then used to predict the engine noise level with the measured in-cylinder pressure as the input. Three methods for transfer path modeling were applied and the method based on the cepstral smoothing technique led to the most accurate results with averaged estimation errors of 2 dBA and a root mean square error of 1.5dBA. Finally, a linear model for engine noise level estimation was proposed with the in-cylinder pressure signal and the engine speed as components.
Resumo:
As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.
Resumo:
Perceived accessibility has been acknowledged as an important aspect of transport policy since the 70s. Nevertheless, very few empirical studies have been conducted in this field. When aiming to improve social inclusion, by making sus-tainable transport modes accessible to all, it is important to understand the factors driving perceived accessibility. Un-like conventional accessibility measures, perceived accessibility focuses on the perceived possibilities and ease of en-gaging in preferred activities using different transport modes. We define perceived accessibility in terms of how easy it is to live a satisfactory life with the help of the transport system, which is not necessarily the same thing as the objec-tive standard of the system. According to previous research, perceived accessibility varies with the subjectively-rated quality of the mode of transport. Thus, improvements in quality (e.g. trip planning, comfort, or safety) increase the per-ceived accessibility and make life easier to live using the chosen mode of transport. This study (n=750) focuses on the perceived accessibility of public transport, captured using the Perceived Accessibility Scale PAC (Lättman, Olsson, & Fri-man, 2015). More specifically, this study aims to determine how level of quality affects the perceived accessibility in public transport. A Conditional Process Model shows that, in addition to quality, feeling safe and frequency of travel are important predictors of perceived accessibility. Furthermore, elderly and those in their thirties report a lower level of perceived accessibility to their day-to-day activities using public transport. The basic premise of this study is that sub-jective experiences may be as important as objective indicators when planning and designing for socially inclusive transport systems.
Resumo:
In this paper we discuss the temporal aspects of indexing and classification in information systems. Basing this discussion off of the three sources of research of scheme change: of indexing: (1) analytical research on the types of scheme change and (2) empirical data on scheme change in systems and (3) evidence of cataloguer decision-making in the context of scheme change. From this general discussion we propose two constructs along which we might craft metrics to measure scheme change: collocative integrity and semantic gravity. The paper closes with a discussion of these constructs.
Resumo:
Nowadays, the scientific community has devoted a consistent effort to the sustainable development of the waste management sector and resource efficiency in building infrastructures. Waste is the fourth largest source sector of emissions and the municipal solid waste management system is considered as the most complex system to manage, due to its diverse composition and fragmentation of producers and responsibilities. Nevertheless, given the deep complexity that characterize the waste management sector, sustainability is still a challenging task. Interestingly, open issues arise when dealing with the sustainability of the waste sector. In this thesis, some recent advances in the waste management sector have been presented. Specifically, through the analysis of four author publications this thesis attempted to fill the gap in the following open issues: (i) the waste collection and generation of waste considering the pillars of sustainability; (ii) the environmental and social analysis in designing building infrastructures; (iv) the role of the waste collection in boosting sustainable systems of waste management; (v) the ergonomics impacts of waste collection. For this purpose, four author publications in international peer – reviewed journals were selected among the wholly author's contributions (i.e., final publication stage).