974 resultados para user requirements


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Indicators which summarise the characteristics of spatiotemporal data coverages significantly simplify quality evaluation, decision making and justification processes by providing a number of quality cues that are easy to manage and avoiding information overflow. Criteria which are commonly prioritised in evaluating spatial data quality and assessing a dataset’s fitness for use include lineage, completeness, logical consistency, positional accuracy, temporal and attribute accuracy. However, user requirements may go far beyond these broadlyaccepted spatial quality metrics, to incorporate specific and complex factors which are less easily measured. This paper discusses the results of a study of high level user requirements in geospatial data selection and data quality evaluation. It reports on the geospatial data quality indicators which were identified as user priorities, and which can potentially be standardised to enable intercomparison of datasets against user requirements. We briefly describe the implications for tools and standards to support the communication and intercomparison of data quality, and the ways in which these can contribute to the generation of a GEO label.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The approaches to the analysis of various information resources pertinent to user requirements at a semantic level are determined by the thesauruses of the appropriate subject domains. The algorithms of formation and normalization of the multilinguistic thesaurus, and also methods of their comparison are given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This is an extended version of an article presented at the Second International Conference on Software, Services and Semantic Technologies, Sofia, Bulgaria, 11–12 September 2010.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Software applications created on top of the service-oriented architecture (SOA) are increasingly popular but testing them remains a challenge. In this paper a framework named TASSA for testing the functional and non-functional behaviour of service-based applications is presented. The paper focuses on the concept of design time testing, the corresponding testing approach and architectural integration of the consisting TASSA tools. The individual TASSA tools with sample validation scenarios were already presented with a general view of their relation. This paper’s contribution is the structured testing approach, based on the integral use of the tools and their architectural integration. The framework is based on SOA principles and is composable depending on user requirements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Our modular approach to data hiding is an innovative concept in the data hiding research field. It enables the creation of modular digital watermarking methods that have extendable features and are designed for use in web applications. The methods consist of two types of modules – a basic module and an application-specific module. The basic module mainly provides features which are connected with the specific image format. As JPEG is a preferred image format on the Internet, we have put a focus on the achievement of a robust and error-free embedding and retrieval of the embedded data in JPEG images. The application-specific modules are adaptable to user requirements in the concrete web application. The experimental results of the modular data watermarking are very promising. They indicate excellent image quality, satisfactory size of the embedded data and perfect robustness against JPEG transformations with prespecified compression ratios. ACM Computing Classification System (1998): C.2.0.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Requirements for space based monitoring of permafrost features had been already defined within the IGOS Cryosphere Theme Report at the start of the IPY in 2007 (IGOS, 2007). The WMO Polar Space Task Group (PSTG, http://www.wmo.int/pages/prog/sat/pstg_en.php) identified the need to review the requirements for permafrost monitoring and to update these requirements in 2013. Relevant surveys with focus on satellite data are already available from the ESA DUE Permafrost User requirements survey (2009), the United States National Research Council (2014) and the ESA - CliC - IPA - GTN -P workshop in February 2014. These reports have been reviewed and specific needs discussed within the community and a white paper submitted to the WMO PSTG. Acquisition requirements for monitoring of especially terrain changes (incl. rock glaciers and coastal erosion) and lakes (extent, ice properties etc.) with respect to current satellite missions have been specified. About 50 locations ('cold spots') where permafrost (Arctic and Antarctic) in situ monitoring has been taking place for many years or where field stations are currently established have been identified. These sites have been proposed to the WMO Polar Space Task Group as focus areas for future monitoring by high resolution satellite data. The specifications of these sites including meta-data on site instrumentation have been published as supplement to the white paper (Bartsch et al. 2014, doi:10.1594/PANGAEA.847003). The representativity of the 'cold spots' around the arctic has been in the following assessed based on a landscape units product which has been developed as part of the FP7 project PAGE21. The ESA DUE Permafrost service has been utilized to produce a pan-arctic database (25km, 2000-2014) comprising Mean Annual Surface Temperature, Annual and summer Amplitude of Surface Temperature, Mean Summer (July-August) Surface Temperature. Surface status (frozen/unfrozen) related products have been also derived from the ESA DUE Permafrost service. This includes the length of unfrozen period, first unfrozen day and first frozen day. In addition, SAR (ENVISAT ASAR GM) statistics as well as topographic parameters have been considered. The circumpolar datasets have been assessed for their redundancy in information content. 12 distinct units could be derived. The landscape units reveal similarities between North Slope Alaska and the region from the Yamal Peninsula to the Yenisei estuary. Northern Canada is characterized by the same landscape units like western Siberia. North-eastern Canada shows similarities to the Laptev coast region. This paper presents the result of this assessment and formulates recommendations for extensions of the in situ monitoring networks and categorizes the sites by satellite data requirements (specifically Sentinels) with respect to the landscape type and related processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The studies have aimed to overcome the confusing variety of existing persistent identifier systems, by; analysing the current national URN:NBN and other identifier initiatives providing guidelines for an international harmonized persistent identifier framework that serves the long-term preservation needs of the research and cultural heritage communities advising these communities about a roadmap to gain the potential benefits. This roadmap also includes a blueprint for an organisation for the distribution and maintenance of the Persistent Identifier infrastructure. These studies are connected to the broader PersId project with DEFF, SURF, DANS, the national libraries of Germany, Finland and Sweden and CNR and FDR from Italy. A number of organisations have been involved in the process: Europeana, the British library, the Dutch Royal Library, the National library of Norway and the Ministry of Education, Flanders, Belgium. PersID - III: Current State and State of the Art (IIIa) & User Requirements (IIIb) (Persistent Identifier: urn:nbn:nl:ui:13-9g4-i1s) PersID - IV: Prototype for a Meta Resolver System/ Work on Standards (Persistent Identifier: urn:nbn:nl:ui:13-wt1-6n9) PersID - V: Sustainability (Persistent Identifier: urn:nbn:nl:ui:13-o4p-8py) Please note that there are also two broader reports on the project as a whole: PersID - I: Project report and II:Communication. For further information please visit the website of the Persistent Identifier project: www.persid.org

Relevância:

60.00% 60.00%

Publicador:

Resumo:

High quality, well designed medical devices are necessary to provide safe and effective clinical care for patients as well as to ensure the health and safety of professional and lay device users. Capturing the user requirements of users and incorporating these into design is an essential component of this. The field of ergonomics has an opportunity to assist, not only with this area, but also to encourage a more general consideration of the user during medical device development. A review of the literature on methods for assessing user requirements in engineering and ergonomics found that little published work exists on the ergonomics aspects of medical device development. In particular there is little advice available to developers on which issues to consider during design and development or recommendations for good practice in terms of the methods and approaches needed to capture the full range of user requirements. The Multidisciplinary Assessment of Technology Centre for Healthcare (MATCH) is a research collaboration that is working in conjunction with industrial collaborators to apply ergonomics methods to real case study projects with the ultimate aim of producing an industry-focused guide to applying ergonomics principles in medical device development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

MATCH (Multidisciplinary Assessment of Technology Centre for Healthcare) is a new collaboration in the UK that aims to support the healthcare sector by creating methods to assess the value of medical devices from concept through to mature product. A major aim of MATCH is to encourage the inclusion of the user throughout the product lifecycle in order to achieve devices that truly meet the requirements of their users. A review of the published literature indicates that user requirements are mainly collected during the design and evaluation stage of the product lifecycle whilst other areas, including the concept stage, have less user involvement. Complementing the literature review is an in-depth consultation with the medical device industry, which has identified a number of barriers encountered by companies when attempting to capture user requirements. These will be addressed by a number of case study projects, performed in collaboration with our industrial partners, that will examine the application and utility of different approaches to collecting and analysing data on user requirements. MATCH is focused on providing advice to device developers on how to select and apply methods that have maximum theoretical strength, practical application, cost-effectiveness and likelihood of wide sector acceptance. Feedback will be sought in order to ensure that the needs of the diverse medical device sector are met.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En la actualidad, todos los servicios convergen en una Red de Próxima Generación [NGN]. Asimismo, las exigencias de calidad de servicio [QoS], por los requerimientos de los usuarios, son más estrictas, lo que hace necesario plantear procedimientos de QoS que garanticen una operación eficaz en el transporte de los servicios más críticos y de tiempo real ¿como la voz¿, garantizando la disminución de los problemas de latencia, jitter, pérdida de paquetes y eco. Los operadores de Telecomunicaciones deben aplicar las regulaciones emitidas por la Comisión de Regulación de Comunicaciones de Colombia [CRC] y ajustarse a las recomendaciones Y.1540 y Y.1541 de la Unión Internacional de Telecomunicaciones [UIT]. Este documento presenta un procedimiento para aplicar mecanismos de QoS en una NGN en el acceso xDSL con el fin de mantener un nivel de QoS en Voz sobre IP (VoIP) que permita su provisión, con eficiencia económica y técnica, en favor tanto del cliente, como del operador de telecomunicaciones.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.