986 resultados para information metrics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The second theme of this book concerns L&D’s ‘Contributions’, specifically how L&D professionals articulate, communicate and demonstrate value that it brings to the organization. Specifically, Chapter 3, titled ‘Using information, metrics and developing business cases for L&D’, discusses how L&D professionals can do this using the business case as a vehicle. The business case is a tool that L&D professionals can use to show how new L&D initiatives can benefit the organization and its stakeholders. The value of such benefit can be ‘articulated’ quantitatively and qualitatively. Chapter 3 adopts a holistic approach in developing a business case. L&D professionals must be competently knowledgeable about accounting and finance but without the need to be experts – as their expertise lies in L&D. Therefore to successfully complete a business case, L&D professionals need to form teams comprising the right members (depending on what the business case is about). The political realities that are associated with the development of a business case can be important considerations. How well L&D is able to ‘sell’ a business case depends on how well it is framed, usually either as a problem or opportunity. We then discuss the information, data and metrics required to build a typical business case, specifically in terms of identifying the benefits and costs. The chapter concludes with some suggestions on how the findings from the business case can be presented in infographics-inspired form.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, the performance of the network coded amplify-forward cooperative protocol is studied. The use of network coding can suppress the bandwidth resource consumed by relay transmission, and hence increase the spectral efficiency of cooperative diversity. A distributed strategy of relay selection is applied to the cooperative scheme, which can reduce system overhead and also facilitate the development of the explicit expressions of information metrics, such as outage probability and ergodic capacity. Both analytical and numerical results demonstrate that the proposed protocol can achieve large ergodic capacity and full diversity gain simultaneously.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação de Mestrado em Informação e Sistemas Empresariais apresentada à Universidade Aberta em associação com o Instituto Superior Técnico

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Metrics are often used to compare the climate impacts of emissions from various sources, sectors or nations. These are usually based on global-mean input, and so there is the potential that important information on smaller scales is lost. Assuming a non-linear dependence of the climate impact on local surface temperature change, we explore the loss of information about regional variability that results from using global-mean input in the specific case of heterogeneous changes in ozone, methane and aerosol concentrations resulting from emissions from road traffic, aviation and shipping. Results from equilibrium simulations with two general circulation models are used. An alternative metric for capturing the regional climate impacts is investigated. We find that the application of a metric that is first calculated locally and then averaged globally captures a more complete and informative signal of climate impact than one that uses global-mean input. The loss of information when heterogeneity is ignored is largest in the case of aviation. Further investigation of the spatial distribution of temperature change indicates that although the pattern of temperature response does not closely match the pattern of the forcing, the forcing pattern still influences the response pattern on a hemispheric scale. When the short-lived transport forcing is superimposed on present-day anthropogenic CO2 forcing, the heterogeneity in the temperature response to CO2 dominates. This suggests that the importance of including regional climate impacts in global metrics depends on whether small sectors are considered in isolation or as part of the overall climate change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DDoS attacks are one of the major threats to Internet services. Sophisticated hackers are mimicking the features of legitimate network events, such as flash crowds, to fly under the radar. This poses great challenges to detect DDoS attacks. In this paper, we propose an attack feature independent DDoS flooding attack detection method at local area networks. We employ flow entropy on local area network routers to supervise the network traffic and raise potential DDoS flooding attack alarms when the flow entropy drops significantly in a short period of time. Furthermore, information distance is employed to differentiate DDoS attacks from flash crowds. In general, the attack traffic of one DDoS flooding attack session is generated by many bots from one botnet, and all of these bots are executing the same attack program. As a result, the similarity among attack traffic should higher than that among flash crowds, which are generated by many random users. Mathematical models have been established for the proposed detection strategies. Analysis based on the models indicates that the proposed methods can raise the alarm for potential DDoS flooding attacks and can differentiate DDoS flooding attacks from flash crowds with conditions. The extensive experiments and simulations confirmed the effectiveness of our proposed detection strategies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Health Information Exchange (HIE) will play a key part in our nation’s effort to improve healthcare. The evidence of HIEs transformational role in healthcare delivery systems is quite limited. The lack of such evidence led us to explore what exists in the healthcare industry that may provide evidence of effectiveness and efficiency of HIEs. The objective of the study was to find out how many fully functional HIEs are using any measurements or metrics to gauge impact of HIE on quality improvement (QI) and on return on investment (ROI).^ A web-based survey was used to determine the number of operational HIEs using metrics for QI and ROI. Our study highlights the fact that only 50 percent of the HIEs who responded use or plan to use metrics. However, 95 percent of the respondents believed HIEs improve quality of care while only 56 percent believed HIE showed positive ROI. Although operational HIEs present numerous opportunities to demonstrate the business model for improving health care quality, evidence to document the impact of HIEs is lacking. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measuring quality attributes of object-oriented designs (e.g. maintainability and performance) has been covered by a number of studies. However, these studies have not considered security as much as other quality attributes. Also, most security studies focus at the level of individual program statements. This approach makes it hard and expensive to discover and fix vulnerabilities caused by design errors. In this work, we focus on the security design of an object oriented application and define a number of security metrics. These metrics allow designers to discover and fix security vulnerabilities at an early stage, and help compare the security of various alternative designs. In particular, we propose seven security metrics to measure Data Encapsulation (accessibility) and Cohesion (interactions) of a given object-oriented class from the point of view of potential information flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, cognitive load analysis via acoustic- and CAN-Bus-based driver performance metrics is employed to assess two different commercial speech dialog systems (SDS) during in-vehicle use. Several metrics are proposed to measure increases in stress, distraction and cognitive load and we compare these measures with statistical analysis of the speech recognition component of each SDS. It is found that care must be taken when designing an SDS as it may increase cognitive load which can be observed through increased speech response delay (SRD), changes in speech production due to negative emotion towards the SDS, and decreased driving performance on lateral control tasks. From this study, guidelines are presented for designing systems which are to be used in vehicular environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent surveillance systems typically use a single visual spectrum modality for their input. These systems work well in controlled conditions, but often fail when lighting is poor, or environmental effects such as shadows, dust or smoke are present. Thermal spectrum imagery is not as susceptible to environmental effects, however thermal imaging sensors are more sensitive to noise and they are only gray scale, making distinguishing between objects difficult. Several approaches to combining the visual and thermal modalities have been proposed, however they are limited by assuming that both modalities are perfuming equally well. When one modality fails, existing approaches are unable to detect the drop in performance and disregard the under performing modality. In this paper, a novel middle fusion approach for combining visual and thermal spectrum images for object tracking is proposed. Motion and object detection is performed on each modality and the object detection results for each modality are fused base on the current performance of each modality. Modality performance is determined by comparing the number of objects tracked by the system with the number detected by each mode, with a small allowance made for objects entering and exiting the scene. The tracking performance of the proposed fusion scheme is compared with performance of the visual and thermal modes individually, and a baseline middle fusion scheme. Improvement in tracking performance using the proposed fusion approach is demonstrated. The proposed approach is also shown to be able to detect the failure of an individual modality and disregard its results, ensuring performance is not degraded in such situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several studies have developed metrics for software quality attributes of object-oriented designs such as reusability and functionality. However, metrics which measure the quality attribute of information security have received little attention. Moreover, existing security metrics measure either the system from a high level (i.e. the whole system’s level) or from a low level (i.e. the program code’s level). These approaches make it hard and expensive to discover and fix vulnerabilities caused by software design errors. In this work, we focus on the design of an object-oriented application and define a number of information security metrics derivable from a program’s design artifacts. These metrics allow software designers to discover and fix security vulnerabilities at an early stage, and help compare the potential security of various alternative designs. In particular, we present security metrics based on composition, coupling, extensibility, inheritance, and the design size of a given object-oriented, multi-class program from the point of view of potential information flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In vector space based approaches to natural language processing, similarity is commonly measured by taking the angle between two vectors representing words or documents in a semantic space. This is natural from a mathematical point of view, as the angle between unit vectors is, up to constant scaling, the only unitarily invariant metric on the unit sphere. However, similarity judgement tasks reveal that human subjects fail to produce data which satisfies the symmetry and triangle inequality requirements for a metric space. A possible conclusion, reached in particular by Tversky et al., is that some of the most basic assumptions of geometric models are unwarranted in the case of psychological similarity, a result which would impose strong limits on the validity and applicability vector space based (and hence also quantum inspired) approaches to the modelling of cognitive processes. This paper proposes a resolution to this fundamental criticism of of the applicability of vector space models of cognition. We argue that pairs of words imply a context which in turn induces a point of view, allowing a subject to estimate semantic similarity. Context is here introduced as a point of view vector (POVV) and the expected similarity is derived as a measure over the POVV's. Different pairs of words will invoke different contexts and different POVV's. Hence the triangle inequality ceases to be a valid constraint on the angles. We test the proposal on a few triples of words and outline further research.