820 resultados para information security management assessment
Resumo:
In the IS literature, commitment is typically considered to involve organizational or managerial support for a system and not that of its users. This paper however reports on a field study involving 16 organizations that attempted to build user involvement in developing a knowledge management strategy by having them design it. Twenty-two IT-supported group workshops (involving 183 users) were run to develop action plans for better knowledge management that users would like to see implemented. Each workshop adopted the same problem structuring technique to assist group members develop a politically feasible action plan to which they were psychologically and emotionally dedicated. In addition to reviewing the problem structuring method, this paper provides qualitative insight into the factors a knowledge management strategy should have to encourage user commitment. © 2004 Elsevier B.V. All rights reserved.
Resumo:
Our study investigated the impact of ICT expansion on economic freedom in the Middle East (Bahrain, Iran, Jordan, Kuwait, Lebanon, Oman, Qatar, Saudi Arabia, Syria, United Arab Emirates, and Yemen). Our empirical analysis used archival data from 1995 to 2005; it showed that ICT expansion in the Middle East has been effective both in bridging the digital divide and also in promoting economic freedom in a region that was vulnerable to political, social, and global conflict. However, differences between countries, such as the educational attainment of their citizens and institutional resistance to technology acceptance, both enhanced and restricted the relationship between ICT and economic freedom.
Resumo:
Based on a Belief-Action-Outcome framework, we produced a model that shows senior managers' perception of both the antecedents to and the consequences of Green IS adoption by a firm. This conceptual model and its associated hypotheses were empirically tested using a dataset generated from a survey of 405 organizations. The results suggest that coercive pressure influences the attitude toward Green IS adoption while mimetic pressure does not. In addition, we found that there was a significant relationship between Green IS adoption, attitude, and consideration of future consequences. Finally, we found that only long term Green IS adoption was positively related to environmental performance. © 2013 Elsevier B.V.
Resumo:
The impact of ICT (information and communications technology) on the logistics service industry is reshaping its organisation and structure. Within this process, the nature of changes resulting from ICT dissemination in small 3PLs (third party logistics providers) is still unclear, although a large number of logistics service markets, especially in the EU context, are populated by a high number of small 3PLs. In addition, there is still a gap in the literature where the role of technological capability in small 3PLs is seriously underestimated. This gives rise to the need to develop investigation in this area. The paper presents the preliminary results of a case study analysis on ICT usage in a sample of 7 small Italian 3PLs. The results highlight some of the barriers to effective ICT implementation, as well as some of the critical success factors.
Resumo:
Some basic types of archiving programs are described in the paper in addition to their advantages and disadvantages with respect to the analysis of security in archiving. Analysis and appraisal are performed on the results obtained during the described experiments.
Resumo:
This paper dials with presentations of the Bulgarian Cultural and Historical Heritage in the Cyberspace. The study was taking place at the Information management course with bachelor students in Information Technologies, Information Brokerage and Information Security at the University of Library Studies and Information Technologies. The students describe about 300 different objectives – cultural and historical, material and immaterial.
Resumo:
The shifting of global economic power from mature, established markets to emerging markets (EMs) is a fundamental feature of the new realities in the global political economy. Due to a combination of reasons (such as scarcity of reliable information on management systems of EMs, the growing contribution of human resource management (HRM) towards organisational performance, amongst others), the understanding about the dynamics of management of HRM in the EMs context and the need for proactive efforts by key stakeholders (e.g., multinational and local firms, policy makers and institutions such as trade unions) to develop appropriate HRM practice and policy for EMs has now become more critical than ever. It is more so given the phenomenal significance of the EMs predicted for the future of the global economy. For example, Antoine van Agtmael predicts that: in about 25 years the combined gross national product (GNP) of emergent markets will overtake that of currently mature economies causing a major shift in the centre of gravity of the global economy away from the developed to emerging economies. (van Agtmael 2007: 10–11) Despite the present (late 2013 and early 2014) slowdown in the contribution of EMs towards the global industrial growth (e.g., Das, 2013; Reuters, 2014), EMs are predicted to produce 70 per cent of world GDP growth and a further ten years later, their equity market capitalisation is expected to reach US$ 80 trillion, 1.2 times more than the developed world (see Goldman Sachs, 2010).
Resumo:
The complexity of many organizational tasks requires perspectives, expertise, and talents that are often not found in a single individual. Organizations have therefore been placing employees into groups, assigning them to tasks they would formally have undertaken individually. The use of these groups, known as workgroups, has become an important strategy for managing this increased complexity. Empirical research on participative budgeting however has been limited almost exclusively to individuals. This dissertation empirically examines the effect of the information that management and workgroups have about group members' performance capabilities, on the work standards that workgroups select during the participative budgeting process. ^ A laboratory experiment was conducted in which two hundred and forty undergraduate business students were randomly assigned to three-member groups. The study provides empirical evidence which suggests that when management is unaware of group members' performance capabilities, workgroups select higher work standards and have higher performance levels than when management is aware of their performance capabilities. ^
Resumo:
The function of assessment in higher-education hospitality programs is to improve student learning. Although the assessment process is common in higher-education institutions, examples of assessment practices in hospitality programs have not been made available to academic practitioners. This paper describes a method successful at formulating assessment in a hospitality college professional program.
Resumo:
An assessment of how hotel guests view in-room entertainment-technology amenities was conducted to compare the importance of these technologies to how they performed. In-room entertainment technology continues to evolve in the hotel industry. However, given the multitude of entertainment products available in the marketplace today, hoteliers have little understanding of guests’ expectations and of which in-room entertainment-technology amenities will drive guest satisfaction and increase loyalty to the hotel brand. Given that technology is integral to a hotel stay, this study seeks to evaluate the importance and performance of in-room entertainment-technology amenities. Findings indicate that free-to-guest television (FTG TV) and high-speed Internet access were the two most important inroom entertainment-technology amenities when it comes to the selection of a hotel for both leisure and business travelers. The Importance/Satisfaction Matrix presented in the current study showed that many of the in-room entertainment-technology amenities are currently a low priority for guests. Keywords: importance-performance analysis, hotel, in-room entertainment technologies
Resumo:
This project is about retrieving data in range without allowing the server to read it, when the database is stored in the server. Basically, our goal is to build a database that allows the client to maintain the confidentiality of the data stored, despite all the data is stored in a different location from the client's hard disk. This means that all the information written on the hard disk can be easily read by another person who can do anything with it. Given that, we need to encrypt that data from eavesdroppers or other people. This is because they could sell it or log into accounts and use them for stealing money or identities. In order to achieve this, we need to encrypt the data stored in the hard drive, so that only the possessor of the key can easily read the information stored, while all the others are going to read only encrypted data. Obviously, according to that, all the data management must be done by the client, otherwise any malicious person can easily retrieve it and use it for any malicious intention. All the methods analysed here relies on encrypting data in transit. In the end of this project we analyse 2 theoretical and practical methods for the creation of the above databases and then we tests them with 3 datasets and with 10, 100 and 1000 queries. The scope of this work is to retrieve a trend that can be useful for future works based on this project.
Resumo:
Abstract
The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.
This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.
I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.
Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.
II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.
The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.
In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.
Resumo:
Cyber-physical systems tightly integrate physical processes and information and communication technologies. As today’s critical infrastructures, e.g., the power grid or water distribution networks, are complex cyber-physical systems, ensuring their safety and security becomes of paramount importance. Traditional safety analysis methods, such as HAZOP, are ill-suited to assess these systems. Furthermore, cybersecurity vulnerabilities are often not considered critical, because their effects on the physical processes are not fully understood. In this work, we present STPA-SafeSec, a novel analysis methodology for both safety and security. Its results show the dependencies between cybersecurity vulnerabilities and system safety. Using this information, the most effective mitigation strategies to ensure safety and security of the system can be readily identified. We apply STPA-SafeSec to a use case in the power grid domain, and highlight its benefits.
Resumo:
Abstract There has been a great deal of interest in the area of cyber security in recent years. But what is cyber security exactly? And should society really care about it? We look at some of the challenges of being an academic working in the area of cyber security and explain why cyber security is, to put it rather simply, hard! Speaker Biography Keith Martin Prof. Keith Martin is Professor of Information Security at Royal Holloway, University of London. He received his BSc (Hons) in Mathematics from the University of Glasgow in 1988 and a PhD from Royal Holloway in 1991. Between 1992 and 1996 he held a Research Fellowship at the University of Adelaide, investigating mathematical modelling of cryptographic key distribution problems. In 1996 he joined the COSIC research group of the Katholieke Universiteit Leuven in Belgium, working on security for third generation mobile communications. Keith rejoined Royal Holloway in January 2000, became a Professor in Information Security in 2007 and was Director of the Information Security Group between 2010 and 2015. Keith's research interests range across cyber security, but with a focus on cryptographic applications. He is the author of 'Everyday Cryptography' published by Oxford University Press.