975 resultados para COMPUTER SCIENCE, THEORY


Relevância:

90.00% 90.00%

Publicador:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In e-Science experiments, it is vital to record the experimental process for later use such as in interpreting results, verifying that the correct process took place or tracing where data came from. The process that led to some data is called the provenance of that data, and a provenance architecture is the software architecture for a system that will provide the necessary functionality to record, store and use process documentation. However, there has been little principled analysis of what is actually required of a provenance architecture, so it is impossible to determine the functionality they would ideally support. In this paper, we present use cases for a provenance architecture from current experiments in biology, chemistry, physics and computer science, and analyse the use cases to determine the technical requirements of a generic, technology and application-independent architecture. We propose an architecture that meets these requirements and evaluate a preliminary implementation by attempting to realise two of the use cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper is a result of a fruitful cooperation between the computer science and the dental diagnosis experiences. The study presents a new approach of applying computer algorithms to radiographic images of dental implantation used for bone regeneration. We focus here only on the contribution of the computer assistance to the clinical research as the periodontal therapy is beyond the scope of this paper. The proposed system is based on a pattern recognition approach, directed to recognize density changes in the intra-bony affected areas of patients. It comprises different modules with new algorithms specially designed to treat the patients&rsquo; radiographic images more accurately. The system includes digitizing, detecting the complicated region of interest (ROI), defining reference area to correct any projection discrepancy of the follow up images, and finally to extract the distinguishing features of the ROI as a basis for determining the rate of new bone density accumulation. This study is applied to two typical dental cases for a patient who received two different operations. The results are very encouraging and more accurate than traditional techniques reported before. <br />

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The under-representation of women in the computing profession in many parts the western world has received our attention through numerous publications, the noticeable low representation of women at computer science conferences and in the lecture halls. Over the past two decades, the situation had become worse. Please refer to the other articles in this special issue for detailed statistics, a discussion of factors that contribute to the low participation rate by women, and for suggestions on how to reverse the current trend.This paper seeks to add to the dialogue by presenting preliminary findings from a research project conducted in four countries. The aim of this research was to gain an insight into the perceptions future computer professionals hold on the category of employment loosely defined under the term of &quot;a computer professional.&quot; One goal was to get insight into whether or not there is a difference between female and male students regarding their view of computer professionals. Other goals were to determine if there was any difference between female and male students in different parts of the world, as well as who or what most influences the students to undertake their courses in computing.The team of researchers gave an extensive questionnaire to undergraduate students enrolled in a variety of computing degree programs. The students enrolled in these programs at Victoria University of Technology in Melbourne, the University of East London, the Chinese University of Hong Kong, and Ithaca College located in Ithaca, New York. This article reports on the analysis of the results from the questionnaire. It discusses the gender differences in the responses from the students in these countries to try to get a worldwide perspective. At this time, it does not yet report on the similarities and differences between the groups of participants from each of the four countries. Instead, it investigates whether there are gendered differences in the views of this rather broad sample of student population of future computer professionals.<br /><br />

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Attempts to produce adequate and long-lived subject indexes of information systems and computer science research have failed. In this paper we report preliminary results of an approach by which the terms expressed in research literature, such as that in information systems, can be systematically and meaningfully categorised. The approach is based on Roman Ingarden&rsquo;s ontological theory of the written scholarly work: its nature, existence, and categorisation, and builds on Grounded Theory: a rigorous grounded qualitative research method addressing how meaningful categories can be analysed from text and related to each other. We have found that the key guiding unit of analysis operationalising Ingarden&rsquo;s approach through Grounded Theory is the &ldquo;reported research activity&rdquo; and that the process is possible although labour intensive. On the basis of using the approach, we propose simple steps to improve the quality of keywords in reported research.<br />

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of the research is to investigate factors that may explain success in elementary computer programming at the tertiary level. The first phase of the research included the identification of possible explanatory factors through a literature review, a survey of students studying introductory computing, a focus-group session with teachers of computer programming and interviews with programming students. The second phase of the research that was called the main study, involved testing the identified factors. Two different groups of programming students - one group majoring in business computing and another majoring in computer science - completed a survey questionnaire. The findings of the research are as follows. Gender is of little significance for business students but there is an adverse gender penalty for females in computer science. Secondary school assessment is inversely related to outcomes in business computing but directly influences outcomes in the first programming unit in the computer science course. As in prior research, previous knowledge and experience were demonstrated to matter, A range of other variables was found to be of little importance. The research suggests that different problem-solving techniques might be relevant in business compared with those of use in computer science.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Requirements engineering is a commencing phase in the development of either software applications or information systems. It is concerned with understanding and specifying the customer's requirements of the system to be delivered. Throughout the literature, this is agreed to be one of the most crucial and, unfortunately, problematic phases in development. Despite the diversity of research directions, approaches and methods, the question of process understanding and management is still limited. Among contemporary approaches to the improvement of the current practice of Requirements Engineering, Formal Object-Oriented Method (FOOM) has been introduced as a new promising solution. The FOOM approach to requirements engineering is based on a synthesis of socio-organisational theory, the object-oriented approach, and mathematical formal specification. The entire FOOM specification process is evolutionary and involves a large volume of changes in requirements. During this process, requirements evolve through various forms of informal, semi-formal, and formal while maintaining a semantic link between these forms and, most importantly, conforming to the customer's requirements. A deep understanding of the complexity of the requirements model and its dynamics is critical in improving requirements engineering process management. This thesis investigates the benefits of documenting both the evolution of the requirements model and the rationale for that evolution. Design explanation explains and justifies the deliberations of, and decisions made during, the design activity. In this thesis, design explanation is used to describe the requirements engineering process in order to improve understandability of, and traceability within, the evolving requirements specification. The design explanation recorded during this research project is also useful in assisting the researcher in gaining insights into the creativity and opportunistic characteristics of the requirements engineering process. This thesis offers an interpretive investigation into incorporating design explanation within FOOM in order to extend and advantage the method. The researcher's interpretation and analysis of collected data highlight an insight-driven and opportunistic process rather than a strictly and systematically predefined one. In fact, the process was not smoothly evolutionary, but involved occasional 'crisis' points at which the model was reconceptualised, simplified and restructured. Therefore, contributions of the thesis lie not only in an effective incorporation of design explanation within FOOM, but also a deep understanding of the dynamic process of requirements engineering. The new understanding of the complexity of the requirements model and its dynamics suggests new directions for future research and forms a basis for a new approach to process management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electronic commerce and the Internet have created demand for automated systems that can make complex decisions utilizing information from multiple sources. Because the information is uncertain, dynamic, distributed, and heterogeneous in nature, these systems require a great diversity of intelligent techniques including expert systems, fuzzy logic, neural networks, and genetic algorithms. However, in complex decision making, many different components or sub-tasks are involved, each of which requires different types of processing. Thus multiple such techniques are required resulting in systems called hybrid intelligent systems. That is, hybrid solutions are crucial for complex problem solving and decision making. There is a growing demand for these systems in many areas including financial investment planning, engineering design, medical diagnosis, and cognitive simulation. However, the design and development of these systems is difficult because they have a large number of parts or components that have many interactions. From a multi-agent perspective, agents in multi-agent systems (MAS) are autonomous and can engage in flexible, high-level interactions. MASs are good at complex, dynamic interactions. Thus a multi-agent perspective is suitable for modeling, design, and construction of hybrid intelligent systems. The aim of this thesis is to develop an agent-based framework for constructing hybrid intelligent systems which are mainly used for complex problem solving and decision making. Existing software development techniques (typically, object-oriented) are inadequate for modeling agent-based hybrid intelligent systems. There is a fundamental mismatch between the concepts used by object-oriented developers and the agent-oriented view. Although there are some agent-oriented methodologies such as the Gaia methodology, there is still no specifically tailored methodology available for analyzing and designing agent-based hybrid intelligent systems. To this end, a methodology is proposed, which is specifically tailored to the analysis and design of agent-based hybrid intelligent systems. The methodology consists of six models - role model, interaction model, agent model, skill model, knowledge model, and organizational model. This methodology differs from other agent-oriented methodologies in its skill and knowledge models. As good decisions and problem solutions are mainly based on adequate information, rich knowledge, and appropriate skills to use knowledge and information, these two models are of paramount importance in modeling complex problem solving and decision making. Follow the methodology, an agent-based framework for hybrid intelligent system construction used in complex problem solving and decision making was developed. The framework has several crucial characteristics that differentiate this research from others. Four important issues relating to the framework are also investigated. These cover the building of an ontology for financial investment, matchmaking in middle agents, reasoning in problem solving and decision making, and decision aggregation in MASs. The thesis demonstrates how to build a domain-specific ontology and how to access it in a MAS by building a financial ontology. It is argued that the practical performance of service provider agents has a significant impact on the matchmaking outcomes of middle agents. It is proposed to consider service provider agents' track records in matchmaking. A way to provide initial values for the track records of service provider agents is also suggested. The concept of â˜reasoning with multimedia informationâ is introduced, and reasoning with still image information using symbolic projection theory is proposed. How to choose suitable aggregation operations is demonstrated through financial investment application and three approaches are proposed - the stationary agent approach, the token-passing approach, and the mobile agent approach to implementing decision aggregation in MASs. Based on the framework, a prototype was built and applied to financial investment planning. This prototype consists of one serving agent, one interface agent, one decision aggregation agent, one planning agent, four decision making agents, and five service provider agents. Experiments were conducted on the prototype. The experimental results show the framework is flexible, robust, and fully workable. All agents derived from the methodology exhibit their behaviors correctly as specified.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This chapter examines the nature and role of theory in criminal justice evaluation. A distinction between theories of and theories for evaluation is offered to clarify what is meant by &lsquo;theory&rsquo; in the context of contemporary evaluation practice. Theories of evaluation provide a set of prescriptions and principles that can be used to guide the design, conduct and use of evaluation. Theories for evaluation include programme theory and the application of social science theory to understand how and why criminal justice interventions work to generate desired outcomes. The fundamental features of these three types of theory are discussed in detail, with a particular focus on demonstrating their combined value and utility for informing and improving the practice of criminal justice evaluation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Smartphones are pervasively used in society, and have been both the target and victim of malware writers. Motivated by the significant threat that presents to legitimate users, we survey the current smartphone malware status and their propagation models. The content of this paper is presented in two parts. In the first part, we review the short history of mobile malware evolution since 2004, and then list the classes of mobile malware and their infection vectors. At the end of the first part, we enumerate the possible damage caused by smartphone malware. In the second part, we focus on smartphone malware propagation modeling. In order to understand the propagation behavior of smartphone malware, we recall generic epidemic models as a foundation for further exploration. We then extensively survey the smartphone malware propagation models. At the end of this paper, we highlight issues of the current smartphone malware propagation models and discuss possible future trends based on our understanding of this topic. &copy; &copy; 2014 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Smartphone applications are getting more and more popular and pervasive in our daily life, and are also attractive to malware writers due to their limited computing source and vulnerabilities. At the same time, we possess limited understanding of our opponents in cyberspace. In this paper, we investigate the propagation model of SMS/MMS-based worms through integrating semi-Markov process and social relationship graph. In our modeling, we use semi-Markov process to characterize state transition among mobile nodes, and hire social network theory, a missing element in many previous works, to enhance the proposed mobile malware propagation model. In order to evaluate the proposed models, we have developed a specific software, and collected a large scale real-world data for this purpose. The extensive experiments indicate that the proposed models and algorithms are effective and practical. &copy; 2014 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new portfolio risk measure that is the uncertainty of portfolio fuzzy return is introduced in this paper. Beyond the well-known Sharpe ratio (i.e., the reward-to-variability ratio) in modern portfolio theory, we initiate the so-called fuzzy Sharpe ratio in the fuzzy modeling context. In addition to the introduction of the new risk measure, we also put forward the reward-to-uncertainty ratio to assess the portfolio performance in fuzzy modeling. Corresponding to two approaches based on TM and TW fuzzy arithmetic, two portfolio optimization models are formulated in which the uncertainty of portfolio fuzzy returns is minimized, while the fuzzy Sharpe ratio is maximized. These models are solved by the fuzzy approach or by the genetic algorithm (GA). Solutions of the two proposed models are shown to be dominant in terms of portfolio return uncertainty compared with those of the conventional mean-variance optimization (MVO) model used prevalently in the financial literature. In terms of portfolio performance evaluated by the fuzzy Sharpe ratio and the reward-to-uncertainty ratio, the model using TW fuzzy arithmetic results in higher performance portfolios than those obtained by both the MVO and the fuzzy model, which employs TM fuzzy arithmetic. We also find that using the fuzzy approach for solving multiobjective problems appears to achieve more optimal solutions than using GA, although GA can offer a series of well-diversified portfolio solutions diagrammed in a Pareto frontier.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As the new millennium approaches, we are living in a society that is increasingly dependent upon information technology. However, whilst technology can deliver a number of benefits, it also introduces new vulnerabilities that can be exploited by persons with the necessary technical skills. Hackers represent a well-known threat in this respect and are responsible for a significant degree of disruption and damage to information systems. However, they are not the only criminal element that has to be taken into consideration. Evidence suggests that technology is increasingly seen as potential tool for terrorist organizations. This is leading to the emergence of a new threat in the form of 'cyber terrorists', who attack technological infrastructures such as the Internet in order to help further their cause. The paper discusses the problems posed by these groups and considers the nature of the responses necessary to preserve the future security of our society.