890 resultados para knowledge-based systems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Part 1: Introduction

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As unmanned autonomous vehicles (UAVs) are being widely utilized in military and civil applications, concerns are growing about mission safety and how to integrate dierent phases of mission design. One important barrier to a coste ective and timely safety certication process for UAVs is the lack of a systematic approach for bridging the gap between understanding high-level commander/pilot intent and implementation of intent through low-level UAV behaviors. In this thesis we demonstrate an entire systems design process for a representative UAV mission, beginning from an operational concept and requirements and ending with a simulation framework for segments of the mission design, such as path planning and decision making in collision avoidance. In this thesis, we divided this complex system into sub-systems; path planning, collision detection and collision avoidance. We then developed software modules for each sub-system

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is an increasing concern to reduce the cost and overheads during the development of reliable systems. Selective protection of most critical parts of the systems represents a viable solution to obtain a high level of reliability at a fraction of the cost. In particular to design a selective fault mitigation strategy for processor-based systems, it is mandatory to identify and prioritize the most vulnerable registers in the register file as best candidates to be protected (hardened). This paper presents an application-based metric to estimate the criticality of each register from the microprocessor register file in microprocessor-based systems. The proposed metric relies on the combination of three different criteria based on common features of executed applications. The applicability and accuracy of our proposal have been evaluated in a set of applications running in different microprocessors. Results show a significant improvement in accuracy compared to previous approaches and regardless of the underlying architecture.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bilinear pairings can be used to construct cryptographic systems with very desirable properties. A pairing performs a mapping on members of groups on elliptic and genus 2 hyperelliptic curves to an extension of the finite field on which the curves are defined. The finite fields must, however, be large to ensure adequate security. The complicated group structure of the curves and the expensive field operations result in time consuming computations that are an impediment to the practicality of pairing-based systems. The Tate pairing can be computed efficiently using the ɳT method. Hardware architectures can be used to accelerate the required operations by exploiting the parallelism inherent to the algorithmic and finite field calculations. The Tate pairing can be performed on elliptic curves of characteristic 2 and 3 and on genus 2 hyperelliptic curves of characteristic 2. Curve selection is dependent on several factors including desired computational speed, the area constraints of the target device and the required security level. In this thesis, custom hardware processors for the acceleration of the Tate pairing are presented and implemented on an FPGA. The underlying hardware architectures are designed with care to exploit available parallelism while ensuring resource efficiency. The characteristic 2 elliptic curve processor contains novel units that return a pairing result in a very low number of clock cycles. Despite the more complicated computational algorithm, the speed of the genus 2 processor is comparable. Pairing computation on each of these curves can be appealing in applications with various attributes. A flexible processor that can perform pairing computation on elliptic curves of characteristic 2 and 3 has also been designed. An integrated hardware/software design and verification environment has been developed. This system automates the procedures required for robust processor creation and enables the rapid provision of solutions for a wide range of cryptographic applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Knowledge organization (KO) research is a field of scholarship concerned with the design, study and critique of the processes of organizing and representing documents that societies see as worthy of preserving (Tennis, 2008). In this context we are concerned with the relationship between language and action.On the one hand, we are concerned with what language can and does do for our knowledge organization systems (KOS). For example, how do the words NEGRO or INDIAN work in historical and contemporary indexing languages? In relation to this, we are also concerned with how we know about knowledge organization (KO) and its languages. On the other hand, we are concerned with how to act given this knowledge. That is, how do we carry out research and how do we design, implement, and evaluate KO systems?It is important to consider these questions in the context of our work because we are delegated by society to disseminate cultural memory. We are endowed with a perspective, prepared by an education, and granted positions whereby society asks us to ensure that documentary material is accessible to future generations. There is a social value in our work, and as such there is a social imperative to our work. We must act with good conscience, and use language judiciously, for the memory of the world is a heavy burden.In this paper, I explore these two weights of language and action that bear down on KO researchers. I first summarize what extant literature says about the knowledge claims we make with regard to KO practices and systems. To make it clear what it is that I think we know, I create a schematic that will link claims (language) to actions in advising, implementing, or evaluating information practices and systems.I will then contrast this with what we do not know, that is, what the unanswered questions might be (Gnoli, 2008 ; Dahlberg, 2011), and I will discuss them in relation to the two weights in our field of KO.Further, I will try to provide a systematic overview of possible ways to address these open questions in KO research. I will draw on the concept of elenchus - the forms of epistemology, theory, and methodology in KO (Tennis, 2008), and framework analysis which are structures, work practice, and discourses of KO systems (Tennis, 2006). In so doing, I will argue for a Neopragmatic stance on the weight of language and action in KO (Rorty, 1982 ; 2000). I will close by addressing the lacuna left in Neopragmatic thought – the ethical imperative to use language and action in a particular good and moral way. That is, I will address the ethical imperative of KO given its weights, epistemologies, theories, and methods. To do this, I will review a sample of relevant work on deontology in both western and eastern philosophical schools (e.g., Harvey, 1995).The perspective I want to communicate in this section is that the good in carrying out KO research may begin with epistemic stances (cf., language), but ultimately stands on ethical actions. I will present an analysis describing the micro and the macro ethical concerns in relation to KO research and its advice on practice. I hope this demonstrates that the direction of epistemology, theory, and methodology in KO, while burdened with the dual weights of language and action, is clear when provided an ethical sounding board. We know how to proceed when we understand how our work can benefit the world.KO is an important, if not always understood, division of labor in a society that values its documentary heritage and memory institutions. Being able to do good requires us to understand how to balance the weights of language and action. We must understand where we stand and be able to chart a path forward, one that does not cause harm, but adds value to the world and those that want to access recorded knowledge.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Knowledge organization in the networked environment is guided by standards. Standards in knowledge organization are built on principles. For example, NISO Z39.19-1993 Guide to the Construction of Monolingual Thesauri (now undergoing revision) and NISO Z39.85- 2001 Dublin Core Metadata Element Set are two standards used in many implementations. Both of these standards were crafted with knowledge organization principles in mind. Therefore it is standards work guided by knowledge organization principles which can affect design of information services and technologies. This poster outlines five threads of thought that inform knowledge organization principles in the networked environment. An understanding of each of these five threads informs system evaluation. The evaluation of knowledge organization systems should be tightly linked to a rigorous understanding of the principles of construction. Thus some foundational evaluation questions grow from an understanding of stan dard s and pr inciples: on what pr inciples is this know ledge organization system built? How well does this implementation meet the ideal conceptualization of those principles? How does this tool compare to others built on the same principles?

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The work of knowledge organization requires a particular set of tools. For instance we need standards of content description like Anglo-American Cataloging Rules Edition 2, Resource Description and Access (RDA), Cataloging Cultural Objects, and Describing Archives: A Content Standard. When we intellectualize the process of knowledge organization – that is when we do basic theoretical research in knowledge organization we need another set of tools. For this latter exercise we need constructs. Constructs are ideas with many conceptual elements, largely considered subjective. They allow us to be inventive as well as allow us to see a particular point of view in knowledge organization. For example, Patrick Wilson’s ideas of exploitative control and descriptive control, or S. R. Ranganathan’s fundamental categories are constructs. They allow us to identify functional requirements or operationalizations of functional requirements, or at least come close to them for our systems and schemes. They also allow us to carry out meaningful evaluation.What is even more interesting, from a research point of view, is that constructs once offered to the community can be contested and reinterpreted and this has an affect on how we view knowledge organization systems and processes. Fundamental categories are again a good example in that some members of the Classification Research Group (CRG) argued against Ranganathan’s point of view. The CRG posited more fundamental categories than Ranganathan’s five, Personality, Matter, Energy, Space, and Time (Ranganathan, 1967). The CRG needed significantly more fundamental categories for their work.1 And these are just two voices in this space we can also consider the fundamental categories of Johannes Kaiser (1911), Shera and Egan, Barbara Kyle (Vickery, 1960), and Eric de Grolier (1962). We can also reference contemporary work that continues comparison and analysis of fundamental categories (e.g., Dousa, 2011).In all these cases we are discussing a construct. The fundamental category is not discovered; it is constructed by a classificationist. This is done because it is useful in engaging in the act of classification. And while we are accustomed to using constructs or debating their merit in one knowledge organization activity or another, we have not analyzed their structure, nor have we created a typology. In an effort to probe the epistemological dimension of knowledge organization, we think it would be a fruitful exercise to do this. This is because we might benefit from clarity around not only our terminology, but the manner in which we talk about our terminology. We are all creative workers examining what is available to us, but doing so through particular lenses (constructs) identifying particular constructs. And by knowing these and being able to refer to these we would consider a core competency for knowledge organization researchers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El interés de esta monografía es analizar la influencia de la globalización como proceso mundial y el neoliberalismo como política económica frente a la definición de políticas educativas. Tiene como objetivo analizar la manera en que se han modificado los conceptos de autonomía y democracia universitaria en la universidad pública colombiana, en el marco de la globalización y a través de la educación por competencias desde 1992 hasta el 2013. Con base en una aproximación conceptual de los términos de autonomía y democracia universitaria a través de elementos teóricos e históricos, se analizará como el actual sistema educativo colombiano y la educación por competencias ha modificado los conceptos de autonomía y democracia universitaria en la universidad pública colombiana.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Facing with the difficulty in information propagation and synthesizing from conceptual to embodiment design, this paper introduces a function-oriented, axiom based conceptual modeling scheme. Default logic reasoning is exploited for recognition and reconstitution of conceptual product geometric and topological information. The proposed product modeling system and reasoning approach testify a methodology of "structural variation design", which is verified in the implementation of a GPAL (Green Product All Life-cycle) CAD system. The GPAL system includes major enhancement modules of a mechanism layout sketching method based on fuzzy logic, a knowledge-based function-to-form mapping mechanism and conceptual form reconstitution paradigm based on default geometric reasoning. A mechanical hand design example shows a more than 20 times increase in design efficacy with these enhancement modules in the GPAL system on a general 3D CAD platform.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Information and communication technologies (ICTs) had occupied their position on knowledge management and are now evolving towards the era of self-intelligence (Klosterman, 2001). In the 21st century ICTs for urban development and planning are imperative to improve the quality of life and place. This includes the management of traffic, waste, electricity, sewerage and water quality, monitoring fire and crime, conserving renewable resources, and coordinating urban policies and programs for urban planners, civil engineers, and government officers and administrators. The handling of tasks in the field of urban management often requires complex, interdisciplinary knowledge as well as profound technical information. Most of the information has been compiled during the last few years in the form of manuals, reports, databases, and programs. However frequently, the existence of these information and services are either not known or they are not readily available to the people who need them. To provide urban administrators and the public with comprehensive information and services, various ICTs are being developed. In early 1990s Mark Weiser (1993) proposed Ubiquitous Computing project at the Xerox Palo Alto Research Centre in the US. He provides a vision of a built environment which digital networks link individual residents not only to other people but also to goods and services whenever and wherever they need (Mitchell, 1999). Since then the Republic of Korea (ROK) has been continuously developed national strategies for knowledge based urban development (KBUD) through the agenda of Cyber Korea, E-Korea and U-Korea. Among abovementioned agendas particularly the U-Korea agenda aims the convergence of ICTs and urban space for a prosperous urban and economic development. U-Korea strategies create a series of U-cities based on ubiquitous computing and ICTs by a means of providing ubiquitous city (U-city) infrastructure and services in urban space. The goals of U-city development is not only boosting the national economy but also creating value in knowledge based communities. It provides opportunity for both the central and local governments collaborate to U-city project, optimize information utilization, and minimize regional disparities. This chapter introduces the Korean-led U-city concept, planning, design schemes and management policies and discusses the implications of U-city concept in planning for KBUD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Building on the strengths of its popular previous edition Management: Core Concepts and Applications, 2nd Australasian edition has been thoroughly revised and updated to reflect the three keyaspects of contemporary undergraduate introductory management education: Management theory Concept application Skills development. The text's 16 chapters are presented in a lively and concise mannerideal for the typical 12 or 13 teaching weeks of a semester. Itsflexible framework allows instructors to teach students through the useof interactive tools such as case studies, exercises and projects.These action-oriented learning activities complement the text's solidfoundation of knowledge-based theory material. There is a balanced coverage of both small to medium sizedenterprises and larger multinational organisations operating inAustralia, New Zealand and the Asia-Pacific region. A critical thinkingperspective is integrated throughout the book, asking and encouragingstudents to analyse the theory in light of real-world examples. Each copy of the printed textbook comes with a free copy of the Wiley Desktop Edition:a full electronic version of the text that allows students to easilysearch for key concepts, create their own colour-coded highlights andmake electronic notes in the text for revision. Key themes of the text include: The importance of ethical and socially responsible management Recognition of the continuing need to cater for the increasing diversity of the workforce The importance of managing people, technology, knowledge and quality in achieving organisational goals An appreciation of the challenges and opportunities presented bythe ever changing environment in which contemporary managers operate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Successful product innovation and the ability of companies to continuously improve their innovation processes are rapidly becoming essential requirements for competitive advantage and long-term growth in both manufacturing and service industries. It is now recognized that companies must develop innovation capabilities across all stages of the product development, manufacture, and distribution cycle. These Continuous Product Innovation (CPI) capabilities are closely associated with a company’s knowledge management systems and processes. Companies must develop mechanisms to continuously improve these capabilities over time. Using results of an international survey on CPI practices, sets of companies are identified by similarities in specific contingencies related to their complexity of product, process, technological, and customer interface. Differences between the learning behaviors found present in the company groups and in the levers used to develop and support these behaviors are identified and discussed. This paper also discusses appropriate mechanisms for firms with similar complexities, and some approaches they can use to improve their organizational learning and product innovation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data generated in a normal gravity environment is often used in design and risk assessment for reduced gravity applications. It has been clearly demonstrated that this is a conservative approach for non-metallic materials which have been repeatedly shown to be less flammable in a reduced gravity environment. However, recent work has demonstrated this is not true for metallic materials. This work, conducted in a newly completed drop tower observed a significant increase in both lowest burn pressure and burn rate in reduced gravity. Hence the normal gravity qualification of a metallic materials’ lowest burn pressure or burn rate for reduced-gravity or space-based systems is clearly not conservative. This paper presents a summary of this work and the results obtained for several metallic materials showing an increased flammability and burn rate for a range of oxygen pressures, and discusses the implications of this work on the fire-safety of space-based systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Web service composition is an important problem in web service based systems. It is about how to build a new value-added web service using existing web services. A web service may have many implementations, all of which have the same functionality, but may have different QoS values. Thus, a significant research problem in web service composition is how to select a web service implementation for each of the web services such that the composite web service gives the best overall performance. This is so-called optimal web service selection problem. There may be mutual constraints between some web service implementations. Sometimes when an implementation is selected for one web service, a particular implementation for another web service must be selected. This is so called dependency constraint. Sometimes when an implementation for one web service is selected, a set of implementations for another web service must be excluded in the web service composition. This is so called conflict constraint. Thus, the optimal web service selection is a typical constrained ombinatorial optimization problem from the computational point of view. This paper proposes a new hybrid genetic algorithm for the optimal web service selection problem. The hybrid genetic algorithm has been implemented and evaluated. The evaluation results have shown that the hybrid genetic algorithm outperforms other two existing genetic algorithms when the number of web services and the number of constraints are large.