932 resultados para Knowledge Information Objects
Resumo:
188 p.
Resumo:
Effects of context on the perception of, and incidental memory for, real-world objects have predominantly been investigated in younger individuals, under conditions involving a single static viewpoint. We examined the effects of prior object context and object familiarity on both older and younger adults' incidental memory for real objects encountered while they traversed a conference room. Recognition memory for context-typical and context-atypical objects was compared with a third group of unfamiliar objects that were not readily named and that had no strongly associated context. Both older and younger adults demonstrated a typicality effect, showing significantly lower 2-alternative-forced-choice recognition of context-typical than context-atypical objects; for these objects, the recognition of older adults either significantly exceeded, or numerically surpassed, that of younger adults. Testing-awareness elevated recognition but did not interact with age or with object type. Older adults showed significantly higher recognition for context-atypical objects than for unfamiliar objects that had no prior strongly associated context. The observation of a typicality effect in both age groups is consistent with preserved semantic schemata processing in aging. The incidental recognition advantage of older over younger adults for the context-typical and context-atypical objects may reflect aging-related differences in goal-related processing, with older adults under comparatively more novel circumstances being more likely to direct their attention to the external environment, or age-related differences in top-down effortful distraction regulation, with older individuals' attention more readily captured by salient objects in the environment. Older adults' reduced recognition of unfamiliar objects compared to context-atypical objects may reflect possible age differences in contextually driven expectancy violations. The latter finding underscores the theoretical and methodological value of including a third type of objects-that are comparatively neutral with respect to their contextual associations-to help differentiate between contextual integration effects (for schema-consistent objects) and expectancy violations (for schema-inconsistent objects).
Resumo:
A Low-Level Geographic Information System (LL-GIS) was developed to provide a simple low-cost mapping program which can be executed in any personal computer, by individuals with different levels of knowledge in computing. MAPPER is an add-on module of FishBase - a global database with key information on the biology of fish - where it creates on-screen maps with information on biodiversity and the occurrence of species. In another application, MAPPER is used to display and analyzed geographical information on the Philippines.
Resumo:
The population of belugas, Delphinapterus leucas, in Cook Inlet, Alaska, is geographically isolated and appears to be declining. Conservation efforts require appropriate information about population levels and trends, feeding and behavior, reproduction, and natural and anthropogenic impacts. This study documents traditional ecological knowledge of the Alaska Native hunters of belugas in Cook Inlet to add information from this critical source. Traditional knowledge about belugas has been documented elsewhere by the author, and the same methods were used in Cook Inlet to systematically gather information concerning knowledge of the natural history of this beluga population and its habitat. The hunters’knowledge is largely consistent with what is known from previous research, and it extends the published descriptions of the ecology of beluga whales in Cook Inlet. Making this information available and involving the hunters to a greater extent in research and management are important contributions to the conservation of Cook Inlet beluga
Resumo:
In western civilization, the knowledge of the elasmobranch or selachian fishes (sharks and rays) begins with Aristotle (384–322 B.C.). Two of his extant works, the “Historia Animalium” and the “Generation of Animals,” both written about 330 B.C., demonstrate knowledge of elasmobranch fishes acquired by observation. Roman writers of works on natural history, such as Aelian and Pliny, who followed Aristotle, were compilers of available information. Their contribution was that they prevented the Greek knowledge from being lost, but they added few original observations. The fall of Rome, around 476 A.D., brought a period of economic regression and political chaos. These in turn brought intellectual thought to a standstill for nearly one thousand years, the period known as the Dark Ages. It would not be until the middle of the sixteenth century, well into the Renaissance, that knowledge of elasmobranchs would advance again. The works of Belon, Salviani, Rondelet, and Steno mark the beginnings of ichthyology, including the study of sharks and rays. The knowledge of sharks and rays increased slowly during and after the Renaissance, and the introduction of the Linnaean System of Nomenclature in 1735 marks the beginning of modern ichthyology. However, the first major work on sharks would not appear until the early nineteenth century. Knowledge acquired about sea animals usually follows their economic importance and exploitation, and this was also true with sharks. The first to learn about sharks in North America were the native fishermen who learned how, when, and where to catch them for food or for their oils. The early naturalists in America studied the land animals and plants; they had little interest in sharks. When faunistic works on fishes started to appear, naturalists just enumerated the species of sharks that they could discern. Throughout the U.S. colonial period, sharks were seldom utilized for food, although their liver oil or skins were often utilized. Throughout the nineteenth century, the Spiny Dogfish, Squalus acanthias, was the only shark species utilized in a large scale on both coasts. It was fished for its liver oil, which was used as a lubricant, and for lighting and tanning, and for its skin which was used as an abrasive. During the early part of the twentieth century, the Ocean Leather Company was started to process sea animals (primarily sharks) into leather, oil, fertilizer, fins, etc. The Ocean Leather Company enjoyed a monopoly on the shark leather industry for several decades. In 1937, the liver of the Soupfin Shark, Galeorhinus galeus, was found to be a rich source of vitamin A, and because the outbreak of World War II in 1938 interrupted the shipping of vitamin A from European sources, an intensive shark fishery soon developed along the U.S. West Coast. By 1939 the American shark leather fishery had transformed into the shark liver oil fishery of the early 1940’s, encompassing both coasts. By the late 1940’s, these fisheries were depleted because of overfishing and fishing in the nursery areas. Synthetic vitamin A appeared on the market in 1950, causing the fishery to be discontinued. During World War II, shark attacks on the survivors of sunken ships and downed aviators engendered the search for a shark repellent. This led to research aimed at understanding shark behavior and the sensory biology of sharks. From the late 1950’s to the 1980’s, funding from the Office of Naval Research was responsible for most of what was learned about the sensory biology of sharks.
Resumo:
The Internet of Things (IOT) concept and enabling technologies such as RFID offer the prospect of linking the real world of physical objects with the virtual world of information technology to improve visibility and traceability information within supply chains and across the entire lifecycles of products, as well as enabling more intuitive interactions and greater automation possibilities. There is a huge potential for savings through process optimization and profit generation within the IOT, but the sharing of financial benefits across companies remains an unsolved issue. Existing approaches towards sharing of costs and benefits have failed to scale so far. The integration of payment solutions into the IOT architecture could solve this problem. We have reviewed different possible levels of integration. Multiple payment solutions have been researched. Finally we have developed a model that meets the requirements of the IOT in relation to openness and scalability. It supports both hardware-centric and software-centric approaches to integration of payment solutions with the IOT. Different requirements concerning payment solutions within the IOT have been defined and considered in the proposed model. Possible solution providers include telcos, e-payment service providers and new players such as banks and standardization bodies. The proposed model of integrating the Internet of Things with payment solutions will lower the barrier to invoicing for the more granular visibility information generated using the IOT. Thus, it has the potential to enable recovery of the necessary investments in IOT infrastructure and accelerate adoption of the IOT, especially for projects that are only viable when multiple benefits throughout the supply chain need to be accumulated in order to achieve a Return on Investment (ROI). In a long-term perspective, it may enable IT-departments to become profit centres instead of cost centres. © 2010 - IOS Press and the authors. All rights reserved.
Resumo:
Our knowledge regarding ethno-medico zoology is scanty and scattered. The present work is an endeavour to collect information on indigenous traditional knowledge (ITK) of disease cure through fish consumption, prepare a consolidated report on this aspect and to document our ITK so that in the long run after due verification (by Medical experts), such ITK can be patented. We also suggest for the recognition of the age old tribal medicine and establishment of a national research institute for tribal medicines at suitable place for the welfare of all the human beings.
Resumo:
RFID is a technology that enables the automated capture of observations of uniquely identified physical objects as they move through supply chains. Discovery Services provide links to repositories that have traceability information about specific physical objects. Each supply chain party publishes records to a Discovery Service to create such links and also specifies access control policies to restrict who has visibility of link information, since it is commercially sensitive and could reveal inventory levels, flow patterns, trading relationships, etc. The requirement of being able to share information on a need-to-know basis, e.g. within the specific chain of custody of an individual object, poses a particular challenge for authorization and access control, because in many supply chain situations the information owner might not have sufficient knowledge about all the companies who should be authorized to view the information, because the path taken by an individual physical object only emerges over time, rather than being fully pre-determined at the time of manufacture. This led us to consider novel approaches to delegate trust and to control access to information. This paper presents an assessment of visibility restriction mechanisms for Discovery Services capable of handling emergent object paths. We compare three approaches: enumerated access control (EAC), chain-of-communication tokens (CCT), and chain-of-trust assertions (CTA). A cost model was developed to estimate the additional cost of restricting visibility in a baseline traceability system and the estimates were used to compare the approaches and to discuss the trade-offs. © 2012 IEEE.
Resumo:
The Internet has enabled the creation of a growing number of large-scale knowledge bases in a variety of domains containing complementary information. Tools for automatically aligning these knowledge bases would make it possible to unify many sources of structured knowledge and answer complex queries. However, the efficient alignment of large-scale knowledge bases still poses a considerable challenge. Here, we present Simple Greedy Matching (SiGMa), a simple algorithm for aligning knowledge bases with millions of entities and facts. SiGMa is an iterative propagation algorithm which leverages both the structural information from the relationship graph as well as flexible similarity measures between entity properties in a greedy local search, thus making it scalable. Despite its greedy nature, our experiments indicate that SiGMa can efficiently match some of the world's largest knowledge bases with high precision. We provide additional experiments on benchmark datasets which demonstrate that SiGMa can outperform state-of-the-art approaches both in accuracy and efficiency.
Resumo:
Design rationale is an effective way of capturing knowledge, since it records the issues addressed, the options considered, and the arguments used when specific decisions are made during the design process. Design rationale is generally captured by identifying elements and their dependencies, i.e. in a structured way. Current retrieval methods focus mainly on either the classification of rationale or on keyword-based searches of records. Keyword-based retrieval is reasonably effective as the information in design rationale records is mainly described using text. However, most of the current keyword-based retrieval methods discard the implicit structures of these records, resulting either in poor precision of retrieval or in isolated pieces of information that are difficult to understand. This ongoing research aims to go beyond keyword-based retrieval by developing methods and tools to facilitate the provision of useful design knowledge in new design projects. Our first step is to understand the structured information derived from the relationship between lumps of text held in different nodes in the design rationale captured via a software tool currently used in industry, and study how this information can be utilised to improve retrieval performance. Specifically, methods for utilising various structured information are developed and implemented on a prototype keyword-based retrieval system developed in our earlier work. The implementation and evaluation of these methods shows that the structured information can be utilised in a number of ways, such as filtering the results and providing more complete information. This allows the retrieval system to present results that are easy to understand, and which closely match designers' queries. Like design rationale, other methods for representing design knowledge also in essence involve structured information and thus the methods proposed can be generalised to be adapted and applied for the retrieval of other kinds of design knowledge. Copyright © 2002-2012 The Design Society. All rights reserved.
Resumo:
The lack of viable methods to map and label existing infrastructure is one of the engineering grand challenges for the 21st century. For instance, over two thirds of the effort needed to geometrically model even simple infrastructure is spent on manually converting a cloud of points to a 3D model. The result is that few facilities today have a complete record of as-built information and that as-built models are not produced for the vast majority of new construction and retrofit projects. This leads to rework and design changes that can cost up to 10% of the installed costs. Automatically detecting building components could address this challenge. However, existing methods for detecting building components are not view and scale-invariant, or have only been validated in restricted scenarios that require a priori knowledge without considering occlusions. This leads to their constrained applicability in complex civil infrastructure scenes. In this paper, we test a pose-invariant method of labeling existing infrastructure. This method simultaneously detects objects and estimates their poses. It takes advantage of a recent novel formulation for object detection and customizes it to generic civil infrastructure scenes. Our preliminary experiments demonstrate that this method achieves convincing recognition results.
Resumo:
Change propagates, potentially affecting many aspects of a design and requiring much rework to implement. This article introduces a cross-domain approach to decompose a design and identify possible change propagation linkages, complemented by an interactive tool that generates dynamic checklists to assess change impact. The approach considers the information domains of requirements, functions, components, and the detail design process. Laboratory experiments using a vacuum cleaner suggest that cross-domain modelling helps analyse a design to create and capture the information required for change prediction. Further experiments using an electronic product show that this information, coupled with the interactive tool, helps to quickly and consistently assess the impact of a proposed change. © 2012 Springer-Verlag London Limited.
Resumo:
During product development, engineering designers raise several information requests that make them search through human and documentary sources. This paper reports research to characterise, in detail, these requests for designers working in a major aerospace engineering company. The research found that at a high level, a distinction can be made between requests to acquire information and to process information. The former are raised to access design and domain information. The latter, instead, are formed to define designs. For researchers, this study extends existing knowledge of information requests by characterising key differences in their nature and explaining how they are used in the design process. For practitioners, these findings can be used as a basis to understand the diverseness of information requests and how to channel efforts to support designers in information seeking. In particular, the research indicates that a strategy to support designers should enable the development of engineering communities that share information effectively and the introduction of techniques that facilitate the documentation of information. © 2012 Springer-Verlag London Limited.