862 resultados para Interactive Video Instruction: A Training Tool Whose Time Has Come
Resumo:
研究了企业建模方法的发展趋势 ,提出了一个集成化的企业建模方法 ,给出了建模框架体系和建模方法 ,并设计了一个基于CORBA软件总线的集成化企业建模与仿真优化系统。所提出的建模方法与设计的软件工具 ,对于促进集成化企业建模方法学的研究 ,开发具有我国自主版权的建模与优化工具系统 ,具有实际应用价值。
Resumo:
Allen的时间理论因直观、易懂而倍受推崇,但它存在不能处理连续变化事件等缺欠。本文提出更为一般的时间理论框架,以扩展Allen的理论,本框架的特点为:(l)将Allen的理论纳入其中;(2)可由时间点构造时区,并可处理时区和时间点;(3)以时间元素集的2D图形表示为基础的约束传播算法,既高效又可视化。
Resumo:
Stimulus-Response Compatibility is a key concept in human-machine interaction. It is proved that to map stimulus to response according to salient-feature coding principle will get a compatible pair. In the designing of Chinese Pinyin Code inputting devices, stimulus-response compatibility will bring the device with features of ease of use and ease of learning. In this research, Response time and error rates of two designs of salient-feature coding principle and one design of random mapping were tested along with the QWERTY keyboard. Cross-modal Compatibility Effects were found, and no significant difference between two salient-feature coding types, both on response time and error rates; but response time has shown difference between salient-feature coding designs and random mapping design. Compared with the QWERTY keyboard group, the error rates of subjects of chord keyboard group showed no significant differences. But subjects assigned to the QWERTY keyboard group have a shorter response time. One possible reason is the subjects of chord keyboard group only at beginner skill after at most 6 hours practice whereas subjects of QWERTY group were at least at novice level after take a foundation of computer class at their own college.
Resumo:
Gibbs, N., Getting Constitutional Theory into Proportion: A Matter of Interpretation?, Oxford Journal of Legal Studies, 27 (1), 175-191. RAE2008
Resumo:
In the analysis of relations among the elements of two sets it is usual to obtain different values depending on the point of view from which these relations are measured. The main goal of the paper is the modelization of these situations by means of a generalization of the L-fuzzy concept analysis called L-fuzzy bicontext. We study the L-fuzzy concepts of these L-fuzzy bicontexts obtaining some interesting results. Specifically, we will be able to classify the biconcepts of the L-fuzzy bicontext. Finally, a practical case is developed using this new tool.
Resumo:
There is much common ground between the areas of coding theory and systems theory. Fitzpatrick has shown that a Göbner basis approach leads to efficient algorithms in the decoding of Reed-Solomon codes and in scalar interpolation and partial realization. This thesis simultaneously generalizes and simplifies that approach and presents applications to discrete-time modeling, multivariable interpolation and list decoding. Gröbner basis theory has come into its own in the context of software and algorithm development. By generalizing the concept of polynomial degree, term orders are provided for multivariable polynomial rings and free modules over polynomial rings. The orders are not, in general, unique and this adds, in no small way, to the power and flexibility of the technique. As well as being generating sets for ideals or modules, Gröbner bases always contain a element which is minimal with respect tot the corresponding term order. Central to this thesis is a general algorithm, valid for any term order, that produces a Gröbner basis for the solution module (or ideal) of elements satisfying a sequence of generalized congruences. These congruences, based on shifts and homomorphisms, are applicable to a wide variety of problems, including key equations and interpolations. At the core of the algorithm is an incremental step. Iterating this step lends a recursive/iterative character to the algorithm. As a consequence, not all of the input to the algorithm need be available from the start and different "paths" can be taken to reach the final solution. The existence of a suitable chain of modules satisfying the criteria of the incremental step is a prerequisite for applying the algorithm.
Resumo:
Since Wireless Sensor Networks (WSNs) are subject to failures, fault-tolerance becomes an important requirement for many WSN applications. Fault-tolerance can be enabled in different areas of WSN design and operation, including the Medium Access Control (MAC) layer and the initial topology design. To be robust to failures, a MAC protocol must be able to adapt to traffic fluctuations and topology dynamics. We design ER-MAC that can switch from energy-efficient operation in normal monitoring to reliable and fast delivery for emergency monitoring, and vice versa. It also can prioritise high priority packets and guarantee fair packet deliveries from all sensor nodes. Topology design supports fault-tolerance by ensuring that there are alternative acceptable routes to data sinks when failures occur. We provide solutions for four topology planning problems: Additional Relay Placement (ARP), Additional Backup Placement (ABP), Multiple Sink Placement (MSP), and Multiple Sink and Relay Placement (MSRP). Our solutions use a local search technique based on Greedy Randomized Adaptive Search Procedures (GRASP). GRASP-ARP deploys relays for (k,l)-sink-connectivity, where each sensor node must have k vertex-disjoint paths of length ≤ l. To count how many disjoint paths a node has, we propose Counting-Paths. GRASP-ABP deploys fewer relays than GRASP-ARP by focusing only on the most important nodes – those whose failure has the worst effect. To identify such nodes, we define Length-constrained Connectivity and Rerouting Centrality (l-CRC). Greedy-MSP and GRASP-MSP place minimal cost sinks to ensure that each sensor node in the network is double-covered, i.e. has two length-bounded paths to two sinks. Greedy-MSRP and GRASP-MSRP deploy sinks and relays with minimal cost to make the network double-covered and non-critical, i.e. all sensor nodes must have length-bounded alternative paths to sinks when an arbitrary sensor node fails. We then evaluate the fault-tolerance of each topology in data gathering simulations using ER-MAC.
Resumo:
The insider threat is a security problem that is well-known and has a long history, yet it still remains an invisible enemy. Insiders know the security processes and have accesses that allow them to easily cover their tracks. In recent years the idea of monitoring separately for these threats has come into its own. However, the tools currently in use have disadvantages and one of the most effective techniques of human review is costly. This paper explores the development of an intelligent agent that uses already in-place computing material for inference as an inexpensive monitoring tool for insider threats. Design Science Research (DSR) is a methodology used to explore and develop an IT artifact, such as for this intelligent agent research. This methodology allows for a structure that can guide a deep search method for problems that may not be possible to solve or could add to a phenomenological instantiation.
Resumo:
Among the signal developments of the last third of the twentieth century has been the emergence of a new politics of human rights. The transnational circulation of norms, networks, and representations has advanced human rights claims in ways that have reshaped global practices. Just as much as the transnational flow of capital, the new human rights politics are part of the phenomenon that has come to be termed globalization. Shifting the focus from the sovereignty of the nation to the rights of individuals, regardless of nationality, the interplay between the local and the global in these new human rights claims are fundamentally redrawing the boundaries between the rights of individuals, states, and the international community. Truth Claims brings together for the first time some of the best new work from a variety of disciplinary and geographic perspectives exploring the making of human rights claims and the cultural politics of their representations. All of the essays, whether dealing with the state and its victims, receptions of human rights claims, or the status of transnational rights claims in the era of globalization, explore the potentialities of an expansive humanistic framework. Here, the authors move beyond the terms -- and the limitations -- of the universalism/relativism debate that has so defined existing human rights literature.
Resumo:
This paper considers a variant of the classical problem of minimizing makespan in a two-machine flow shop. In this variant, each job has three operations, where the first operation must be performed on the first machine, the second operation can be performed on either machine but cannot be preempted, and the third operation must be performed on the second machine. The NP-hard nature of the problem motivates the design and analysis of approximation algorithms. It is shown that a schedule in which the operations are sequenced arbitrarily, but without inserted machine idle time, has a worst-case performance ratio of 2. Also, an algorithm that constructs four schedules and selects the best is shown to have a worst-case performance ratio of 3/2. A polynomial time approximation scheme (PTAS) is also presented.
Resumo:
Executive Summary 1. The Marine Life Information Network (MarLIN) has been developed since 1998. Defra funding has supported a core part of its work, the Biology and Sensitivity Key Information Sub-programme. This report relates to Biology and Sensitivity work for the period 2001-2004. 2. MarLIN Biology and Sensitivity research takes information on the biology of species to identify the likely effects of changing environmental conditions linked to human activities on those species. In turn, species that are key functional, key structural, dominant, or characteristic in a biotope (the habitat and its associated species) are used to identify biotope sensitivity. Results are displayed over the World Wide Web and can be accessed via a range of search tools that make the information of relevance to environmental management. 3. The first Defra contract enabled the development of criteria and methods of research, database storage methods and the research of a wide range of species. A contract from English Nature and Scottish Natural Heritage enabled biotopes relevant to marine SACs to be researched. 4. Defra funding in 2001-2004 has especially enabled recent developments to be targeted for research. Those developments included the identification of threatened and declining species by the OSPAR Biodiversity Committee, the development of a new approach to defining sensitivity (part of the Review of Marine Nature Conservation), and the opportunity to use Geographical Information Systems (GIS) more effectively to link survey data to MarLIN assessments of sensitivity. 5. The MarLIN database has been developed to provide a resource to 'pick-and-mix' information depending on the questions being asked. Using GIS, survey data that provides locations for species and biotopes has been linked to information researched by MarLIN to map the likely sensitivity of an area to a specified factor. Projects undertaken for the Irish Sea pilot (marine landscapes), in collaboration with CEFAS (fishing impacts) and with the Countryside Council for Wales (oil spill response) have demonstrated the application of MarLIN information linked to survey data in answering, through maps, questions about likely impacts of human activities on seabed ecosystems. 6. GIS applications that use MarLIN sensitivity information give meaningful results when linked to localized and detailed survey information (lists of species and biotopes as point source or mapped extents). However, broad landscape units require further interpretation. 7. A new mapping tool (SEABED map) has been developed to display data on species distributions and survey data according to search terms that might be used by an environmental manager. 8. MarLIN outputs are best viewed on the Web site where the most up-to-date information from live databases is available. The MarLIN Web site receives about 1600 visits a day. 9. The MarLIN approach to assessing sensitivity and its application to environmental management were presented in papers at three international conferences during the current contract and a 'touchstone' paper is to be published in the peer-reviewed journal Hydrobiologia. The utility of MarLIN information for environmental managers, amongst other sorts of information, has been described in an article in Marine Pollution Bulletin. 10. MarLIN information is being used to inform the identification of potential indicator species for implementation of the Water Framework Directive including initiatives by ICES. 11. Non-Defra funding streams are supporting the updating of reviews and increasing the amount of peer review undertaken; both of which are important to the maintenance of the resource. However, whilst MarLIN information is sufficiently wide ranging to be used in an 'operational' way for marine environmental protection and management, new initiatives and the new biotopes classification have introduced additional species and biotopes that will need to be researched in the future. 12. By the end of the contract, the Biology and Sensitivity Key Information database contained full Key Information reviews on 152 priority species and 117 priority biotopes, together with basic information on 412 species; a total of 564 marine benthic species.
Resumo:
The Continuous Plankton Recorder has been sampling the northeast Pacific on a routine basis since 2000. Although this is a relatively short time series still, climate variability within that time has caused noticeable related changes in the plankton. The earlier part of the time series followed the 1999 La Nina and conditions were cool, but conditions between 2003 and 2005 were anomalously warm. Oceanic zooplankton have responded to this warming in several ways that are discernible in CPR data. The seasonal cycle of mesozooplankton biomass in the eastern Gulf of Alaska has shifted earlier in the spring by a few weeks (sampling resolution is too coarse to be more accurate). The copepod Neocalanus plumchruslflemingeri is largely responsible as it makes up a high proportion of the spring surface biomass and stage-based determinations have shown an earlier maximum in warmer years across much of the northeast Pacific, spanning nearly 20 degrees of latitude. Summer copepod populations are more diverse than in spring, although lower in biomass. The northwards extension of southern taxa in the summer correlates with surface temperature and in warmer years southern taxa are found further north than in cooler years. These findings support the importance of monitoring the open ocean particularly as it is an important foraging ground for large fish, birds and mammals. Higher trophic levels may time their reproduction or migration to coincide with the abundance of particular prey which may be of a different composition and/or lower abundance at a particular time in warmer conditions.
Resumo:
Much of the evidence suggesting that inequalities in health have been increasing over the last two decades has come from studies that compared the changes in relative health status of areas over time. Such studies ignore the movement of people between areas. This paper examines the population movement between small areas in Northern Ireland in the year prior to the 1991 census as well as the geographical distribution of migrants to Northern Ireland over the same period. It shows that deprived areas tended to become depopulated and that those who left these areas were the more affluent residents. While immigrants differed a little from the indigenous population, the overall effect of their distribution would be to maintain the geographical socio-economic status quo. The selective movement of people between areas would result in the distribution of health and ill-health becoming more polarized, i.e. produce a picture of widening inequalities between areas even though the distribution between individuals is unchanged. These processes suggest potential significant problems with the area-based approaches to monitoring health and inequalities in health.
Resumo:
This is an invited paper to a special issue on pupil voice focusing on methodological issues arising from the ESRC/TLRP project on consulting pupils about assessment practices in their classrooms. The issue of consulting pupils about assessment has rarely been researched before but what this article illustrates are some of the difficulties, tensions and positive outcomes of engaging with students as researchers within a nationally funded (and therefore externally driven), university-based project. This study adds considerably to the body of knowledge in this area by engaging students in the process as researchers in different capacities within the project. Issues discussed include the use of student advisory groups, ethical negotiation, students undertaking videotaped classroom observations and their subsequent role in co-interpreting video excerpts and visual images. The paper has attracted considerable interest already through the ESRC pupil seminar series forum and also from a prior paper presentation to the European Educational Research Association in September 2006 in Switzerland to the Childrens' Rights SIG becasue of researchers' current interests in embedding democratic principles and practices within research with children and young people.
Resumo:
This book examines credit in working class communities since 1880, focusing on forms of borrowing that were dependent on personal relationships and social networks. It provides an extended historical discussion of credit unions, legal and illegal moneylenders (loan sharks), and looks at the concept of ‘financial exclusion’. Initially, the book focuses on the history of tallymen, check traders, and their eventual movement into moneylending following the loss of their more affluent customers, due to increased spending power and an increasingly liberalized credit market. They also faced growing competition from mail order companies operating through networks of female agents, whose success owed much to the reciprocal cultural and economic conventions that lay at the heart of traditional working class credit relationships. Discussion of these forms of credit is related to theoretical debates about cultural aspects of credit exchange that ensured the continuing success of such forms of lending, despite persistent controversies about their use. The book contrasts commercial forms of credit with formal and informal co-operative alternatives, such as the mutuality clubs operated by co-operative retailers and credit unions. It charts the impact of post-war immigration upon credit patterns, particularly in relation to the migrant (Irish and Caribbean) origins of many credit unions and explains the relative lack of success of the credit union movement. The book contributes to anti-debt debates by exploring the historical difficulties of developing legislation in relation to the millions of borrowers who have patronized what has come to be termed the sub-prime sector.