732 resultados para Burden of proof


Relevância:

80.00% 80.00%

Publicador:

Resumo:

该文以一实际应用为背景提出了多移动机器人避碰及死锁预防算法 ,该算法将机器人的运行环境形式化地描述为初等运动集、冲突图、总任务集及机器人作业集 ,利用集合论、图论的有关方法及技术实现了多机器人间的避碰与死锁预防 .当机器人的运行环境改变时 ,只需要对相应的集合描述文件进行修改 ,而不用对程序做任何改动 .算法的另一个特点是利用避碰算法巧妙地完成了死锁预防 .仿真和实际运行证明了该算法高效可靠 .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

本文采用集中预规划方法 ,通过调整机器人的运动速度实现多机器人避碰 ,所提算法的基本思想为 :将机器人的运动路径分段 ,然后按避碰要求对机器人通过各段的时间进行约束 ,从而将避碰问题转化为高维线性空间的优化问题 ,并进一步将其转化为线性方程的求解 ,使问题具有明确的解析解 .由于该方法的复杂度较高 ,在实现过程中采用了多种方法降低复杂度 ,简化计算 .本文给出了该算法的基本思路 ,有关定理及证明 ,算法的化简方法 ,最后给出了实验结果及分析 .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A newly developed experimental model called simulation of real mission was used to explore law of time perception and user endurance for feedback delay under Network-Supported Co-operative Work. Some non-technological factors influencing time perception and user endurance (mission type、difficulty level、feedback method、partner type、gender and A type behavior pattern) were also examined. Results of the study showed that: (1) Under condition of waiting without feedback, mission type and difficulty level demonstrated significant main effects on judgment of waiting duration. People will wait more time to receive partner's feedback if he or she perceives that partner's task is difficult, and the longest waiting duration (LWD) in the mission of computation is longer than the LWD in the mission of proof searching. (2) Under condition of waiting with feedback, experimental data perfectly supported Vierordt's Law: short duration is underestimated, long duration is overestimated, only proper duration (2-6 second) can be estimated correctly. The proper duration will vary with the changing of difficulty levels of mission. More long the waiting duration is, more estimation error will be occurred. The type difference of partner has no significant effect on the law of time perception. (3) Under condition of waiting with feedback, non-technology factors can significantly effect user's endurance. When subjects were told their partner was human, mission type and difficulty level of mission could significantly effect user's endurance. When subjects were told their partner was computer, A type behavior pattern and difficulty level of mission could significantly effect user's endurance. The two-way interaction effect between A type behavior pattern and gender was detected.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Type-omega DPLs (Denotational Proof Languages) are languages for proof presentation and search that offer strong soundness guarantees. LCF-type systems such as HOL offer similar guarantees, but their soundness relies heavily on static type systems. By contrast, DPLs ensure soundness dynamically, through their evaluation semantics; no type system is necessary. This is possible owing to a novel two-tier syntax that separates deductions from computations, and to the abstraction of assumption bases, which is factored into the semantics of the language and allows for sound evaluation. Every type-omega DPL properly contains a type-alpha DPL, which can be used to present proofs in a lucid and detailed form, exclusively in terms of primitive inference rules. Derived inference rules are expressed as user-defined methods, which are "proof recipes" that take arguments and dynamically perform appropriate deductions. Methods arise naturally via parametric abstraction over type-alpha proofs. In that light, the evaluation of a method call can be viewed as a computation that carries out a type-alpha deduction. The type-alpha proof "unwound" by such a method call is called the "certificate" of the call. Certificates can be checked by exceptionally simple type-alpha interpreters, and thus they are useful whenever we wish to minimize our trusted base. Methods are statically closed over lexical environments, but dynamically scoped over assumption bases. They can take other methods as arguments, they can iterate, and they can branch conditionally. These capabilities, in tandem with the bifurcated syntax of type-omega DPLs and their dynamic assumption-base semantics, allow the user to define methods in a style that is disciplined enough to ensure soundness yet fluid enough to permit succinct and perspicuous expression of arbitrarily sophisticated derived inference rules. We demonstrate every major feature of type-omega DPLs by defining and studying NDL-omega, a higher-order, lexically scoped, call-by-value type-omega DPL for classical zero-order natural deduction---a simple choice that allows us to focus on type-omega syntax and semantics rather than on the subtleties of the underlying logic. We start by illustrating how type-alpha DPLs naturally lead to type-omega DPLs by way of abstraction; present the formal syntax and semantics of NDL-omega; prove several results about it, including soundness; give numerous examples of methods; point out connections to the lambda-phi calculus, a very general framework for type-omega DPLs; introduce a notion of computational and deductive cost; define several instrumented interpreters for computing such costs and for generating certificates; explore the use of type-omega DPLs as general programming languages; show that DPLs do not have to be type-less by formulating a static Hindley-Milner polymorphic type system for NDL-omega; discuss some idiosyncrasies of type-omega DPLs such as the potential divergence of proof checking; and compare type-omega DPLs to other approaches to proof presentation and discovery. Finally, a complete implementation of NDL-omega in SML-NJ is given for users who want to run the examples and experiment with the language.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PILOT is a programming system constructed in LISP. It is designed to facilitate the development of programs by easing the familiar sequence: write some code, run the program, make some changes, write some more code, run the program again, etc. As a program becomes more complex, making these changes becomes harder and harder because the implications of changes are harder to anticipate. In the PILOT system, the computer plays an active role in this evolutionary process by providing the means whereby changes can be effected immediately, and in ways that seem natural to the user. The user of PILOT feels that he is giving advice, or making suggestions, to the computer about the operation of his programs, and that the system then performs the work necessary. The PILOT system is thus an interface between the user and his program, monitoring both in the requests of the user and operation of his program. The user may easily modify the PILOT system itself by giving it advice about its own operation. This allows him to develop his own language and to shift gradually onto PILOT the burden of performing routine but increasingly complicated tasks. In this way, he can concentrate on the conceptual difficulties in the original problem, rather than on the niggling tasks of editing, rewriting, or adding to his programs. Two detailed examples are presented. PILOT is a first step toward computer systems that will help man to formulate problems in the same way they now help him to solve them. Experience with it supports the claim that such "symbiotic systems" allow the programmer to attack and solve more difficult problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Woods, T. (2007). African Pasts: Memory and History in African Literatures. Manchetser: Manchester University Press. RAE2008

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wydział Neofilologii: Instytut Filologii Rosyjskiej. Zakład Ukrainistyki

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Outcome assessment can support the therapeutic process by providing a way to track symptoms and functionality over time, providing insights to clinicians and patients, as well as offering a common language to discuss patient behavior/functioning. OBJECTIVES: In this article, we examine the patient-based outcome assessment (PBOA) instruments that have been used to determine outcomes in acupuncture clinical research and highlight measures that are feasible, practical, economical, reliable, valid, and responsive to clinical change. The aims of this review were to assess and identify the commonly available PBOA measures, describe a framework for identifying appropriate sets of measures, and address the challenges associated with these measures and acupuncture. Instruments were evaluated in terms of feasibility, practicality, economy, reliability, validity, and responsiveness to clinical change. METHODS: This study was a systematic review. A total of 582 abstracts were reviewed using PubMed (from inception through April 2009). RESULTS: A total of 582 citations were identified. After screening of title/abstract, 212 articles were excluded. From the remaining 370 citations, 258 manuscripts identified explicit PBOA; 112 abstracts did not include any PBOA. The five most common PBOA instruments identified were the Visual Analog Scale, Symptom Diary, Numerical Pain Rating Scales, SF-36, and depression scales such as the Beck Depression Inventory. CONCLUSIONS: The way a questionnaire or scale is administered can have an effect on the outcome. Also, developing and validating outcome measures can be costly and difficult. Therefore, reviewing the literature on existing measures before creating or modifying PBOA instruments can significantly reduce the burden of developing a new measure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Most information about the lifetime prevalence of mental disorders comes from retrospective surveys, but how much these surveys have undercounted due to recall failure is unknown. We compared results from a prospective study with those from retrospective studies. METHOD: The representative 1972-1973 Dunedin New Zealand birth cohort (n=1037) was followed to age 32 years with 96% retention, and compared to the national New Zealand Mental Health Survey (NZMHS) and two US National Comorbidity Surveys (NCS and NCS-R). Measures were research diagnoses of anxiety, depression, alcohol dependence and cannabis dependence from ages 18 to 32 years. RESULTS: The prevalence of lifetime disorder to age 32 was approximately doubled in prospective as compared to retrospective data for all four disorder types. Moreover, across disorders, prospective measurement yielded a mean past-year-to-lifetime ratio of 38% whereas retrospective measurement yielded higher mean past-year-to-lifetime ratios of 57% (NZMHS, NCS-R) and 65% (NCS). CONCLUSIONS: Prospective longitudinal studies complement retrospective surveys by providing unique information about lifetime prevalence. The experience of at least one episode of DSM-defined disorder during a lifetime may be far more common in the population than previously thought. Research should ask what this means for etiological theory, construct validity of the DSM approach, public perception of stigma, estimates of the burden of disease and public health policy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Malaria and other vector-borne diseases represent a significant and growing burden in many tropical countries. Successfully addressing these threats will require policies that expand access to and use of existing control methods, such as insecticide-treated bed nets (ITNs) and artemesinin combination therapies (ACTs) for malaria, while weighing the costs and benefits of alternative approaches over time. This paper argues that decision analysis provides a valuable framework for formulating such policies and combating the emergence and re-emergence of malaria and other diseases. We outline five challenges that policy makers and practitioners face in the struggle against malaria, and demonstrate how decision analysis can help to address and overcome these challenges. A prototype decision analysis framework for malaria control in Tanzania is presented, highlighting the key components that a decision support tool should include. Developing and applying such a framework can promote stronger and more effective linkages between research and policy, ultimately helping to reduce the burden of malaria and other vector-borne diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Whereas common infectious and parasitic diseases such as malaria and the HIV/AIDS pandemic remain major unresolved health problems in many developing countries, emerging non-communicable diseases relating to diet and lifestyle have been increasing over the last two decades, thus creating a double burden of disease and impacting negatively on already over-stretched health services in these countries. Prevalence rates for type 2 diabetes mellitus and CVD in sub-Saharan Africa have seen a 10-fold increase in the last 20 years. In the Arab Gulf current prevalence rates are between 25 and 35% for the adult population, whilst evidence of the metabolic syndrome is emerging in children and adolescents. The present review focuses on the concept of the epidemiological and nutritional transition. It looks at historical trends in socio-economic status and lifestyle and trends in nutrition-related non-communicable diseases over the last two decades, particularly in developing countries with rising income levels, as well as the other extreme of poverty, chronic hunger and coping strategies and metabolic adaptations in fetal life that predispose to non-communicable disease risk in later life. The role of preventable environmental risk factors for obesity and the metabolic syndrome in developing countries is emphasized and also these challenges are related to meeting the millennium development goals. The possible implications of these changing trends for human and economic development in poorly-resourced healthcare settings and the implications for nutrition training are also discussed.