220 resultados para Grew, Nehemiah, 1641-1712


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virtual reality has a number of advantages for analyzing sports interactions such as the standardization of experimental conditions, stereoscopic vision, and complete control of animated humanoid movement. Nevertheless, in order to be useful for sports applications, accurate perception of simulated movement in the virtual sports environment is essential. This perception depends on parameters of the synthetic character such as the number of degrees of freedom of its skeleton or the levels of detail (LOD) of its graphical representation. This study focuses on the influence of this latter parameter on the perception of the movement. In order to evaluate it, this study analyzes the judgments of immersed handball goalkeepers that play against a graphically modified virtual thrower. Five graphical representations of the throwing action were defined: a textured reference level (L0), a nontextured level (L1), a wire-frame level (L2), a moving point light display (MLD) level with a normal-sized ball (L3), and a MLD level where the ball is represented by a point of light (L4). The results show that judgments made by goalkeepers in the L4 condition are significantly less accurate than in all the other conditions (p

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Environments that are hostile to life are characterized by reduced microbial activity which results in poor soil- and plant-health, low biomass and biodiversity, and feeble ecosystem development. Whereas the functional biosphere may primarily be constrained by water activity (a w) the mechanism(s) by which this occurs have not been fully elucidated. Remarkably we found that, for diverse species of xerophilic fungi at a w values of = 0.72, water activity per se did not limit cellular function. We provide evidence that chaotropic activity determined their biotic window, and obtained mycelial growth at water activities as low as 0.647 (below that recorded for any microbial species) by addition of compounds that reduced the net chaotropicity. Unexpectedly we found that some fungi grew optimally under chaotropic conditions, providing evidence for a previously uncharacterized class of extremophilic microbes. Further studies to elucidate the way in which solute activities interact to determine the limits of life may lead to enhanced biotechnological processes, and increased productivity of agricultural and natural ecosystems in arid and semiarid regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The Six Sigma approach to business improvement has emerged as a phenomenon in both the practitioner and academic literature with potential for achieving increased competitiveness and contributing. However, there is a lack of critical reviews covering both theory and practice. Therefore, the purpose of this paper is to critically review the literature of Six Sigma using a consistent theoretical perspective, namely absorptive capacity.

Design/methodology/approach – The literature from peer-reviewed journals has been critically reviewed using the absorptive capacity framework and dimensions of acquisition, assimilation, transformation, and exploitation.

Findings – There is evidence of emerging theoretical underpinning in relation to Six Sigma borrowing from an eclectic range of organisational theories. However, this theoretical development lags behind practice in the area. The development of Six Sigma in practice is expanding mainly through more rigorous studies and applications in service-based environments (profit and not for profit). The absorptive capacity framework is found to be a useful overarching framework within which to situate existing theoretical and practice studies.

Research limitations/implications – Agendas for further research from the critical review, in relation to both theory and practice, have been established in relation to each dimension of the absorptive capacity framework.

Practical implications – The paper shows that Six Sigma is both a strategic and operational issue and that focussing solely on define, measure, analyse, improve control-based projects can limit the strategic effectiveness of the approach within organisations.

Originality/value – Despite the increasing volume of Six Sigma literature and organisational applications, there is a paucity of critical reviews which cover both theory and practice and which suggest research agendas derived from such reviews.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the rapid growth in the quantity and complexity of scientific knowledge available for scientists, and allied professionals, the problems associated with harnessing this knowledge are well recognized. Some of these problems are a result of the uncertainties and inconsistencies that arise in this knowledge. Other problems arise from heterogeneous and informal formats for this knowledge. To address these problems, developments in the application of knowledge representation and reasoning technologies can allow scientific knowledge to be captured in logic-based formalisms. Using such formalisms, we can undertake reasoning with the uncertainty and inconsistency to allow automated techniques to be used for querying and combining of scientific knowledge. Furthermore, by harnessing background knowledge, the querying and combining tasks can be carried out more intelligently. In this paper, we review some of the significant proposals for formalisms for representing and reasoning with scientific knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The number of clinical trials reports is increasing rapidly due to a large number of clinical trials being conducted; it, therefore, raises an urgent need to utilize the clinical knowledge contained in the clinical trials reports. In this paper, we focus on the qualitative knowledge instead of quantitative knowledge. More precisely, we aim to model and reason with the qualitative comparison (QC for short) relations which consider qualitatively how strongly one drug/therapy is preferred to another in a clinical point of view. To this end, first, we formalize the QC relations, introduce the notions of QC language, QC base, and QC profile; second, we propose a set of induction rules for the QC relations and provide grading interpretations for the QC bases and show how to determine whether a QC base is consistent. Furthermore, when a QC base is inconsistent, we analyze how to measure inconsistencies among QC bases, and we propose different approaches to merging multiple QC bases. Finally, a case study on lowering intraocular pressure is conducted to illustrate our approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Christ’s life, as related through the Gospel narratives and early Apocrypha, was subject to a riot of literary-devotional adaptation in the medieval period. This collection provides a series of groundbreaking studies centring on the devotional and cultural significance of Christianity’s pivotal story during the Middle Ages.

The collection represents an important milestone in terms of mapping the meditative modes of piety that characterize a number of Christological traditions, including the Meditationes vitae Christi and the numerous versions it spawned in both Latin and the vernacular. A number of chapters in the volume track how and why meditative piety grew in popularity to become a mode of spiritual activity advised not only to recluses and cenobites as in the writings of Aelred of Rievaulx, but also reached out to diverse lay audiences through the pastoral regimens prescribed by devotional authors such as the Carthusian prior Nicholas Love in England and the Parisian theologian and chancellor of the University of Paris, Jean Gerson.

Through exploring these texts from a variety of perspectives — theoretical, codicological, theological — and through tracing their complex lines of dissemination in ideological and material terms, this collection promises to be invaluable to students and scholars of medieval religious and literary culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional Time Division Multiple Access (TDMA) protocol provides deterministic periodic collision free data transmissions. However, TDMA lacks flexibility and exhibits low efficiency in dynamic environments such as wireless LANs. On the other hand contention-based MAC protocols such as the IEEE 802.11 DCF are adaptive to network dynamics but are generally inefficient in heavily loaded or large networks. To take advantage of the both types of protocols, a D-CVDMA protocol is proposed. It is based on the k-round elimination contention (k-EC) scheme, which provides fast contention resolution for Wireless LANs. D-CVDMA uses a contention mechanism to achieve TDMA-like collision-free data transmissions, which does not need to reserve time slots for forthcoming transmissions. These features make the D-CVDMA robust and adaptive to network dynamics such as node leaving and joining, changes in packet size and arrival rate, which in turn make it suitable for the delivery of hybrid traffic including multimedia and data content. Analyses and simulations demonstrate that D-CVDMA outperforms the IEEE 802.11 DCF and k-EC in terms of network throughput, delay, jitter, and fairness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hunter and Konieczny explored the relationships between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base in several of their papers. In particular, an inconsistency value termed MIVC, defined from minimal inconsistent subsets, can be considered as a Shapley Inconsistency Value. Moreover, it can be axiomatized completely in terms of five simple axioms. MinInc, one of the five axioms, states that each minimal inconsistent set has the same amount of conflict. However, it conflicts with the intuition illustrated by the lottery paradox, which states that as the size of a minimal inconsistent belief base increases, the degree of inconsistency of that belief base becomes smaller. To address this, we present two kinds of revised inconsistency measures for a belief base from its minimal inconsistent subsets. Each of these measures considers the size of each minimal inconsistent subset as well as the number of minimal inconsistent subsets of a belief base. More specifically, we first present a vectorial measure to capture the inconsistency for a belief base, which is more discriminative than MIVC. Then we present a family of weighted inconsistency measures based on the vectorial inconsistency measure, which allow us to capture the inconsistency for a belief base in terms of a single numerical value as usual. We also show that each of the two kinds of revised inconsistency measures can be considered as a particular Shapley Inconsistency Value, and can be axiomatically characterized by the corresponding revised axioms presented in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design and implementation of a programmable cyclic redundancy check (CRC) computation circuit architecture, suitable for deployment in network related system-on-chips (SoCs) is presented. The architecture has been designed to be field reprogrammable so that it is fully flexible in terms of the polynomial deployed and the input port width. The circuit includes an embedded configuration controller that has a low reconfiguration time and hardware cost. The circuit has been synthesised and mapped to 130-nm UMC standard cell [application-specific integrated circuit (ASIC)] technology and is capable of supporting line speeds of 5 Gb/s. © 2006 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For some time there is a large interest in variable step-size methods for adaptive filtering. Recently, a few stochastic gradient algorithms have been proposed, which are based on cost functions that have exponential dependence on the chosen error. However, we have experienced that the cost function based on exponential of the squared error does not always satisfactorily converge. In this paper we modify this cost function in order to improve the convergence of exponentiated cost function and the novel ECVSS (exponentiated convex variable step-size) stochastic gradient algorithm is obtained. The proposed technique has attractive properties in both stationary and abrupt-change situations. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present mid-infrared (MIR) observations of the Type II-plateau supernova (SN) 2004et, obtained with the Spitzer Space Telescope between 64 and 1406 days past explosion. Late-time optical spectra are also presented. For the period 300-795 days past explosion, we argue that the spectral energy distribution (SED) of SN 2004et comprises (1) a hot component due to emission from optically thick gas, as well as free-bound radiation; (2) a warm component due to newly formed, radioactively heated dust in the ejecta; and (3) a cold component due to an IR echo from the interstellar-medium dust of the host galaxy, NGC 6946. There may also have been a small contribution to the IR SED due to free-free emission from ionized gas in the ejecta. We reveal the first-ever spectroscopic evidence for silicate dust formed in the ejecta of a supernova. This is supported by our detection of a large, but progressively declining, mass of SiO. However, we conclude that the mass of directly detected ejecta dust grew to no more than a few times 10(-4) M-circle dot. We also provide evidence that the ejecta dust formed in comoving clumps of fixed size. We argue that, after about two years past explosion, the appearance of wide, box-shaped optical line profiles was due to the impact of the ejecta on the progenitor circumstellar medium and that the subsequent formation of a cool, dense shell was responsible for a later rise in the MIR flux. This study demonstrates the rich, multifaceted ways in which a typical core-collapse supernova and its progenitor can produce and/or interact with dust grains. The work presented here adds to the growing number of studies that do not support the contention that SNe are responsible for the large mass of observed dust in high-redshift galaxies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new domain-specific, reconfigurable system-on-a-chip (SoC) architecture is proposed for video motion estimation. This has been designed to cover most of the common block-based video coding standards, including MPEG-2, MPEG-4, H.264, WMV-9 and AVS. The architecture exhibits simple control, high throughput and relatively low hardware cost when compared with existing circuits. It can also easily handle flexible search ranges without any increase in silicon area and can be configured prior to the start of the motion estimation process for a specific standard. The computational rates achieved make the circuit suitable for high-end video processing applications, such as HDTV. Silicon design studies indicate that circuits based on this approach incur only a relatively small penalty in terms of power dissipation and silicon area when compared with implementations for specific standards. Indeed, the cost/performance achieved exceeds that of existing but specific solutions and greatly exceeds that of general purpose field programmable gate array (FPGA) designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A BSP superstep is a distributed computation comprising a number of simultaneously executing processes which may generate asynchronous messages. A superstep terminates with a barrier which enforces a global synchronisation and delivers all ongoing communications. Multilevel supersteps can utilise barriers in which subsets of processes, interacting through shared memories, are locally synchronised (partitioned synchronisation). In this paper a state-based semantics, closely related to the classical sequential programming model, is derived for distributed BSP with partitioned synchronisation.