220 resultados para Grew, Nehemiah, 1641-1712


Relevância:

10.00% 10.00%

Publicador:

Resumo:

MinneSPEC proposes reduced input sets that microprocessor designers can use to model representative short-running workloads. A four-step methodology verifies the program behavior similarity of these input sets to reference sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Randomising set index functions can reduce the number of conflict misses in data caches by spreading the cache blocks uniformly over all sets. Typically, the randomisation functions compute the exclusive ors of several address bits. Not all randomising set index functions perform equally well, which calls for the evaluation of many set index functions. This paper discusses and improves a technique that tackles this problem by predicting the miss rate incurred by a randomisation function, based on profiling information. A new way of looking at randomisation functions is used, namely the null space of the randomisation function. The members of the null space describe pairs of cache blocks that are mapped to the same set. This paper presents an analytical model of the error made by the technique and uses this to propose several optimisations to the technique. The technique is then applied to generate a conflict-free randomisation function for the SPEC benchmarks. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ‘market’, ‘organisation’, ‘vision’, ‘specification’, and ‘solution’ to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ‘market’, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ‘organisation’ and ‘vision’ sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ‘specification’ or ‘solution’ sources. Results from the second case study indicate that only ‘requirements dependency’ is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a promising method for pattern recognition and function estimation, least squares support vector machines (LS-SVM) express the training in terms of solving a linear system instead of a quadratic programming problem as for conventional support vector machines (SVM). In this paper, by using the information provided by the equality constraint, we transform the minimization problem with a single equality constraint in LS-SVM into an unconstrained minimization problem, then propose reduced formulations for LS-SVM. By introducing this transformation, the times of using conjugate gradient (CG) method, which is a greatly time-consuming step in obtaining the numerical solution, are reduced to one instead of two as proposed by Suykens et al. (1999). The comparison on computational speed of our method with the CG method proposed by Suykens et al. and the first order and second order SMO methods on several benchmark data sets shows a reduction of training time by up to 44%. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitride-strengthened, reduced activation, martensitic steel is anticipated to have higher creep strength because of the remarkable thermal stability of nitrides. Two nitride-strengthened, reduced activation martensitic steels with different carbon contents were prepared to investigate the microstructure and mechanical property changes with decreasing carbon content. It has been found that both steels had the microstructure of full martensite with fine nitrides dispersed homogeneously in the matrix and displayed extremely high strength but poor toughness. Compared with the steel with low carbon content (0.005 pct in wt pct), the steel with high carbon content (0.012 pct in wt pct) had not only the higher strength but also the higher impact toughness and grain coarsening temperature, which was related to the carbon content. On the one hand, carbon reduction led to Ta-rich inclusions; on the other hand, the grain grew larger when normalized at high temperature because of the absence of Ta carbonitrides, which would decrease impact toughness. The complicated Al2O3 inclusions in the two steels have been revealed to be responsible for the initiated cleavage fracture by acting as the critical cracks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a data flow based run time system as an efficient tool for supporting execution of parallel code on heterogeneous architectures hosting both multicore CPUs and GPUs. We discuss how the proposed run time system may be the target of both structured parallel applications developed using algorithmic skeletons/parallel design patterns and also more "domain specific" programming models. Experimental results demonstrating the feasibility of the approach are presented. © 2012 World Scientific Publishing Company.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hopanoids are pentacyclic triterpenoids that are thought to be bacterial surrogates for eukaryotic sterols, such as cholesterol, acting to stabilize membranes and to regulate their fluidity and permeability. To date, very few studies have evaluated the role of hopanoids in bacterial physiology. The synthesis of hopanoids depends on the enzyme squalene-hopene cyclase (Shc), which converts the linear squalene into the basic hopene structure. Deletion of the 2 genes encoding Shc enzymes in Burkholderia cenocepacia K56-2, BCAM2831 and BCAS0167, resulted in a strain that was unable to produce hopanoids, as demonstrated by gas chromatography and mass spectrometry. Complementation of the Delta shc mutant with only BCAM2831 was sufficient to restore hopanoid production to wild-type levels, while introducing a copy of BCAS0167 alone into the Delta shc mutant produced only very small amounts of the hopanoid peak. The Delta shc mutant grew as well as the wild type in medium buffered to pH 7 and demonstrated no defect in its ability to survive and replicate within macrophages, despite transmission electron microscopy (TEM) revealing defects in the organization of the cell envelope. The Delta shc mutant displayed increased sensitivity to low pH, detergent, and various antibiotics, including polymyxin B and erythromycin. Loss of hopanoid production also resulted in severe defects in both swimming and swarming motility. This suggests that hopanoid production plays an important role in the physiology of B. cenocepacia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Colour-based particle filters have been used exhaustively in the literature given rise to multiple applications However tracking coloured objects through time has an important drawback since the way in which the camera perceives the colour of the object can change Simple updates are often used to address this problem which imply a risk of distorting the model and losing the target In this paper a joint image characteristic-space tracking is proposed which updates the model simultaneously to the object location In order to avoid the curse of dimensionality a Rao-Blackwellised particle filter has been used Using this technique the hypotheses are evaluated depending on the difference between the model and the current target appearance during the updating stage Convincing results have been obtained in sequences under both sudden and gradual illumination condition changes Crown Copyright (C) 2010 Published by Elsevier B V All rights reserved

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a novel framework for visual tracking of human body parts is introduced. The approach presented demonstrates the feasibility of recovering human poses with data from a single uncalibrated camera by using a limb-tracking system based on a 2-D articulated model and a double-tracking strategy. Its key contribution is that the 2-D model is only constrained by biomechanical knowledge about human bipedal motion, instead of relying on constraints that are linked to a specific activity or camera view. These characteristics make our approach suitable for real visual surveillance applications. Experiments on a set of indoor and outdoor sequences demonstrate the effectiveness of our method on tracking human lower body parts. Moreover, a detail comparison with current tracking methods is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we consider the problem of tracking similar objects. We show how a mean field approach can be used to deal with interacting targets and we compare it with Markov Chain Monte Carlo (MCMC). Two mean field implementations are presented. The first one is more general and uses particle filtering. We discuss some simplifications of the base algorithm that reduce the computation time. The second one is based on suitable Gaussian approximations of probability densities that lead to a set of self-consistent equations for the means and covariances. These equations give the Kalman solution if there is no interaction. Experiments have been performed on two kinds of sequences. The first kind is composed of a single long sequence of twenty roaming ants and was previously analysed using MCMC. In this case, our mean field algorithms obtain substantially better results. The second kind corresponds to selected sequences of a football match in which the interaction avoids tracker coalescence in situations where independent trackers fail.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To optimize the performance of wireless networks, one needs to consider the impact of key factors such as interference from hidden nodes, the capture effect, the network density and network conditions (saturated versus non-saturated). In this research, our goal is to quantify the impact of these factors and to propose effective mechanisms and algorithms for throughput guarantees in multi-hop wireless networks. For this purpose, we have developed a model that takes into account all these key factors, based on which an admission control algorithm and an end-to-end available bandwidth estimation algorithm are proposed. Given the necessary network information and traffic demands as inputs, these algorithms are able to provide predictive control via an iterative approach. Evaluations using analytical comparison with simulations as well as existing research show that the proposed model and algorithms are accurate and effective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Convincing conversational agents require a coherent set of behavioral responses that can be interpreted by a human observer as indicative of a personality. This paper discusses the continued development and subsequent evaluation of virtual agents based on sound psychological principles. We use Eysenck's theoretical basis to explain aspects of the characterization of our agents, and we describe an architecture where personality affects the agent's global behavior quality as well as their back-channel productions. Drawing on psychological research, we evaluate perception of our agents' personalities and credibility by human viewers (N = 187). Our results suggest that we succeeded in validating theoretically grounded indicators of personality in our virtual agents, and that it is feasible to place our characters on Eysenck's scales. A key finding is that the presence of behavioral characteristics reinforces the prescribed personality profiles that are already emerging from the still images. Our long-term goal is to enhance agents' ability to sustain realistic interaction with human users, and we discuss how this preliminary work may be further developed to include more systematic variation of Eysenck's personality scales. © 2012 IEEE.


--------------------------------------------------------------------------------

Reaxys Database Information|

--------------------------------------------------------------------------------

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the occurrence of four red macroalgae new to Europe. Two species were unambiguously determined to the species level with a DNA barcoding approach, while the remaining two species could only be assigned to a genus. Gelidium vagum was found in the Oosterschelde estuary (the Netherlands). Gracilariopsis chorda, Chondracanthus sp. and Solieria sp. were found in the Gulf of Morbihan in Brittany (France); Solieria sp. was also subsequently observed in the Thau Lagoon (France). Gelidium vagum and Gracilariopsis chorda are species originating from the north-western Pacific, around the Japanese archipelago. Phylogenetic analyses also show a likely Pacific origin for Chondracanthus sp. and Solieria sp. Three of these species are likely to have been introduced after 2008, indicating some active transport pathways between the Pacific and the north-eastern Atlantic. These findings also underline the importance of consistent and continuous local expertise (versus rapid assessment) in early warning systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years unmanned vehicles have grown in popularity, with an ever increasing number of applications in industry, the military and research within air, ground and marine domains. In particular, the challenges posed by unmanned marine vehicles in order to increase the level of autonomy include automatic obstacle avoidance and conformance with the Rules of the Road when navigating in the presence of other maritime traffic. The USV Master Plan which has been established for the US Navy outlines a list of objectives for improving autonomy in order to increase mission diversity and reduce the amount of supervisory intervention. This paper addresses the specific development needs based on notable research carried out to date, primarily with regard to navigation, guidance, control and motion planning. The integration of the International Regulations for Avoiding Collisions at Sea within the obstacle avoidance protocols seeks to prevent maritime accidents attributed to human error. The addition of these critical safety measures may be key to a future growth in demand for USVs, as they serve to pave the way for establishing legal policies for unmanned vessels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A general approach to information correction and fusion for belief functions is proposed, where not only may the information items be irrelevant, but sources may lie as well. We introduce a new correction scheme, which takes into account uncertain metaknowledge on the source’s relevance and truthfulness and that generalizes Shafer’s discounting operation. We then show how to reinterpret all connectives of Boolean logic in terms of source behavior assumptions with respect to relevance and truthfulness. We are led to generalize the unnormalized Dempster’s rule to all Boolean connectives, while taking into account the uncertainties pertaining to assumptions concerning the behavior of sources. Eventually, we further extend this approach to an even more general setting, where source behavior assumptions do not have to be restricted to relevance and truthfulness.We also establish the commutativity property between correction and fusion processes, when the behaviors of the sources are independent.