9 resultados para dynamic group discovery

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exascale systems are the next frontier in high-performance computing and are expected to deliver a performance of the order of 10^18 operations per second using massive multicore processors. Very large- and extreme-scale parallel systems pose critical algorithmic challenges, especially related to concurrency, locality and the need to avoid global communication patterns. This work investigates a novel protocol for dynamic group communication that can be used to remove the global communication requirement and to reduce the communication cost in parallel formulations of iterative data mining algorithms. The protocol is used to provide a communication-efficient parallel formulation of the k-means algorithm for cluster analysis. The approach is based on a collective communication operation for dynamic groups of processes and exploits non-uniform data distributions. Non-uniform data distributions can be either found in real-world distributed applications or induced by means of multidimensional binary search trees. The analysis of the proposed dynamic group communication protocol has shown that it does not introduce significant communication overhead. The parallel clustering algorithm has also been extended to accommodate an approximation error, which allows a further reduction of the communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing elements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Results from two studies on longitudinal friendship networks are presented, exploring the impact of a gratitude intervention on positive and negative affect dynamics in a social network. The gratitude intervention had been previously shown to increase positive affect and decrease negative affect in an individual but dynamic group effects have not been considered. In the first study the intervention was administered to the whole network. In the second study two social networks are considered and in each only a subset of individuals, initially low/high in negative affect respectively received the intervention as `agents of change'. Data was analyzed using stochastic actor based modelling techniques to identify resulting network changes, impact on positive and negative affect and potential contagion of mood within the group. The first study found a group level increase in positive and a decrease in negative affect. Homophily was detected with regard to positive and negative affect but no evidence of contagion was found. The network itself became more volatile along with a fall in rate of change of negative affect. Centrality measures indicated that the best broadcasters were the individuals with the least negative affect levels at the beginning of the study. In the second study, the positive and negative affect levels for the whole group depended on the initial levels of negative affect of the intervention recipients. There was evidence of positive affect contagion in the group where intervention recipients had low initial level of negative affect and contagion in negative affect for the group where recipients had initially high level of negative affect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensitivity, specificity, and reproducibility are vital to interpret neuroscientific results from functional magnetic resonance imaging (fMRI) experiments. Here we examine the scan–rescan reliability of the percent signal change (PSC) and parameters estimated using Dynamic Causal Modeling (DCM) in scans taken in the same scan session, less than 5 min apart. We find fair to good reliability of PSC in regions that are involved with the task, and fair to excellent reliability with DCM. Also, the DCM analysis uncovers group differences that were not present in the analysis of PSC, which implies that DCM may be more sensitive to the nuances of signal changes in fMRI data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning is a vital element of project management but it is still not recognized as a process variable. Its objective should be to outperform the initially defined processes, and foresee and overcome possible undesirable events. Detailed task-level master planning is unrealistic since one cannot accurately predict all the requirements and obstacles before work has even started. The process planning methodology (PPM) has thus been developed in order to overcome common problems of the overwhelming project complexity. The essential elements of the PPM are the process planning group (PPG), including a control team that dynamically links the production/site and management, and the planning algorithm embodied within two continuous-improvement loops. The methodology was tested on a factory project in Slovenia and in four successive projects of a similar nature. In addition to a number of improvement ideas and enhanced communication, the applied PPM resulted in 32% higher total productivity, 6% total savings and created a synergistic project environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observation of adverse drug reactions during drug development can cause closure of the whole programme. However, if association between the genotype and the risk of an adverse event is discovered, then it might suffice to exclude patients of certain genotypes from future recruitment. Various sequential and non-sequential procedures are available to identify an association between the whole genome, or at least a portion of it, and the incidence of adverse events. In this paper we start with a suspected association between the genotype and the risk of an adverse event and suppose that the genetic subgroups with elevated risk can be identified. Our focus is determination of whether the patients identified as being at risk should be excluded from further studies of the drug. We propose using a utility function to? determine the appropriate action, taking into account the relative costs of suffering an adverse reaction and of failing to alleviate the patient's disease. Two illustrative examples are presented, one comparing patients who suffer from an adverse event with contemporary patients who do not, and the other making use of a reference control group. We also illustrate two classification methods, LASSO and CART, for identifying patients at risk, but we stress that any appropriate classification method could be used in conjunction with the proposed utility function. Our emphasis is on determining the action to take rather than on providing definitive evidence of an association. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strategy is a contested concept. The generic literature is characterized by a diverse range of competing theories and alternative perspectives. Traditional models of the competitive strategy of construction firms have tended to focus on exogenous factors. In contrast, the resource-based view of strategic management emphasizes the importance of endogenous factors. The more recently espoused concept of dynamic capabilities extends consideration beyond static resources to focus on the ability of firms to reconfigure their operating routines to enable responses to changing environments. The relevance of the dynamics capabilities framework to the construction sector is investigated through an exploratory case study of a regional contractor. The focus on how firms continuously adapt to changing environments provides new insights into competitive strategy in the construction sector. Strong support is found for the importance of path dependency in shaping strategic choice. The case study further suggests that strategy is a collective endeavour enacted by a loosely defined group of individual actors. Dynamic capabilities are characterized by an empirical elusiveness and as such are best construed as situated practices embedded within a social and physical context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most pervading concepts underlying computational models of information processing in the brain is linear input integration of rate coded uni-variate information by neurons. After a suitable learning process this results in neuronal structures that statically represent knowledge as a vector of real valued synaptic weights. Although this general framework has contributed to the many successes of connectionism, in this paper we argue that for all but the most basic of cognitive processes, a more complex, multi-variate dynamic neural coding mechanism is required - knowledge should not be spacially bound to a particular neuron or group of neurons. We conclude the paper with discussion of a simple experiment that illustrates dynamic knowledge representation in a spiking neuron connectionist system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While an awareness of age-related changes in memory may help older adults gain insight into their own cognitive abilities, it may also have a negative impact on memory performance through a mechanism of stereotype threat (ST). The consequence of ST is under-performance in abilities related to the stereotype. Here, we examined the degree to which explicit and implicit memory were affected by ST across a wide age-range. We found that explicit memory was affected by ST, but only in an Early-Aging group (mean age 67.83), and not in a Later-Aging group (mean age 84.59). Implicit memory was not affected in either the Early or Later Aging group. These results demonstrate that ST for age-related memory decline affects memory processes requiring controlled retrieval while sparing item encoding. Furthermore, this form of ST appears to dissipate as aging progresses. These results have implications for understanding psychological development across the span of aging.