863 resultados para critical path methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of preprocessing a large graph so that point-to-point shortest-path queries can be answered very fast. Computing shortest paths is a well studied problem, but exact algorithms do not scale to huge graphs encountered on the web, social networks, and other applications. In this paper we focus on approximate methods for distance estimation, in particular using landmark-based distance indexing. This approach involves selecting a subset of nodes as landmarks and computing (offline) the distances from each node in the graph to those landmarks. At runtime, when the distance between a pair of nodes is needed, we can estimate it quickly by combining the precomputed distances of the two nodes to the landmarks. We prove that selecting the optimal set of landmarks is an NP-hard problem, and thus heuristic solutions need to be employed. Given a budget of memory for the index, which translates directly into a budget of landmarks, different landmark selection strategies can yield dramatically different results in terms of accuracy. A number of simple methods that scale well to large graphs are therefore developed and experimentally compared. The simplest methods choose central nodes of the graph, while the more elaborate ones select central nodes that are also far away from one another. The efficiency of the suggested techniques is tested experimentally using five different real world graphs with millions of edges; for a given accuracy, they require as much as 250 times less space than the current approach in the literature which considers selecting landmarks at random. Finally, we study applications of our method in two problems arising naturally in large-scale networks, namely, social search and community detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We survey several of the research efforts pursued by the iBench and snBench projects in the CS Department at Boston University over the last half dozen years. These activities use ideas and methodologies inspired by recent developments in other parts of computer science -- particularly in formal methods and in the foundations of programming languages -- but now specifically applied to the certification of safety-critical networking systems. This is research jointly led by Azer Bestavros and Assaf Kfoury with the participation of Adam Bradley, Andrei Lapets, and Michael Ocean.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colloidal photonic crystals have potential light manipulation applications including; fabrication of efficient lasers and LEDs, improved optical sensors and interconnects, and improving photovoltaic efficiencies. One road-block of colloidal selfassembly is their inherent defects; however, they can be manufactured cost effectively into large area films compared to micro-fabrication methods. This thesis investigates production of ‘large-area’ colloidal photonic crystals by sonication, under oil co-crystallization and controlled evaporation, with a view to reducing cracking and other defects. A simple monotonic Stöber particle synthesis method was developed producing silica particles in the range of 80 to 600nm in a single step. An analytical method assesses the quality of surface particle ordering in a semiquantitative manner was developed. Using fast Fourier transform (FFT) spot intensities, a grey scale symmetry area method, has been used to quantify the FFT profiles. Adding ultrasonic vibrations during film formation demonstrated large areas could be assembled rapidly, however film ordering suffered as a result. Under oil cocrystallisation results in the particles being bound together during film formation. While having potential to form large areas, it requires further refinement to be established as a production technique. Achieving high quality photonic crystals bonded with low concentrations (<5%) of polymeric adhesives while maintaining refractive index contrast, proved difficult and degraded the film’s uniformity. A controlled evaporation method, using a mixed solvent suspension, represents the most promising method to produce high quality films over large areas, 75mm x 25mm. During this mixed solvent approach, the film is kept in the wet state longer, thus reducing cracks developing during the drying stage. These films are crack-free up to a critical thickness, and show very large domains, which are visible in low magnification SEM images as Moiré fringe patterns. Higher magnification reveals separation between alternate fringe patterns are domain boundaries between individual crystalline growth fronts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The original solution to the high failure rate of software development projects was the imposition of an engineering approach to software development, with processes aimed at providing a repeatable structure to maintain a consistency in the ‘production process’. Despite these attempts at addressing the crisis in software development, others have argued that the rigid processes of an engineering approach did not provide the solution. The Agile approach to software development strives to change how software is developed. It does this primarily by relying on empowered teams of developers who are trusted to manage the necessary tasks, and who accept that change is a necessary part of a development project. The use of, and interest in, Agile methods in software development projects has expanded greatly, yet this has been predominantly practitioner driven. There is a paucity of scientific research on Agile methods and how they are adopted and managed. This study aims at addressing this paucity by examining the adoption of Agile through a theoretical lens. The lens used in this research is that of double loop learning theory. The behaviours required in an Agile team are the same behaviours required in double loop learning; therefore, a transition to double loop learning is required for a successful Agile adoption. The theory of triple loop learning highlights that power factors (or power mechanisms in this research) can inhibit the attainment of double loop learning. This study identifies the negative behaviours - potential power mechanisms - that can inhibit the double loop learning inherent in an Agile adoption, to determine how the Agile processes and behaviours can create these power mechanisms, and how these power mechanisms impact on double loop learning and the Agile adoption. This is a critical realist study, which acknowledges that the real world is a complex one, hierarchically structured into layers. An a priori framework is created to represent these layers, which are categorised as: the Agile context, the power mechanisms, and double loop learning. The aim of the framework is to explain how the Agile processes and behaviours, through the teams of developers and project managers, can ultimately impact on the double loop learning behaviours required in an Agile adoption. Four case studies provide further refinement to the framework, with changes required due to observations which were often different to what existing literature would have predicted. The study concludes by explaining how the teams of developers, the individual developers, and the project managers, working with the Agile processes and required behaviours, can inhibit the double loop learning required in an Agile adoption. A solution is then proposed to mitigate these negative impacts. Additionally, two new research processes are introduced to add to the Information Systems research toolkit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Eosinophils are observed in several liver diseases, but their contribution in the pathogenesis of these disorders remains poorly investigated. Concanavalin A (Con A)-induced hepatitis is an experimental model of immune-mediated liver injury in which natural killer T (NKT) cells play a critical role through the production of interleukin (IL)-4 and the expression of Fas ligand (FasL). Because activated NKT cells also produce IL-5, a critical cytokine for eosinophil maturation and function, the role of IL-5 was investigated in this model. METHODS: IL-5-deficient mice, eosinophil depletion in wild-type (WT) mice, and NKT cell transfer from WT- or IL-5-deficient mice into NKT cell-deficient mice were used to assess the role of IL-5 and eosinophils. RESULTS: Liver eosinophil infiltrate and IL-5 production were observed after Con A challenge. Liver injury was dramatically reduced in IL-5-deficient or eosinophil-depleted mice. In addition, residual hepatitis observed in Fas-deficient mice was abolished after IL-5 neutralization. Finally, we showed that NKT cells constituted a critical source of IL-5. Indeed, transfer of WT NKT cells to mice lacking NKT cells restored liver injury, whereas transfer of IL-5-deficient NKT cells did not. CONCLUSIONS: These observations highlight the pathologic role of IL-5 and eosinophils in experimental immune-mediated hepatitis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(1)H NMR spectroscopy is used to investigate a series of microporous activated carbons derived from a poly(ether ether ketone) (PEEK) precursor with varying amounts of burnoff (BO). In particular, properties relevant to hydrogen storage are evaluated such as pore structure, average pore size, uptake, and binding energy. High-pressure NMR with in situ H(2) loading is employed with H(2) pressure ranging from 100 Pa to 10 MPa. An N(2)-cooled cryostat allows for NMR isotherm measurements at both room temperature ( approximately 290 K) and 100 K. Two distinct (1)H NMR peaks appear in the spectra which represent the gaseous H(2) in intergranular pores and the H(2) residing in micropores. The chemical shift of the micropore peak is observed to evolve with changing pressure, the magnitude of this effect being correlated to the amount of BO and therefore the structure. This is attributed to the different pressure dependence of the amount of adsorbed and non-adsorbed molecules within micropores, which experience significantly different chemical shifts due to the strong distance dependence of the ring current effect. In pores with a critical diameter of 1.2 nm or less, no pressure dependence is observed because they are not wide enough to host non-adsorbed molecules; this is the case for samples with less than 35% BO. The largest estimated pore size that can contribute to the micropore peak is estimated to be around 2.4 nm. The total H(2) uptake associated with pores of this size or smaller is evaluated via a calibration of the isotherms, with the highest amount being observed at 59% BO. Two binding energies are present in the micropores, with the lower, more dominant one being on the order of 5 kJ mol(-1) and the higher one ranging from 7 to 9 kJ mol(-1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulations of reaction processes in solution in general rely on the definition of a reaction coordinate and the determination of the thermodynamic changes of the system along the reaction coordinate. The reaction coordinate often is constituted of characteristic geometrical properties of the reactive solute species, while the contributions of solvent molecules are implicitly included in the thermodynamics of the solute degrees of freedoms. However, solvent dynamics can provide the driving force for the reaction process, and in such cases explicit description of the solvent contribution in the free energy of the reaction process becomes necessary. We report here a method that can be used to analyze the solvent contributions to the reaction activation free energies from the combined QM/MM minimum free-energy path simulations. The method was applied to the self-exchange S(N)2 reaction of CH(3)Cl + Cl(-), showing that the importance of solvent-solute interactions to the reaction process. The results were further discussed in the context of coupling between solvent and solute molecules in reaction processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Physical activity self-report instruments in the US have largely been developed for and validated in White samples. Despite calls to validate existing instruments in more diverse samples, relatively few instruments have been validated in US Blacks. Emerging evidence suggests that these instruments may have differential validity in Black populations. PURPOSE: This report reviews and evaluates the validity and reliability of self-reported measures of physical activity in Blacks and makes recommendations for future directions. METHODS: A systematic literature review was conducted to identify published reports with construct or criterion validity evaluated in samples that included Blacks. Studies that reported results separately for Blacks were examined. RESULTS: The review identified 10 instruments validated in nine manuscripts. Criterion validity correlations tended to be low to moderate. No study has compared the validity of multiple instruments in a single sample of Blacks. CONCLUSION: There is a need for efforts validating self-report physical activity instruments in Blacks, particularly those evaluating the relative validity of instruments in a single sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we shall critically examine a special class of graph matching algorithms that follow the approach of node-similarity measurement. A high-level algorithm framework, namely node-similarity graph matching framework (NSGM framework), is proposed, from which, many existing graph matching algorithms can be subsumed, including the eigen-decomposition method of Umeyama, the polynomial-transformation method of Almohamad, the hubs and authorities method of Kleinberg, and the kronecker product successive projection methods of Wyk, etc. In addition, improved algorithms can be developed from the NSGM framework with respects to the corresponding results in graph theory. As the observation, it is pointed out that, in general, any algorithm which can be subsumed from NSGM framework fails to work well for graphs with non-trivial auto-isomorphism structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To date there has been little research on young people and sexuality in Northern Ireland. This paper draws on the first major study in this area to analyse the delivery of formal sex education in schools. Both quantitative and qualitative methods were used to access young people's opinions about the quality of the sex education they had received at school. Overall, they reported high levels of dissatisfaction, with notable variations in relation to both gender and religious affiliation. In one sense their opinions mesh well with those of young people in other parts of these islands. At the same time the specificity of sexuality in Ireland plays a key role in producing the moral system that underlies much of formal sex education in schools. Underpinned by a particularly traditional and conservative strain of Christian morality, sex education in Northern Ireland schools is marked by conservatism and silence and by the avoidance of opportunities for informed choice in relation to sexuality on the part of young people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the paper is to explore teachers’ methods of delivering an ethos of tolerance, respect
and mutual understanding in one integrated secondary school in Northern Ireland. Drawing on
interviews with teachers in the school, it is argued that most teachers make ‘critical choices’
which both reflect and reinforce a ‘culture of avoidance’, whereby politically or religiously contentious
issues are avoided rather than explored. Although teachers are well-intentioned in making
these choices, it is shown that they have the potential to create the conditions that maintain or even
harden psychological boundaries between Catholics and Protestants rather than dilute them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reported mast-cell counts in endobronchial biopsies from asthmatic subjects are conflicting, with different methodologies often being used. This study compared three standard methods of counting mast cells in endobronchial biopsies from asthmatic and normal subjects. Endobronchial biopsies were obtained from atopic asthmatic subjects (n=17), atopic nonasthmatic subjects (n=6), and nonatopic nonasthmatic control subjects (n=5). After overnight fixation in Carnoy's fixative, mast cells were stained by the short and long toluidine blue methods and antitryptase immunohistochemistry and were counted by light microscopy. Method comparison was made according to Bland & Altman. The limits of agreement were unacceptable for each of the comparisons, suggesting that the methods are not interchangeable. Coefficients of repeatability were excellent, and not different for the individual techniques. These results suggest that some of the reported differences in mast-cell numbers in endobronchial biopsies in asthma may be due to the staining method used, making direct comparisons between studies invalid. Agreement on a standard method is required for counting mast cells in bronchial biopsies, and we recommend the immunohistochemical method, since fixation is less critical and the resultant tissue sections facilitate clear, accurate, and rapid counts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we assess realistic evaluation’s articulation with evidence-based practice (EBP) from the perspective of critical realism. We argue that the adoption by realistic evaluation of a realist causal ontology means that it is better placed to explain complex healthcare interventions than the traditional method used by EBP, the randomized controlled trial (RCT). However, we do not conclude from this that the use of RCTs is without merit, arguing that it is possible to use both methods in combination under the rubric of realist theory. More negatively, we contend that the rejection of critical theory and utopianism by realistic evaluation in favour of the pragmatism of piecemeal social engineering means that it is vulnerable to accusations that it promotes technocratic interpretations of human problems. We conclude that, insofar as realistic evaluation adheres to the ontology of critical realism, it provides a sound contribution to EBP, but insofar as it rejects the critical turn of Bhaskar’s realism, it replicates the technocratic tendencies inherent in EBP.