853 resultados para height partition clustering
Resumo:
BACKGROUND: We used four years of paediatric severe acute respiratory illness (SARI) sentinel surveillance in Blantyre, Malawi to identify factors associated with clinical severity and co-viral clustering.
METHODS: From January 2011 to December 2014, 2363 children aged 3 months to 14 years presenting to hospital with SARI were enrolled. Nasopharyngeal aspirates were tested for influenza and other respiratory viruses. We assessed risk factors for clinical severity and conducted clustering analysis to identify viral clusters in children with co-viral detection.
RESULTS: Hospital-attended influenza-positive SARI incidence was 2.0 cases per 10,000 children annually; it was highest children aged under 1 year (6.3 cases per 10,000), and HIV-infected children aged 5 to 9 years (6.0 cases per 10,000). 605 (26.8%) SARI cases had warning signs, which were positively associated with HIV infection (adjusted risk ratio [aRR]: 2.4, 95% CI: 1.4, 3.9), RSV infection (aRR: 1.9, 95% CI: 1.3, 3.0) and rainy season (aRR: 2.4, 95% CI: 1.6, 3.8). We identified six co-viral clusters; one cluster was associated with SARI with warning signs.
CONCLUSIONS: Influenza vaccination may benefit young children and HIV infected children in this setting. Viral clustering may be associated with SARI severity; its assessment should be included in routine SARI surveillance.
Resumo:
The structure of a turbulent non-premixed flame of a biogas fuel in a hot and diluted coflow mimicking moderate and intense low dilution (MILD) combustion is studied numerically. Biogas fuel is obtained by dilution of Dutch natural gas (DNG) with CO2. The results of biogas combustion are compared with those of DNG combustion in the Delft Jet-in-Hot-Coflow (DJHC) burner. New experimental measurements of lift-off height and of velocity and temperature statistics have been made to provide a database for evaluating the capability of numerical methods in predicting the flame structure. Compared to the lift-off height of the DNG flame, addition of 30 % carbon dioxide to the fuel increases the lift-off height by less than 15 %. Numerical simulations are conducted by solving the RANS equations using Reynolds stress model (RSM) as turbulence model in combination with EDC (Eddy Dissipation Concept) and transported probability density function (PDF) as turbulence-chemistry interaction models. The DRM19 reduced mechanism is used as chemical kinetics with the EDC model. A tabulated chemistry model based on the Flamelet Generated Manifold (FGM) is adopted in the PDF method. The table describes a non-adiabatic three stream mixing problem between fuel, coflow and ambient air based on igniting counterflow diffusion flamelets. The results show that the EDC/DRM19 and PDF/FGM models predict the experimentally observed decreasing trend of lift-off height with increase of the coflow temperature. Although more detailed chemistry is used with EDC, the temperature fluctuations at the coflow inlet (approximately 100K) cannot be included resulting in a significant overprediction of the flame temperature. Only the PDF modeling results with temperature fluctuations predict the correct mean temperature profiles of the biogas case and compare well with the experimental temperature distributions.
Resumo:
Modelling of massive stars and supernovae (SNe) plays a crucial role in understanding galaxies. From this modelling we can derive fundamental constraints on stellar evolution, mass-loss processes, mixing, and the products of nucleosynthesis. Proper account must be taken of all important processes that populate and depopulate the levels (collisional excitation, de-excitation, ionization, recombination, photoionization, bound–bound processes). For the analysis of Type Ia SNe and core collapse SNe (Types Ib, Ic and II) Fe group elements are particularly important. Unfortunately little data is currently available and most noticeably absent are the photoionization cross-sections for the Fe-peaks which have high abundances in SNe. Important interactions for both photoionization and electron-impact excitation are calculated using the relativistic Dirac atomic R-matrix codes (DARC) for low-ionization stages of Cobalt. All results are calculated up to photon energies of 45 eV and electron energies up to 20 eV. The wavefunction representation of Co III has been generated using GRASP0 by including the dominant 3d7, 3d6[4s, 4p], 3p43d9 and 3p63d9 configurations, resulting in 292 fine structure levels. Electron-impact collision strengths and Maxwellian averaged effective collision strengths across a wide range of astrophysically relevant temperatures are computed for Co III. In addition, statistically weighted level-resolved ground and metastable photoionization cross-sections are presented for Co II and compared directly with existing work.
Resumo:
We consider the problem of resource selection in clustered Peer-to-Peer Information Retrieval (P2P IR) networks with cooperative peers. The clustered P2P IR framework presents a significant departure from general P2P IR architectures by employing clustering to ensure content coherence between resources at the resource selection layer, without disturbing document allocation. We propose that such a property could be leveraged in resource selection by adapting well-studied and popular inverted lists for centralized document retrieval. Accordingly, we propose the Inverted PeerCluster Index (IPI), an approach that adapts the inverted lists, in a straightforward manner, for resource selection in clustered P2P IR. IPI also encompasses a strikingly simple peer-specific scoring mechanism that exploits the said index for resource selection. Through an extensive empirical analysis on P2P IR testbeds, we establish that IPI competes well with the sophisticated state-of-the-art methods in virtually every parameter of interest for the resource selection task, in the context of clustered P2P IR.
Resumo:
The Twitter System is the biggest social network in the world, and everyday millions of tweets are posted and talked about, expressing various views and opinions. A large variety of research activities have been conducted to study how the opinions can be clustered and analyzed, so that some tendencies can be uncovered. Due to the inherent weaknesses of the tweets - very short texts and very informal styles of writing - it is rather hard to make an investigation of tweet data analysis giving results with good performance and accuracy. In this paper, we intend to attack the problem from another aspect - using a two-layer structure to analyze the twitter data: LDA with topic map modelling. The experimental results demonstrate that this approach shows a progress in twitter data analysis. However, more experiments with this method are expected in order to ensure that the accurate analytic results can be maintained.
Resumo:
This paper introduces a new stochastic clustering methodology devised for the analysis of categorized or sorted data. The methodology reveals consumers' common category knowledge as well as individual differences in using this knowledge for classifying brands in a designated product class. A small study involving the categorization of 28 brands of U.S. automobiles is presented where the results of the proposed methodology are compared with those obtained from KMEANS clustering. Finally, directions for future research are discussed.
Resumo:
NCD Risk Factor Collaboration (NCD-RisC)
Resumo:
Reverse engineering is usually the stepping stone of a variety of at-tacks aiming at identifying sensitive information (keys, credentials, data, algo-rithms) or vulnerabilities and flaws for broader exploitation. Software applica-tions are usually deployed as identical binary code installed on millions of com-puters, enabling an adversary to develop a generic reverse-engineering strategy that, if working on one code instance, could be applied to crack all the other in-stances. A solution to mitigate this problem is represented by Software Diversity, which aims at creating several structurally different (but functionally equivalent) binary code versions out of the same source code, so that even if a successful attack can be elaborated for one version, it should not work on a diversified ver-sion. In this paper, we address the problem of maximizing software diversity from a search-based optimization point of view. The program to protect is subject to a catalogue of transformations to generate many candidate versions. The problem of selecting the subset of most diversified versions to be deployed is formulated as an optimisation problem, that we tackle with different search heuristics. We show the applicability of this approach on some popular Android apps.
Resumo:
This paper proposes a novel demand response model using a fuzzy subtractive cluster approach. The model development provides support to domestic consumer decisions on controllable loads management, considering consumers' consumption needs and the appropriate load shape or rescheduling in order to achieve possible economic benefits. The model based on fuzzy subtractive clustering method considers clusters of domestic consumption covering an adequate consumption range. Analysis of different scenarios is presented considering available electric power and electric energy prices. Simulation results are presented and conclusions of the proposed demand response model are discussed. (C) 2016 Elsevier Ltd. All rights reserved.
Resumo:
A method is outlined for optimising graph partitions which arise in mapping un- structured mesh calculations to parallel computers. The method employs a combination of iterative techniques to both evenly balance the workload and minimise the number and volume of interprocessor communications. They are designed to work efficiently in parallel as well as sequentially and when combined with a fast direct partitioning technique (such as the Greedy algorithm) to give an initial partition, the resulting two-stage process proves itself to be both a powerful and flexible solution to the static graph-partitioning problem. The algorithms can also be used for dynamic load-balancing and a clustering technique can additionally be employed to speed up the whole process. Experiments indicate that the resulting parallel code can provide high quality partitions, independent of the initial partition, within a few seconds.
Resumo:
Unstructured mesh codes for modelling continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Parallelisation of such codes using single Program Multi Data (SPMD) domain decomposition techniques implemented with message passing has been demonstrated to provide high parallel efficiency, scalability to large numbers of processors P and portability across a wide range of parallel platforms. High efficiency, especially for large P requires that load balance is achieved in each parallel loop. For a code in which loops span a variety of mesh entity types, for example, elements, faces and vertices, some compromise is required between load balance for each entity type and the quantity of inter-processor communication required to satisfy data dependence between processors.
Resumo:
Understanding how biodiversity spatially distribute over both the short term and long term, and what factors are affecting the distribution, are critical for modeling the spatial pattern of biodiversity as well as for promoting effective conservation planning and practices. This dissertation aims to examine factors that influence short-term and long-term avian distribution from the geographical sciences perspective. The research develops landscape level habitat metrics to characterize forest height heterogeneity and examines their efficacies in modelling avian richness at the continental scale. Two types of novel vegetation-height-structured habitat metrics are created based on second order texture algorithms and the concepts of patch-based habitat metrics. I correlate the height-structured metrics with the richness of different forest guilds, and also examine their efficacies in multivariate richness models. The results suggest that height heterogeneity, beyond canopy height alone, supplements habitat characterization and richness models of two forest bird guilds. The metrics and models derived in this study demonstrate practical examples of utilizing three-dimensional vegetation data for improved characterization of spatial patterns in species richness. The second and the third projects focus on analyzing centroids of avian distributions, and testing hypotheses regarding the direction and speed of these shifts. I first showcase the usefulness of centroids analysis for characterizing the distribution changes of a few case study species. Applying the centroid method on 57 permanent resident bird species, I show that multi-directional distribution shifts occurred in large number of studied species. I also demonstrate, plain birds are not shifting their distribution faster than mountain birds, contrary to the prediction based on climate change velocity hypothesis. By modelling the abundance change rate at regional level, I show that extreme climate events and precipitation measures associate closely with some of the long-term distribution shifts. This dissertation improves our understanding on bird habitat characterization for species richness modelling, and expands our knowledge on how avian populations shifted their ranges in North America responding to changing environments in the past four decades. The results provide an important scientific foundation for more accurate predictive species distribution modeling in future.
Resumo:
Post inhibitory rebound is a nonlinear phenomenon present in a variety of nerve cells. Following a period of hyper-polarization this effect allows a neuron to fire a spike or packet of spikes before returning to rest. It is an important mechanism underlying central pattern generation for heartbeat, swimming and other motor patterns in many neuronal systems. In this paper we consider how networks of neurons, which do not intrinsically oscillate, may make use of inhibitory synaptic connections to generate large scale coherent rhythms in the form of cluster states. We distinguish between two cases i) where the rebound mechanism is due to anode break excitation and ii) where rebound is due to a slow T-type calcium current. In the former case we use a geometric analysis of a McKean type model to obtain expressions for the number of clusters in terms of the speed and strength of synaptic coupling. Results are found to be in good qualitative agreement with numerical simulations of the more detailed Hodgkin-Huxley model. In the second case we consider a particular firing rate model of a neuron with a slow calcium current that admits to an exact analysis. Once again existence regions for cluster states are explicitly calculated. Both mechanisms are shown to prefer globally synchronous states for slow synapses as long as the strength of coupling is sufficiently large. With a decrease in the duration of synaptic inhibition both systems are found to break into clusters. A major difference between the two mechanisms for cluster generation is that anode break excitation can support clusters with several groups, whilst slow T-type calcium currents predominantly give rise to clusters of just two (anti-synchronous) populations.