883 resultados para Cluster Computer


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers a model-based approach to the clustering of tissue samples of a very large number of genes from microarray experiments. It is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. Frequently in practice, there are also clinical data available on those cases on which the tissue samples have been obtained. Here we investigate how to use the clinical data in conjunction with the microarray gene expression data to cluster the tissue samples. We propose two mixture model-based approaches in which the number of components in the mixture model corresponds to the number of clusters to be imposed on the tissue samples. One approach specifies the components of the mixture model to be the conditional distributions of the microarray data given the clinical data with the mixing proportions also conditioned on the latter data. Another takes the components of the mixture model to represent the joint distributions of the clinical and microarray data. The approaches are demonstrated on some breast cancer data, as studied recently in van't Veer et al. (2002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an experiment in designing, implementing and testing a Transport layer cluster scheduling and dispatching architecture. The motivation for the experiment was the hypothesis that a Transport layer clustering solution may offer advantantages over the existing industry-standard Network layer and Data Link Layer approaches. The critical success factors initially established to guide and evaluate the experiment were reduced dispatcher work load, reduced dispatcher internal state memory requirements, distributed denial of service resilience, and cluster software design simplicity. The functional design stage of the experiment produced a Transport layer strategy for scheduling and load balancing based on the specification of two new TCP options. Implementation required the introduction of the newly specified TCP options into the Linux (2.4) kernel. The implementation produced an extended Linux Socket API to facilitate user-process access to the additional TCP capability. The testing stage of the experiment confirmed the operational efficiency of the solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The manufacture of copper alloy flat rolled metals involves hot and cold rolling operations, together with annealing and other secondary processes, to transform castings (mainly slabs and cakes) into such shapes as strip, plate, sheet, etc. Production is mainly to customer orders in a wide range of specifications for dimensions and properties. However, order quantities are often small and so process planning plays an important role in this industry. Much research work has been done in the past in relation to the technology of flat rolling and the details of the operations, however, there is little or no evidence of any research in the planning of processes for this type of manufacture. Practical observation in a number of rolling mills has established the type of manual process planning traditionally used in this industry. This manual approach, however, has inherent drawbacks, being particularly dependent on the individual planners who gain their knowledge over a long span of practical experience. The introduction of the retrieval CAPP approach to this industry was a first step to reduce these problems. But this could not provide a long-term answer because of the need for an experienced planner to supervise generation of any plan. It also fails to take account of the dynamic nature of the parameters involved in the planning, such as the availability of resources, operation conditions and variations in the costs. The other alternative is the use of a generative approach to planning in the rolling mill context. In this thesis, generative methods are developed for the selection of optimal routes for single orders and then for batches of orders, bearing in mind equipment restrictions, production costs and material yield. The batch order process planning involves the use of a special cluster analysis algorithm for optimal grouping of the orders. This research concentrates on cold-rolling operations. A prototype model of the proposed CAPP system, including both single order and batch order planning options, has been developed and tested on real order data in the industry. The results were satisfactory and compared very favourably with the existing manual and retrieval methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computers have dramatically changed the way we live, conduct business, and deliver education. They have infiltrated the Bahamian public school system to the extent that many educators now feel the need for a national plan. The development of such a plan is a challenging undertaking, especially in developing countries where physical, financial, and human resources are scarce. This study assessed the situation with regard to computers within the Bahamian public school system, and provided recommended guidelines to the Bahamian government based on the results of a survey, the body of knowledge about trends in computer usage in schools, and the country's needs. ^ This was a descriptive study for which an extensive review of literature in areas of computer hardware, software, teacher training, research, curriculum, support services and local context variables was undertaken. One objective of the study was to establish what should or could be relative to the state-of-the-art in educational computing. A survey was conducted involving 201 teachers and 51 school administrators from 60 randomly selected Bahamian public schools. A random stratified cluster sampling technique was used. ^ This study used both quantitative and qualitative research methodologies. Quantitative methods were used to summarize the data about numbers and types of computers, categories of software available, peripheral equipment, and related topics through the use of forced-choice questions in a survey instrument. Results of these were displayed in tables and charts. Qualitative methods, data synthesis and content analysis, were used to analyze the non-numeric data obtained from open-ended questions on teachers' and school administrators' questionnaires, such as those regarding teachers' perceptions and attitudes about computers and their use in classrooms. Also, interpretative methodologies were used to analyze the qualitative results of several interviews conducted with senior public school system's officials. Content analysis was used to gather data from the literature on topics pertaining to the study. ^ Based on the literature review and the data gathered for this study a number of recommendations are presented. These recommendations may be used by the government of the Commonwealth of The Bahamas to establish policies with regard to the use of computers within the public school system. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]In this paper an architecture for an estimator of short-term wind farm power is proposed. The estimator is made up of a Linear Machine classifier and a set of k Multilayer Perceptrons, training each one for a specific subspace of the input space. The splitting of the input dataset into the k clusters is done using a k-means technique, obtaining the equivalent Linear Machine classifier from the cluster centroids...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of data without labels is commonly subject to scrutiny by unsupervised machine learning techniques. Such techniques provide more meaningful representations, useful for better understanding of a problem at hand, than by looking only at the data itself. Although abundant expert knowledge exists in many areas where unlabelled data is examined, such knowledge is rarely incorporated into automatic analysis. Incorporation of expert knowledge is frequently a matter of combining multiple data sources from disparate hypothetical spaces. In cases where such spaces belong to different data types, this task becomes even more challenging. In this paper we present a novel immune-inspired method that enables the fusion of such disparate types of data for a specific set of problems. We show that our method provides a better visual understanding of one hypothetical space with the help of data from another hypothetical space. We believe that our model has implications for the field of exploratory data analysis and knowledge discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gli sforzi di ricerca relativi all'High Performance Computing, nel corso degli anni, hanno prodotto risultati importanti inerenti all'incremento delle prestazioni sia in termini di numero di operazioni effettuate per periodo temporale, sia introducendo o migliorando algoritmi paralleli presenti in letteratura. Tali traguardi hanno comportato cambiamenti alla struttura interna delle macchine; si è assistito infatti ad un'evoluzione delle architetture dei processori utilizzati e all'impiego di GPU come risorse di calcolo aggiuntive. La conseguenza di un continuo incremento di prestazioni è quella di dover far fronte ad un grosso dispendio energetico, in quanto le macchine impiegate nell'HPC sono ideate per effettuare un'intensa attività di calcolo in un periodo di tempo molto prolungato; l'energia necessaria per alimentare ciascun nodo e dissipare il calore generato comporta costi elevati. Tra le varie soluzioni proposte per limitare il consumo di energia, quella che ha riscosso maggior interesse, sia a livello di studio che di mercato, è stata l'integrazione di CPU di tipologia RISC (Reduced Instruction Set Computer), in quanto capaci di ottenere prestazioni soddisfacenti con un impiego energetico inferiore rispetto alle CPU CISC (Complex Instruction Set Computer). In questa tesi è presentata l'analisi delle prestazioni di Monte Cimone, un cluster composto da 8 nodi di calcolo basati su architettura RISC-V e distribuiti in 4 piattaforme (\emph{blade}) dual-board. Verranno eseguiti dei benchmark che ci permetteranno di valutare: le prestazioni dello scambio di dati a lunga e corta distanza; le prestazioni nella risoluzione di problemi che presentano un principio di località spaziale ridotto; le prestazioni nella risoluzione di problemi su grafi e, nello specifico, ricerca in ampiezza e cammini minimi da sorgente singola.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The new social panorama resulting from aging of the Brazilian population is leading to significant transformations within healthcare. Through the cluster analysis strategy, it was sought to describe the specific care demands of the elderly population, using frailty components. Cross-sectional study based on reviewing medical records, conducted in the geriatric outpatient clinic, Hospital de Clínicas, Universidade Estadual de Campinas (Unicamp). Ninety-eight elderly users of this clinic were evaluated using cluster analysis and instruments for assessing their overall geriatric status and frailty characteristics. The variables that most strongly influenced the formation of clusters were age, functional capacities, cognitive capacity, presence of comorbidities and number of medications used. Three main groups of elderly people could be identified: one with good cognitive and functional performance but with high prevalence of comorbidities (mean age 77.9 years, cognitive impairment in 28.6% and mean of 7.4 comorbidities); a second with more advanced age, greater cognitive impairment and greater dependence (mean age 88.5 years old, cognitive impairment in 84.6% and mean of 7.1 comorbidities); and a third younger group with poor cognitive performance and greater number of comorbidities but functionally independent (mean age 78.5 years old, cognitive impairment in 89.6% and mean of 7.4 comorbidities). These data characterize the profile of this population and can be used as the basis for developing efficient strategies aimed at diminishing functional dependence, poor self-rated health and impaired quality of life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The [Ru3O(Ac)6(py)2(CH3OH)]+ cluster provides an effective electrocatalytic species for the oxidation of methanol under mild conditions. This complex exhibits characteristic electrochemical waves at -1.02, 0.15 and 1.18 V, associated with the Ru3III,II,II/Ru3III,III,II/Ru 3III,III,III /Ru3IV,III,III successive redox couples, respectively. Above 1.7 V, formation of two RuIV centers enhances the 2-electron oxidation of the methanol ligand yielding formaldehyde, in agreement with the theoretical evolution of the HOMO levels as a function of the oxidation states. This work illustrates an important strategy to improve the efficiency of the oxidation catalysis, by using a multicentered redox catalyst and accessing its multiple higher oxidation states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Identifying clusters of acute paracoccidioidomycosis cases could potentially help in identifying the environmental factors that influence the incidence of this mycosis. However, unlike other endemic mycoses, there are no published reports of clusters of paracoccidioidomycosis. Methodology/Principal Findings: A retrospective cluster detection test was applied to verify if an excess of acute form (AF) paracoccidioidomycosis cases in time and/or space occurred in Botucatu, an endemic area in Sao Paulo State. The scan-test SaTScan v7.0.3 was set to find clusters for the maximum temporal period of 1 year. The temporal test indicated a significant cluster in 1985 (P<0.005). This cluster comprised 10 cases, although 2.19 were expected for this year in this area. Age and clinical presentation of these cases were typical of AF paracccidioidomycosis. The space-time test confirmed the temporal cluster in 1985 and showed the localities where the risk was higher in that year. The cluster suggests that some particularities took place in the antecedent years in those localities. Analysis of climate variables showed that soil water storage was atypically high in 1982/83 (similar to 2.11/2.5 SD above mean), and the absolute air humidity in 1984, the year preceding the cluster, was much higher than normal (similar to 1.6 SD above mean), conditions that may have favored, respectively, antecedent fungal growth in the soil and conidia liberation in 1984, the probable year of exposure. These climatic anomalies in this area was due to the 1982/83 El Nino event, the strongest in the last 50 years. Conclusions/Significance: We describe the first cluster of AF paracoccidioidomycosis, which was potentially linked to a climatic anomaly caused by the 1982/83 El Nino Southern Oscillation. This finding is important because it may help to clarify the conditions that favor Paracoccidioides brasiliensis survival and growth in the environment and that enhance human exposure, thus allowing the development of preventive measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. Abundance variations in moderately metal-rich globular clusters can give clues about the formation and chemical enrichment of globular clusters. Aims. CN, CH, Na, Mg and Al indices in spectra of 89 stars of the template metal-rich globular cluster M71 are measured and implications on internal mixing are discussed. Methods. Stars from the turn-off up to the Red Giant Branch (0.87 < log g < 4.65) observed with the GMOS multi-object spectrograph at the Gemini-North telescope are analyzed. Radial velocities, colours, effective temperatures, gravities and spectral indices are determined for the sample. Results. Previous findings related to the CN bimodality and CN-CH anticorrelation in stars of M71 are confirmed. We also find a CN-Na correlation, and Al-Na, as well as an Mg(2)-Al anticorrelation. Conclusions. A combination of convective mixing and a primordial pollution by AGB or massive stars in the early stages of globular cluster formation is required to explain the observations.