968 resultados para Ammassi,Galassie,emissioni,non termiche,cluster,relitti,radio


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computers of a non-dedicated cluster are often idle (users attend meetings, have lunch or coffee breaks) or lightly loaded (users carry out simple computations). These underutilized computers can be employed to execute parallel applications not only during weekends and at nights but also during office hours. Thus, they have to be shared by parallel and sequential applications which could lead to the improvement of their execution performance. However, there is a lack of experimental study showing the behavior and performance of parallel and sequential applications executing concurrently on clusters. We present here the result of an experimental study into load balancing based scheduling of a mixture of parallel and sequential applications on a non-dedicated cluster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although individual PCs of a cluster are used by their owners to run sequential applications (local jobs), the cluster as a whole or its subset can also be employed to run parallel applications (cluster jobs) even during working hours. This implies that these computers have to be shared by parallel and sequential applications, which could lead to the improvement of the execution performance and resource utilization. However, there is a lack of experimental study showing the behavior and performance of executing parallel and sequential applications concurrently on a non-dedicated cluster. The result of such research would be beneficial for the development of new global scheduling algorithms. We present the result of an experimental study into scheduling of a mixture of parallel and sequential applications on a non-dedicated cluster. The aim of this study is to learn how the concurrent execution of a communication intensive parallel application and sequential applications influences their execution performance and utilization of the cluster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, coordinated scheduling of multiple parallel applications across computers has been considered as the critical factor to achieve high execution performance. We claim in this report that the performance and costs of the execution of parallel applications could be improved if not only dedicated clusters but also non-dedicated clusters were used and several parallel applications were executed concurreontly. To support this claim we carried out experimental study into the performance of multiple NAS parallel programs executing concurrently on a non-dedicated cluster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies have shown that most of the computers in a non-dedicated cluster are often idle or lightly loaded. The underutilized computers in a non-dedicated cluster can be employed to execute parallel applications. The aim of this study is to learn how concurrent execution of a computation-bound and sequential applications influence their execution performance and cluster utilization. The result of the study has demonstrated that a computation-bound parallel application benefits from load balancing, and at the same time sequential applications suffer only an insignificant slowdown of execution. Overall, the utilization of a non-dedicated cluster is improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part I : A zinc finger gene Tzf1 was cloned in the earlier work of the lab by screening a ë-DASH2 cDNA expression library with an anti-Rat SC antibody. A ë-DASH2 genomic DNA library and cosmid lawrist 4 genomic DNA library were screened with the cDNA fragment of Tzf1 to determine the genomic organization of Tzf1. Another putative zinc finger gene Tzf2 was found about 700 bp upstream of Tzf1.RACE experiment was carried out for both genes to establish the whole length cDNA. The cDNA sequences of Tzf and Tzf2 were used to search the Flybase (Version Nov, 2000). They correspond to two genes found in the Flybase, CG4413 and CG4936. The CG4413 transcript seems to be a splicing variant of Tzf transcripts. Another two zinc finger genes Tzf3 and Tzf4 were discovered in silico. They are located 300 bp away from Tzf and Tzf2, and a non-tandem cluster was formed by the four genes. All four genes encode proteins with a very similar modular structure, since they all have five C2H2 type zinc fingers at their c-terminal ends. This is the most compact zinc finger protein gene cluster found in Drosophila melanogaster.Part II: 34,056 bp insert of the cosmid 19G11

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il presente lavoro e’ stato svolto nell'ambito della collaborazione NUCLEX, esperimento della Commissione Nazionale 3 dell'INFN per lo studio della dinamica delle reazioni nucleari indotte da ioni pesanti. In particolare l'oggetto della tesi riguarda l'analisi della competizione fra i diversi processi di emissione di particelle cariche leggere da sistemi composti. Piu’ precisamente in questa tesi si studiano e si confrontano le emissioni da sorgenti equilibrate e di pre-equilibrio di particelle alfa e protoni per due diverse reazioni di fusione, 16O +65Cu a 256 MeV e 19F +62Ni a 304 MeV, che portano entrambe alla formazione del nucleo composto 81Rb. I due sistemi sono stati scelti in modo da avere una reazione indotta da un proiettile costituito da un numero intero di particelle alfa (alfa-cluster) (16O) ed una seconda indotta da un proiettile non alfa cluster (19F), con la medesima energia del fascio (16 MeV/n). Lo scopo e’ di cercare evidenze sulla struttura a cluster di alfa dei nuclei con numero di massa A multiplo di 4 ed N=Z. Allo scopo di evidenziare i contributi delle diverse sorgenti di emissione delle particelle si e’ appli- cata la tecnica del Moving Source Fit agli spettri delle particlelle emesse e rivelate con l'apparato GARFIELD in coincidenza con i residui di evaporazione misurati tramite il rivelatore anulare Ring Counter. I risultati sperimentali ottenuti applicando la tecnica del Moving Source Fit ai nostri dati sperimentali sono interessanti e sembrano contraddire la maggiore emissione di particelle alfa da parte del sistema con proiettile a struttura alfa-cluster 16O. Le possibili ipotesi alla base di questo risultato abbastanza sorprendente sono tuttora in fase di discussione e saranno oggetto di ulteriori verifiche tramite lo studio di correlazioni piu’ esclusive e confronto con modelli di pre-equilibrio complessi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is an empirical study of how two words in Icelandic, "nú" and "núna", are used in contemporary Icelandic conversation. My aims in this study are, first, to explain the differences between the temporal functions of "nú" and "núna", and, second, to describe the non-temporal functions of "nú". In the analysis, a focus is placed on comparing the sequential placement of the two words, on their syntactical distribution, and on their prosodic realization. The empirical data comprise 14 hours and 11 minutes of naturally occurring conversation recorded between 1996 and 2003. The selected conversations represent a wide range of interactional contexts including informal dinner parties, institutional and non-institutional telephone conversations, radio programs for teenagers, phone-in programs, and, finally, a political debate on television. The theoretical and methodological framework is interactional linguistics, which can be described as linguistically oriented conversation analysis (CA). A comparison of "nú" and "núna" shows that the two words have different syntactic distributions. "Nú" has a clear tendency to occur in the front field, before the finite verb, while "núna" typically occurs in the end field, after the object. It is argued that this syntactic difference reflects a functional difference between "nú" and "núna". A sequential analysis of "núna" shows that the word refers to an unspecified period of time which includes the utterance time as well as some time in the past and in the future. This temporal relation is referred to as reference time. "Nú", by contrast, is mainly used in three different environments: a) in temporal comparisons, 2) in transitions, and 3) when the speaker is taking an affective stance. The non-temporal functions of "nú" are divided into three categories: a) "nú" as a tone particle, 2) "nú" as an utterance particle, and 3) "nú" as a dialogue particle. "Nú" as a tone particle is syntactically integrated and can occur in two syntactic positions: pre-verbally and post-verbally. I argue that these instances are employed in utterances in which a speaker is foregrounding information or marking it as particularly important. The study shows that, although these instances are typically prosodically non-prominent and unstressed, they are in some cases delivered with stress and with a higher pitch than the surrounding talk. "Nú" as an utterance particle occurs turn-initially and is syntactically non-integrated. By using "nú", speakers show continuity between turns and link new turns to prior ones. These instances initiate either continuations by the same speaker or new turns after speaker shifts. "Nú" as a dialogue particle occurs as a turn of its own. The study shows that these instances register informings in prior turns as unexpected or as a departure from the normal state of affairs. "Nú" as a dialogue particle is often delivered with a prolonged vowel and a recognizable intonation contour. A comparative sequential and prosodic analysis shows that in these cases there is a correlation between the function of "nú" and the intonation contour by which it is delivered. Finally, I argue that despite the many functions of "nú", all the instances can be said to have a common denominator, which is to display attention towards the present moment and the utterances which are produced prior or after the production of "nú". Instead of anchoring the utterances in external time or reference time, these instances position the utterance in discourse internal time, or discourse time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The methods of secondary wood processing are assumed to evolve over time and to affect the requirements set for the wood material and its suppliers. The study aimed at analysing the industrial operating modes applied by joinery and furniture manufacturers as sawnwood users. Industrial operating mode was defined as a pattern of important decisions and actions taken by a company which describes the company's level of adjustment in the late-industrial transition. A non-probabilistic sample of 127 companies was interviewed, including companies from Denmark, Germany, the Netherlands, and Finland. Fifty-two of the firms were furniture manufacturers and the other 75 were producing windows and doors. Variables related to business philosophy, production operations, and supplier choice criteria were measured and used as a basis for a customer typology; variables related to wood usage and perceived sawmill performance were measured to be used to profile the customer types. Factor analysis was used to determine the latent dimensions of industrial operating mode. Canonical correlations analysis was applied in developing the final base for classifying the observations. Non-hierarchical cluster analysis was employed to build a five-group typology of secondary wood processing firms; these ranged from traditional mass producers to late-industrial flexible manufacturers. There is a clear connection between the amount of late-industrial elements in a company and the share of special and customised sawnwood it uses. Those joinery or furniture manufacturers that are more late-industrial also are likely to use more component-type wood material and to appreciate customer-oriented technical precision. The results show that the change is towards the use of late-industrial sawnwood materials and late-industrial supplier relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I. The 3.7 Å Crystal Structure of Horse Heart Ferricytochrome C.

The crystal structure of horse heart ferricytochrome c has been determined to a resolution of 3.7 Å using the multiple isomorphous replacement technique. Two isomorphous derivatives were used in the analysis, leading to a map with a mean figure of merit of 0.458. The quality of the resulting map was extremely high, even though the derivative data did not appear to be of high quality.

Although it was impossible to fit the known amino acid sequence to the calculated structure in an unambiguous way, many important features of the molecule could still be determined from the 3.7 Å electron density map. Among these was the fact that cytochrome c contains little or no α-helix. The polypeptide chain appears to be wound about the heme group in such a way as to form a loosely packed hydrophobic core in the molecule.

The heme group is located in a cleft on the molecule with one edge exposed to the solvent. The fifth coordinating ligand is His 18 and the sixth coordinating ligand is probably neither His 26 nor His 33.

The high resolution analysis of cytochrome c is now in progress and should be completed within the next year.

II. The Application of the Karle-Hauptman Tangent Formula to Protein Phasing.

The Karle-Hauptman tangent formula has been shown to be applicable to the refinement of previously determined protein phases. Tests were made with both the cytochrome c data from Part I and a theoretical structure based on the myoglobin molecule. The refinement process was found to be highly dependent upon the manner in which the tangent formula was applied. Iterative procedures did not work well, at least at low resolution.

The tangent formula worked very well in selecting the true phase from the two possible phase choices resulting from a single isomorphous replacement phase analysis. The only restriction on this application is that the heavy atoms form a non-centric cluster in the unit cell.

Pages 156 through 284 in this Thesis consist of previously published papers relating to the above two sections. References to these papers can be found on page 155.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta pesquisa avaliou a situação da tuberculose no Brasil, no período de 2001 a 2003, segundo indicadores do processo de operacionalização do Programa Nacional de Controle de Tuberculose (PNCT), e estimou os efeitos de fatores determinantes da taxa de incidência da doença. Para a avaliação utilizou-se a análise de cluster não-hierárquica, visando agrupar os municípios brasileiros de acordo com a morbidade por tuberculose (TB) e AIDS, e pelo desempenho do PNCT. Estes clusters foram mapeados, comparando-se a distribuição nos municípios, em regiões metropolitanas, municípios prioritários, e segundo o tamanho da população. O qui-quadrado de Pearson foi utilizado para testar associação nas categorias. A modelagem longitudinal multinível foi usada para identificar e estimar os efeitos dos determinantes da doença. Os agregados foram: anos, municípios e regiões metropolitanas. O modelo foi de intercepto e inclinação aleatória. Foram retidas as variáveis capazes de diminuir a variância dos níveis, pois, desta forma, explicam a variabilidade hierárquica da doença. Incluiu-se renda, densidade populacional, proporção de cura, taxa de incidência de AIDS e as grandes regiões brasileiras. A avaliação mostrou que a situação epidemiológica preocupante ocorreu nos municípios com Baixa TB e Alta AIDS, e Alta TB e AIDS. O cluster de Muito baixa TB e AIDS concentrou 50% dos municípios, o que pode configurar problemas de notificação. São 6 clusters de desempenho do programa. Bom e Bom com baixo DOTS predominando nos municípios pequenos, não prioritários e fora das regiões metropolitanas. No desempenho Moderado houve maior proporção de municípios prioritários. Clusters Regular e Fraco concentraram 10% dos municípios, com abandono de tratamento elevado e cura muito baixa. O cluster Muito Fraco caracterizou-se pela falta de dados nos indicadores de desempenho. O modelo multinível identificou a AIDS como fator impactante na tuberculose, anteriormente não encontrado em outros estudos; a interação entre renda e AIDS, e importante contribuição das regiões metropolitanas na distribuição da tuberculose, que se manifesta heterogeneamente nas grandes regiões do país. A análise discriminou municípios, e mostrou não haver associação entre maior morbidade e melhor desempenho do PNCT, retratando inadequação da vigilância à realidade epidemiológica do Brasil. O programa necessita ser reforçado, no sentido de considerar a AIDS ao estabelecer suas estratégias de controle. Ademais, os aspectos de baixa renda da população e densidade populacional, já analisados em diversas pesquisas, também se manifestaram de forma importante nestes resultados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the Pan-STARRS1 discovery of the long-lived and blue transient PS1-11af, which was also detected by Galaxy Evolution Explorer with coordinated observations in the near-ultraviolet (NUV) band. PS1-11af is associated with the nucleus of an early type galaxy at redshift z = 0.4046 that exhibits no evidence for star formation or active galactic nucleus activity. Four epochs of spectroscopy reveal a pair of transient broad absorption features in the UV on otherwise featureless spectra. Despite the superficial similarity of these features to P-Cygni absorptions of supernovae (SNe), we conclude that PS1-11af is not consistent with the properties of known types of SNe. Blackbody fits to the spectral energy distribution are inconsistent with the cooling, expanding ejecta of a SN, and the velocities of the absorption features are too high to represent material in homologous expansion near a SN photosphere. However, the constant blue colors and slow evolution of the luminosity are similar to previous optically selected tidal disruption events (TDEs). The shape of the optical light curve is consistent with models for TDEs, but the minimum accreted mass necessary to power the observed luminosity is only 0.002 M, which points to a partial disruption model. A full disruption model predicts higher bolometric luminosities, which would require most of the radiation to be emitted in a separate component at high energies where we lack observations. In addition, the observed temperature is lower than that predicted by pure accretion disk models for TDEs and requires reprocessing to a constant, lower temperature. Three deep non-detections in the radio with the Very Large Array over the first two years after the event set strict limits on the production of any relativistic outflow comparable to Swift J1644+57, even if off-axis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several projects in the recent past have aimed at promoting Wireless Sensor Networks as an infrastructure technology, where several independent users can submit applications that execute concurrently across the network. Concurrent multiple applications cause significant energy-usage overhead on sensor nodes, that cannot be eliminated by traditional schemes optimized for single-application scenarios. In this paper, we outline two main optimization techniques for reducing power consumption across applications. First, we describe a compiler based approach that identifies redundant sensing requests across applications and eliminates those. Second, we cluster the radio transmissions together by concatenating packets from independent applications based on Rate-Harmonized Scheduling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este art??culo presenta los resultados de un estudio realizado entre los funcionarios del Ayuntamiento de Palma de Mallorca. Tiene por objeto conocer las caracter??sticas de los cargos de direcci??n, as?? como de los mandos intermedios en una administraci??n p??blica local de grandes dimensiones, y obtener grupos homog??neos de profesionales con responsabilidades de direcci??n a partir de las competencias autoevaluadas. Para ello se ha realizado un estudio transversal descriptivo, basado en encuesta autoadministrada. Se seleccionaron las 126 personas que cumpl??an las condiciones de tener responsabilidades de direcci??n. Se analiza un amplio conjunto de variables centradas en las competencias profesionales y se realizan an??lisis descriptivos diversos, entre ellos un an??lisis factorial de las competencias autoevaluadas, preparatorios del an??lisis de clusters no jer??rquicos. Los resultados indican la existencia de tres clusters diferentes y consistentes atendiendo al g??nero, edad y procedimiento de acceso a la funci??n directiva.