788 resultados para cluster sampling
Resumo:
The sampling of certain solid angle is a fundamental operation in realistic image synthesis, where the rendering equation describing the light propagation in closed domains is solved. Monte Carlo methods for solving the rendering equation use sampling of the solid angle subtended by unit hemisphere or unit sphere in order to perform the numerical integration of the rendering equation. In this work we consider the problem for generation of uniformly distributed random samples over hemisphere and sphere. Our aim is to construct and study the parallel sampling scheme for hemisphere and sphere. First we apply the symmetry property for partitioning of hemisphere and sphere. The domain of solid angle subtended by a hemisphere is divided into a number of equal sub-domains. Each sub-domain represents solid angle subtended by orthogonal spherical triangle with fixed vertices and computable parameters. Then we introduce two new algorithms for sampling of orthogonal spherical triangles. Both algorithms are based on a transformation of the unit square. Similarly to the Arvo's algorithm for sampling of arbitrary spherical triangle the suggested algorithms accommodate the stratified sampling. We derive the necessary transformations for the algorithms. The first sampling algorithm generates a sample by mapping of the unit square onto orthogonal spherical triangle. The second algorithm directly compute the unit radius vector of a sampling point inside to the orthogonal spherical triangle. The sampling of total hemisphere and sphere is performed in parallel for all sub-domains simultaneously by using the symmetry property of partitioning. The applicability of the corresponding parallel sampling scheme for Monte Carlo and Quasi-D/lonte Carlo solving of rendering equation is discussed.
Resumo:
High spatial resolution vertical profiles of pore-water chemistry have been obtained for a peatland using diffusive equilibrium in thin films (DET) gel probes. Comparison of DET pore-water data with more traditional depth-specific sampling shows good agreement and the DET profiling method is less invasive and less likely to induce mixing of pore-waters. Chloride mass balances as water tables fell in the early summer indicate that evaporative concentration dominates and there is negligible lateral flow in the peat. Lack of lateral flow allows element budgets for the same site at different times to be compared. The high spatial resolution of sampling also enables gradients to be observed that permit calculations of vertical fluxes. Sulfate concentrations fall at two sites with net rates of 1.5 and 5.0nmol cm− 3 day− 1, likely due to a dominance of bacterial sulfate reduction, while a third site showed a net gain in sulfate due to oxidation of sulfur over the study period at an average rate of 3.4nmol cm− 3 day− 1. Behaviour of iron is closely coupled to that of sulfur; there is net removal of iron at the two sites where sulfate reduction dominates and addition of iron where oxidation dominates. The profiles demonstrate that, in addition to strong vertical redox related chemical changes, there is significant spatial heterogeneity. Whilst overall there is evidence for net reduction of sulfate within the peatland pore-waters, this can be reversed, at least temporarily, during periods of drought when sulfide oxidation with resulting acid production predominates.
Resumo:
Sub)picosecond transient absorption (TA) and time-resolved infrared (TRIR) spectra of the cluster [OS3(CO)(10-) (AcPy-MV)](2+) (the clication AcPy-MV = Acpy-MV2+ = [2-pyridylacetimine-N-(2-(1'-methyl-4,4'-bipyridine-1,1'-diium-1-yl) ethyl)] (PF6)(2)) (1(2+)) reveal that photoinduced electron transfer to the electron-accepting 4,4'-bipyridine-1,1'diium (MV2+) moiety competes with the fast relaxation of the initially populated sigmapi* excited state of the cluster to the ground state and/or cleavage of an Os-Os bond. The TA spectra of cluster 12 in acetone, obtained by irradiation into its lowest-energy absorption band, show the characteristic absorptions of the one-electron-reduced MV*(+) unit at 400 and 615 nm, in accordance with population of a charge-separated (CS) state in which a cluster-core electron has been transferred to the lowest pi* orbital of the remote MV2+ unit. This assignment is confirmed by picosecond TRIR spectra that show a large shift of the pilot highest-frequency nu(CO) band of 1(2+) by ca. +40 cm(-1), reflecting the photooxidation of the cluster core. The CS state is populated via fast (4.2 x 10(11) s(-1)) and efficient (88%) oxidative quenching of the optically populated sigmapi* excited state and decays biexponentially with lifetimes of 38 and 166 ps (1:2:1 ratio) with a complete regeneration of the parent cluster. About 12% of the cluster molecules in the sigmapi* excited state form long-lived open-core biradicals. In strongly coordinating acetonitrile, however, the cluster core-to-MV2+ electron transfer in cluster 12+ results in the irreversible formation of secondary photoproducts with a photooxidized cluster core. The photochemical behavior of the [Os-3(CO)(10)(alpha-diimine-MV)](2+) (donor-acceptor) dyad can be controlled by an externally applied electronic bias. Electrochemical one-electron reduction of the MV2+ moiety prior to the irradiation reduces its electron-accepting character to such an extent that the photoinduced electron transfer to MV*+ is no longer feasible. Instead, the irradiation of reduced cluster 1(.)+ results in the reversible formation of an open-core zwitterion, the ultimate photoproduct also observed upon irradiation of related nonsubstituted clusters [Os-3(CO)(10)(alpha-diimine)] in strongly coordinating solvents such as acetonitrile.
Resumo:
Recent research in multi-agent systems incorporate fault tolerance concepts, but does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely 'Intelligent Agents'. A task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The feasibility of the approach is validated by simulations on an FPGA using a multi-agent simulator, and implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.
Resumo:
Recent research in multi-agent systems incorporate fault tolerance concepts. However, the research does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely ‘Intelligent Agents’. In the approach considered a task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The agents hence contribute towards fault tolerance and towards building reliable systems. The feasibility of the approach is validated by simulations on an FPGA using a multi-agent simulator and implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.
Resumo:
A first step in interpreting the wide variation in trace gas concentrations measured over time at a given site is to classify the data according to the prevailing weather conditions. In order to classify measurements made during two intensive field campaigns at Mace Head, on the west coast of Ireland, an objective method of assigning data to different weather types has been developed. Air-mass back trajectories calculated using winds from ECMWF analyses, arriving at the site in 1995–1997, were allocated to clusters based on a statistical analysis of the latitude, longitude and pressure of the trajectory at 12 h intervals over 5 days. The robustness of the analysis was assessed by using an ensemble of back trajectories calculated for four points around Mace Head. Separate analyses were made for each of the 3 years, and for four 3-month periods. The use of these clusters in classifying ground-based ozone measurements at Mace Head is described, including the need to exclude data which have been influenced by local perturbations to the regional flow pattern, for example, by sea breezes. Even with a limited data set, based on 2 months of intensive field measurements in 1996 and 1997, there are statistically significant differences in ozone concentrations in air from the different clusters. The limitations of this type of analysis for classification and interpretation of ground-based chemistry measurements are discussed.
Resumo:
The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.
Resumo:
Clusters of computers can be used together to provide a powerful computing resource. Large Monte Carlo simulations, such as those used to model particle growth, are computationally intensive and take considerable time to execute on conventional workstations. By spreading the work of the simulation across a cluster of computers, the elapsed execution time can be greatly reduced. Thus a user has apparently the performance of a supercomputer by using the spare cycles on other workstations.
Resumo:
O grupo das mulheres trabalhadoras do sexo (MTS) é reconhecido como uma populaçãode maior risco à infecção pelo HIV, tanto pela prevalência elevada, como por suavulnerabilidade social como pelos fatores relacionados à própria atividade profissional. Porém, arealização de estudos nos subgrupos de maior risco ao HIV mediante estratégias convencionaisde amostragem é, em geral, problemática por essas populações possuírem pequena magnitudeem termos populacionais e por estarem vinculados a comportamentos estigmatizados ouatividades ilegais. Em 1997, foi proposto um método de amostragem probabilística parapopulações de difícil acesso denominado Respondent-Driven Sampling (RDS). O método éconsiderado como uma variante da amostragem em cadeia e possibilita a estimação estatísticados parâmetros de interesse. Na literatura internacional, para análise de dados coletados porRDS, muitos autores têm utilizado técnicas estatísticas multivariadas tradicionais, sem levar emconta a estrutura de dependência das observações, presente nos dados coletados por RDS.A presente tese tem por objetivo contribuir para suprir informações sobre as práticas derisco relacionadas ao HIV entre as mulheres trabalhadoras do sexo (MTS) com odesenvolvimento de método estatístico para análise de dados coletados com o método deamostragem RDS. Com tal finalidade, foram utilizadas as informações coletadas na PesquisaCorrente da Saúde realizada em dez cidades brasileiras, com 2.523 MTS recrutadas por RDS,entre os anos de 2008 e 2009. O questionário foi autopreenchido e incluiu módulos sobrecaracterísticas da atividade profissional, práticas sexuais, uso de drogas, testes periódicos deHIV, e acesso aos serviços de saúde.Primeiramente, foram descritos alguns pressupostos do RDS e todas as etapas deimplantação da pesquisa. Em seguida, foram propostos métodos de análise multivariada, considerando o RDS como um desenho complexo de amostragem.
Resumo:
While the Cluster spacecraft were located near the high-latitude magnetopause, between 1010 and 1040 UT on 16 January 2004, three typical flux transfer event (FTE) signatures were observed. During this interval, simultaneous and conjugated all‐sky camera measurements, recorded at Yellow River Station, Svalbard, are available at 630.0 and 557.7 nm that show poleward‐moving auroral forms (PMAFs), consistent with magnetic reconnection at the dayside magnetopause. Simultaneous FTEs seen at the magnetopause mainly move northward, but having duskward (eastward) and tailward velocity components, roughly consistent with the observed direction of motion of the PMAFs in all‐sky images. Between the PMAFs meridional keograms, extracted from the all‐sky images, show intervals of lower intensity aurora which migrate equatorward just before the PMAFs intensify. This is strong evidence for an equatorward eroding and poleward moving open‐closed boundary associated with a variable magnetopause reconnection rate under variable IMF conditions. From the durations of the PMAFs, we infer that the evolution time of FTEs is 5–11 minutes from its origin on the magnetopause to its addition to the polar cap.
Resumo:
Passive samplers have been predominantly used to monitor environmental conditions in single volumes. However, measurements using a calibrated passive sampler- Solid Phase Microextraction (SPME) fibre, in three houses with cold pitched roof, successfully demonstrated the potential of the SPME fibre as a device for monitoring air movement in two volumes. The roofs monitored were pitched at 15° - 30° with insulation thickness varying between 200-300 mm on the ceiling. For effective analysis, two constant sources of volatile organic compounds were diffused steadily in the house. Emission rates and air movement from the house to the roof was predicted using developed algorithms. The airflow rates which were calibrated against conventional tracer gas techniques were introduced into a HAM software package to predict the effects of air movement on other varying parameters. On average it was shown from the in situ measurements that about 20-30% of air entering the three houses left through gaps and cracks in the ceiling into the roof. Although these field measurements focus on the airflows, it is associated with energy benefits such that; if these flows are reduced then significantly energy losses would also be reduced (as modelled) consequently improving the energy efficiency of the house. Other results illustrated that condensation formation risks were dependent on the airtightness of the building envelopes including configurations of their roof constructions.
Resumo:
This paper discusses how numerical gradient estimation methods may be used in order to reduce the computational demands on a class of multidimensional clustering algorithms. The study is motivated by the recognition that several current point-density based cluster identification algorithms could benefit from a reduction of computational demand if approximate a-priori estimates of the cluster centres present in a given data set could be supplied as starting conditions for these algorithms. In this particular presentation, the algorithm shown to benefit from the technique is the Mean-Tracking (M-T) cluster algorithm, but the results obtained from the gradient estimation approach may also be applied to other clustering algorithms and their related disciplines.