863 resultados para parallel sorting


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study analyses the area of construction and demolition waste (C & D W) auditing. The production of C&DW has grown year after year since the Environmental Protection Agency (EPA) first published a report in 1996 which provided data for C&D W quantities for 1995 (EPA, 1996a). The most recent report produced by the EPA is based on data for 2005 (EPA, 2006). This report estimated that the quantity of C&DW produced for that period to be 14 931 486 tonnes. However, this is a ‘data update’ report containing an update on certain waste statistics so any total provided would not be a true reflection of the waste produced for that period. This illustrates that a more construction site-specific form of data is required. The Department of Building and Civil Engineering in the Galway-Mayo Institute of Technology have carried out two recent research projects (Grimes, 2005; Kelly, 2006) in this area, which have produced waste production indicators based on site-specific data. This involved the design and testing of an original auditing tool based on visual characterisation and the application of conversion factors. One of the main recommendations of these studies was to compare this visual characterisation approach with a photogrammetric sorting methodology. This study investigates the application of photogrammetric sorting on a residential construction site in the Galway region. A visual characterisation study is also carried out on the same project to compare the two methodologies and assess the practical application in a construction site environment. Data collected from the waste management contractor on site was also used to provide further evaluation. From this, a set of waste production indicators for new residential construction was produced: □ 50.8 kg/m2 for new residential construction using data provided by the visual characterisation method and the Landfill Levy conversion factors. □ 43 kg/m2 for new residential construction using data provided by the photogrammetric sorting method and the Landfill Levy conversion factors. □ 23.8 kg/m2 for new residential construction using data provided by Waste Management Contractor (WMC). The acquisition of the data from the waste management contractor was a key element for testing of the information produced by the visual characterisation and photogrammetric sorting methods. The actual weight provided by the waste management contractor shows a significant difference between the quantities provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2012

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2012

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays a huge attention of the academia and research teams is attracted to the potential of the usage of the 60 GHz frequency band in the wireless communications. The use of the 60GHz frequency band offers great possibilities for wide variety of applications that are yet to be implemented. These applications also imply huge implementation challenges. Such example is building a high data rate transceiver which at the same time would have very low power consumption. In this paper we present a prototype of Single Carrier -SC transceiver system, illustrating a brief overview of the baseband design, emphasizing the most important decisions that need to be done. A brief overview of the possible approaches when implementing the equalizer, as the most complex module in the SC transceiver, is also presented. The main focus of this paper is to suggest a parallel architecture for the receiver in a Single Carrier communication system. This would provide higher data rates that the communication system canachieve, for a price of higher power consumption. The suggested architecture of such receiver is illustrated in this paper,giving the results of its implementation in comparison with its corresponding serial implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Naturwiss., Diss., 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in computer memory technology justify research towards new and different views on computer organization. This paper proposes a novel memory-centric computing architecture with the goal to merge memory and processing elements in order to provide better conditions for parallelization and performance. The paper introduces the architectural concepts and afterwards shows the design and implementation of a corresponding assembler and simulator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note describes ParallelKnoppix, a bootable CD that allows creation of a Linux cluster in very little time. An experienced user can create a cluster ready to execute MPI programs in less than 10 minutes. The computers used may be heterogeneous machines, of the IA-32 architecture. When the cluster is shut down, all machines except one are in their original state, and the last can be returned to its original state by deleting a directory. The system thus provides a means of using non-dedicated computers to create a cluster. An example session is documented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have used massively parallel signature sequencing (MPSS) to sample the transcriptomes of 32 normal human tissues to an unprecedented depth, thus documenting the patterns of expression of almost 20,000 genes with high sensitivity and specificity. The data confirm the widely held belief that differences in gene expression between cell and tissue types are largely determined by transcripts derived from a limited number of tissue-specific genes, rather than by combinations of more promiscuously expressed genes. Expression of a little more than half of all known human genes seems to account for both the common requirements and the specific functions of the tissues sampled. A classification of tissues based on patterns of gene expression largely reproduces classifications based on anatomical and biochemical properties. The unbiased sampling of the human transcriptome achieved by MPSS supports the idea that most human genes have been mapped, if not functionally characterized. This data set should prove useful for the identification of tissue-specific genes, for the study of global changes induced by pathological conditions, and for the definition of a minimal set of genes necessary for basic cell maintenance. The data are available on the Web at http://mpss.licr.org and http://sgb.lynxgen.com.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a neoclassical trade model with heterogeneous factors of production. We consider a world with two factors, labor and .managers., each with a distribution of ability levels. Production combines a manager of some type with a group of workers. The output of a unit depends on the types of the two factors, with complementarity between them, while exhibiting diminishing returns to the number of workers. We examine the sorting of factors to sectors and the matching of factors within sectors, and we use the model to study the determinants of the trade pattern and the effects of trade on the wage and salary distributions. Finally, we extend the model to include search frictions and consider the distribution of employment rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Participation is a key indicator of the potential effectiveness of any population-based intervention. Defining, measuring and reporting participation in cancer screening programmes has become more heterogeneous as the number and diversity of interventions have increased, and the purposes of this benchmarking parameter have broadened. This study, centred on colorectal cancer, addresses current issues that affect the increasingly complex task of comparing screening participation across settings. Reports from programmes with a defined target population and active invitation scheme, published between 2005 and 2012, were reviewed. Differences in defining and measuring participation were identified and quantified, and participation indicators were grouped by aims of measure and temporal dimensions. We found that consistent terminology, clear and complete reporting of participation definition and systematic documentation of coverage by invitation were lacking. Further, adherence to definitions proposed in the 2010 European Guidelines for Quality Assurance in Colorectal Cancer Screening was suboptimal. Ineligible individuals represented 1% to 15% of invitations, and variable criteria for ineligibility yielded differences in participation estimates that could obscure the interpretation of colorectal cancer screening participation internationally. Excluding ineligible individuals from the reference population enhances comparability of participation measures. Standardised measures of cumulative participation to compare screening protocols with different intervals and inclusion of time since invitation in definitions are urgently needed to improve international comparability of colorectal cancer screening participation. Recommendations to improve comparability of participation indicators in cancer screening interventions are made.