105 resultados para Iterative decoding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An alternative approach to research is described that has been developed through a succession of significant construction management research projects. The approach follows the principles of iterative grounded theory, whereby researchers iterate between alternative theoretical frameworks and emergent empirical data. Of particular importance is an orientation toward mixing methods, thereby overcoming the existing tendency to dichotomize quantitative and qualitative approaches. The approach is positioned against the existing contested literature on grounded theory, and the possibility of engaging with empirical data in a “theory free” manner is discounted. Emphasis instead is given to the way in which researchers must be theoretically sensitive as a result of being steeped in relevant literatures. Knowledge of existing literatures therefore shapes the initial research design; but emergent empirical findings cause fresh theoretical perspectives to be mobilized. The advocated approach is further aligned with notions of knowledge coproduction and the underlying principles of contextualist research. It is this unique combination of ideas which characterizes the paper's contribution to the research methodology literature within the field of construction management. Examples are provided and consideration is given to the extent to which the emergent findings are generalizable beyond the specific context from which they are derived.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research uses a sociological perspective to build an improved, context specific understanding of innovation diffusion within the UK construction industry. It is argued there is an iterative interplay between actors and the social system they occupy that directly influences the diffusion process as well as the methodology adopted. The research builds upon previous findings that argued a level of best fit for the three innovation diffusion concepts of cohesion, structural equivalence and thresholds. That level of best fit is analysed here using empirical data from the UK construction industry. This analysis allows an understanding of how the relative importance of these concepts' actually varies within the stages of the innovation diffusion process. The conclusion that the level of relevance fluctuates in relation to the stages of the diffusion process is a new development in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current literature offers little understanding about how procurement methods are enacted in practice. Developments in procurement are often viewed as the result of responding to recommendations from particular constituents within the sector. The research seeks to remove itself from such deterministic leaning, counselling instead that procurement should not be viewed in static terms, but dynamically manifesting over time within a complex web of interconnections between various actors, their situated context and the broader industrial structure. Attention is given to how a client and construction firm engaged in a collusive interaction to realise an innovative procurement method that derived its legitimacy from a backcloth of initiatives promoted by various commentators. A case study of a medium-size regional contractor demonstrates how the first partnering arrangement was enacted within the UK affordable housing maintenance sector in the UK. The case study finds that the enactment of new procurement methods strongly relies on iterative learning between clients and contractors. It is further suggested that construction firms need to initiate new procurement in order to remain competitive within the sector. The findings point towards a pro-active initiative by the contractor and client to enact a ‘procurement first’. Encouragement may be drawn from this example by other contractors seeking to offer more than simply responsive procurement solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decoding emotional prosody is crucial for successful social interactions, and continuous monitoring of emotional intent via prosody requires working memory. It has been proposed by Ross and others that emotional prosody cognitions in the right hemisphere are organized in an analogous fashion to propositional language functions in the left hemisphere. This study aimed to test the applicability of this model in the context of prefrontal cortex working memory functions. BOLD response data were therefore collected during performance of two emotional working memory tasks by participants undergoing fMRI. In the prosody task, participants identified the emotion conveyed in pre-recorded sentences, and working memory load was manipulated in the style of an N-back task. In the matched lexico-semantic task, participants identified the emotion conveyed by sentence content. Block-design neuroimaging data were analyzed parametrically with SPM5. At first, working memory for emotional prosody appeared to be right-lateralized in the PFC, however, further analyses revealed that it shared much bilateral prefrontal functional neuroanatomy with working memory for lexico-semantic emotion. Supplementary separate analyses of males and females suggested that these language functions were less bilateral in females, but their inclusion did not alter the direction of laterality. It is concluded that Ross et al.'s model is not applicable to prefrontal cortex working memory functions, that evidence that working memory cannot be subdivided in prefrontal cortex according to material type is increased, and that incidental working memory demands may explain the frontal lobe involvement in emotional prosody comprehension as revealed by neuroimaging studies. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We frequently encounter conflicting emotion cues. This study examined how the neural response to emotional prosody differed in the presence of congruent and incongruent lexico-semantic cues. Two hypotheses were assessed: (i) decoding emotional prosody with conflicting lexico-semantic cues would activate brain regions associated with cognitive conflict (anterior cingulate and dorsolateral prefrontal cortex) or (ii) the increased attentional load of incongruent cues would modulate the activity of regions that decode emotional prosody (right lateral temporal cortex). While the participants indicated the emotion conveyed by prosody, functional magnetic resonance imaging data were acquired on a 3T scanner using blood oxygenation level-dependent contrast. Using SPM5, the response to congruent cues was contrasted with that to emotional prosody alone, as was the response to incongruent lexico-semantic cues (for the 'cognitive conflict' hypothesis). The right lateral temporal lobe region of interest analyses examined modulation of activity in this brain region between these two contrasts (for the 'prosody cortex' hypothesis). Dorsolateral prefrontal and anterior cingulate cortex activity was not observed, and neither was attentional modulation of activity in right lateral temporal cortex activity. However, decoding emotional prosody with incongruent lexico-semantic cues was strongly associated with left inferior frontal gyrus activity. This specialist form of conflict is therefore not processed by the brain using the same neural resources as non-affective cognitive conflict and neither can it be handled by associated sensory cortex alone. The recruitment of inferior frontal cortex may indicate increased semantic processing demands but other contributory functions of this region should be explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Fractal Quantizer is proposed that replaces the expensive division operation for the computation of scalar quantization by more modest and available multiplication, addition and shift operations. Although the proposed method is iterative in nature, simulations prove a virtually undetectable distortion to the naked eve for JPEG compressed images using a single iteration. The method requires a change to the usual tables used in JPEG algorithins but of similar size. For practical purposes, performing quantization is reduced to a multiplication plus addition operation easily programmed in either low-end embedded processors and suitable for efficient and very high speed implementation in ASIC or FPGA hardware. FPGA hardware implementation shows up to x15 area-time savingscompared to standars solutions for devices with dedicated multipliers. The method can be also immediately extended to perform adaptive quantization(1).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a new iterative algorithm for OFDM joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the problem of "overfitting" such that the iterative approach may converge to a trivial solution. Although it is essential for this joint approach, the overfitting problem was relatively less studied in existing algorithms. In this paper, specifically, we apply a hard decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the phase noise, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical simulations are also given to verify the proposed algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The General Packet Radio Service (GPRS) has been developed for the mobile radio environment to allow the migration from the traditional circuit switched connection to a more efficient packet based communication link particularly for data transfer. GPRS requires the addition of not only the GPRS software protocol stack, but also more baseband functionality for the mobile as new coding schemes have be en defined, uplink status flag detection, multislot operation and dynamic coding scheme detect. This paper concentrates on evaluating the performance of the GPRS coding scheme detection methods in the presence of a multipath fading channel with a single co-channel interferer as a function of various soft-bit data widths. It has been found that compressing the soft-bit data widths from the output of the equalizer to save memory can influence the likelihood decision of the coding scheme detect function and hence contribute to the overall performance loss of the system. Coding scheme detection errors can therefore force the channel decoder to either select the incorrect decoding scheme or have no clear decision which coding scheme to use resulting in the decoded radio block failing the block check sequence and contribute to the block error rate. For correct performance simulation, the performance of the full coding scheme detection must be taken into account.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past decade, airborne based LIght Detection And Ranging (LIDAR) has been recognised by both the commercial and public sectors as a reliable and accurate source for land surveying in environmental, engineering and civil applications. Commonly, the first task to investigate LIDAR point clouds is to separate ground and object points. Skewness Balancing has been proven to be an efficient non-parametric unsupervised classification algorithm to address this challenge. Initially developed for moderate terrain, this algorithm needs to be adapted to handle sloped terrain. This paper addresses the difficulty of object and ground point separation in LIDAR data in hilly terrain. A case study on a diverse LIDAR data set in terms of data provider, resolution and LIDAR echo has been carried out. Several sites in urban and rural areas with man-made structure and vegetation in moderate and hilly terrain have been investigated and three categories have been identified. A deeper investigation on an urban scene with a river bank has been selected to extend the existing algorithm. The results show that an iterative use of Skewness Balancing is suitable for sloped terrain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we introduce a new algorithm, based on the successful work of Fathi and Alexandrov, on hybrid Monte Carlo algorithms for matrix inversion and solving systems of linear algebraic equations. This algorithm consists of two parts, approximate inversion by Monte Carlo and iterative refinement using a deterministic method. Here we present a parallel hybrid Monte Carlo algorithm, which uses Monte Carlo to generate an approximate inverse and that improves the accuracy of the inverse with an iterative refinement. The new algorithm is applied efficiently to sparse non-singular matrices. When we are solving a system of linear algebraic equations, Bx = b, the inverse matrix is used to compute the solution vector x = B(-1)b. We present results that show the efficiency of the parallel hybrid Monte Carlo algorithm in the case of sparse matrices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we study the computational complexity of a class of grid Monte Carlo algorithms for integral equations. The idea of the algorithms consists in an approximation of the integral equation by a system of algebraic equations. Then the Markov chain iterative Monte Carlo is used to solve the system. The assumption here is that the corresponding Neumann series for the iterative matrix does not necessarily converge or converges slowly. We use a special technique to accelerate the convergence. An estimate of the computational complexity of Monte Carlo algorithm using the considered approach is obtained. The estimate of the complexity is compared with the corresponding quantity for the complexity of the grid-free Monte Carlo algorithm. The conditions under which the class of grid Monte Carlo algorithms is more efficient are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a new iterative algorithm for orthogonal frequency division multiplexing (OFDM) joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the relatively less studied problem of "overfitting" such that the iterative approach may converge to a trivial solution. Specifically, we apply a hard-decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the PHN, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical Simulations are also given to verify the proposed algorithm. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This correspondence proposes a new algorithm for the OFDM joint data detection and phase noise (PHN) cancellation for constant modulus modulations. We highlight that it is important to address the overfitting problem since this is a major detrimental factor impairing the joint detection process. In order to attack the overfitting problem we propose an iterative approach based on minimum mean square prediction error (MMSPE) subject to the constraint that the estimated data symbols have constant power. The proposed constrained MMSPE algorithm (C-MMSPE) significantly improves the performance of existing approaches with little extra complexity being imposed. Simulation results are also given to verify the proposed algorithm.