861 resultados para Channel capacity and propagation modelling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid face recognition, using image (2D) and structural (3D) information, has explored the fusion of Nearest Neighbour classifiers. This paper examines the effectiveness of feature modelling for each individual modality, 2D and 3D. Furthermore, it is demonstrated that the fusion of feature modelling techniques for the 2D and 3D modalities yields performance improvements over the individual classifiers. By fusing the feature modelling classifiers for each modality with equal weights the average Equal Error Rate improves from 12.60% for the 2D classifier and 12.10% for the 3D classifier to 7.38% for the Hybrid 2D+3D clasiffier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The anatomy and microstructure of the spine and in particular the intervertebral disc are intimately linked to how they operate in vivo and how they distribute loads to the adjacent musculature and bony anatomy. The degeneration of the intervertebral discs may be characterised by a loss of hydration, loss of disc height, a granular texture and the presence of annular lesions. As such, degeneration of the intervertebral discs compromises the mechanical integrity of their components and results in adaption and modification in the mechanical means by which loads are distributed between adjacent spinal motion segments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article I outline and demonstrate a synthesis of the methods developed by Lemke (1998) and Martin (2000) for analyzing evaluations in English. I demonstrate the synthesis using examples from a 1.3-million-word technology policy corpus drawn from institutions at the local, state, national, and supranational levels. Lemke's (1998) critical model is organized around the broad 'evaluative dimensions' that are deployed to evaluate propositions and proposals in English. Martin's (2000) model is organized with a more overtly systemic-functional orientation around the concept of 'encoded feeling'. In applying both these models at different times, whilst recognizing their individual usefulness and complementarity, I found specific limitations that led me to work towards a synthesis of the two approaches. I also argue for the need to consider genre, media, and institutional aspects more explicitly when claiming intertextual and heteroglossic relations as the basis for inferred evaluations. A basic assertion made in this article is that the perceived Desirability of a process, person, circumstance, or thing is identical to its 'value'. But the Desirability of anything is a socially and thus historically conditioned attribution that requires significant amounts of institutional inculcation of other 'types' of value-appropriateness, importance, beauty, power, and so on. I therefore propose a method informed by critical discourse analysis (CDA) that sees evaluation as happening on at least four interdependent levels of abstraction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important trend in Chilean retailing industry is the increase in channel blurring. This investigation attempts to identify the relevant store attributes for different retail formats (grocery, department store, drug store, and home improvement). Do consumer store attribute saliency vary for different retail formats? Interviews identified twelve salient store attributes for the different retail formats. Survey results showed differences in store attribute saliencies for consumers when shopping at different formats. Seven of the twelve variables showed significant differences across formats. However, two attributes were relatively important for all four retail formats: product quality and responsiveness of employees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The multi-criteria decision making methods, Preference METHods for Enrichment Evaluation (PROMETHEE) and Graphical Analysis for Interactive Assistance (GAIA), and the two-way Positive Matrix Factorization (PMF) receptor model were applied to airborne fine particle compositional data collected at three sites in Hong Kong during two monitoring campaigns held from November 2000 to October 2001 and November 2004 to October 2005. PROMETHEE/GAIA indicated that the three sites were worse during the later monitoring campaign, and that the order of the air quality at the sites during each campaign was: rural site > urban site > roadside site. The PMF analysis on the other hand, identified 6 common sources at all of the sites (diesel vehicle, fresh sea salt, secondary sulphate, soil, aged sea salt and oil combustion) which accounted for approximately 68.8 ± 8.7% of the fine particle mass at the sites. In addition, road dust, gasoline vehicle, biomass burning, secondary nitrate, and metal processing were identified at some of the sites. Secondary sulphate was found to be the highest contributor to the fine particle mass at the rural and urban sites with vehicle emission as a high contributor to the roadside site. The PMF results are broadly similar to those obtained in a previous analysis by PCA/APCS. However, the PMF analysis resolved more factors at each site than the PCA/APCS. In addition, the study demonstrated that combined results from multi-criteria decision making analysis and receptor modelling can provide more detailed information that can be used to formulate the scientific basis for mitigating air pollution in the region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this conceptual article, we extend earlier work on Open Innovation and Absorptive Capacity. We suggest that the literature on Absorptive Capacity does not place sufficient emphasis on distributed knowledge and learning or on the application of innovative knowledge. To accomplish physical transformations, organisations need specific Innovative Capacities that extend beyond knowledge management. Accessive Capacity is the ability to collect, sort and analyse knowledge from both internal and external sources. Adaptive Capacity is needed to ensure that new pieces of equipment are suitable for the organisation's own purposes even though they may have been originally developed for other uses. Integrative Capacity makes it possible for a new or modified piece of equipment to be fitted into an existing production process with a minimum of inessential and expensive adjustment elsewhere in the process. These Innovative Capacities are controlled and coordinated by Innovative Management Capacity, a higher-order dynamic capability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human spatial environments must adapt to climate change. Spatial planning is central to climate change adaptation and potentially well suited to the task, however neoliberal influences and trends threaten this capacity. This paper explores the potential interaction of emerging research areas, the first of which pursues climate change adaptation through spatial planning and the second of which has observed the neoliberalisation of urban planning, The potential capacity and form of spatial adaptation within the context a planning environment influenced by neoliberal principles is evaluated. This influence relates to the themes of spatial scale, temporal scale, responsibility for action, strategies and mechanisms, accrual of benefits, negotiation of priorities and approach to uncertainty. This paper presents a conceptual framework of the influence of neoliberalism on spatial adaptation and presents examples of this approach in documents which underpin adaptation in Australia. It identifies the potential characteristics and the challenges and opportunities of spatial adaptation under a neoliberal frame. The neoliberal frame does not entirely preclude spatial adaptation but significantly influence its form. Neoliberal approaches involve individual action in response to private incentives and near term impacts while collective action, regulatory mechanisms and long term planning is approached cautiously. Challenges concern the degree to which collective action and a long term orientation are necessary, how individual adaptation relates to collective vulnerability and the prioritisation of adaptation by markets. Opportunities might involve the operability of individual and local adaptation, the existence of private incentives to adapt and the potential to align adaptation with entrepreneurial projects.