933 resultados para interpreting paradigm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The correspondence problem in computer vision is basically a matching task between two or more sets of features. In this paper, we introduce a vectorized image representation, which is a feature-based representation where correspondence has been established with respect to a reference image. This representation has two components: (1) shape, or (x, y) feature locations, and (2) texture, defined as the image grey levels mapped onto the standard reference image. This paper explores an automatic technique for "vectorizing" face images. Our face vectorizer alternates back and forth between computation steps for shape and texture, and a key idea is to structure the two computations so that each one uses the output of the other. A hierarchical coarse-to-fine implementation is discussed, and applications are presented to the problems of facial feature detection and registration of two arbitrary faces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the next five years, computer games will find their way into a vast number of American homes, creating a unique educational opportunity: the development of "computer coaches" for the serious intellectual skills required by some of these games. From the player's perspective, the coach will provide advice regarding strategy and tactics for better play. But, from the perspective of the coach, the request for help is an opportunity to tutor basic mathematical, scientific or other kinds of knowledge that the game exercises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problems under consideration center around the interpretation of binocular stereo disparity. In particular, the goal is to establish a set of mappings from stereo disparity to corresponding three-dimensional scene geometry. An analysis has been developed that shows how disparity information can be interpreted in terms of three-dimensional scene properties, such as surface depth, discontinuities, and orientation. These theoretical developments have been embodied in a set of computer algorithms for the recovery of scene geometry from input stereo disparity. The results of applying these algorithms to several disparity maps are presented. Comparisons are made to the interpretation of stereo disparity by biological systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose and rationale The purpose of the exploratory research is to provide a deeper understanding of how the work environment enhances or constrains organisational creativity (creativity and innovation) within the context of the advertising sector. The argument for the proposed research is that the contemporary literature is dominated by quantitative research instruments to measure the climate and work environment across many different sectors. The most influential theory within the extant literature is the componential theory of organisational creativity and innovation and is used as an analytical guide (Amabile, 1997; Figure 8) to conduct an ethnographic study within a creative advertising agency based in Scotland. The theory suggests that creative people (skills, expertise and task motivation) are influenced by the work environment in which they operate. This includes challenging work (+), work group supports (+), supervisory encouragement (+), freedom (+), sufficient resources (+), workload pressures (+ or -), organisational encouragement (+) and organisational impediments (-) which is argued enhances (+) or constrains (-) both creativity and innovation. An interpretive research design is conducted to confirm, challenge or extend the componential theory of organisational creativity and innovation (Amabile, 1997; Figure 8) and contribute to knowledge as well as practice. Design/methodology/approach The scholarly activity conducted within the context of the creative industries and advertising sector is in its infancy and research from the alternative paradigm using qualitative methods is limited which may provide new guidelines for this industry sector. As such, an ethnographic case study research design is a suitable methodology to provide a deeper understanding of the subject area and is consistent with a constructivist ontology and an interpretive epistemology. This ontological position is conducive to the researcher’s axiology and values in that meaning is not discovered as an objective truth but socially constructed from multiple realties from social actors. As such, ethnography is the study of people in naturally occurring settings and the creative advertising agency involved in the research is an appropriate purposive sample within an industry that is renowned for its creativity and innovation. Qualitative methods such as participant observation (field notes, meetings, rituals, social events and tracking a client brief), material artefacts (documents, websites, annual reports, emails, scrapbooks and photographic evidence) and focused interviews (informal and formal conversations, six taped and transcribed interviews and use of Survey Monkey) are used to provide a written account of the agency’s work environment. The analytical process of interpreting the ethnographic text is supported by thematic analysis (selective, axial and open coding) through the use of manual analysis and NVivo9 software Findings The findings highlight a complex interaction between the people within the agency and the enhancers and constraints of the work environment in which they operate. This involves the creative work environment (Amabile, 1997; Figure 8) as well as the physical work environment (Cain, 2012; Dul and Ceylan, 2011; Dul et al. 2011) and that of social control and power (Foucault, 1977; Gahan et al. 2007; Knights and Willmott, 2007). As such, the overarching themes to emerge from the data on how the work environment enhances or constrains organisational creativity include creative people (skills, expertise and task motivation), creative process (creative work environment and physical work environment) and creative power (working hours, value of creativity, self-fulfilment and surveillance). Therefore, the findings confirm that creative people interact and are influenced by aspects of the creative work environment outlined by Amabile (1997; Figure 8). However, the results also challenge and extend the theory to include that of the physical work environment and creative power. Originality/value/implications Methodologically, there is no other interpretive research that uses an ethnographic case study approach within the context of the advertising sector to explore and provide a deeper understanding of the subject area. As such, the contribution to knowledge in the form of a new interpretive framework (Figure 16) challenges and extends the existing body of knowledge (Amabile, 1997; Figure 8). Moreover, the contribution to practice includes a flexible set of industry guidelines (Appendix 13) that may be transferrable to other organisational settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rowland, J.J. (2002) Interpreting Analytical Spectra with Evolutionary Computation. In: Fogel, G.B. and Corne, D.W. (eds), Evolutionary Computation in Bioinformatics. Morgan Kaufmann, San Francisco, pp 341--365, ISBN 1-55860-797-8

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose and evaluate an admission control paradigm for RTDBS, in which a transaction is submitted to the system as a pair of processes: a primary task, and a recovery block. The execution requirements of the primary task are not known a priori, whereas those of the recovery block are known a priori. Upon the submission of a transaction, an Admission Control Mechanism is employed to decide whether to admit or reject that transaction. Once admitted, a transaction is guaranteed to finish executing before its deadline. A transaction is considered to have finished executing if exactly one of two things occur: Either its primary task is completed (successful commitment), or its recovery block is completed (safe termination). Committed transactions bring a profit to the system, whereas a terminated transaction brings no profit. The goal of the admission control and scheduling protocols (e.g., concurrency control, I/O scheduling, memory management) employed in the system is to maximize system profit. We describe a number of admission control strategies and contrast (through simulations) their relative performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proliferation of inexpensive workstations and networks has created a new era in distributed computing. At the same time, non-traditional applications such as computer-aided design (CAD), computer-aided software engineering (CASE), geographic-information systems (GIS), and office-information systems (OIS) have placed increased demands for high-performance transaction processing on database systems. The combination of these factors gives rise to significant challenges in the design of modern database systems. In this thesis, we propose novel techniques whose aim is to improve the performance and scalability of these new database systems. These techniques exploit client resources through client-based transaction management. Client-based transaction management is realized by providing logging facilities locally even when data is shared in a global environment. This thesis presents several recovery algorithms which utilize client disks for storing recovery related information (i.e., log records). Our algorithms work with both coarse and fine-granularity locking and they do not require the merging of client logs at any time. Moreover, our algorithms support fine-granularity locking with multiple clients permitted to concurrently update different portions of the same database page. The database state is recovered correctly when there is a complex crash as well as when the updates performed by different clients on a page are not present on the disk version of the page, even though some of the updating transactions have committed. This thesis also presents the implementation of the proposed algorithms in a memory-mapped storage manager as well as a detailed performance study of these algorithms using the OO1 database benchmark. The performance results show that client-based logging is superior to traditional server-based logging. This is because client-based logging is an effective way to reduce dependencies on server CPU and disk resources and, thus, prevents the server from becoming a performance bottleneck as quickly when the number of clients accessing the database increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SomeCast is a novel paradigm for the reliable multicast of real-time data to a large set of receivers over the Internet. SomeCast is receiver-initiated and thus scalable in the number of receivers, the diverse characteristics of paths between senders and receivers (e.g. maximum bandwidth and round-trip-time), and the dynamic conditions of such paths (e.g. congestion-induced delays and losses). SomeCast enables receivers to dynamically adjust the rate at which they receive multicast information to enable the satisfaction of real-time QoS constraints (e.g. rate, deadlines, or jitter). This is done by enabling a receiver to join SOME number of concurrent multiCAST sessions, whereby each session delivers a portion of an encoding of the real-time data. By adjusting the number of such sessions dynamically, client-specific QoS constraints can be met independently. The SomeCast paradigm can be thought of as a generalization of the AnyCast (e.g. Dynamic Server Selection) and ManyCast (e.g. Digital Fountain) paradigms, which have been proposed in the literature to address issues of scalability of UniCast and MultiCast environments, respectively. In this paper we overview the SomeCast paradigm, describe an instance of a SomeCast protocol, and present simulation results that quantify the significant advantages gained from adopting such a protocol for the reliable multicast of data to a diverse set of receivers subject to real-time QoS constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since at least the early 1990s, stage and risk migration have been seen in patients with prostate cancer, likely corresponding to the institution of prostate specific antigen (PSA) screening in health systems. Preoperative risk factors, including PSA level and clinical stage, have decreased significantly. These improved prognostic variables have led to a larger portion of men being stratified with low-risk disease, as per the classification of D'Amico and associates. This, in turn, has corresponded with more favorable postoperative variables, including decreased extraprostatic tumor extension and prolonged biochemical-free recurrence rates. The advent of focal therapy is bolstered by findings of increased unilateral disease with decreased tumor volume. Increasingly, targeted or delayed therapies may be possible within the current era of lower risk disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional neuroimaging studies of episodic memory retrieval generally measure brain activity while participants remember items encountered in the laboratory ("controlled laboratory condition") or events from their own life ("open autobiographical condition"). Differences in activation between these conditions may reflect differences in retrieval processes, memory remoteness, emotional content, retrieval success, self-referential processing, visual/spatial memory, and recollection. To clarify the nature of these differences, a functional MRI study was conducted using a novel "photo paradigm," which allows greater control over the autobiographical condition, including a measure of retrieval accuracy. Undergraduate students took photos in specified campus locations ("controlled autobiographical condition"), viewed in the laboratory similar photos taken by other participants (controlled laboratory condition), and were then scanned while recognizing the two kinds of photos. Both conditions activated a common episodic memory network that included medial temporal and prefrontal regions. Compared with the controlled laboratory condition, the controlled autobiographical condition elicited greater activity in regions associated with self-referential processing (medial prefrontal cortex), visual/spatial memory (visual and parahippocampal regions), and recollection (hippocampus). The photo paradigm provides a way of investigating the functional neuroanatomy of real-life episodic memory under rigorous experimental control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key challenge in promoting decent work worldwide is how to improve the position of both firms and workers in value chains and global production networks driven by lead firms. This article develops a framework for analysing the linkages between the economic upgrading of firms and the social upgrading of workers. Drawing on studies which indicate that firm upgrading does not necessarily lead to improvements for workers, with a particular focus on the Moroccan garment industry, it outlines different trajectories and scenarios to provide a better understanding of the relationship between economic and social upgrading. The authors 2011 Journal compilation © International Labour Organization 2011.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural distributed systems are adaptive, scalable and fault-tolerant. Emergence science describes how higher-level self-regulatory behaviour arises in natural systems from many participants following simple rulesets. Emergence advocates simple communication models, autonomy and independence, enhancing robustness and self-stabilization. High-quality distributed applications such as autonomic systems must satisfy the appropriate nonfunctional requirements which include scalability, efficiency, robustness, low-latency and stability. However the traditional design of distributed applications, especially in terms of the communication strategies employed, can introduce compromises between these characteristics. This paper discusses ways in which emergence science can be applied to distributed computing, avoiding some of the compromises associated with traditionally-designed applications. To demonstrate the effectiveness of this paradigm, an emergent election algorithm is described and its performance evaluated. The design incorporates nondeterministic behaviour. The resulting algorithm has very low communication complexity, and is simultaneously very stable, scalable and robust.