889 resultados para Almost always propositional logic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This paper takes a customer view on corporate image and value, and discusses the value of image in service. We propose a model depicting how the customer’s corporate brand image affects the customer’s value-in-use. Methodology/approach The paper represents conceptual development on customers’ value and image construction processes. By integrating ideas and elements from the current service and branding literature a model is proposed that extends current views on how value-in-use emerges. Findings From a current service perspective it is the customer who makes value assessments when experiencing service. Similarly, if branding is a concept used to denote the service provider’s intentions and attempts to create a corporate brand, image construction is the corresponding process where the customer constructs the corporate image. This image construction process is always present both in service interactions and in communication and has an effect on the customer’s value-in-use. We argue that two interrelated concepts are needed to capture corporate image construction and dynamics and value-in-use – the image-in-use and image heritage. Research implications The model integrates two different streams of research pointing to the need to consider traditional marketing communication and service interactions as inherently related to each other from the customer’s point of view. Additionally the model gives a platform for understanding how value-in-use emerges over time. New methodological approaches and techniques to capture image-in-use and image heritage and their interplay with value-in-use are needed. Practical implications The company may not be able to control the emergence of value-in-use but may influence it, not only in interactions with the customer but also with pure communication. Branding activities should therefore be considered related to service operations and service development. Additionally, practitioners would need to apply qualitative methods to understand the customer’s view on image and value-in-use. Originality/value The paper presents a novel approach for understanding and studying that the customer’s image of a company influences emergence of value-in-use. The model implies that the customer’s corporate image has a crucial role for experienced value-in-use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discussion of a service-dominant logic has made the findings of decades of service marketing research a topic of interest for marketing at large. Some fundamental aspects of the logic such as value creation and its marketing implications are more complex than they have been treated as so far and need to be further developed to serve marketing theory and practice well. Following the analysis in the present article it is argued that although customers are co-producers in service processes, according to the value-in-use notion adopted in the contemporary marketing and management literature they are fundamentally the creators of value for themselves. Furthermore, it is concluded that although by providing goods and services as input resources into customers’ consumption and value-generating processes firms are fundamentally value facilitators, interactions with customers that exist or can be created enable firms to engage themselves with their customers’ processes and thereby they become co-creators of value with their customers. As marketing implications it is observed that 1) the goal of marketing is to support customers’ value creation, 2) following a service logic and due to the existence of interactions where the firm’s and the customer’s processes merge into an integrated joint value creation process, the firm is not restricted to making value propositions only, but can directly and actively influence the customer’s value fulfilment as well and extend its marketing process to include activities during customer-firm interactions, and 3) although all goods and services are consumed as service, customers’ purchasing decisions can be expected to be dependant of whether they have the skills and interest to use a resource, such as a good, as service or want to buy extended market offerings including process-related elements. Finally, the analysis concludes with five service logic theses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The k-colouring problem is to colour a given k-colourable graph with k colours. This problem is known to be NP-hard even for fixed k greater than or equal to 3. The best known polynomial time approximation algorithms require n(delta) (for a positive constant delta depending on k) colours to colour an arbitrary k-colourable n-vertex graph. The situation is entirely different if we look at the average performance of an algorithm rather than its worst-case performance. It is well known that a k-colourable graph drawn from certain classes of distributions can be ii-coloured almost surely in polynomial time. In this paper, we present further results in this direction. We consider k-colourable graphs drawn from the random model in which each allowed edge is chosen independently with probability p(n) after initially partitioning the vertex set into ii colour classes. We present polynomial time algorithms of two different types. The first type of algorithm always runs in polynomial time and succeeds almost surely. Algorithms of this type have been proposed before, but our algorithms have provably exponentially small failure probabilities. The second type of algorithm always succeeds and has polynomial running time on average. Such algorithms are more useful and more difficult to obtain than the first type of algorithms. Our algorithms work as long as p(n) greater than or equal to n(-1+is an element of) where is an element of is a constant greater than 1/4.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review article, based on a lecture delivered in Madras in 1985, is an account of the author's experience in the working out of the molecular structure and conformation of the collagen triple-helix over the years 1952–78. It starts with the first proposal of the correct triple-helix in 1954, but with three residues per turn, which was later refined in 1955 into a coiled-coil structure with approximately 3.3 residues per turn. The structure readily fitted proline and hydroxyproline residues and required glycine as every third residue in each of the three chains. The controversy regarding the number of hydrogen bonds per tripeptide could not be resolved by X-ray diffraction or energy minimization, but physicochemical data, obtained in other laboratories during 1961–65, strongly pointed to two hydrogen bonds, as suggested by the author. However, it was felt that the structure with one straight NH … O bond was better. A reconciliation of the two was obtained in Chicago in 1968, by showing that the second hydrogen bond is via a water molecule, which makes it weaker, as found in the physicochemical studies mentioned above. This water molecule was also shown, in 1973, to take part in further cross-linking hydrogen bonds with the OH group of hydroxyproline, which occurred always in the location previous to glycine, and is at the right distance from the water. Thus, almost all features of the primary structure, X-ray pattern, optical and hydrodynamic data, and the role of hydroxyproline in stabilising the triple helical structure, have been satisfactorily accounted for. These also lead to a confirmation of Pauling's theory that vitamin C improves immunity to diseases, as explained in the last section.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An algebraic generalization of the well-known binary q-function array to a multivalued q-function array is presented. It is possible to associate tree-structure realizations for binary q-functions and multivalued q-functions. Synthesis of multivalued functions using this array is very simple

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the ratio of matched individuals to blocking pairs grows linearly with the number of propose–accept rounds executed by the Gale–Shapley algorithm for the stable marriage problem. Consequently, the participants can arrive at an almost stable matching even without full information about the problem instance; for each participant, knowing only its local neighbourhood is enough. In distributed-systems parlance, this means that if each person has only a constant number of acceptable partners, an almost stable matching emerges after a constant number of synchronous communication rounds. We apply our results to give a distributed (2 + ε)-approximation algorithm for maximum-weight matching in bicoloured graphs and a centralised randomised constant-time approximation scheme for estimating the size of a stable matching.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis was to examine the understanding of community in George Lindbeck s The Nature of Doctrine. Intrinsic to this question was also examining how Lindbeck understands the relation between the text and the world which both meet in a Christian community. Thirdly this study also aimed at understanding what the persuasiveness of this understanding depends on. The method applied for this task was systematic analysis. The study was conducted by first providing an orientation into the nontheological substance of the ND which was assumed useful with respect to the aim of this study. The study then went on to explore Lindbeck in his own context of postliberal theology in order to see how the ND was received. It also attempted to provide a picture of how the ND relates to Lindbeck as a theologian. The third chapter was a descriptive analysis into the cultural-linguistic perspective, which is understood as being directly proportional to his understanding of community. The fourth chapter was an analysis into how the cultural-linguistic perspective sees the relation between the text and the world. When religion is understood from a cultural-linguistic perspective, it presents itself as a cultural-linguistic entity, which Lindbeck understands as a comprehensive interpretive scheme which structures human experience and understanding of oneself and the world in which one lives. When one exists in this entity, it is the entity which shapes the subjectivities of all those who are at home in this entity which makes participation in the life of a cultural linguistic entity a condition for understanding it. Religion is above all an external word that moulds and shapes our religious existence and experience. Understanding faith then as coming from hearing, is something that correlates with the cultural-linguistic depiction of reality. Religion informs us of a religious reality, it does not originate in any way from ourselves. This externality linked to the axiomatic nature of religion is also something that distinguishes Lindbeck sharply from liberalist tendencies, which understand religion as ultimately expressing the prereflective depths of the inner self. Language is the central analogy to understanding the medium in which one moves when inhabiting a cultural-linguistic system because language is the transmitting medium in which the cultural-linguistic system is embodied. The realism entailed in Lindbeck s understanding of a community is that we are fundamentally on the receiving end when it comes to our identities whether cultural or religious. We always witness to something. Its persuasiveness rests on the fact that we never exist in an unpersuaded reality. The language of Christ is a self-sustaining and irreducible cultural-linguistic entity, which is ontologically founded upon Christ. It transmits the reality of a new being. The basic relation to the world for a Christian is that of witnessing salvation in Christ: witnessing Christ as the home of hearing the message of salvation, which is the God-willed way. Following this logic, the relation of the world and the text is one of relating to the world from the text, i.e. In Christ through the word (text) for the world, because it assumes it s logic from the way Christ ontologically relates to us.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to further develop the logic of service, value creation, value co-creation and value have to be formally and rigorously defined, so that the nature, content and locus of value and the roles of service providers and customers in value creation can be unambiguously assessed. In the present article, following the underpinning logic of value-in-use, it is demonstrated that in order to achieve this, value creation is best defined as the customer’s creation of value-in-use. The analysis shows that the firm’s and customer’s processes and activities can be divided into a provider sphere, closed for the customer, and a customer sphere, closed for the firm. Value creation occurs in the customer sphere, whereas firms in the provider sphere facilitate value creation by producing resources and processes which represent potential value or expected value-in use for their customers. By getting access to the closed customer sphere, firms can create a joint value sphere and engage in customers’ value creation as co-creators of value with them. This approach establishes a theoretically sound foundation for understanding value creation in service logic, and enables meaningful managerial implications, for example as to what is required for co-creation of value, and also further theoretical elaborations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coastal lagoons are complex ecosystems exhibiting a high degree of non-linearity in the distribution and exchange of nutrients dissolved in the water column due to their spatio-temporal characteristics. This factor has a direct influence on the concentrations of chlorophyll-a, an indicator of the primary productivity in the water bodies as lakes and lagoons. Moreover the seasonal variability in the characteristics of large-scale basins further contributes to the uncertainties in the data on the physico-chemical and biological characteristics of the lagoons. Considering the above, modelling the distributions of the nutrients with respect to the chlorophyll-concentrations, hence requires an effective approach which will appropriately account for the non-linearity of the ecosystem as well as the uncertainties in the available data. In the present investigation, fuzzy logic was used to develop a new model of the primary production for Pulicat lagoon, Southeast coast of India. Multiple regression analysis revealed that the concentrations of chlorophyll-a in the lagoon was highly influenced by the dissolved concentrations of nitrate, nitrites and phosphorous to different extents over different seasons and years. A high degree of agreement was obtained between the actual field values and those predicted by the new fuzzy model (d = 0.881 to 0.788) for the years 2005 and 2006, illustrating the efficiency of the model in predicting the values of chlorophyll-a in the lagoon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing variability in device leakage has made the design of keepers for wide OR structures a challenging task. The conventional feedback keepers (CONV) can no longer improve the performance of wide dynamic gates for the future technologies. In this paper, we propose an adaptive keeper technique called rate sensing keeper (RSK) that enables faster switching and tracks the variation across different process corners. It can switch upto 1.9x faster (for 20 legs) than CONV and can scale upto 32 legs as against 20 legs for CONV in a 130-nm 1.2-V process. The delay tracking is within 8% across the different process corners. We demonstrate the circuit operation of RSK using a 32 x 8 register file implemented in an industrial 130-nm 1.2-V CMOS process. The performance of individual dynamic logic gates are also evaluated on chip for various keeper techniques. We show that the RSK technique gives superior performance compared to the other alternatives such as Conditional Keeper (CKP) and current mirror-based keeper (LCR).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal specification is vital to the development of distributed real-time systems as these systems are inherently complex and safety-critical. It is widely acknowledged that formal specification and automatic analysis of specifications can significantly increase system reliability. Although a number of specification techniques for real-time systems have been reported in the literature, most of these formalisms do not adequately address to the constraints that the aspects of 'distribution' and 'real-time' impose on specifications. Further, an automatic verification tool is necessary to reduce human errors in the reasoning process. In this regard, this paper is an attempt towards the development of a novel executable specification language for distributed real-time systems. First, we give a precise characterization of the syntax and semantics of DL. Subsequently, we discuss the problems of model checking, automatic verification of satisfiability of DL specifications, and testing conformance of event traces with DL specifications. Effective solutions to these problems are presented as extensions to the classical first-order tableau algorithm. The use of the proposed framework is illustrated by specifying a sample problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coalescence processes are investigated during phase separation in a density-matched liquid mixture (partially deuterated cyclohexane and methanol) under near-critical conditions. As a result of the interplay between capillary and lubrication forces, ''nose'' coalescence appears to be always associated with the slow growth of isolated droplets (exponent almost-equal-to 1/3), whereas ''dimple'' coalescence corresponds to the fast growth of interconnected droplets (exponent almost-equal-to 1). At each stage of growth, the distribution of droplets trapped during dimple coalescence is reminiscent of all of the previous coalescence events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of logic synthesis is to produce circuits which satisfy the given boolean function while meeting timing constraints and requiring the minimum silicon area. Logic synthesis involves two steps namely logic decomposition and technology mapping. Existing methods treat the two as separate operation. The traditional approach is to minimize the number of literals without considering the target technology during the decomposition phase. The decomposed expressions are then mapped on to the target technology to optimize the area, Timing optimization is carried out subsequently, A new approach which treats logic decomposition and technology maping as a single operation is presented. The logic decomposition is based on the parameters of the target technology. The area and timing optimization is carried out during logic decomposition phase itself. Results using MCNC circuits are presented to show that this method produces circuits which are 38% faster while requiring 14% increase in area.