949 resultados para Interior One Flange Load Case
Resumo:
Doherty, A. M. and Alexander, N. (2004). Relationship development in international retail franchising: Case study evidence from the UK fashion sector. European Journal of Marketing. 38(9-10), pp.1215-1235 RAE2008
Resumo:
Dissertação apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do Grau de Mestre em Engenharia e Gestão Ambiental, ramo de Sistemas Industriais
Resumo:
Tese apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutor em Ciências Empresariais, especialidade em Gestão
Resumo:
Tese apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Doutor em Ciências Sociais, especialidade em Psicologia
Resumo:
This dissertation illustrates the merits of an interdisciplinary approach to religious conversion by employing Lewis Rambo’s systemic stage model to illumine the process of St. Augustine’s conversion. Previous studies of Augustine’s conversion have commonly explored his narrative of transformation from the perspective of one specific discipline, such as theology, history, or psychology. In doing so, they have necessarily restricted attention to a limited set of questions and problems. By bringing these disciplines into a structured, critical conversation, this study demonstrates how formulating and responding to the interplay among personal, social, cultural, and religious dimensions of Augustine’s conversion process may eventuate in the consideration of issues previously unarticulated and thus unaddressed. Rambo (1993) formulates a model of religious change that consists of what he calls context, crisis, quest, encounter, interaction, commitment, and consequences. Change is explained by drawing upon the research and scholarship of psychologists, sociologists, anthropologists, and religionists, in conjunction with the contributions of theologians. This study unfolds in the following chapters: I. Introduction; II. Literature review of scholarship about conversion, with emphasis on explication of Rambo’s model; III. A description of the case of Augustine, drawn from a close reading of the Confessions; IV. Literature review of scholarship about Augustine’s conversion; V. Interdisciplinary interpretation of Augustine’s conversion; and VI. Implications for scholars of conversion, and for pastoral caregivers, as well as recommendations for future research. This dissertation demonstrates how Augustine’s conversion experience was deeply influenced by 1) psychological distress and crisis; 2) the quest to know himself and the divine; 3) interactions with significant others; 4) participation in Christian communities; 5) philosophical and cultural changes; and 6) the encounter with the divine. As such, this study reveals the value of interpreting Augustine’s conversion as an evolving process constituted in multiple factors that can be differentiated from one another, yet clearly interact with one another. It examines the implications of constructing an interdisciplinary approach to Augustine’s conversion narrative for both the academy and the Christian community, and recommends the use of Rambo’s model in studies of other cases of religious change.
Resumo:
To support the diverse Quality of Service (QoS) requirements of real-time (e.g. audio/video) applications in integrated services networks, several routing algorithms that allow for the reservation of the needed bandwidth over a Virtual Circuit (VC) established on one of several candidate routes have been proposed. Traditionally, such routing is done using the least-loaded concept, and thus results in balancing the load across the set of candidate routes. In a recent study, we have established the inadequacy of this load balancing practice and proposed the use of load profiling as an alternative. Load profiling techniques allow the distribution of "available" bandwidth across a set of candidate routes to match the characteristics of incoming VC QoS requests. In this paper we thoroughly characterize the performance of VC routing using load profiling and contrast it to routing using load balancing and load packing. We do so both analytically and via extensive simulations of multi-class traffic routing in Virtual Path (VP) based networks. Our findings confirm that for routing guaranteed bandwidth flows in VP networks, load balancing is not desirable as it results in VP bandwidth fragmentation, which adversely affects the likelihood of accepting new VC requests. This fragmentation is more pronounced when the granularity of VC requests is large. Typically, this occurs when a common VC is established to carry the aggregate traffic flow of many high-bandwidth real-time sources. For VP-based networks, our simulation results show that our load-profiling VC routing scheme performs better or as well as the traditional load-balancing VC routing in terms of revenue under both skewed and uniform workloads. Furthermore, load-profiling routing improves routing fairness by proactively increasing the chances of admitting high-bandwidth connections.
Resumo:
We consider the problem of task assignment in a distributed system (such as a distributed Web server) in which task sizes are drawn from a heavy-tailed distribution. Many task assignment algorithms are based on the heuristic that balancing the load at the server hosts will result in optimal performance. We show this conventional wisdom is less true when the task size distribution is heavy-tailed (as is the case for Web file sizes). We introduce a new task assignment policy, called Size Interval Task Assignment with Variable Load (SITA-V). SITA-V purposely operates the server hosts at different loads, and directs smaller tasks to the lighter-loaded hosts. The result is that SITA-V provably decreases the mean task slowdown by significant factors (up to 1000 or more) where the more heavy-tailed the workload, the greater the improvement factor. We evaluate the tradeoff between improvement in slowdown and increase in waiting time in a system using SITA-V, and show conditions under which SITA-V represents a particularly appealing policy. We conclude with a discussion of the use of SITA-V in a distributed Web server, and show that it is attractive because it has a simple implementation which requires no communication from the server hosts back to the task router.
Resumo:
In this paper, we propose and evaluate an implementation of a prototype scalable web server. The prototype consists of a load-balanced cluster of hosts that collectively accept and service TCP connections. The host IP addresses are advertised using the Round Robin DNS technique, allowing any host to receive requests from any client. Once a client attempts to establish a TCP connection with one of the hosts, a decision is made as to whether or not the connection should be redirected to a different host---namely, the host with the lowest number of established connections. We use the low-overhead Distributed Packet Rewriting (DPR) technique to redirect TCP connections. In our prototype, each host keeps information about connections in hash tables and linked lists. Every time a packet arrives, it is examined to see if it has to be redirected or not. Load information is maintained using periodic broadcasts amongst the cluster hosts.
Resumo:
When people work from home, the domains of home and work are co-located, often under one roof. Home-workers have to cope with the meeting of two practices that have traditionally been physically separated. In light of this, we need to understand: how do people who work from home negotiate the boundaries between their home and work practices? What kinds of boundaries do people construct? How do boundaries affect the relationship between home and work as domains? What kinds of boundaries are available to home-workers? Are home-workers in charge of their boundaries or do they co-create them with others? How does this position home-workers in their domains? In order to address these questions, I analysed a variety of data, including newspaper columns, online forum discussions, interviews, and personal diary entries, using a discourse analytic approach that lends itself to issues of positioning. Current literature clashes over whether home-workers are in control of their boundaries, and over the relationship between home and work that arises out of boundary negotiations, i.e. whether home and work are dichotomous or layered. I seek to contribute to boundary theory by adopting a practice theory stance (Wenger, 1998) to guide my analysis. By viewing home and work as practices, I show that boundary negotiations depend on how home-workers are positioned, e.g. if they are positioned as peripheral in a domain, they lack influence over boundaries. I demonstrate that home and work constitute a number of different practices, rather than a rigid dichotomy, and that the way home and work are related are not the same for all home-workers. The application of practice concepts further shows how relationships between practices are created. The contribution of this work is a reconceptualisation of current boundary theory away from individual and cognitive notions (Nippert-Eng, 1996) into the realm of positioning.
Resumo:
The fundamental aim of this thesis is to examine the effect of New Public Management (NPM) on the traditional roles of elected representatives, management and community activists in Irish local government. This will be achieved through a case study analysis of one local authority, Cork County Council. NPM promises greater democracy in decision-making. Therefore, one can hypothesise that the roles of the three key groupings identified will become more influenced by principles of participatory decision-making. Thus, a number of related questions will be addressed by this work, such as, have the local elected representatives been empowered by NPM? Has a managerial revolution taken place? Has local democracy been enhanced by more effective community participation? It will be seen in chapter 2 that these questions have not been adequately addressed to date in NPM literature. The three groups identified can be regarded as stakeholders although the researcher is cautious in using this term because of its value-laden nature. Essentially, in terms of Cork County Council, stakeholders can be defined as decision-makers and people within the organization and its environment who are interested in or could be affected directly or indirectly by organizational performance. This is an all-embracing definition and includes all citizens, residents, community groups and client organizations. It is in this context that the term 'stakeholder' should be understood when it is occasionally used in this thesis. In this case, the perceptions of elected councilors, management and community representatives with regard to their changing roles are as significant as the changes themselves. The chapter begins with a brief account of the background to this research. This is followed by an explanation of the methodology which is used and then concludes with short statements about the remaining chapters in the thesis.
Resumo:
Motivated by accurate average-case analysis, MOdular Quantitative Analysis (MOQA) is developed at the Centre for Efficiency Oriented Languages (CEOL). In essence, MOQA allows the programmer to determine the average running time of a broad class of programmes directly from the code in a (semi-)automated way. The MOQA approach has the property of randomness preservation which means that applying any operation to a random structure, results in an output isomorphic to one or more random structures, which is key to systematic timing. Based on original MOQA research, we discuss the design and implementation of a new domain specific scripting language based on randomness preserving operations and random structures. It is designed to facilitate compositional timing by systematically tracking the distributions of inputs and outputs. The notion of a labelled partial order (LPO) is the basic data type in the language. The programmer uses built-in MOQA operations together with restricted control flow statements to design MOQA programs. This MOQA language is formally specified both syntactically and semantically in this thesis. A practical language interpreter implementation is provided and discussed. By analysing new algorithms and data restructuring operations, we demonstrate the wide applicability of the MOQA approach. Also we extend MOQA theory to a number of other domains besides average-case analysis. We show the strong connection between MOQA and parallel computing, reversible computing and data entropy analysis.
Resumo:
Case-Based Reasoning (CBR) uses past experiences to solve new problems. The quality of the past experiences, which are stored as cases in a case base, is a big factor in the performance of a CBR system. The system's competence may be improved by adding problems to the case base after they have been solved and their solutions verified to be correct. However, from time to time, the case base may have to be refined to reduce redundancy and to get rid of any noisy cases that may have been introduced. Many case base maintenance algorithms have been developed to delete noisy and redundant cases. However, different algorithms work well in different situations and it may be difficult for a knowledge engineer to know which one is the best to use for a particular case base. In this thesis, we investigate ways to combine algorithms to produce better deletion decisions than the decisions made by individual algorithms, and ways to choose which algorithm is best for a given case base at a given time. We analyse five of the most commonly-used maintenance algorithms in detail and show how the different algorithms perform better on different datasets. This motivates us to develop a new approach: maintenance by a committee of experts (MACE). MACE allows us to combine maintenance algorithms to produce a composite algorithm which exploits the merits of each of the algorithms that it contains. By combining different algorithms in different ways we can also define algorithms that have different trade-offs between accuracy and deletion. While MACE allows us to define an infinite number of new composite algorithms, we still face the problem of choosing which algorithm to use. To make this choice, we need to be able to identify properties of a case base that are predictive of which maintenance algorithm is best. We examine a number of measures of dataset complexity for this purpose. These provide a numerical way to describe a case base at a given time. We use the numerical description to develop a meta-case-based classification system. This system uses previous experience about which maintenance algorithm was best to use for other case bases to predict which algorithm to use for a new case base. Finally, we give the knowledge engineer more control over the deletion process by creating incremental versions of the maintenance algorithms. These incremental algorithms suggest one case at a time for deletion rather than a group of cases, which allows the knowledge engineer to decide whether or not each case in turn should be deleted or kept. We also develop incremental versions of the complexity measures, allowing us to create an incremental version of our meta-case-based classification system. Since the case base changes after each deletion, the best algorithm to use may also change. The incremental system allows us to choose which algorithm is the best to use at each point in the deletion process.
Resumo:
This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.
Resumo:
Three indicators of health and diet were selected to examine the health status in three socioeconomic groups in post-medieval Ireland. The aim was to examine the reliability of traditional skeletal markers of health in highly contextualised populations. The link between socio-economic status and health was examined to determine if traditional linking of poor health with poverty was evident in skeletal samples. The analysis indicated that this was indeed the case and that health was significantly compromised in populations of low socio-economic status. Thus it indicated that status intimately influences the physical body form. Sex was also found to be a major defining factor in the response of an individual to physiological stress. It was also evident that contemporary populations may suffer from different physiological stresses, and their responses to those stresses may differ. Adaptation was a key factor here. This has implications for studies of earlier populations that may lack detailed contextual data in terms of blanket applications of interpretations. The results also show a decline in health from the medieval through to the post-medieval period, which is intimately linked with the immense social changes and all the related effects of these. The socio-economic structure of post-medieval Ireland was a direct result of the British policies in Ireland. The physical form of the Irish may be seen to have occurred as a result of those policies, with the Irish poor in particular suffering substantial health problems, even in contrast to the poor of Britain. This study has enriched the recorded historical narrative of this period of the recent past, and highlights more nuanced narratives may emerge from the osteoarchaeological analysis when sound contextual information is available. It also examines a period in Irish history that, until very recently, had been virtually untouched in terms of archaeological study.
Resumo:
Vietnam launched its first-ever stock market, named as Ho Chi Minh City Securities Trading Center (HSTC) on July 20, 2000. This is one of pioneering works on HSTC, which finds empirical evidences for the following: Anomalies of the HSTC stock returns through clusters of limit-hits, limit-hit sequences; Strong herd effect toward extreme positive returns of the market portfolio;The specification of ARMA-GARCH helps capture fairly well issues such as serial correlations and fat-tailed for the stabilized period. By using further information and policy dummy variables, it is justifiable that policy decisions on technicalities of trading can have influential impacts on the move of risk level, through conditional variance behaviors of HSTC stock returns. Policies on trading and disclosure practices have had profound impacts on Vietnam Stock Market (VSM). The over-using of policy tools can harm the market and investing mentality. Price limits become increasingly irrelevant and prevent the market from self-adjusting to equilibrium. These results on VSM have not been reported before in the literature on Vietnam’s financial markets. Given the policy implications, we suggest that the Vietnamese authorities re-think the use of price limit and give more freedom to market participants.