921 resultados para Means-end approach
Resumo:
O seguinte trabalho desenvolve o tema da violência contra o movimento popular na Galiléia, segundo o texto de Lucas 13,1-5. Esse texto não tem paralelo nas outras duas fontes sinóticas, nem em João, nem em Tomé, nem no grupo Galileu que escreveu a fonte Q; quanto a esses eventos históricos que narra o texto, não há referência nem em Flávio Josefo, nem em outros historiadores da época. Isso quer dizer, que estes versículos são uma fonte própria de Lucas, uma fonte autônoma, chamada por alguns como fonte L (ou fonte S). A abordagem deste texto de Lucas, feita por grande parte de pesquisadores na área bíblica, preocupa-se com os temas de pecado e arrependimento, deixando na margem a situação das vítimas e as ameaças de Jesus para seus ouvintes. Neste sentido, este trecho de Lucas é de grande importância. Estes versículos expressam a realidade sócio -política. Seu conteúdo é um sinal de conflito e de denúncia contra o sistema imperial romano que não passou desapercebido para o redator do texto e nem para o seu auditório. Trata-se, portanto, da memória das vítimas da opressão. Apresentamos a seguir, a pesquisa em três capítulos esboçados brevemente. O primeiro descreve o agir específico dos procuradores ou governadores romanos, nas províncias comandadas por eles; ao mesmo tempo, a reação do povo e os seus protestos. A nossa ênfase recairá sobre o procurador romano Pôncio Pilatos. Nos valeremos das fontes bíblicas, extra-bíblicas e pseudo-epígrafas. No final, destacaremos a relevância e o papel central do texto Lucas 13,1-5. No segundo capítulo, o centro será a exegese de Lucas 13,1-5, relacionando-o com o contexto maior que, em nosso caso, é chamado itinerário de viagem para Jerusalém , e com um contexto imediato que é o capitulo 13 de Lucas. No final, perguntaremos pelo grupo ou grupos que podem estar por trás destes versículos, e a importância da fonte L, como fonte primeira que se insere no Evangelho de Lucas. O texto de Lucas 13,1-5 aparece como texto autônomo, memória das vítimas; ele contrasta com a visão moderada dos relatos da Paixão nos sinóticos, frente a uma realidade de opressão. O terceiro capítulo constitui-se num ensaio de articulação destes dois capítulos com a realidade atual, especificamente com a situação de guerra, violência e morte na Colômbia, junto aos esforços atuais por reconstruir a memória das vítimas do povo colombiano, memória que dá sentido e dignifica a oferenda de suas vidas.(AU)
Resumo:
O objetivo geral desta pesquisa foi descrever e analisar os conceitos de redenção, liberdade e serviço nos escritos de Ellen Gould White (1827-1915), uma das fundadoras da Igreja Adventista do Sétimo Dia, refletindo sobre as implicações para a práxis educacional. Foram propostos quatro objetivos específicos, cada um dos quais se transformou num dos capítulos desta tese. Esses capítulos foram norteados por três hipóteses. A primeira sugeria que Ellen White apresenta características que a configuram como uma pensadora liminar , que viveu na fronteira . Usando teorias de Walter Mignolo, Boaventura de Souza Santos e Gloria Anzaldúa, observou-se que em Ellen White há traços de liminaridade, manifestos numa vida e discursos diferentes, que algumas vezes contrariaram o pensamento da maioria dominante, mas que se demonstraram positivos para a denominação. A segunda hipótese deste trabalho dizia respeito à existência de três possíveis eixos que norteiam a noção whiteana de educação: redenção, liberdade e serviço. O que antes era apenas uma aposta confirmou-se pela leitura e análise de obras fundamentais de White. De modo que a compreensão plena da noção whiteana de educação requer uma reflexão articulada desses três temas. Ou seja, não é suficiente que haja uma leitura separada deles ou mesmo uma abordagem justaposta. É necessário articulação. Finalmente, a terceira hipótese dizia que se o tema da liberdade está presente de maneira convincente nos escritos de Ellen White, então se pode pensar que sua filosofia educacional e consequentemente a práxis pedagógica é diretamente condicionada por essa abordagem. O que pode ser dito ao término deste trabalho é que, nos escritos whiteanos, a liberdade constitui-se numa espécie de fio dourado que perpassa os diversos assuntos e esperanças abordadas pela autora, começando pela narrativa da queda que implica na perda da liberdade e terminando na narrativa esperançosa de um mundo perfeito onde finalmente a liberdade responsável será recuperada. Conseqüentemente, a noção whiteana de educação é movida pela temática da liberdade humana: liberdade para pensar, para se desenvolver, para aprender, para criar, para servir, para deixar-se ser redimido.(AU)
Resumo:
Estudo pioneiro que tem como objetivo verificar qual foi a imagem construída de Getúlio Vargas através do cinema, mais especificamente, por meio do cinejornal, verificando como o resultado dessa imagem construída foi utilizado no período eleitoral de 1950, levando em conta as ações de propaganda política, ideológica e eleitoral. Temos como objeto de pesquisa a presença de Getúlio Vargas nos cinejornais veiculados no período de campanha presidencial de 1950 analisados com base na análise de conteúdo qualitativa. Trabalhamos também metodologicamente com a pesquisa documental e histórica, já que abordamos o governo de Vargas, seu suicídio e posteriormente, o histórico dos presidenciáveis que sucederam-no no poder, por isso foi feito um recolhimento de documentos disponíveis daquela época para endossar o trabalho. Concluímos que apesar da campanha eleitoral, política e ideológica de Vargas ter sido estruturada de forma minuciosa, atingindo o objetivo esperado nas urnas, a oposição intensiva dos partidos e da imprensa resultou em um fim trágico que marcou a história da política brasileira.(AU)
Resumo:
Liver fibrosis and its end-stage disease cirrhosis are a main cause of mortality and morbidity worldwide. Thus far, there is no efficient pharmaceutical intervention for the treatment of liver fibrosis. Liver fibrosis is characterized by excessive accumulation of the extracellular matrix (ECM) proteins. Transglutaminase (TG)-mediated covalent cross-linking has been implicated in the stabilization and accumulation of ECM in a number of fibrotic diseases. Thus, the use of tissue TG2 inhibitors has potential in the treatment of liver fibrosis. Recently, we introduced a novel group of site-directed irreversible specific inhibitors of TGs. Here, we describe the development of a liposome-based drug-delivery system for the site-specific delivery of these TG inhibitors into the liver. By using anionic or neutral-based DSPC liposomes, the TG inhibitor can be successfully incorporated into these liposomes and delivered specifically to the liver. Liposomes can therefore be used as a potential carrier system for site-specific delivery of the TG2 inhibitors into the liver, opening up a potential new avenue for the treatment of liver fibrosis and its end-stage disease cirrhosis.
Resumo:
This special issue of the Journal of the Operational Research Society is dedicated to papers on the related subjects of knowledge management and intellectual capital. These subjects continue to generate considerable interest amongst both practitioners and academics. This issue demonstrates that operational researchers have many contributions to offer to the area, especially by bringing multi-disciplinary, integrated and holistic perspectives. The papers included are both theoretical as well as practical, and include a number of case studies showing how knowledge management has been implemented in practice that may assist other organisations in their search for a better means of managing what is now recognised as a core organisational activity. It has been accepted by a growing number of organisations that the precise handling of information and knowledge is a significant factor in facilitating their success but that there is a challenge in how to implement a strategy and processes for this handling. It is here, in the particular area of knowledge process handling that we can see the contributions of operational researchers most clearly as is illustrated in the papers included in this journal edition. The issue comprises nine papers, contributed by authors based in eight different countries on five continents. Lind and Seigerroth describe an approach that they call team-based reconstruction, intended to help articulate knowledge in a particular organisational. context. They illustrate the use of this approach with three case studies, two in manufacturing and one in public sector health care. Different ways of carrying out reconstruction are analysed, and the benefits of team-based reconstruction are established. Edwards and Kidd, and Connell, Powell and Klein both concentrate on knowledge transfer. Edwards and Kidd discuss the issues involved in transferring knowledge across frontières (borders) of various kinds, from those borders within organisations to those between countries. They present two examples, one in distribution and the other in manufacturing. They conclude that trust and culture both play an important part in facilitating such transfers, that IT should be kept in a supporting role in knowledge management projects, and that a staged approach to this IT support may be the most effective. Connell, Powell and Klein consider the oft-quoted distinction between explicit and tacit knowledge, and argue that such a distinction is sometimes unhelpful. They suggest that knowledge should rather be regarded as a holistic systemic property. The consequences of this for knowledge transfer are examined, with a particular emphasis on what this might mean for the practice of OR Their view of OR in the context of knowledge management very much echoes Lind and Seigerroth's focus on knowledge for human action. This is an interesting convergence of views given that, broadly speaking, one set of authors comes from within the OR community, and the other from outside it. Hafeez and Abdelmeguid present the nearest to a 'hard' OR contribution of the papers in this special issue. In their paper they construct and use system dynamics models to investigate alternative ways in which an organisation might close a knowledge gap or skills gap. The methods they use have the potential to be generalised to any other quantifiable aspects of intellectual capital. The contribution by Revilla, Sarkis and Modrego is also at the 'hard' end of the spectrum. They evaluate the performance of public–private research collaborations in Spain, using an approach based on data envelopment analysis. They found that larger organisations tended to perform relatively better than smaller ones, even though the approach used takes into account scale effects. Perhaps more interesting was that many factors that might have been thought relevant, such as the organisation's existing knowledge base or how widely applicable the results of the project would be, had no significant effect on the performance. It may be that how well the partnership between the collaborators works (not a factor it was possible to take into account in this study) is more important than most other factors. Mak and Ramaprasad introduce the concept of a knowledge supply network. This builds on existing ideas of supply chain management, but also integrates the design chain and the marketing chain, to address all the intellectual property connected with the network as a whole. The authors regard the knowledge supply network as the natural focus for considering knowledge management issues. They propose seven criteria for evaluating knowledge supply network architecture, and illustrate their argument with an example from the electronics industry—integrated circuit design and fabrication. In the paper by Hasan and Crawford, their interest lies in the holistic approach to knowledge management. They demonstrate their argument—that there is no simple IT solution for organisational knowledge management efforts—through two case study investigations. These case studies, in Australian universities, are investigated through cultural historical activity theory, which focuses the study on the activities that are carried out by people in support of their interpretations of their role, the opportunities available and the organisation's purpose. Human activities, it is argued, are mediated by the available tools, including IT and IS and in this particular context, KMS. It is this argument that places the available technology into the knowledge activity process and permits the future design of KMS to be improved through the lessons learnt by studying these knowledge activity systems in practice. Wijnhoven concentrates on knowledge management at the operational level of the organisation. He is concerned with studying the transformation of certain inputs to outputs—the operations function—and the consequent realisation of organisational goals via the management of these operations. He argues that the inputs and outputs of this process in the context of knowledge management are different types of knowledge and names the operation method the knowledge logistics. The method of transformation he calls learning. This theoretical paper discusses the operational management of four types of knowledge objects—explicit understanding; information; skills; and norms and values; and shows how through the proposed framework learning can transfer these objects to clients in a logistical process without a major transformation in content. Millie Kwan continues this theme with a paper about process-oriented knowledge management. In her case study she discusses an implementation of knowledge management where the knowledge is centred around an organisational process and the mission, rationale and objectives of the process define the scope of the project. In her case they are concerned with the effective use of real estate (property and buildings) within a Fortune 100 company. In order to manage the knowledge about this property and the process by which the best 'deal' for internal customers and the overall company was reached, a KMS was devised. She argues that process knowledge is a source of core competence and thus needs to be strategically managed. Finally, you may also wish to read a related paper originally submitted for this Special Issue, 'Customer knowledge management' by Garcia-Murillo and Annabi, which was published in the August 2002 issue of the Journal of the Operational Research Society, 53(8), 875–884.
Resumo:
The evaluation of industrial policy interventions has attracted increasing policy and academic attention in recent years. Despite the widespread consensus regarding the need for evaluation, the issue of how to evaluate, and the associated methodological considerations, continue to be issues of considerable debate. The authors develop an approach to estimate the net additionality of financial assistance from Enterprise Ireland to indigenously owned firms in Ireland for the period 2000 to 2002. With a sample of Enterprise Ireland assisted firms, an innovative, self-assessment, in-depth, face-to-face, interview methodology was adopted. The authors also explore a way of incorporating the indirect benefits of assistance into derived deadweight estimate issue which is seldom discussed in the context of deadweight estimates. They conclude by reflecting on the key methodological lessons learned from the evaluation process, and highlight some pertinent evaluation issues which should form the focus of much future discussion in this field of research.
Resumo:
If product cycle time reduction is the mission, and the multifunctional team is the means of achieving the mission, what then is the modus operandi by which this means is to accomplish its mission? This paper asserts that a preferred modus operandi for the multifunctional team is to adopt a process-oriented view of the manufacturing enterprise, and for this it needs the medium of a process map [16] The substance of this paper is a methodology which enables the creation of such maps Specific examples of process models drawn from the product develop ment life cycle are presented and described in order to support the methodology's integrity and value The specific deliverables we have so far obtained are a methodology for process capture and analysis, a collection of process models spanning the product development cycle, and, an engineering handbook which hosts these models and presents a computer-based means of navigating through these processes in order to allow users a better understanding of the nature of the business, their role in it, and why the job that they do benefits the work of the company We assert that this kind of thinking is the essence of concurrent engineering implementation, and further that the systemigram process models uniquely stim ulate and organise such thinking.
Resumo:
For many decades, the Kingdom of Saudi Arabia has been widely known for being a reliable oil exporter. This fact, however, has not exempted it from facing significant domestic energy challenges. One of the most pressing of these challenges involves bridging the widening electricity supply-demand gap where, currently, the demand is growing at a very fast rate. One crucial means to address this challenge is through delivering power supply projects with maximum efficiency. Project delivery delay, however, is not uncommon in this highly capital-intensive industry, indicating electricity supplies are not coping with the demand increases. To provide a deeper insight into the challenges of project implementation and efficient practice, this research adopts a pragmatic approach by triangulating literature, questionnaires and semi-structured interviews. The research was conducted in the Saudi Arabian power supply industry – Western Operating Area. A total of 105 usable questionnaires were collected, and 28 recorded, semi-structured interviews were conducted, analysed and synthesised to produce a conceptual model of what constitutes the project implementation challenges in the investigated industry. This was achieved by conducting a comprehensive ranking analysis applied to all 58 identified and surveyed factors which, according to project practitioners in the investigated industry, contribute to project delay. 28 of these project delay factors were selected as the "most important" ones. Factor Analysis was employed to structure these 28 most important project delay factors into the following meaningful set of 7 project implementation challenges: Saudi Electricity Company's contractual commitments, Saudi Electricity Company's communication and coordination effectiveness, contractors' project planning and project control effectiveness, consultant-related aspects, manpower challenges and material uncertainties, Saudi Electricity Company's tendering system, and lack of project requirements clarity. The study has implications for industry policy in that it provides a coherent assessment of the key project stakeholders' central problems. From this analysis, pragmatic recommendations are proposed that, if enacted, will minimise the significance of the identified problems on future project outcomes, thus helping to ensure the electricity supply-demand gap is diminished.
Resumo:
The purpose of this study is to develop econometric models to better understand the economic factors affecting inbound tourist flows from each of six origin countries that contribute to Hong Kong’s international tourism demand. To this end, we test alternative cointegration and error correction approaches to examine the economic determinants of tourist flows to Hong Kong, and to produce accurate econometric forecasts of inbound tourism demand. Our empirical findings show that permanent income is the most significant determinant of tourism demand in all models. The variables of own price, weighted substitute prices, trade volume, the share price index (as an indicator of changes in wealth in origin countries), and a dummy variable representing the Beijing incident (1989) are also found to be important determinants for some origin countries. The average long-run income and own price elasticity was measured at 2.66 and – 1.02, respectively. It was hypothesised that permanent income is a better explanatory variable of long-haul tourism demand than current income. A novel approach (grid search process) has been used to empirically derive the weights to be attached to the lagged income variable for estimating permanent income. The results indicate that permanent income, estimated with empirically determined relatively small weighting factors, was capable of producing better results than the current income variable in explaining long-haul tourism demand. This finding suggests that the use of current income in previous empirical tourism demand studies may have produced inaccurate results. The share price index, as a measure of wealth, was also found to be significant in two models. Studies of tourism demand rarely include wealth as an explanatory forecasting long-haul tourism demand. However, finding a satisfactory proxy for wealth common to different countries is problematic. This study indicates with the ECM (Error Correction Models) based on the Engle-Granger (1987) approach produce more accurate forecasts than ECM based on Pesaran and Shin (1998) and Johansen (1988, 1991, 1995) approaches for all of the long-haul markets and Japan. Overall, ECM produce better forecasts than the OLS, ARIMA and NAÏVE models, indicating the superiority of the application of a cointegration approach for tourism demand forecasting. The results show that permanent income is the most important explanatory variable for tourism demand from all countries but there are substantial variations between countries with the long-run elasticity ranging between 1.1 for the U.S. and 5.3 for U.K. Price is the next most important variable with the long-run elasticities ranging between -0.8 for Japan and -1.3 for Germany and short-run elasticities ranging between – 0.14 for Germany and -0.7 for Taiwan. The fastest growing market is Mainland China. The findings have implications for policies and strategies on investment, marketing promotion and pricing.
Resumo:
This thesis is focussed on the role differentiationhypothesis as it relates to small groups (Bales, 1958). The hypothesis is systematically examined, both conceptually and empirically, in the light of the Equilibrium Hypothesis (Bales, 1953) and the Negotiated Order Theory of leadership (e.g. Hosking, 1988). Chapter 1 sketches in a context for the research,which was stimulated by attempts during the 60s and 70s to organise small groups without leaders (the leaderless group, based on isocratic principles). Chapter 2 gives a conceptual and developmental overview of Bales' work, concentrating on the Equilibrium Hypothesis. It is argued that Bales' conceptual approach, if developed, can potentially integrate the disparate small groups and leadership literatures. Chapters 3 and 4 examine the concepts `group', `leader' and `leadership' in terms of the Negotiated Order perspective. In chapter 3 it is argued that two aspects of the concept group need to be taken separately into account; physical attributes and social psychological aspects (the metaphysical glue). It is further argued that a collection of people becomes a group only when they begin to establish a shared sense of social order. In chapter 4 it is argued that leadership is best viewed as a process of negotiation between those who influence and those who are influenced, in the context of shared values about means and ends. It is further argued that leadership is the process by which a shared sense of social order is established and maintained, thus linking the concepts `leadership' and `group' in a single formulation. The correspondences with Bales' approach are discussed at the end of the chapter. Chapters 5 to 8 present a detailed critical description and evaluation of the empirical work which claims to show role differentiation or test the hypothesis, both Bales original work and subsequent studies. It is argued here, that the measurement and analytical procedures adopted by Bales and others, in particular the use of simple means as summaries of group structures, are fundamentally flawed, and that role differentiation in relation to particular identifiable groups has not been demonstrated clearly anywhere in the literature. Chapters 9 to 13 present the empirical work conducted for the thesis. 18 small groups are examined systematically for evidence of role differentiation using an approach based on early sociometry (Moreno, 1934). The results suggest that role differentiation, as described by Bales, does not occur as often as is implied in the literature, and not equivocally in any case. In particular structures derived from Liking are typically distributed or weak. This suggests that one of Bales' principal findings, that Liking varies independently of his other main dimensions, is the product of statistical artifact. Chapter 14 presents a general summary of results and presents some considerations about future research.
Resumo:
The present thesis investigates targeted (locally and systemically) delivery of a novel group of inhibitors of enzyme transglutaminases (TGs). TGs are a widely distributed group of enzymes that catalyse the formation of isopeptide bonds between the y-carboxamide group of protein-bound glutamines and the a-amino group of protein-bound lysines or polyamines. The first group of the novel inhibitors tested were the tluorescently labelled inhibitors of Factor XIIIa (FXIIIa). These small, non-toxic inhibitors have the potential to prevent stabilisation of thrombi by FXIIIa and consequently increase the natural rate of thrombolysis, in addition it reduces staphylococcal colonisation of catheters by inhibiting their FXIIIa¬mediated cross-linking to blood clot proteins on the central venous catheter (CVCs) surface. The aim of this work was to incorporate the FXIIIa inhibitor either within coating of polyurethane (PU) catheters or to integrate it into silicone catheters, so as to reduce the incidence of thrombotic occlusion and associated bacterial infection in CVCs. The initial work focused on the incorporation of FXIIIa inhibitors within polymeric coatings of PU catheters. After defining the key characteristics desired for an effective polymeric-coating, polyvinylpyrrolidone (PVP), poly(lactic-co-glycolic acid) (PLGA) or their combination were studies as polymers of choice for coating of the catheters_ The coating was conducted by dip-coating method in a polymer solution containing the inhibitor. Upon incubation of the inhibitor-and polymer-coated strips in buffer, PVP was dissolved instantly, generating fast and significant drug release, whilst PLGA did not dissolve, yielding a slow and an insufficient amount of drug release. Nevertheless, the drug release profile was enhanced upon employing a blend solution of PVP and PLGA. The second part of the study was to incorporate the FXIIIa inhibitor into a silicone elastomer; results demonstrated that FXIIIa inhibitor can be incorporated and released from silicone by using citric acid (CA) and sodium bicarbonate (SB) as additives and the drug release rate can be controlled by the amount of incorporated additives in the silicone matrix. Furthermore, it was deemed that the inhibitor was still biologically active subsequent to being released from the silicone elastomer strips. Morphological analysis confirmed the formation of channels and cracks inside the specimens upon the addition of CA and SB. Nevertheless, the tensile strength, in addition to Young's modulus of silicone elastomer strips, decreased constantly with an increasing amount of amalgamated CA/ SB in the formulations. According to our results, incorporation of FXIIIa inhibitor into catheters and other medical implant devices could offer new perspectives in preventing bio-material associated infections and thrombosis. The use of tissue transglutaminase (T02) inhibitor for treating of liver fibrosis was also investigated. Liver fibrosis is characterized by increased synthesis and decreased degradation of the extracellular matrix (ECM). Transglutaminase-mediated covalent cross-linking is involved in the stabilization of ECM in human liver fibrosis. Thus, TG2 inhibitors may be used to counteract the decreased degradation of the ECM. The potential of a liposome based drug delivery system for site specific delivery of the fluorescent TG2 inhibitor into the liver was investigated; results indicated that the TG2 inhibitor can be successfully integrated into liposomes and delivered to the liver, therefore demonstrating that liposomes can be employed for site-specific delivery of TG2 inhibitors into the liver and TG2 inhibitor incorporating liposomes could offer a new approach in treating liver fibrosis and its end stage disease cirrhosis.
Resumo:
The thesis presents an account of an attempt to utilize expert systems within the domain of production planning and control. The use of expert systems was proposed due to the problematical nature of a particular function within British Steel Strip Products' Operations Department: the function of Order Allocation, allocating customer orders to a production week and site. Approaches to tackling problems within production planning and control are reviewed, as are the general capabilities of expert systems. The conclusions drawn are that the domain of production planning and control contains both `soft' and `hard' problems, and that while expert systems appear to be a useful technology for this domain, this usefulness has by no means yet been demonstrated. Also, it is argued that the main stream methodology for developing expert systems is unsuited for the domain. A problem-driven approach is developed and used to tackle the Order Allocation function. The resulting system, UAAMS, contained two expert components. One of these, the scheduling procedure was not fully implemented due to inadequate software. The second expert component, the product routing procedure, was untroubled by such difficulties, though it was unusable on its own; thus a second system was developed. This system, MICRO-X10, duplicated the function of X10, a complex database query routine used daily by Order Allocation. A prototype version of MICRO-X10 proved too slow to be useful but allowed implementation and maintenance issues to be analysed. In conclusion, the usefulness of the problem-driven approach to expert systems development within production planning and control is demonstrated but restrictions imposed by current expert system software are highlighted in that the abilities of such software to cope with `hard' scheduling constructs and also the slow processing speeds of such software can restrict the current usefulness of expert systems within production planning and control.
Resumo:
Wireless sensor networks have been identified as one of the key technologies for the 21st century. They consist of tiny devices with limited processing and power capabilities, called motes that can be deployed in large numbers of useful sensing capabilities. Even though, they are flexible and easy to deploy, there are a number of considerations when it comes to their fault tolerance, conserving energy and re-programmability that need to be addressed before we draw any substantial conclusions about the effectiveness of this technology. In order to overcome their limitations, we propose a middleware solution. The proposed scheme is composed based on two main methods. The first method involves the creation of a flexible communication protocol based on technologies such as Mobile Code/Agents and Linda-like tuple spaces. In this way, every node of the wireless sensor network will produce and process data based on what is the best for it but also for the group that it belongs too. The second method incorporates the above protocol in a middleware that will aim to bridge the gap between the application layer and low level constructs such as the physical layer of the wireless sensor network. A fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort towards the deployed applications running in an energy efficient manner inside the network. The proposed scheme is evaluated through a number of trials aiming to test its merits under real time conditions and to identify its effectiveness against other similar approaches. Finally, parameters which determine the characteristics of the proposed scheme are also examined.
Resumo:
To ensure state synchronization of signalling operations, many signaling protocol designs choose to establish “soft” state that expires if it is not refreshed. The approaches of refreshing state in multi-hop signaling system can be classified as either end-to-end (E2E) or hop-by-hop (HbH). Although both state refresh approaches have been widely used in practical signaling protocols, the design tradeoffs between state synchronization and signaling cost have not yet been fully investigated. In this paper, we investigate this issue from the perspectives of state refresh and state removal. We propose simple but effective Markov chain models for both approaches and obtain closed-form solutions which depict the state refresh performance in terms of state consistency and refresh message rate, as well as the state removal performance in terms of state removal delay. Simulations verify the analytical models. It is observed that the HbH approach yields much better state synchronization at the cost of higher signaling cost than the E2E approach. While the state refresh performance can be improved by increasing the values of state refresh and timeout timers, the state removal delay increases largely for both E2E and HbH approaches. The analysis here shed lights on the design of signaling protocols and the configuration of the timers to adapt to changing network conditions.
Resumo:
Decentralised supply chain formation involves determining the set of producers within a network able to supply goods to one or more consumers at the lowest cost. This problem is frequently tackled using auctions and negotiations. In this paper we show how it can be cast as an optimisation of a pairwise cost function. Optimising this class of functions is NP-hard but good approximations to the global minimum can be obtained using Loopy Belief Propagation (LBP). Here we detail a LBP-based approach to the supply chain formation problem, involving decentralised message-passing between potential participants. Our approach is evaluated against a well-known double-auction method and an optimal centralised technique, showing several improvements: it obtains better solutions for most networks that admit a competitive equilibrium Competitive equilibrium as defined in [3] is used as a means of classifying results on certain networks to allow for minor inefficiencies in their auction protocol and agent bidding strategies. while also solving problems where no competitive equilibrium exists, for which the double-auction method frequently produces inefficient solutions.