815 resultados para multi-concern autonomic management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the pioneer firms in the leisure cruise industry embarked on a bold idea in 2000 to offer an unregimented experience unlike most cruises. Despite the appeal of the concept from a marketing perspective, the service innovation posed operational challenges, many of which continue to undermine the firm’s competitive position. Using a multi-method empirical approach and interdisciplinary views that draw on research from marketing and operations management, the authors analyze this business case to identify challenges that service firms face when services are developed and managed from siloed functional perspectives. Based on their research findings and guided by the literature, the authors derive a service-systems model to aid service planning and management. The authors further highlight a new organizational form and function for services under the domain of service experience management that is positioned as a means to unify service operations and marketing for delivering on service promises. The authors offer direction for further research on service operations systems and service experience management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This multi-perspectival Interpretive Phenomenological Analysis (IPA) study explored how people in the ‘networks of concern’ talked about how they tried to make sense of the challenging behaviours of four children with severe learning disabilities. The study also aimed to explore what affected relationships between people. The study focussed on 4 children through interviewing their mothers, their teachers and the Camhs Learning Disability team members who were working with them. Two fathers also joined part of the interviews. All interviews were conducted separately using a semi-structured approach. IPA allowed both a consideration of the participant’s lived experiences and ‘objects of concern’ and a deconstruction of the multiple contexts of people’s lives, with a particular focus on disability. The analysis rendered five themes: the importance of love and affection, the difficulties, and the differences of living with a challenging child, the importance of being able to make sense of the challenges and the value of good relationships between people. Findings were interpreted through the lens of CMM (Coordinated Management of Meaning), which facilitated a systemic deconstruction and reconstruction of the findings. The research found that making sense of the challenges was a key concern for parents. Sharing meanings were important for people’s relationships with each other, including employing diagnostic and behavioural narratives. The importance of context is also highlighted including a consideration of how societal views of disability have an influence on people in the ‘network of concern’ around the child. A range of systemic approaches, methods and techniques are suggested as one way of improving services to these children and their families. It is suggested that adopting a ‘both/and’ position is important in such work - both applying evidence based approaches and being alert to and exploring the different ways people try and make sense of the children’s challenges. Implications for practice included helping professionals be alert to their constructions and professional narratives, slowing the pace with families, staying close to the concerns of families and addressing network issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research thesis explored the concept of empathy. The specific purpose was to further understand the idea of empathy in relation to the experience of male support workers who provide residential care to adults with intellectual disabilities (ID) and challenging behaviour. The thesis aimed to provide some insights into how support workers develop and extract meaning from their experiences of relationships with clients and the impact of this on their own self-care, namely, self-compassion. Since personal accounts of experience were required, a qualitative methodology was employed, Interpretative Phenomenological Analysis (IPA) (Smith, 2004). This methodology was selected as it allows for the exploration and interpretation of idiographic lived experience and meaning making. 8 experienced support workers were interviewed using a semi structured interview. Four superordinate themes emerged from the data. These included: 1. Making sense of the others inner world; 2. Processes that enhance empathic practice; 3. Tensions and conflicts, and 4. Management of distressing feelings. Differing accounts of interpreting the needs of clients were identified which helped participants understand, make sense of their interpersonal experience and participate in their role. These included utilising academic knowledge and senses, particularly sight and hearing, which were seemingly complemented by a level of reflective practice. Additionally, to make sense of the experience of a client, they appeared to put themselves in their position, suggesting a form of empathy. Participants appeared to engage in a process of reflection on their relationships with clients, which helped them think about what they had learned about the person’s needs, moreover, this process enabled them to identify some of their own responses and feelings. However, participants seemed to struggle to recognise the occurrence or impact of distressing emotional experience and to express their feelings, possibly in response to a deep sense of responsibility and fear of transferring emotional distress to others. This dilemma of holding two potentially conflicting views of experience seemed to inhibit self-compassion. Although not specifically testing theories of empathy, from the overall findings, it could be suggested that empathy may be a dynamic, transient process that is influenced by reflexivity, values and context. The context in which participants discussed their practice, and situated within their accounts, suggested a sense of confusion and uncertainty. Consequently, it is suggested this impacted on how participants understood and related to clients, and to themselves. There were some specific implications for Counselling Psychology practice, mostly concerning training and supervision. These included recommendations for staff training and supervision, systemic organisational intervention, policy development, recommendations for revisions to models of specialist care frameworks and clinical training.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design is being performed on an ever-increasing spectrum of complex practices arising in response to emerging markets and technologies, co-design, digital interaction, service design and cultures of innovation. This emerging notion of design has led to an expansive array of collaborative and facilitation skills to demonstrate and share how such methods can shape innovation. The meaning of these design things in practice can't be taken for granted as matters of fact, which raises a key challenge for design to represent its role through the contradictory nature of matters of concern. This paper explores an innovative, object-oriented approach within the field of design research, visually combining an actor-network theory framework with situational analysis, to report on the role of design for fledgling companies in Scotland, established and funded through the knowledge exchange hub Design in Action (DiA). Key findings and visual maps are presented from reflective discussions with actors from a selection of the businesses within DiA's portfolio. The suggestion is that any notions of strategic value, of engendering meaningful change, of sharing the vision of design, through design things, should be grounded in the reflexive interpretations of matters of concern that emerge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lake Albert is one of the largest lakes in Uganda that still supports a multi-species fishery which as a result of variable adult sizes of the species, causes management challenges especially in relation to gear mesh size enforcement. Prior to the 1980s, commercial species were 17 large sized fishes especially Citharinus citharinus, Distichodus niloticus and Lates spp. that were confimed to inshore habitats of the lake and were thus rapidly over fished. Frame and catch assessment surveys conducted in this study revealed a >80% dominance of small size fish species (Neobola bredoi and Brycinus nurse) and a 40 -60% decrease in the contribution of the large commercial species. Sustainability of small size fish species is uncertain due to seasonal fluctuations and low beach value. At about 150,000 tons of fish recorded from Lake Albert and Albert Nile, the beach value was estimated at 55.3 million USD. Despite the noted decline in catches of the large sized fishes their contribution was more than 50% of total beach value. Therefore, management measures should couple value addition for the small sized species and maintain effort regulation targeting recovery of the large previously important commercial species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After having elective percutaneous coronary intervention (PCI) patients are expected to self-manage their coronary heart disease (CHD) by modifying their risk factors, adhering to medication and effectively managing any recurring angina symptoms but that may be ineffective. Objective: Explore how patients self-manage their coronary heart disease (CHD) after elective PCI and identify any factors that may infl uence that. Design and method: This mixed methods study recruited a convenience sample of patients (n=93) approximately three months after elective PCI. Quantitative data were collected using a survey and were subject to univariate, bivariate and multi-variate analysis. Qualitative data from participant interviews was analysed using thematic analysis. Findings: After PCI, 74% of participants managed their angina symptoms inappropriately. Younger participants and those with threatening perceptions of their CHD were more likely to know how to effectively manage their angina symptoms. Few patients adopted a healthier lifestyle after PCI. Qualitative analysis revealed that intentional non-adherence to some medicines was an issue. Some participants felt unsupported by healthcare providers and social networks in relation to their self-management. Participants reported strong emotional responses to CHD and this had a detrimental effect on their self-management. Few patients accessed cardiac rehabilitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-residue gas chromatography-mass spectrometry method was developed in order to evaluate the presence of 39 pesticides of different chemical families (organophosphorus, triazines, imidazole, organochlorine), as well as some of their transformation products, in surface water samples from Ria de Aveiro. Ria de Aveiro is an estuarine coastal lagoon, located in the northern west region of Portugal, which receives inputs from agriculture, urban and industrial activities. The analytical method was developed and validated according international guidelines and showed good linearity, with correlation coefficients higher than 0.9949 for all compounds, adequate precision and accuracy, and high sensitivity. Pesticides were chosen from the priority pollutants list of the Directive 2008/105/EC of the European Parliament and of the Council (on environmental quality standards in the field of water policy), or were selected due their common use in agricultural practices. Some of these 39 pesticides are, or are suspected to be, endocrine disruptor compounds (EDCs), being capable of altering the endocrine system of wildlife and humans, causing form malfunction and ultimately health problems. Even those pesticides which are not EDCs, are known to be awfully toxic and have a recognised impact in human health. The aquatic environment is particularly susceptible to pollution due to intentional and accidental release of chemicals to water [3]. Pesticide contamination of surface water is a national issue as it is often used as drinking water. This concern is especially important in rural agricultural areas where population uses small private water supplies, regularly without any laboratory surveillance. The study was performed in seven sampling points and the results showed a considerable concern pesticide contamination of all samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The report of the proceedings of the New Delhi workshop on the SSF Guidelines (Voluntary Guidelines for Securing Sustainable Small-scale Fisheries in the Context of Food Security and Poverty Eradication). The workshop brought together 95 participants from 13 states representing civil society organizations. governments, FAO, and fishworker organizations from both the marine and inland fisheries sectors. This report will be found useful for fishworker organizations, researchers, policy makers, members of civil society and anyone interested in small-scale fisheries, tenure rights, social development, livelihoods, post harvest and trade and disasters and climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What does this thesis do? This thesis uses Actor-Network Theory (ANT) to examine how a UK retailer’s organization and strategy, and, in turn, its form of management accounting was shaped by its supply chain. The thesis does this by reporting on four related themes in the form of four inter-connected essays. The first essay undertakes a state-of-the-art review of the literature. It examines how accounting issues within supply chains permeate ‘matters of concern’. In accordance with this idea of ANT, the essay illustrates how issues emerged, controversies developed, and matters evolved through an actor-network of accounting researchers within the supply chain domain. This leads on to the second essay, which exemplifies the nature of the UK’s retailing industry within which the supply chain case organization emerged and developed. The purposes of the essay are twofold: to introduce the contextual ramifications of the case organization; and to illustrate the emergence of a new market logic, which led to the creation of a global supply chain and a new form of management accounting therein. The third essay reports on a qualitative case study. It analyses the dualistic relation between ostensive and performative aspects of supply chain strategy, reveals how accounting numbers act as an obligatory passage point within this dualism, and makes a contribution to the ANT debate around the issue of whether and how a dualism between ostensive and performative aspects exists. The final essay reports on another case analysis of institutionalizing a heterarchical form of management accounting: a distributed form of intelligence that penetrates through lateral accountable relations. The analysis reveals a new form of management accounting characterised by ambiguity; it emphasizes the possibilities of compromises and negotiations, and it thus contributes to knowledge by combining an aspect of ANT with heterarchical tendencies in the world of contemporary organizations. Finally, the thesis concludes that it is the supply chain that organises today’s neoliberal capitalism; and it is management accounting that unites both human and non-human actors within such supply chains, despite that form of management accounting being ambiguous. The thesis comprises the introduction, these four essays, and the conclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding and predicting patterns of distribution and abundance of marine resources is important for con- servation and management purposes in small-scale artisanal fisheries and industrial fisheries worldwide. The goose barnacle (Pollicipes pollicipes) is an important shellfish resource and its distribution is closely related to wave exposure at different spatial scales. We modelled the abundance (percent coverage) of P. pollicipes as a function of a simple wave exposure index based on fetch estimates from digitized coastlines at different spatial scales. The model accounted for 47.5% of the explained deviance and indicated that barnacle abundance increases non-linearly with wave exposure at both the smallest (metres) and largest (kilometres) spatial scales considered in this study. Distribution maps were predicted for the study region in SW Portugal. Our study suggests that the relationship between fetch-based exposure indices and P. pollicipes percent cover may be used as a simple tool for providing stakeholders with information on barnacle distribution patterns. This information may improve assessment of harvesting grounds and the dimension of exploitable areas, aiding management plans and support- ing decision making on conservation, harvesting pressure and surveillance strategies for this highly appreciated and socio- economically important marine resource.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The status of five species of commercially exploited sharks within the Great Barrier Reef Marine Park (GBRMP) and south-east Queensland was assessed using a data-limited approach. Annual harvest rate, U, estimated empirically from tagging between 2011 and 2013, was compared with an analytically-derived proxy for optimal equilibrium harvest rate, UMSY Lim. Median estimates of U for three principal retained species, Australian blacktip shark, Carcharhinus tilstoni, spot-tail shark, Carcharhinus sorrah, and spinner shark, Carcharhinus brevipinna, were 0.10, 0.06 and 0.07 year-1, respectively. Median U for two retained, non-target species, pigeye shark, Carcharhinus amboinensis and Australian sharpnose shark, Rhizoprionodon taylori, were 0.27 and 0.01 year-1, respectively. For all species except the Australian blacktip the median ratio of U/UMSY Lim was <1. The high vulnerability of this species to fishing combined with life history characteristics meant UMSY Lim was low (0.04-0.07 year-1) and that U/UMSY Lim was likely to be > 1. Harvest of the Australian blacktip shark above UMSY could place this species at a greater risk of localised depletion in parts of the GBRMP. Results of the study indicated that much higher catches, and presumably higher U, during the early 2000s were likely unsustainable. The unexpectedly high level of U on the pigeye shark indicated that output-based management controls may not have been effective in reducing harvest levels on all species, particularly those caught incidentally by other fishing sectors including the recreational sector. © 2016 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forested areas within cities host a large number of species, responsible for many ecosystem services in urban areas. The biodiversity in these areas is influenced by human disturbances such as atmospheric pollution and urban heat island effect. To ameliorate the effects of these factors, an increase in urban green areas is often considered sufficient. However, this approach assumes that all types of green cover have the same importance for species. Our aim was to show that not all forested green areas are equal in importance for species, but that based on a multi-taxa and functional diversity approach it is possible to value green infrastructure in urban environments. After evaluating the diversity of lichens, butterflies and other-arthropods, birds and mammals in 31 Mediterranean urban forests in south-west Europe (Almada, Portugal), bird and lichen functional groups responsive to urbanization were found. A community shift (tolerant species replacing sensitive ones) along the urbanization gradient was found, and this must be considered when using these groups as indicators of the effect of urbanization. Bird and lichen functional groups were then analyzed together with the characteristics of the forests and their surroundings. Our results showed that, contrary to previous assumptions, vegetation density and more importantly the amount of urban areas around the forest (matrix), are more important for biodiversity than forest quantity alone. This indicated that not all types of forested green areas have the same importance for biodiversity. An index of forest functional diversity was then calculated for all sampled forests of the area. This could help decision-makers to improve the management of urban green infrastructures with the goal of increasing functionality and ultimately ecosystem services in urban areas.