864 resultados para Support tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Although business models that deliver sustainability are increasingly popular in the literature, few tools that assist in sustainable business modelling have been identified. This paper investigates how businesses might create balanced social, environmental and economic value through integrating sustainability more fully into the core of their business. A value mapping tool is developed to help firms create value propositions better suited for sustainability. Design/methodology/approach: In addition to a literature review, six sustainable companies were interviewed to understand their approaches to business modelling, using a case study approach. Building on the literature and practice, a tool was developed which was pilot tested through use in a workshop. The resulting improved tool and process was subsequently refined through use in 13 workshops. Findings: A novel value mapping tool was developed to support sustainable business modelling, which introduces three forms of value (value captured, missed/destroyed or wasted, and opportunity) and four major stakeholder groups (environment, society, customer, and network actors). Practical implications: This tool intends to support business modelling for sustainability by assisting firms in better understanding their overall value proposition, both positive and negative, for all relevant stakeholders in the value network. Originality/value: The tool adopts a multiple stakeholder view of value, a network rather than firm centric perspective, and introduces a novel way of conceptualising value that specifically introduces value destroyed or wasted/ missed, in addition to the current value proposition and new opportunities for value creation. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Security policies are increasingly being implemented by organisations. Policies are mapped to device configurations to enforce the policies. This is typically performed manually by network administrators. The development and management of these enforcement policies is a difficult and error prone task. This thesis describes the development and evaluation of an off-line firewall policy parser and validation tool. This provides the system administrator with a textual interface and the vendor specific low level languages they trust and are familiar with, but the support of an off-line compiler tool. The tool was created using the Microsoft C#.NET language, and the Microsoft Visual Studio Integrated Development Environment (IDE). This provided an object environment to create a flexible and extensible system, as well as simple Web and Windows prototyping facilities to create GUI front-end applications for testing and evaluation. A CLI was provided with the tool, for more experienced users, but it was also designed to be easily integrated into GUI based applications for non-expert users. The evaluation of the system was performed from a custom built GUI application, which can create test firewall rule sets containing synthetic rules, to supply a variety of experimental conditions, as well as record various performance metrics. The validation tool was created, based around a pragmatic outlook, with regard to the needs of the network administrator. The modularity of the design was important, due to the fast changing nature of the network device languages being processed. An object oriented approach was taken, for maximum changeability and extensibility, and a flexible tool was developed, due to the possible needs of different types users. System administrators desire, low level, CLI-based tools that they can trust, and use easily from scripting languages. Inexperienced users may prefer a more abstract, high level, GUI or Wizard that has an easier to learn process. Built around these ideas, the tool was implemented, and proved to be a usable, and complimentary addition to the many network policy-based systems currently available. The tool has a flexible design and contains comprehensive functionality. As opposed to some of the other tools which perform across multiple vendor languages, but do not implement a deep range of options for any of the languages. It compliments existing systems, such as policy compliance tools, and abstract policy analysis systems. Its validation algorithms were evaluated for both completeness, and performance. The tool was found to correctly process large firewall policies in just a few seconds. A framework for a policy-based management system, with which the tool would integrate, is also proposed. This is based around a vendor independent XML-based repository of device configurations, which could be used to bring together existing policy management and analysis systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Political drivers such as the Kyoto protocol, the EU Energy Performance of Buildings Directive and the Energy end use and Services Directive have been implemented in response to an identified need for a reduction in human related CO2 emissions. Buildings account for a significant portion of global CO2 emissions, approximately 25-30%, and it is widely acknowledged by industry and research organisations that they operate inefficiently. In parallel, unsatisfactory indoor environmental conditions have proven to negatively impact occupant productivity. Legislative drivers and client education are seen as the key motivating factors for an improvement in the holistic environmental and energy performance of a building. A symbiotic relationship exists between building indoor environmental conditions and building energy consumption. However traditional Building Management Systems and Energy Management Systems treat these separately. Conventional performance analysis compares building energy consumption with a previously recorded value or with the consumption of a similar building and does not recognise the fact that all buildings are unique. Therefore what is required is a new framework which incorporates performance comparison against a theoretical building specific ideal benchmark. Traditionally Energy Managers, who work at the operational level of organisations with respect to building performance, do not have access to ideal performance benchmark information and as a result cannot optimally operate buildings. This thesis systematically defines Holistic Environmental and Energy Management and specifies the Scenario Modelling Technique which in turn uses an ideal performance benchmark. The holistic technique uses quantified expressions of building performance and by doing so enables the profiled Energy Manager to visualise his actions and the downstream consequences of his actions in the context of overall building operation. The Ideal Building Framework facilitates the use of this technique by acting as a Building Life Cycle (BLC) data repository through which ideal building performance benchmarks are systematically structured and stored in parallel with actual performance data. The Ideal Building Framework utilises transformed data in the form of the Ideal Set of Performance Objectives and Metrics which are capable of defining the performance of any building at any stage of the BLC. It is proposed that the union of Scenario Models for an individual building would result in a building specific Combination of Performance Metrics which would in turn be stored in the BLC data repository. The Ideal Data Set underpins the Ideal Set of Performance Objectives and Metrics and is the set of measurements required to monitor the performance of the Ideal Building. A Model View describes the unique building specific data relevant to a particular project stakeholder. The energy management data and information exchange requirements that underlie a Model View implementation are detailed and incorporate traditional and proposed energy management. This thesis also specifies the Model View Methodology which complements the Ideal Building Framework. The developed Model View and Rule Set methodology process utilises stakeholder specific rule sets to define stakeholder pertinent environmental and energy performance data. This generic process further enables each stakeholder to define the resolution of data desired. For example, basic, intermediate or detailed. The Model View methodology is applicable for all project stakeholders, each requiring its own customised rule set. Two rule sets are defined in detail, the Energy Manager rule set and the LEED Accreditor rule set. This particular measurement generation process accompanied by defined View would filter and expedite data access for all stakeholders involved in building performance. Information presentation is critical for effective use of the data provided by the Ideal Building Framework and the Energy Management View definition. The specifications for a customised Information Delivery Tool account for the established profile of Energy Managers and best practice user interface design. Components of the developed tool could also be used by Facility Managers working at the tactical and strategic levels of organisations. Informed decision making is made possible through specified decision assistance processes which incorporate the Scenario Modelling and Benchmarking techniques, the Ideal Building Framework, the Energy Manager Model View, the Information Delivery Tool and the established profile of Energy Managers. The Model View and Rule Set Methodology is effectively demonstrated on an appropriate mixed use existing ‘green’ building, the Environmental Research Institute at University College Cork, using the Energy Management and LEED rule sets. Informed Decision Making is also demonstrated using a prototype scenario for the demonstration building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of my Portfolio is to explore the working hypothesis that the organic growth of a firm is governed by the perspectives of individuals and such perspectives are governed by their meaning-making. The Portfolio presents explorations of the transformation of my meaning making and in adopting new practices to support the organic growth of a firm. I use the work of other theorists to transition my understanding of how the world works. This transition process is an essential tool to engage with and understand the perspectives of others and develop a mental capacity to “train one’s imagination to go visiting” (Arendt, 1982; p.43). The Portfolio, therefore, is primarily located in reflective research. Using Kegan’s (1994) approach to Adult Mental Development, and Sowell’s (2007) understanding of the visions which silently shape our thoughts I organise the developments of my meaning making around three transformation pillars of change. In pillar one I seek to transform an unthinking respect for authority and break down a blind pervasiveness of thought within my reasoning process arising from an instinct for attachment and support from others whom I trust. In pillar two I seek to discontinue using autocratic leadership and learn to use the thoughts and contributions of a wider team to make improved choices about uncertain future events. In pillar three I explore the use of a more reflective thinking framework to test the accuracy of my perceptions and apply a high level of integrity in my reasoning process. The transformation of my meaning making has changed my perspectives and in turn my preferred practices to support the organic growth of a firm. I identify from practice that a transformative form of leadership is far more effective that a transactional form of leadership to stimulate the trust and teamwork required to sustain the growth a firm. Creating an environment where one feels free to share thoughts and feelings with others is an essential tool to build a team to critique the thoughts of one other. Furthermore, the entrepreneurial wisdom to grow a firm must come from a wider team, located both inside and outside the boundaries of a firm. No individual or small team has the mental capacity to provide the entrepreneurship required to drive the organic growth of a firm. I address my Portfolio to leaders in organisations who have no considered framework on the best practices required to lead a social organisation. These individuals may have no sense of what they implicitly believe drives social causation and they may have no understanding if their meaning making supports or curtails the practices required to grow a firm. They may have a very limited capacity to think in a logical manner, with the result they are using guesses from their ‘gut’ to make poor judgements in the management of a firm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The detection of latent tuberculosis infection (LTBI) is a major component of tuberculosis (TB) control strategies. In addition to the tuberculosis skin test (TST), novel blood tests, based on in vitro release of IFN-gamma in response to Mycobacterium tuberculosis-specific antigens ESAT-6 and CFP-10 (IGRAs), are used for TB diagnosis. However, neither IGRAs nor the TST can separate acute TB from LTBI, and there is concern that responses in IGRAs may decline with time after infection. We have therefore evaluated the potential of the novel antigen heparin-binding hemagglutinin (HBHA) for in vitro detection of LTBI. METHODOLOGY AND PRINCIPAL FINDINGS: HBHA was compared to purified protein derivative (PPD) and ESAT-6 in IGRAs on lymphocytes drawn from 205 individuals living in Belgium, a country with low TB prevalence, where BCG vaccination is not routinely used. Among these subjects, 89 had active TB, 65 had LTBI, based on well-standardized TST reactions and 51 were negative controls. HBHA was significantly more sensitive than ESAT-6 and more specific than PPD for the detection of LTBI. PPD-based tests yielded 90.00% sensitivity and 70.00% specificity for the detection of LTBI, whereas the sensitivity and specificity for the ESAT-6-based tests were 40.74% and 90.91%, and those for the HBHA-based tests were 92.06% and 93.88%, respectively. The QuantiFERON-TB Gold In-Tube (QFT-IT) test applied on 20 LTBI subjects yielded 50% sensitivity. The HBHA IGRA was not influenced by prior BCG vaccination, and, in contrast to the QFT-IT test, remote (>2 years) infections were detected as well as recent (<2 years) infections by the HBHA-specific test. CONCLUSIONS: The use of ESAT-6- and CFP-10-based IGRAs may underestimate the incidence of LTBI, whereas the use of HBHA may combine the operational advantages of IGRAs with high sensitivity and specificity for latent infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a select overview of tools supporting traditional Jewish learning. Then we go on to discuss our own HyperJoseph/HyperIsaac project in instructional hypermedia. Its application is to teaching, teacher training, and self-instruction in given Bible passages. The treatment of two narratives has been developed thus far. The tool enables an analysis of the text in several respects: linguistic, narratological, etc. Moreover, the Scriptures' focality throughout the cultural history makes this domain of application particularly challenging, in that there is a requirement for the tool to encompass the accretion of receptions in the cultural repertoire, i.e., several layers of textual traditions—either hermeneutic (i.e., interpretive), or appropriations—related to the given core passage, thus including "secondary" texts (i.e., such that are responding or derivative) from as disparate realms as Roman-age and later homiletics, Medieval and later commentaries or supercommentaries, literary appropriations, references to the arts and modern scholarship, etc. in particular, the Midrash (homiletic expansions) is adept at narrative gap filling, so the narratives mushroom at the interstices where the primary text is silent. The genealogy of the project is rooted in Weiss' index of novelist Agnon's writings, which was eventually upgraded into a hypertextual tool, including Agnon's full-text and ancillary materials. Those early tools being intended primarily for reference and research-support in literary studies, the Agnon hypertext system was initially emulated in the conception of HyperJoseph, which is applied to the Joseph story from Genesis. Then, the transition from a tool for reference to an instructional tool required a thorough reconception in an educational perspective, which led to HyperIsaac, on the sacrifice of Isaac, and to a redesign and upgrade of HyperJoseph as patterned after HyperIsaac.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Computer Aided Parallelisation Tools (CAPTools) [Ierotheou, C, Johnson SP, Cross M, Leggett PF, Computer aided parallelisation tools (CAPTools)-conceptual overview and performance on the parallelisation of structured mesh codes, Parallel Computing, 1996;22:163±195] is a set of interactive tools aimed to provide automatic parallelisation of serial FORTRAN Computational Mechanics (CM) programs. CAPTools analyses the user's serial code and then through stages of array partitioning, mask and communication calculation, generates parallel SPMD (Single Program Multiple Data) messages passing FORTRAN. The parallel code generated by CAPTools contains calls to a collection of routines that form the CAPTools communications Library (CAPLib). The library provides a portable layer and user friendly abstraction over the underlying parallel environment. CAPLib contains optimised message passing routines for data exchange between parallel processes and other utility routines for parallel execution control, initialisation and debugging. By compiling and linking with different implementations of the library, the user is able to run on many different parallel environments. Even with today's parallel systems the concept of a single version of a parallel application code is more of an aspiration than a reality. However for CM codes the data partitioning SPMD paradigm requires a relatively small set of message-passing communication calls. This set can be implemented as an intermediate `thin layer' library of message-passing calls that enables the parallel code (especially that generated automatically by a parallelisation tool such as CAPTools) to be as generic as possible. CAPLib is just such a `thin layer' message passing library that supports parallel CM codes, by mapping generic calls onto machine specific libraries (such as CRAY SHMEM) and portable general purpose libraries (such as PVM an MPI). This paper describe CAPLib together with its three perceived advantages over other routes: - as a high level abstraction, it is both easy to understand (especially when generated automatically by tools) and to implement by hand, for the CM community (who are not generally parallel computing specialists); - the one parallel version of the application code is truly generic and portable; - the parallel application can readily utilise whatever message passing libraries on a given machine yield optimum performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software metrics are the key tool in software quality management. In this paper, we propose to use support vector machines for regression applied to software metrics to predict software quality. In experiments we compare this method with other regression techniques such as Multivariate Linear Regression, Conjunctive Rule and Locally Weighted Regression. Results on benchmark dataset MIS, using mean absolute error, and correlation coefficient as regression performance measures, indicate that support vector machines regression is a promising technique for software quality prediction. In addition, our investigation of PCA based metrics extraction shows that using the first few Principal Components (PC) we can still get relatively good performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents the application of a theatrical technique—Playback Theatre, which was developed in the United States during the 1970s—to social intervention, as a narrative and listening space that confers value and dignity upon the person and the unique and distinct individual experiences that facilitate their social and relational integration. This art of being oneself, as the author states, uses the oral tradition and spontaneous and creative communication of psychodrama and combines them with theatrical expression. This technique has been shown to be pertinent to both community social work and support groups for persons in problematic situations. The aim of this is to celebrate some specific moment of their lives, as individuals or as a community, and to define strategies for improving living conditions or resolving or alleviating conflicts. It is also used to assess the achievements of the proposed objectives, to strengthen the motivation to change and to transform existing relationships into collaborative ones. This is possible not only owing to the participation of persons, but also to the assumption of different roles that can permit the overcoming of certain traumatic events.In addition to support groups, it is used for the training and supervision of social work professionals. The theatrical technique in question allows them to assume roles as diverse as narrator, audience or actor, whether simultaneously or successively. Taking the role of «performer» or guide to the theatrical action requires prior preparation in order for the group of participants to be able to pool their individualities and their emotions and reflect on them. The participatory methodology that Playback Theatre proposes is important in community social work and is posed in a new and transformative key.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problems often faced by social workers and their supervisors in decision making where human rights considerations and child protection concerns collide. High profile court cases in the United Kingdom and Europe have consistently called for social workers to convey more clarity when justifying their reasons for interfering with human rights in child protection cases. The themes emerging from these case law decisions imply that social workers need to be better at giving reasons and evidence in more explicit ways to support any actions they propose which cause interference with Convention Rights. Toulmin (1958, 1985) offers a structured approach to argumentation which may have relevance to the supervision of child protection cases when social workers and managers are required to balance these human rights considerations. One of the key challenges in this balancing act is the need for decision makers to feel confident that any interventions resulting in the interference of human rights are both justified and proportionate. Toulmin’s work has already been shown to have relevance for assisting social workers navigate pathways through cases involving competing ethical and moral demands (Osmo and Landau, 2001) and more recently to human rights and decision making in child protection (Duffy et al, 2006). Toulmin’s model takes the practitioner through a series of stages where any argument or proposed recommendation (claim) is subjected to intense critical analysis involving exposition of its strengths and weaknesses. The author therefore proposes that explicit argumentation (Osmo and Landau, 2001) may help supervisors and practitioners towards safer and more confident decision making in child protection cases involving the interference of the human rights of children and parents. In addition to highlighting the broader context of human rights currently permeating child protection decision making, the paper will include case material to practically demonstrate the application of Toulmin’s model of argumentation to the supervision context. In this way the paper adopts a strong practice approach in helping to assist practitioners with the problems and dilemmas they may come up against in decision making in complex cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researchers have long reported that dogs and cats improve the physical and psychological health of their human caregivers, and while it is still inconclusive, a substantial amount of research now lends support for the commonly held view that "pets are good for us." Recently, studies have directed attention toward exploring the use of animals, most notably dogs, in the detection of disease and other types of health problems in people. This article reviews the evidence for dogs' ability to detect ill health in humans, focusing specifically on the detection of cancer, epileptic seizures, and hypoglycemia. The author describes the research carried out in this area and evaluates it in an effort to determine whether dogs have a role to play in modern health care as an "alert" tool or screening system for ill health. Where necessary, the author has highlighted weaknesses in the work and proposed directions for future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonlinear principal component analysis (PCA) based on neural networks has drawn significant attention as a monitoring tool for complex nonlinear processes, but there remains a difficulty with determining the optimal network topology. This paper exploits the advantages of the Fast Recursive Algorithm, where the number of nodes, the location of centres, and the weights between the hidden layer and the output layer can be identified simultaneously for the radial basis function (RBF) networks. The topology problem for the nonlinear PCA based on neural networks can thus be solved. Another problem with nonlinear PCA is that the derived nonlinear scores may not be statistically independent or follow a simple parametric distribution. This hinders its applications in process monitoring since the simplicity of applying predetermined probability distribution functions is lost. This paper proposes the use of a support vector data description and shows that transforming the nonlinear principal components into a feature space allows a simple statistical inference. Results from both simulated and industrial data confirm the efficacy of the proposed method for solving nonlinear principal component problems, compared with linear PCA and kernel PCA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim-To develop an expert system model for the diagnosis of fine needle aspiration cytology (FNAC) of the breast.

Methods-Knowledge and uncertainty were represented in the form of a Bayesian belief network which permitted the combination of diagnostic evidence in a cumulative manner and provided a final probability for the possible diagnostic outcomes. The network comprised 10 cytological features (evidence nodes), each independently linked to the diagnosis (decision node) by a conditional probability matrix. The system was designed to be interactive in that the cytopathologist entered evidence into the network in the form of likelihood ratios for the outcomes at each evidence node.

Results-The efficiency of the network was tested on a series of 40 breast FNAC specimens. The highest diagnostic probability provided by the network agreed with the cytopathologists' diagnosis in 100% of cases for the assessment of discrete, benign, and malignant aspirates. A typical probably benign cases were given probabilities in favour of a benign diagnosis. Suspicious cases tended to have similar probabilities for both diagnostic outcomes and so, correctly, could not be assigned as benign or malignant. A closer examination of cumulative belief graphs for the diagnostic sequence of each case provided insight into the diagnostic process, and quantitative data which improved the identification of suspicious cases.

Conclusion-The further development of such a system will have three important roles in breast cytodiagnosis: (1) to aid the cytologist in making a more consistent and objective diagnosis; (2) to provide a teaching tool on breast cytological diagnosis for the non-expert; and (3) it is the first stage in the development of a system capable of automated diagnosis through the use of expert system machine vision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of slurry nutrients to land can be associated with unintended losses to the environment depending on soil and weather conditions. Correct timing of slurry application, however, can increase plant nutrient uptake and reduce losses. A decision support system (DSS), which predicts optimum conditions for slurry spreading based on the Hybrid Soil Moisture Deficit (HSMD) model, was investigated for use as a policy tool. The DSS recommendations were compared to farmer perception of suitable conditions for slurry spreading for three soil drainage classes (well, moderate and poorly drained) to better understand on farm slurry management practices and to identify potential conflict with farmer opinion. Six farmers participated in a survey over two and a half years, during which they completed a daily diary, and their responses were compared to Soil Moisture Deficit (SMD) calculations and weather data recorded by on farm meteorological stations. The perception of land drainage quality differed between farmers and was related to their local knowledge and experience. It was found that the allocation of grass fields to HSMD drainage classes using a visual assessment method aligned farmer perception of drainage at the national scale. Farmer opinion corresponded to the theoretical understanding that slurry should not be applied when the soil is wetter than field capacity, i.e. when drainage can occur. While weather and soil conditions (especially trafficability) were the principal reasons given by farmers not to spread slurry, farm management practices (grazing and silage) and current Nitrates Directive policies (closed winter period for spreading) combined with limited storage capacities were obstacles to utilisation of slurry nutrients. Despite the slightly more restrictive advice of the DSS regarding the number of suitable spreading opportunities, the system has potential to address an information deficit that would help farmers to reduce nutrient losses and optimise plant nutrient uptake by improved slurry management. The DSS advice was in general agreement with the farmers and, therefore, they should not be resistant to adopting the tool for day to day management.