937 resultados para Example Based Learnin
Resumo:
Purpose – In the 1990s, a growing number of companies adopted value-based management (VBM) techniques in the UK. The purpose of this paper is to explore the motivations for the adoption or non-adoption of VBM for managing a business. Design/methodology/approach – An interview-based study of 37 large UK companies. Insights from diffusion theory and institutional theory are utilised to theorise these motivations. Findings – It was found that the rate of adoption of VBM in the sample companies does follow the classical S-shape. It also suggests that the supply-side of the diffusion process, most notably the role played by consultants, was an influence on many companies. This was not, however, a sufficient condition for companies to adopt the technique. The research also finds evidence of relocation diffusion, as several adopters are influenced by new officers, for example chief executive officers and finance directors, importing VBM techniques that they have used in organizations within which they have previously worked. Research limitations/implications – It is quite a small scale study and further work would be needed to develop the findings. Practical implications – Understanding and theorising the adoption of new management techniques will help understand the management of a business. Originality/value – This research adds further evidence to the value of studying management accounting, and more specifically management accounting change, in practice. It shows the developments in the adoption of a new technique and hence how a technique becomes accepted in practice.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
Our understanding of the nature of competitive advantage has not been helped by a tendency for theorists to adopt a unitary position, suggesting, for example, that advantage is industry based or resource based. In examining the nature of competitive advantage in an electronic business (e-business) environment this paper adopts a contingency perspective. Several intriguing questions emerge. Do 'new economy' companies have different resource profiles to 'old economy' companies? Are the patterns of resource development and accumulation different? Are attained advantages less sustainable for e-businesses? These are the kinds of themes examined in this paper. The literature on competitive advantage is reviewed as are the challenges posed by the recent changes in the business environment.Two broad sets of firms are identified as emerging out of the e-business shake up and the resource profiles of these firms are discussed. Several research propositions are advanced and the implications for research and practice are discussed.
Resumo:
When applying multivariate analysis techniques in information systems and social science disciplines, such as management information systems (MIS) and marketing, the assumption that the empirical data originate from a single homogeneous population is often unrealistic. When applying a causal modeling approach, such as partial least squares (PLS) path modeling, segmentation is a key issue in coping with the problem of heterogeneity in estimated cause-and-effect relationships. This chapter presents a new PLS path modeling approach which classifies units on the basis of the heterogeneity of the estimates in the inner model. If unobserved heterogeneity significantly affects the estimated path model relationships on the aggregate data level, the methodology will allow homogenous groups of observations to be created that exhibit distinctive path model estimates. The approach will, thus, provide differentiated analytical outcomes that permit more precise interpretations of each segment formed. An application on a large data set in an example of the American customer satisfaction index (ACSI) substantiates the methodology’s effectiveness in evaluating PLS path modeling results.
Resumo:
Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
Since 1988, quasi-markets have been introduced into many areas of social policy in the UK, the NHS internal market is one example. Markets operate by price signals. The NHS Internal Market, if it is to operate efficiently, requires purchasers and providers to respond to price signals. The research hypothesis is - cost accounting methods can be developed to enable healthcare contracts to be priced on a cost-basis in a manner which will facilitate the achievement of economic efficiency in the NHS internal market. Surveys of hospitals in 1991 and 1994 established the cost methods adopted in deriving the prices for healthcare contracts in the first year of the market and three years on. An in-depth view of the costing for pricing process was gained through case studies. Hospitals had inadequate cost information on which to price healthcare contracts at the inception of the internal market: prices did not reflect the relative performance of healthcare providers sufficiently closely to enable the market's espoused efficiency aims to be achieved. Price variations were often due to differing costing approaches rather than efficiency. Furthermore, price comparisons were often meaningless because of inadequate definition of the services (products). In April 1993, the NHS Executive issued guidance on costing for contracting to all NHS providers in an attempt to improve the validity of price comparisons between alternative providers. The case studies and the 1994 survey show that although price comparison has improved, considerable problems remain. Consistency is not assured, and the problem of adequate product definition is still to be solved. Moreover, the case studies clearly highlight the mismatch of rigid, full-cost pricing rules with both the financial management considerations at local level and the emerging internal market(s). Incentives exist to cost-shift, and healthcare prices can easily be manipulated. In the search for a new health policy paradigm to replace traditional bureaucratic provision, cost-based pricing cannot be used to ensure a more efficient allocation of healthcare resources.
Resumo:
We introduce the concept of noncoherent optical pulse discrimination from a coherent (or partially coherent) signal of the same energy using the phenomenon of soliton generation. The impact of randomization of the optical signal content on the observable characteristics of soliton generation is examined and quantified for the particular example of a rectangular pulse.
Resumo:
Almost all manufacturers offer services, but some use these as the basis for their competitive strategy. This is a growing area of interest among practitioners, policy makers, and academics, yet little is known about the adoption of servitization by UK manufacturers. In this paper a survey is presented that has been used to explore the extent, motivations, challenges, and successes of servitization within the business-to-business sector. The findings indicate, for example, that many manufacturers are succeeding with their service strategies, that they are attracted to these as a source of customer focus and revenue growth, and that such strategies require less organizational change than might be expected. Although the findings from the survey should be treated as preliminary, and further work is needed to confirm their reliability and insight, they indicate that servitization is proving to be a powerful competitive weapon for many companies.
Resumo:
Introduction: Lower back pain treatment and compensation costs >$80 billion overall in the US. 75% of back pain is due to disc degeneration in the lumbar region of the spine. Current treatment comprises of painkillers and bed rest or as a more radical solution – interbody cage fusion. In the early stages of disc degeneration the patient would benefit from addition of an injectable gel which polymerises in situ to support the degenerated nucleus pulposus. This involves a material which is an analogue of the natural tissue capable of restoring the biomechanical properties of the natural disc. The nucleus pulposus of the intervertebral disc is an example of a natural proteoglycan consisting of a protein core with negatively charged keratin and chondroitin sulphate attached. As a result of the high fixed charge density of the proteoglycan, the matrix exerts an osmotic swelling pressure drawing sufficient water into support the spinal system. Materials and Methods: NaAMPs (sodium 2- acrylamido 2-methyl propane sulphonic acid) and KSPA (potassium 3- sulphopropyl acrylate) were selected as monomers, the sulphonate group being used to mimic the natural sulphate group. These are used in dermal applications involving chronic wounds and have acceptably low cytotoxicity. Other hydrophilic carboxyl, amide and hydroxyl monomers such as 2-hydroxyethyl acrylamide, ß-carboxyethyl acrylate, acryloyl morpholine, and polyethylene glycol (meth)acrylate were used as diluents together with polyethyleneglycol di(meth)acrylate and hydrophilic multifunctional macromers as cross-linker. Redox was the chosen method of polymerisation and a range of initiators were investigated. Components were packaged in two solutions each containing a redox pair. A dual syringe method of injection into the cavity was used, the required time for polymerisation is circa 3-7 minutes. The final materials were tested using a Bohlin CVO Rheometer cycling from 0.5-25Hz at 37oC to measure the modulus. An in-house compression testing method was developed, using dialysis tubing to mimic the cavity, the gels were swelled in solutions of various osmolarity and compressed to ~ 20%. The pre-gel has also been injected into sheep spinal segments for mechanical compression testing to demonstrate the restoration of properties upon use of the gel. Results and Discussion: Two systems resulted using similar monomer compositions but different initiation and crosslinking agents. NaAMPs and KSPA were used together at a ratio of ~1:1 in both systems with 0.25-2% crosslinking agent, diacrylate or methacrylate. The two initiation systems were ascorbic acid/oxone, and N,N,N,N - tetramethylethylenediamine (TEMED)/ potassium persulphate. These systems produced gelation within 3-7 and 3-5 minutes respectively. Storage of the two component systems was shown to be stable for approximately one month after mixing, in the dark, refrigerated at 1-4oC. The gelation was carried out at 37oC. Literature values for the natural disc give elastic constants ranging from 3-8kPa. The properties of the polymer can be tailored by altering crosslink density and monomer composition and are able to match those of the natural disc. It is possible to incorporate a radio-opaque (histodenz) to enable x-ray luminescence during and after injection. At an inclusion level of 5% the gel is clearly visible and polymerisation and mechanical properties are not altered. Conclusion: A two-pac injection system which will polymerise in situ, that can incorporate a radio-opaque, has been developed. This will reinforce the damaged nucleus pulposus in degenerative disc disease restoring adequate hydration and thus biomechanical properties. Tests on sheep spine segments are currently being carried out to demonstrate that a disc containing the gel has similar properties to an intact disc in comparison to one with a damaged nucleus.
Resumo:
We provide an overview of our recent work on the shaping and stability of optical continua in the long pulse regime. Fibers with normal group-velocity dispersion at all-wavelengths are shown to allow for highly coherent continua that can be nonlinearly shaped using appropriate initial conditions. In contrast, supercontinua generated in the anomalous dispersion regime are shown to exhibit large fluctuations in the temporal and spectral domains that can be controlled using a carefully chosen seed. A particular example of this is the first experimental observation of the Peregrine soliton which constitutes a prototype of optical rogue-waves.
Resumo:
Monitoring land-cover changes on sites of conservation importance allows environmental problems to be detected, solutions to be developed and the effectiveness of actions to be assessed. However, the remoteness of many sites or a lack of resources means these data are frequently not available. Remote sensing may provide a solution, but large-scale mapping and change detection may not be appropriate, necessitating site-level assessments. These need to be easy to undertake, rapid and cheap. We present an example of a Web-based solution based on free and open-source software and standards (including PostGIS, OpenLayers, Web Map Services, Web Feature Services and GeoServer) to support assessments of land-cover change (and validation of global land-cover maps). Authorised users are provided with means to assess land-cover visually and may optionally provide uncertainty information at various levels: from a general rating of their confidence in an assessment to a quantification of the proportions of land-cover types within a reference area. Versions of this tool have been developed for the TREES-3 initiative (Simonetti, Beuchle and Eva, 2011). This monitors tropical land-cover change through ground-truthing at latitude / longitude degree confluence points, and for monitoring of change within and around Important Bird Areas (IBAs) by Birdlife International and the Royal Society for the Protection of Birds (RSPB). In this paper we present results from the second of these applications. We also present further details on the potential use of the land-cover change assessment tool on sites of recognised conservation importance, in combination with NDVI and other time series data from the eStation (a system for receiving, processing and disseminating environmental data). We show how the tool can be used to increase the usability of earth observation data by local stakeholders and experts, and assist in evaluating the impact of protection regimes on land-cover change.
Resumo:
Data envelopment analysis (DEA) has been proven as an excellent data-oriented efficiency analysis method for comparing decision making units (DMUs) with multiple inputs and multiple outputs. In conventional DEA, it is assumed that the status of each measure is clearly known as either input or output. However, in some situations, a performance measure can play input role for some DMUs and output role for others. Cook and Zhu [Eur. J. Oper. Res. 180 (2007) 692–699] referred to these variables as flexible measures. The paper proposes an alternative model in which each flexible measure is treated as either input or output variable to maximize the technical efficiency of the DMU under evaluation. The main focus of this paper is on the impact that the flexible measures has on the definition of the PPS and the assessment of technical efficiency. An example in UK higher education intuitions shows applicability of the proposed approach.
Resumo:
We provide an overview of our recent work on the shaping and stability of optical continua in the long pulse regime. Fibers with normal group-velocity dispersion at all-wavelengths are shown to allow for highly coherent continua that can be nonlinearly shaped using appropriate initial conditions. In contrast, supercontinua generated in the anomalous dispersion regime are shown to exhibit large fluctuations in the temporal and spectral domains that can be controlled using a carefully chosen seed. A particular example of this is the first experimental observation of the Peregrine soliton which constitutes a prototype of optical rogue-waves. © 2012 Elsevier Inc. All rights reserved.
Resumo:
Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.