198 resultados para Customization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. This article explores the experiences of 26 assistive technology (AT) users having a range of physical impairments as they optimized their use of technology in the workplace. Method. A qualitative research design was employed using in-depth, open-ended interviews and observations of AT users in the workplace. Results. Participants identified many factors that limited their use of technology such as discomfort and pain, limited knowledge of the technology's features, and the complexity of the technology. The amount of time required for training, limited work time available for mastery, cost of training and limitations of the training provided, resulted in an over-reliance on trial and error and informal support networks and a sense of isolation. AT users enhanced their use of technology by addressing the ergonomics of the workstation and customizing the technology to address individual needs and strategies. Other key strategies included tailored training and learning support as well as opportunities to practice using the technology and explore its features away from work demands. Conclusions. This research identified structures important for effective AT use in the workplace which need to be put in place to ensure that AT users are able to master and optimize their use of technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

No mundo moderno, comodidade e conveniência têm sido fatores ligados diretamente às necessidades cotidianas das pessoas, onde o tempo é cada vez mais escasso e a busca por facilidades se torna uma constante. Tendo em vista esse cenário, na mesma medida crescente do número de frequentadores em praça de alimentação, há uma concorrência intensa travada pelas empresas de fast food estabelecidas nesses centros de compras. Nesse contexto, esse trabalho identifica e analisa fatores determinantes da qualidade dos serviços de restaurantes fast food, sob a ótica dos consumidores. O setor de fast food foi dividido em três categorias: temáticos, por quilo e lanchonetes, visando identificar possíveis diferenças nesses segmentos. O universo da pesquisa concentra-se nos consumidores dos restaurantes fast food situados no principal shopping center da cidade de Mauá. A pesquisa é de caráter descritivo e exploratório, cujos dados foram coletados por meio de questionário, baseado no instrumento de análise SERVQUAL, aplicado junto a uma amostra não probabilística de 390 usuários da praça de alimentação. Foi realizada análise fatorial por meio do software estatístico SPSS v19, extraindo cinco fatores determinantes da qualidade dos serviços de restaurantes fast food. Dentre os fatores extraidos, ressaltam-se os que estão ligados à Excelência e à Personalização dos serviços. Os segmentos analisados apresentaram convergências em seus resultados, como a elevada expectativa por parte dos consumidores, além do déficit na qualidade percebida de serviços dos restaurantes fast food, assim como apresentaram divergências em suas análises individuais, evidenciando peculiaridades a cada um desses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Short product life cycle and/or mass customization necessitate reconfiguration of operational enablers of supply chain (SC) from time to time in order to harness high levels of performance. The purpose of this paper is to identify the key operational enablers under stochastic environment on which practitioner should focus while reconfiguring a SC network. Design/methodology/approach: The paper used interpretive structural modeling (ISM) approach that presents a hierarchy-based model and the mutual relationships among the enablers. The contextual relationship needed for developing structural self-interaction matrix (SSIM) among various enablers is realized by conducting experiments through simulation of a hypothetical SC network. Findings: The research identifies various operational enablers having a high driving power towards assumed performance measures. In this regard, these enablers require maximum attention and of strategic importance while reconfiguring SC. Practical implications: ISM provides a useful tool to the SC managers to strategically adopt and focus on the key enablers which have comparatively greater potential in enhancing the SC performance under given operational settings. Originality/value: The present research realizes the importance of SC flexibility under the premise of reconfiguration of the operational units in order to harness high value of SC performance. Given the resulting digraph through ISM, the decision maker can focus the key enablers for effective reconfiguration. The study is one of the first efforts that develop contextual relations among operational enablers for SSIM matrix through integration of discrete event simulation to ISM. © Emerald Group Publishing Limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years Web has become mainstream medium for communication and information dissemination. This paper presents approaches and methods for adaptive learning implementation, which are used in some contemporary web-interfaced Learning Management Systems (LMSs). The problem is not how to create electronic learning materials, but how to locate and utilize the available information in personalized way. Different attitudes to personalization are briefly described in section 1. The real personalization requires a user profile containing information about preferences, aims, and educational history to be stored and used by the system. These issues are considered in section 2. A method for development and design of adaptive learning content in terms of learning strategy system support is represented in section 3. Section 4 includes a set of innovative personalization services that are suggested by several very important research projects (SeLeNe project, ELENA project, etc.) dated from the last few years. This section also describes a model for role- and competency-based learning customization that uses Web Services approach. The last part presents how personalization techniques are implemented in Learning Grid-driven applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the contemporary business environment, to adhere to the need of the customers, caused the shift from mass production to mass-customization. This necessitates the supply chain (SC) to be effective flexible. The purpose of this paper is to seek flexibility through adoption of family-based dispatching rules under the influence of inventory system implemented at downstream echelons of an industrial supply chain network. We compared the family-based dispatching rules in existing literature under the purview of inventory system and information sharing within a supply chain network. The dispatching rules are compared for Average Flow Time performance, which is averaged over the three product families. The performance is measured using extensive discrete event simulation process. Given the various inventory related operational factors at downstream echelons, the present paper highlights the importance of strategically adopting appropriate family-based dispatching rule at the manufacturing end. In the environment of mass customization, it becomes imperative to adopt the family-based dispatching rule from the system wide SC perspective. This warrants the application of intra as well as inter-echelon information coordination. The holonic paradigm emerges in this research stream, amidst the holistic approach and the vital systemic approach. The present research shows its novelty in triplet. Firstly, it provides leverage to manager to strategically adopting a dispatching rule from the inventory system perspective. Secondly, the findings provide direction for the attenuation of adverse impact accruing from demand amplification (bullwhip effect) in the form of inventory levels by appropriately adopting family-based dispatching rule. Thirdly, the information environment is conceptualized under the paradigm of Koestler's holonic theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Productivity measurement poses a challenge for service organizations. Conventional management wisdom holds that this challenge is rooted in the difficulty of accurately quantifying service inputs and outputs. Few service firms have adequate service productivity measurement (SPM) systems in place and implementing such systems may involve organizational transformation. Combining field interviews and literature-based insights, the authors develop a conceptual model of antecedents of SPM in service firms and test it using data from 276 service firms. Results indicate that one out of five antecedents affects the choice to use SPM, namely, the degree of service standardization. In addition, all five hypothesized antecedents and one additional antecedent (perceived appropriateness of the current SPM) predict the degree of SPM usage. In particular, the degree of SPM is positively influenced by the degree of service standardization, service customization, investments in service productivity gains, and the appropriateness of current service productivity measures. In turn, customer integration and the perceived difficulty of measuring service productivity negatively affect SPM. The fact that customer integration impedes actual measurement of service productivity is a surprising finding, given that customer integration is widely seen as a means to increase service productivity. The authors conclude with implications for service organizations and directions for research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because some Web users will be able to design a template to visualize information from scratch, while other users need to automatically visualize information by changing some parameters, providing different levels of customization of the information is a desirable goal. Our system allows the automatic generation of visualizations given the semantics of the data, and the static or pre-specified visualization by creating an interface language. We address information visualization taking into consideration the Web, where the presentation of the retrieved information is a challenge. ^ We provide a model to narrow the gap between the user's way of expressing queries and database manipulation languages (SQL) without changing the system itself thus improving the query specification process. We develop a Web interface model that is integrated with the HTML language to create a powerful language that facilitates the construction of Web-based database reports. ^ As opposed to other papers, this model offers a new way of exploring databases focusing on providing Web connectivity to databases with minimal or no result buffering, formatting, or extra programming. We describe how to easily connect the database to the Web. In addition, we offer an enhanced way on viewing and exploring the contents of a database, allowing users to customize their views depending on the contents and the structure of the data. Current database front-ends typically attempt to display the database objects in a flat view making it difficult for users to grasp the contents and the structure of their result. Our model narrows the gap between databases and the Web. ^ The overall objective of this research is to construct a model that accesses different databases easily across the net and generates SQL, forms, and reports across all platforms without requiring the developer to code a complex application. This increases the speed of development. In addition, using only the Web browsers, the end-user can retrieve data from databases remotely to make necessary modifications and manipulations of data using the Web formatted forms and reports, independent of the platform, without having to open different applications, or learn to use anything but their Web browser. We introduce a strategic method to generate and construct SQL queries, enabling inexperienced users that are not well exposed to the SQL world to build syntactically and semantically a valid SQL query and to understand the retrieved data. The generated SQL query will be validated against the database schema to ensure harmless and efficient SQL execution. (Abstract shortened by UMI.)^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital systems can generate left and right audio channels that create the effect of virtual sound source placement (spatialization) by processing an audio signal through pairs of Head-Related Transfer Functions (HRTFs) or, equivalently, Head-Related Impulse Responses (HRIRs). The spatialization effect is better when individually-measured HRTFs or HRIRs are used than when generic ones (e.g., from a mannequin) are used. However, the measurement process is not available to the majority of users. There is ongoing interest to find mechanisms to customize HRTFs or HRIRs to a specific user, in order to achieve an improved spatialization effect for that subject. Unfortunately, the current models used for HRTFs and HRIRs contain over a hundred parameters and none of those parameters can be easily related to the characteristics of the subject. This dissertation proposes an alternative model for the representation of HRTFs, which contains at most 30 parameters, all of which have a defined functional significance. It also presents methods to obtain the value of parameters in the model to make it approximately equivalent to an individually-measured HRTF. This conversion is achieved by the systematic deconstruction of HRIR sequences through an augmented version of the Hankel Total Least Squares (HTLS) decomposition approach. An average 95% match (fit) was observed between the original HRIRs and those re-constructed from the Damped and Delayed Sinusoids (DDSs) found by the decomposition process, for ipsilateral source locations. The dissertation also introduces and evaluates an HRIR customization procedure, based on a multilinear model implemented through a 3-mode tensor, for mapping of anatomical data from the subjects to the HRIR sequences at different sound source locations. This model uses the Higher-Order Singular Value Decomposition (HOSVD) method to represent the HRIRs and is capable of generating customized HRIRs from easily attainable anatomical measurements of a new intended user of the system. Listening tests were performed to compare the spatialization performance of customized, generic and individually-measured HRIRs when they are used for synthesized spatial audio. Statistical analysis of the results confirms that the type of HRIRs used for spatialization is a significant factor in the spatialization success, with the customized HRIRs yielding better results than generic HRIRs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimizing GIS capability does not always require that the municipality obtain cutting edge professionals and resources. This paper offers a disaster risk reduction (DRR) design methodology for small towns and rural areas that employs a multi-variable classification system, enabling customization for effective DRR. Determining appropriate GIS capacity requires that a community first be evaluated in order to identify its disaster risk reduction/disaster management (DRR/DM) requirements. These requirements are then considered in conjunction with the municipality's resources to establish the desired capability. Qualification levels for major aspects of GIS capability with respect to DRR/DM are provided along with descriptions of each level and suggested procedures for advancement to the next level. It should be noted that a municipality can be classified at a different level with respect to different variables. Needs vary according to the community, thus attainment of a uniform capability level may not be necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. ^ For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver.^ The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. ^ The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver. The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software product line engineering promotes large software reuse by developing a system family that shares a set of developed core features, and enables the selection and customization of a set of variabilities that distinguish each software product family from the others. In order to address the time-to-market, the software industry has been using the clone-and-own technique to create and manage new software products or product lines. Despite its advantages, the clone-and-own approach brings several difficulties for the evolution and reconciliation of the software product lines, especially because of the code conflicts generated by the simultaneous evolution of the original software product line, called Source, and its cloned products, called Target. This thesis proposes an approach to evolve and reconcile cloned products based on mining software repositories and code conflict analysis techniques. The approach provides support to the identification of different kinds of code conflicts – lexical, structural and semantics – that can occur during development task integration – bug correction, enhancements and new use cases – from the original evolved software product line to the cloned product line. We have also conducted an empirical study of characterization of the code conflicts produced during the evolution and merging of two large-scale web information system product lines. The results of our study demonstrate the approach potential to automatically or semi-automatically solve several existing code conflicts thus contributing to reduce the complexity and costs of the reconciliation of cloned software product lines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The software product line engineering brings advantages when compared with the traditional software development regarding the mass customization of the system components. However, there are scenarios that to maintain separated clones of a software system seems to be an easier and more flexible approach to manage their variabilities of a software product line. This dissertation evaluates qualitatively an approach that aims to support the reconciliation of functionalities between cloned systems. The analyzed approach is based on mining data about the issues and source code of evolved cloned web systems. The next step is to process the merge conflicts collected by the approach and not indicated by traditional control version systems to identify potential integration problems from the cloned software systems. The results of the study show the feasibility of the approach to perform a systematic characterization and analysis of merge conflicts for large-scale web-based systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The uncontrolled disposal of wastewaters containing phenolic compounds by the industry has caused irreversible damage to the environment. Because of this, it is now mandatory to develop new methods to treat these effluents before they are disposed of. One of the most promising and low cost approaches is the degradation of phenolic compounds via photocatalysis. This work, in particular, has as the main goal, the customization of a bench scale photoreactor and the preparation of catalysts via utilization of char originated from the fast pyrolysis of sewage sludge. The experiments were carried out at constant temperature (50°C) under oxygen (410, 515, 650 and 750 ml min-1). The reaction took place in the liquid phase (3.4 liters), where the catalyst concentration was 1g L-1 and the initial concentration of phenol was 500 mg L-1 and the reaction time was set to 3 hours. A 400 W lamp was adapted to the reactor. The flow of oxygen was optimized to 650 ml min-1. The pH of the liquid and the nature of the catalyst (acidified and calcined palygorskite, palygorskite impregnated with 3.8% Fe and the pyrolysis char) were investigated. The catalytic materials were characterized by XRD, XRF, and BET. In the process of photocatalytic degradation of phenol, the results showed that the pH has a significant influence on the phenol conversion, with best results for pH equal to 5.5. The phenol conversion ranged from 51.78% for the char sewage sludge to 58.02% (for palygorskite acidified calcined). Liquid samples analyzed by liquid chromatography and the following compounds were identified: hydroquinone, catechol and maleic acid. A mechanism of the reaction was proposed, whereas the phenol is transformed into the homogeneous phase and the others react on the catalyst surface. For the latter, the Langmuir-Hinshelwood model was applied, whose mass balances led to a system of differential equations and these were solved using numerical methods in order to get estimates for the kinetic and adsorption parameters. The model was adjusted satisfactorily to the experimental results. From the proposed mechanism and the operating conditions used in this study, the most favored step, regardless of the catalyst, was the acid group (originated from quinone compounds), being transformed into CO2 and water, whose rate constant k4 presented value of 0.578 mol L-1 min-1 for acidified calcined palygorskite, 0.472 mol L-1 min-1 for Fe2O3/palygorskite and 1.276 mol L-1 min-1 for the sludge to char, the latter being the best catalyst for mineralization of acid to CO2 and water. The quinones were adsorbed to the acidic sites of the calcined palygorskite and Fe2O3/palygorskite whose adsorption constants were similar (~ 4.45 L mol-1) and higher than that of the sewage sludge char (3.77 L mol-1).