881 resultados para DDM Data Distribution Management testbed benchmark design implementation instance generator
Resumo:
Purpose – The aim of this study is to analyze consumers' price knowledge in the market for apparels. Design/methodology/approach – After reviewing earlier attempts at assessing the construct, the price estimation error “PEE” was used, a measure based on explicit price knowledge stored in long-term memory, as a valid indicator of price knowledge. Findings – The results, including data from about 1,527 consumers on 66 products from the German apparel market, indicate that price knowledge is relatively low. Originality/value – Although, in the literature, there are several studies on price knowledge in the food industry, little is known about price knowledge in other industry sectors. This is quite surprising since pricing strategy is a concept which is vitally important to all retailers. Therefore, this study is a first contribution to extending the concept of behavioral pricing to the apparel market.
Resumo:
Purpose: The purpose of this paper is to propose and test a model to better understand brand equity. It seeks to investigate the effects of this construct on consumers' responses using data from two European countries. Design/methodology/approach: Hypotheses were tested using structural equation modeling (SEM). Measurement invariance and stability of the model across the two national samples was assessed using multigroup confirmatory factor analysis. Findings: Results indicate that brand equity dimensions inter-relate. Brand awareness positively impacts perceived quality and brand associations. Brand loyalty is mainly influenced by brand associations. Finally, perceived quality, brand associations and brand loyalty are the main drivers of overall brand equity. Findings also corroborate the positive impact of brand equity on consumers' responses. In addition, the general framework proposed is found to be empirically robust across the studied countries. Only a few differences are observed. Research limitations/implications: A limited set of product categories, brands and countries were used. Practical implications: Findings provide useful guidelines for brand equity management. Managers can complement financial metrics with consumer-based brand equity measures to track brand performance over time and to benchmark against other brands. Building brand equity generates more value for corporations since a more favourable consumer response results from positive brand equity. Originality/value: This study contributes to the scarce international brand equity literature by testing the proposed model using data from a sample of consumers in two European countries. It also enriches the brand equity literature by empirically examining the relationships among consumer-based brand equity dimensions and its effects on consumers' responses. © Emerald Group Publishing Limited.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015
Resumo:
A lean menedzsment az értékteremtő folyamatok stratégiai és operatív szintjének meghatározó formálójává vált az elmúlt évtizedekben. Jelen tanulmány stratégiai nézőpontból tárgyalja a lean menedzsment teljes bevezetését. Részletes áttekintést ad a Womack és Jones (2003) által lefektetett lean alapelvekről. Az operatív teljesítményjavulásból származó vevői értékteremtés mellett foglalkozik a tulajdonosi értékteremtéssel, az MRP és a lean szinergikus összekapcsolásával, valamint a lean ideális szervezeti környezetével is. A lean a kapcsolódó területek illesztését is megköveteli, a műhelytanulmány röviden kitér az emberi erőforrás, a teljesítménymérés, az ellátási lánc és a termékfejlesztés legfontosabb kérdéseire. = Lean management has become the dominant strategic and operative framework of value creating processes in the last decades. The working paper describes the strategic approach of full lean implementation. It is mainly built on Womack and Jones’s (2003) lean principles. Beside the five lean principles the study is concerned with customer and shareholder value creation, touches upon the relationship of lean and MRP, and describes ideal lean organizational environment. Lean redesigns value creating processes and requires functional fit of related departments, so the most important issues of human resource, performance, supply chain management and product design are discussed as well.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
The deployment of wireless communications coupled with the popularity of portable devices has led to significant research in the area of mobile data caching. Prior research has focused on the development of solutions that allow applications to run in wireless environments using proxy based techniques. Most of these approaches are semantic based and do not provide adequate support for representing the context of a user (i.e., the interpreted human intention.). Although the context may be treated implicitly it is still crucial to data management. In order to address this challenge this dissertation focuses on two characteristics: how to predict (i) the future location of the user and (ii) locations of the fetched data where the queried data item has valid answers. Using this approach, more complete information about the dynamics of an application environment is maintained. ^ The contribution of this dissertation is a novel data caching mechanism for pervasive computing environments that can adapt dynamically to a mobile user's context. In this dissertation, we design and develop a conceptual model and context aware protocols for wireless data caching management. Our replacement policy uses the validity of the data fetched from the server and the neighboring locations to decide which of the cache entries is less likely to be needed in the future, and therefore a good candidate for eviction when cache space is needed. The context aware driven prefetching algorithm exploits the query context to effectively guide the prefetching process. The query context is defined using a mobile user's movement pattern and requested information context. Numerical results and simulations show that the proposed prefetching and replacement policies significantly outperform conventional ones. ^ Anticipated applications of these solutions include biomedical engineering, tele-health, medical information systems and business. ^
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
The European CloudSME project that incorporated 24 European SMEs, besides five academic partners, has finished its funded phase in March 2016. This presentation will provide a summary of the results of the project, and will analyze the challenges and differences when developing “SME Gateways”, when compared to “Science Gateways”. CloudSME started in 2013 with the aim to develop a cloud-based simulation platform for manufacturing and engineering SMEs. The project was based around industry use-cases, five of which were incorporated in the project from the start, and seven additional ones that were added as an outcome of an open call in January 2015. CloudSME utilized science gateway related technologies, such as the commercial CloudBroker Platform and the WS-PGRADE/gUSE Gateway Framework that were developed in the preceding SCI-BUS project. As most important outcome, the project successfully implemented 12 industry quality demonstrators that showcase how SMEs in the manufacturing and engineering sector can utilize cloud-based simulation services. Some of these solutions are already market-ready and currently being rolled out by the software vendor companies. Some others require further fine-tuning and the implementation of commercial interfaces before being put into the market. The CloudSME use-cases came from a very wide application spectrum. The project implemented, for example, an open marketplace for micro-breweries to optimize their production and distribution processes, an insole design validation service to be used by podiatrists and shoe manufacturers, a generic stock management solution for manufacturing SMEs, and also several “classical” high-performance computing case-studies, such as fluid dynamics simulations for model helicopter design, and dual-fuel internal combustion engine simulation. As the project generated significant impact and interest in the manufacturing sector, 10 CloudSME stakeholders established a follow-up company called CloudSME UG for the future commercialization of the results. Besides the success stories, this talk would also like to highlight the difficulties when transferring the outcomes of an academic research project to real commercial applications. The different mindset and approach of academic and industry partners presented a real challenge for the CloudSME project, with some interesting and valuable lessons learnt. The academic way of supporting SMEs did not always work well with the rather different working practices and culture of many participants. Also, the quality of support regarding operational solutions required by the SMEs is well beyond the typical support services academic institutions are prepared for. Finally, a clear lack of trust in academic solutions when compared to commercial solutions was also imminent. The talk will highlight some of these challenges underpinned by the implementation of the CloudSME use-cases.
Resumo:
Purpose – The purpose of this paper is to explore the links between various characteristics of hospital administration and the utilization of classes of volunteer resource management (VRM) practices. Design/methodology/approach – This paper uses original data collected via surveys of volunteer directors in 122 hospitals in five Northeastern and Southern US states. Findings – Structural equation modeling results suggest that number of paid volunteer management staff, scope of responsibility of the primary volunteer administrator, and hospital size are positively associated with increased usage of certain VRM practices. Research limitations/implications – First, the authors begin the exploration of VRM antecedents, and encourage others to continue this line of inquiry; and second, the authors assess dimensionality of practices, allowing future researchers to consider whether specific dimensions have a differential impact on key individual and organizational outcomes. Practical implications – Based on the findings of a relationship between administrative characteristics and the on-the-ground execution of VRM practice, a baseline audit comparing current practices to those VRM practices presented here might be useful in determining what next steps may be taken to focus investments in VRM that can ultimately drive practice utilization. Originality/value – The exploration of the dimensionality of volunteer management adds a novel perspective to both the academic study, and practice, of volunteer management. To the authors’ knowledge, this is the first empirical categorization of VRM practices.
Resumo:
The IT capability is a organizational ability to perform activities of this role more effectively and an important mechanism in creating value. Its building process (stages of creation and development) occurs through management initiatives for improvement in the performance of their activities, using human resources and IT assets complementary responsible for the evolution of their organizational routines. This research deals with the IT capabilities related to SIG (integrated institutional management systems), built and deployed in UFRN (Universidade Federal do Rio Grande do Norte) to realization and control of administrative, academic and human resources activities. Since 2009, through cooperative agreements with federal and educational institutions of direct administration, UFRN has supported the implementation of these systems, currently involving more than 30 institutions. The present study aims to understand how IT capabilities, relevant in the design, implementation and dissemination of SIG, were built over time. This is a single case study of qualitative and longitudinal nature, performed by capturing, coding and analysis from secondary data and from semi-structured interviews conducted primarily with members of Superintenência de Informática, organizational unit responsible for SIG systems in UFRN. As a result, the technical, of internal relationship and external cooperation capabilities were identified as relevant in the successful trajectory of SIG systems, which have evolved in different ways. The technical capacity, initiated in 2004, toured the stages of creation and development until it reached the stage of stability in 2013, due to technological limits. Regarding the internal relationship capability, begun in 2006, it toured the stages of creation and development, having extended its scope of activities in 2009, being in development since then. Unlike the standard life cycle observed in the literature, the external cooperation capability was initiated by an intensity of initiatives and developments in the routines in 2009, which were decreasing to cease in 2013 in order to stabilize the technological infrastructure already created for cooperative institutions. It was still identified the start of cooperation in 2009 as an important event selection, responsible for changing or creating trajectories of evolution in all three capacities. The most frequent improvements initiatives were of organizational nature and the internal planning activity has been transformed over the routines of the three capabilities. Important resources and complementary assets have been identified as important for the realization of initiatives, such as human resources technical knowledge to the technical capabilities and external cooperation, and business knowledge, for all of them, as well as IT assets: the iproject application for control of development processes, and the document repository wiki. All these resources and complementary assets grew along the capacities, demonstrating its strategic value to SINFO/UFRN
Resumo:
Marine ecosystems are facing a diverse range of threats, including climate change, prompting international efforts to safeguard marine biodiversity through the use of spatial management measures. Marine Protected Areas (MPAs) have been implemented as a conservation tool throughout the world, but their usefulness and effectiveness is strongly related to climate change. However, few MPA programmes have directly considered climate change in the design, management or monitoring of an MPA network. Under international obligations, EU, UK and national targets, Scotland has developed an MPA network that aims to protect marine biodiversity and contribute to the vision of a clean, healthy and productive marine environment. This is the first study to critically analyse the Scottish MPA process and highlight areas which may be improved upon in further iterations of the network in the context of climate change. Initially, a critical review of the Scottish MPA process considered how ecological principles for MPA network design were incorporated into the process, how stakeholder perceptions were considered and crucially what consideration was given to the influence of climate change on the eventual effectiveness of the network. The results indicated that to make a meaningful contribution to marine biodiversity protection for Europe the Scottish MPA network should: i) fully adopt best practice ecological principles ii) ensure effective protection and iii) explicitly consider climate change in the management, monitoring and future iterations of the network. However, this review also highlighted the difficulties of incorporating considerations of climate change into an already complex process. A series of international case studies from British Columbia, Canada; central California, USA; the Great Barrier Reef, Australia and the Hauraki Gulf, New Zealand, were then conducted to investigate perceptions of how climate change has been considered in the design, implementation, management and monitoring of MPAs. The key lessons from this study included: i) strictly protected marine reserves are considered essential for climate change resilience and will be necessary as scientific reference sites to understand climate change effects ii) adaptive management of MPA networks is important but hard to implement iii) strictly protected reserves managed as ecosystems are the best option for an uncertain future. This work provides new insights into the policy and practical challenges MPA managers face under climate change scenarios. Based on the Scottish and international studies, the need to facilitate clear communication between academics, policy makers and stakeholders was recognised in order to progress MPA policy delivery and to ensure decisions were jointly formed and acceptable. A Delphi technique was used to develop a series of recommendations for considering climate change in Scotland’s MPA process. The Delphi participant panel was selected for their knowledge of the Scottish MPA process and included stakeholders, policy makers and academics with expertise in MPA research. The results from the first round of the Delphi technique suggested that differing views of success would likely influence opinions regarding required management of MPAs, and in turn, the data requirements to support management action decisions. The second round of the Delphi technique explored this further and indicated that there was a fundamental dichotomy in panellists’ views of a successful MPA network depending upon whether they believed the MPAs should be strictly protected or allow for sustainable use. A third, focus group round of the Delphi Technique developed a feature-based management scenario matrix to aid in deciding upon management actions in light of changes occurring in the MPA network. This thesis highlights that if the Scottish MPA network is to fulfil objectives of conservation and restoration, the implications of climate change for the design, management and monitoring of the network must be considered. In particular, there needs to be a greater focus on: i) incorporating ecological principles that directly address climate change ii) effective protection that builds resilience of the marine and linked social environment iii) developing a focused, strong and adaptable monitoring framework iv) ensuring mechanisms for adaptive management.
Resumo:
The IT capability is a organizational ability to perform activities of this role more effectively and an important mechanism in creating value. Its building process (stages of creation and development) occurs through management initiatives for improvement in the performance of their activities, using human resources and IT assets complementary responsible for the evolution of their organizational routines. This research deals with the IT capabilities related to SIG (integrated institutional management systems), built and deployed in UFRN (Universidade Federal do Rio Grande do Norte) to realization and control of administrative, academic and human resources activities. Since 2009, through cooperative agreements with federal and educational institutions of direct administration, UFRN has supported the implementation of these systems, currently involving more than 30 institutions. The present study aims to understand how IT capabilities, relevant in the design, implementation and dissemination of SIG, were built over time. This is a single case study of qualitative and longitudinal nature, performed by capturing, coding and analysis from secondary data and from semi-structured interviews conducted primarily with members of Superintenência de Informática, organizational unit responsible for SIG systems in UFRN. As a result, the technical, of internal relationship and external cooperation capabilities were identified as relevant in the successful trajectory of SIG systems, which have evolved in different ways. The technical capacity, initiated in 2004, toured the stages of creation and development until it reached the stage of stability in 2013, due to technological limits. Regarding the internal relationship capability, begun in 2006, it toured the stages of creation and development, having extended its scope of activities in 2009, being in development since then. Unlike the standard life cycle observed in the literature, the external cooperation capability was initiated by an intensity of initiatives and developments in the routines in 2009, which were decreasing to cease in 2013 in order to stabilize the technological infrastructure already created for cooperative institutions. It was still identified the start of cooperation in 2009 as an important event selection, responsible for changing or creating trajectories of evolution in all three capacities. The most frequent improvements initiatives were of organizational nature and the internal planning activity has been transformed over the routines of the three capabilities. Important resources and complementary assets have been identified as important for the realization of initiatives, such as human resources technical knowledge to the technical capabilities and external cooperation, and business knowledge, for all of them, as well as IT assets: the iproject application for control of development processes, and the document repository wiki. All these resources and complementary assets grew along the capacities, demonstrating its strategic value to SINFO/UFRN
Resumo:
Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our implementation is based on the distribution of continuous-variable Einstein–Podolsky–Rosen entangled light. It is one-sided device independent, which means the security of the generated key is independent of any memoryfree attacks on the remote detector. Since continuous-variable encoding is compatible with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components.