953 resultados para customer driven development
Resumo:
As identified by Griffin (1997) and Kahn (2012), manufacturing organisations typically improve their market position by accelerating their product development (PD) cycles. One method for achieving this is to reduce the time taken to design, test and validate new products, so that they can reach the end customer before competition. This paper adds to existing research on PD testing procedures by reporting on an exploratory investigation carried out in a UK-based manufacturing plant. We explore the organisational and managerial factors that contribute to the time spent on testing of new products during development. The investigation consisted of three sections, viz. observations and process modelling, utilisation metrics and a questionnaire-based investigation, from which a proposed framework to improve and reduce the PD time cycle is presented. This research focuses specifically on the improvement of the utilisation of product testing facilities and the links to its main internal stakeholders - PD engineers.
Resumo:
Anthropogenic climate change is causing unprecedented rapid responses in marine communities, with species across many different taxonomic groups showing faster shifts in biogeographic ranges than in any other ecosystem. Spatial and temporal trends for many marine species are difficult to quantify, however, due to the lack of long-term datasets across complete geographical distributions and the occurrence of small-scale variability from both natural and anthropogenic drivers. Understanding these changes requires a multidisciplinary approach to bring together patterns identified within long-term datasets and the processes driving those patterns using biologically relevant mechanistic information to accurately attribute cause and effect. This must include likely future biological responses, and detection of the underlying mechanisms in order to scale up from the organismal level to determine how communities and ecosystems are likely to respond across a range of future climate change scenarios. Using this multidisciplinary approach will improve the use of robust science to inform the development of fit-for-purpose policy to effectively manage marine environments in this rapidly changing world.
Resumo:
Anthropogenic climate change is causing unprecedented rapid responses in marine communities, with species across many different taxonomic groups showing faster shifts in biogeographic ranges than in any other ecosystem. Spatial and temporal trends for many marine species are difficult to quantify, however, due to the lack of long-term datasets across complete geographical distributions and the occurrence of small-scale variability from both natural and anthropogenic drivers. Understanding these changes requires a multidisciplinary approach to bring together patterns identified within long-term datasets and the processes driving those patterns using biologically relevant mechanistic information to accurately attribute cause and effect. This must include likely future biological responses, and detection of the underlying mechanisms in order to scale up from the organismal level to determine how communities and ecosystems are likely to respond across a range of future climate change scenarios. Using this multidisciplinary approach will improve the use of robust science to inform the development of fit-for-purpose policy to effectively manage marine environments in this rapidly changing world.
Resumo:
Following and contributing to the ongoing shift from more structuralist, system-oriented to more pragmatic, socio-cultural oriented anglicism research, this paper verifies to what extent the global spread of English affects naming patterns in Flanders. To this end, a diachronic database of first names is constructed, containing the top 75 most popular boy and girl names from 2005 until 2014. In a first step, the etymological background of these names is documented and the evolution in popularity of the English names in the database is tracked. Results reveal no notable surge in the preference for English names. This paper complements these database-driven results with an experimental study, aiming to show how associations through referents are in this case more telling than associations through phonological form (here based on etymology). Focusing on the socio-cultural background of first names in general and of Anglo-American pop culture in particular, the second part of the study specifically reports on results from a survey where participants are asked to name the first three celebrities that leap to mind when hearing a certain first name (e.g. Lana, triggering the response Del Rey). Very clear associations are found between certain first names and specific celebrities from Anglo-American pop culture. Linking back to marketing research and the social turn in onomastics, we will discuss how these celebrities might function as referees, and how social stereotypes surrounding these referees are metonymically attached to their first names. Similar to the country-of-origin-effect in marketing, these metonymical links could very well be the reason why parents select specific “celebrity names”. Although further attitudinal research is needed, this paper supports the importance of including socio-cultural parameters when conducting onomastic research.
Resumo:
Control of the collective response of plasma particles to intense laser light is intrinsic to relativistic optics, the development of compact laser-driven particle and radiation sources, as well as investigations of some laboratory astrophysics phenomena. We recently demonstrated that a relativistic plasma aperture produced in an ultra-thin foil at the focus of intense laser radiation can induce diffraction, enabling polarization-based control of the collective motion of plasma electrons. Here we show that under these conditions the electron dynamics are mapped into the beam of protons accelerated via strong charge-separation-induced electrostatic fields. It is demonstrated experimentally and numerically via 3D particle-in-cell simulations that the degree of ellipticity of the laser polarization strongly influences the spatial-intensity distribution of the beam of multi-MeV protons. The influence on both sheath-accelerated and radiation pressure-accelerated protons is investigated. This approach opens up a potential new route to control laser-driven ion sources.
Resumo:
Land Ownership and Development: Evidence from Postwar Japan This paper analyzes the effect of land ownership on technology adoption and structural transformation. A large-scale land reform in postwar Japan enforced a large number of tenant farmers who were cultivating land to become owners of this land. I find that the municipalities which had many owner farmers after the land reform tended to experience a quick entry of new agricultural machines which became available after the reform. The adoption of the machines reduced the dependence on family labor, and led to a reallocation of labor from agriculture to industries and service sectors in urban centers when these sectors were growing. I also analyze the aggregate impact of labor reallocation on economic growth by using a simple growth model and micro data. I find that it increased GDP by about 12 percent of the GDP in 1974 during 1955-74. I also find a large and positive effect on agricultural productivity. Loyalty and Treason: Theory and Evidence from Japan's Land Reform A historically large-scale land reform in Japan after World War II enforced by the occupation forces redistributed a large area of farmlands to tenant farmers. The reform demolished hierarchical structures by weakening landlords' power in villages and towns. This paper investigates how the change in the social and economic structure of small communities affects electoral outcomes in the presence of clientelism. I find that there was a considerable decrease in the vote share of conservative parties in highly affected areas after the reform. I find the supporting evidence that the effect was driven by the fact that the tenant farmers who had obtained land exited from the long-term tenancy contract and became independent landowners. The effect was relatively persistent. Finally, I also find the surprising result that there was a decrease, rather than an increase, in turnout in these areas after the reform. Geography and State Fragmentation We examine how geography affects the location of borders between sovereign states in Europe and surrounding areas from 1500 until today at the grid-cell level. This is motivated by an observation that the richest places in this region also have the highest historical border presence, suggesting a hitherto unexplored link between geography and modern development, working through state fragmentation. The raw correlations show that borders tend to be located on mountains, by rivers, closer to coasts, and in areas suitable for rainfed, but not irrigated, agriculture. Many of these patterns also hold with rigorous spatial controls. For example, cells with more rivers and more rugged terrain than their neighboring cells have higher border densities. However, the fragmenting effects of suitability for rainfed agriculture are reversed with such neighbor controls. Moreover, we find that borders are less likely to survive over time when they separate large states from small, but this size-difference effect is mitigated by, e.g., rugged terrain.
Resumo:
Idag är det vanligt att företagen konkurrerar med produkter som innefattar en fysisk vara som har utvidgats med olika tjänster för att kunna tillfredsställa kundens behov. I takt med detta ökar produkternas komplexitet och högre krav ställs på leverantörerna. Denna utveckling har lett till att många företag som tillverkar produkter med hög variation måste arbeta helt kundorderstyrt för att kunna tillgodose kundernas ökande krav. Företagen som tillverkar dessa komplexa produkter har en tillverkning som karaktäriseras av hög variation och låg volym vilket benämns med förkortningen HVLV. Utmaningen för dessa HVLV-företag ligger i att ha en hög produktmixflexibilitet med så låg resursanvändning som möjligt. För att effektivisera verksamheten har många företag därför intresserat sig för Lean production som har visat sig vara ett framgångsrikt koncept för tillverkande företag runt om i världen som effektiviserat sin produktion. Ett flertal artiklar har uppmärksammat begränsningar vid implementeringen av Lean production i HVLV-miljöer. Artiklar pekar vidare på behovet av ytterligare forskning kring Lean productions applicerbarhet i HVLV-miljöer och detta var uppkomsten till examensarbetets bakgrund och syfte. En fallstudie har genomförts på Tibrokök som är ett företag med en helt kundorderstyrd produktion som kännetecknas av HVLV. Med hjälp av fallstudien syftar examensarbetet till att undersöka om verktygen processkartläggning inkl. tidsstudier och layoutflödesdiagram kan bidra till att skapa förutsättningar för implementering av Lean production i en enskild tillverkningsprocess i en HVLV-miljö. Trots att verktygen har behövt anpassas något så anses de ha varit användbara och bidragande till skapandet av förutsättningar för implementering av Lean production i denna HVLV-miljö. Detta eftersom vi kunde identifiera många orsaker till slöseri samt ta fram ett förslag som skapar effektiviseringar av Tibroköks ytbehandlingsprocess i det framtida läget.
Resumo:
The power of computer game technology is currently being harnessed to produce “serious games”. These “games” are targeted at the education and training marketplace, and employ various key game-engine components such as the graphics and physics engines to produce realistic “digital-world” simulations of the real “physical world”. Many approaches are driven by the technology and often lack a consideration of a firm pedagogical underpinning. The authors believe that an analysis and deployment of both the technological and pedagogical dimensions should occur together, with the pedagogical dimension providing the lead. This chapter explores the relationship between these two dimensions, and explores how “pedagogy may inform the use of technology”, how various learning theories may be mapped onto the use of the affordances of computer game engines. Autonomous and collaborative learning approaches are discussed. The design of a serious game is broken down into spatial and temporal elements. The spatial dimension is related to the theories of knowledge structures, especially “concept maps”. The temporal dimension is related to “experiential learning”, especially the approach of Kolb. The multi-player aspect of serious games is related to theories of “collaborative learning” which is broken down into a discussion of “discourse” versus “dialogue”. Several general guiding principles are explored, such as the use of “metaphor” (including metaphors of space, embodiment, systems thinking, the internet and emergence). The topological design of a serious game is also highlighted. The discussion of pedagogy is related to various serious games we have recently produced and researched, and is presented in the hope of informing the “serious game community”.
Resumo:
Purpose – The aim of this study is to investigate the role of key strategic factors in new service development (NSD). In particular, the role of service development strategy, a formalised development process, integrated development teams and customer co-creation were investigated and the results were compared with managers' beliefs. Design/methodology/approach – The study used a sample of more than 500 service development projects to test a NSD conceptual model. Regression analysis was used to test the relative importance of the key strategic factors, and the results were compared with managers' beliefs. Findings – The results show that managers believe that customer co-creation is most important in order to succeed with NSD. However, contrary to management belief, a service development strategy is the “missing link” in improving NSD performance. In addition, the research highlighted an interaction effect between integrated development teams and customer co-creation, which means that project managers should focus on individual competencies on the development team and how they interact with customers throughout the NSD process. Originality/value – For a long time, NSD has failed to receive the attention it deserves, not just in practice but also in service research. This study shows that the number of new services put on the market and then withdrawn because of low sales remains as high as 43 per cent. This paper contributes knowledge on how to reduce the number of failures in NSD by pointing out the key strategic factors that influence NSD performance.
Resumo:
Nervous system disorders are associated with cognitive and motor deficits, and are responsible for the highest disability rates and global burden of disease. Their recovery paths are vulnerable and dependent on the effective combination of plastic brain tissue properties, with complex, lengthy and expensive neurorehabilitation programs. This work explores two lines of research, envisioning sustainable solutions to improve treatment of cognitive and motor deficits. Both projects were developed in parallel and shared a new sensible approach, where low-cost technologies were integrated with common clinical operative procedures. The aim was to achieve more intensive treatments under specialized monitoring, improve clinical decision-making and increase access to healthcare. The first project (articles I – III) concerned the development and evaluation of a web-based cognitive training platform (COGWEB), suitable for intensive use, either at home or at institutions, and across a wide spectrum of ages and diseases that impair cognitive functioning. It was tested for usability in a memory clinic setting and implemented in a collaborative network, comprising 41 centers and 60 professionals. An adherence and intensity study revealed a compliance of 82.8% at six months and an average of six hours/week of continued online cognitive training activities. The second project (articles IV – VI) was designed to create and validate an intelligent rehabilitation device to administer proprioceptive stimuli on the hemiparetic side of stroke patients while performing ambulatory movement characterization (SWORD). Targeted vibratory stimulation was found to be well tolerated and an automatic motor characterization system retrieved results comparable to the first items of the Wolf Motor Function Test. The global system was tested in a randomized placebo controlled trial to assess its impact on a common motor rehabilitation task in a relevant clinical environment (early post-stroke). The number of correct movements on a hand-to-mouth task was increased by an average of 7.2/minute while the probability to perform an error decreased from 1:3 to 1:9. Neurorehabilitation and neuroplasticity are shifting to more neuroscience driven approaches. Simultaneously, their final utility for patients and society is largely dependent on the development of more effective technologies that facilitate the dissemination of knowledge produced during the process. The results attained through this work represent a step forward in that direction. Their impact on the quality of rehabilitation services and public health is discussed according to clinical, technological and organizational perspectives. Such a process of thinking and oriented speculation has led to the debate of subsequent hypotheses, already being explored in novel research paths.
Resumo:
This article discusses the potential of audio games based on the evaluation of three projects: a story-driven audio role-playing game (RPG), an interactive audiobook with RPG elements, and a set of casual sound-based games. The potential is understood, both in popularity and playability terms. The first factor is connected to the degree of players’ interest, while the second one to the degree of their engagement in sound-based game worlds. Although presented projects are embedded within the landscape of past and contemporary audio games and gaming platforms, the authors reach into the near future, concluding with possible development directions for this non-visual interactive entertainment.
Resumo:
In this thesis, proactive marketing is suggested to be a broader concept than existing research assumes. Although the concept has been mentioned in the context of competitive advantage in previous research, it has not been comprehensively described. This thesis shows that proactive marketing is more than investing in marketing communications of a company. Proactive marketing is described as a three-phased process that contains different customer value identification, creation, and delivery activities. The purpose of proactive marketing is essentially to anticipate and pursue market opportunities that bring value to the company’s stakeholders. Ultimately, proactive marketing aims at acting first on the market, shaping the markets, and thus reaching competitive advantage. The proactive marketing process is supported by the structures of an organization. Suitable structures for proactive marketing are identified in the thesis based on existing research and through an empirical analysis. Moreover, proactive marketing is related to two management theories: the dynamic capabilities framework and the empowerment of employees. A dynamic environment requires companies that pursue proactive marketing to change continuously. Dynamic capabilities are considered as tools of the management, which enable companies to create suitable conditions for the constant change. Empowerment of employees is a management practice that creates proactive behaviors in individuals. The empirical analysis is conducted in an online company operating in the rapidly changing marketplace of the Internet. Through the empirical analysis, the thesis identifies in practice how proactiveness manifests in the marketing process of a company, how organizational structures facilitate proactive marketing, and how proactive marketing is managed. The theoretical contribution of this thesis consist of defining the proactive marketing concept comprehensively and providing further research suggestions related to proactive marketing.
Resumo:
A actividade vitivinícola possui um conjunto diverso de características presentes no solo, território e comunidade que fazem parte do património cultural de uma determinada região. Quando a tradição se traduz num conceito como terroir que é formado por características territoriais, sociais e culturais de uma região rural, o vinho apresenta uma “assinatura” que se escreve “naturalmente” no paladar regionalmente identificado. Os vinhos da Região de Nemea, na Grécia e de Basto (Região dos Vinhos Verdes) em Portugal, estão ambos sob a proteção dos regulamentos das Denominações de Origem. No entanto, apesar de ambos serem regulados por sistemas institucionais de certificação e controlo de qualidade, afigura-se a necessidade de questionar se o património cultural e a identidade territorial específica, “impressa” em ambos os terroirs, pode ser protegida num sentido mais abrangente do que apenas origem e qualidade. Em Nemea, a discussão entre os produtores diz respeito ao estabelecimento de sub-zonas, isto é incluir na regulação PDO uma diferente categorização territorial com base no terroir. Ou seja, para além de estar presente no rótulo a designação PDO, as garrafas incluirão ainda informação certificada sobre a área específica (dentro do mesmo terroir) onde o vinho foi produzido. A acontecer resultaria em diferentes status de qualidade de acordo com as diferentes aldeias de Nemea onde as vinhas estão localizadas. O que teria possíveis impactos no valor das propriedades e no uso dos solos. Para além disso, a não participação da Cooperativa de Nemea na SON (a associação local de produtores de vinho) e como tal na discussão principal sobre as mudanças e os desafios sobre o terroir de Nemea constitui um problema no sector vitivinícola de Nemea. Em primeiro lugar estabelece uma relação de não-comunicação entre os dois mais importantes agentes desse sector – as companhias vinícolas e a Cooperativa. Em segundo lugar porque constituiu uma possibilidade real, não só para os viticultores ficarem arredados dessa discussão, como também (porque não representados pela cooperativa) ficar impossibilitado um consenso sobre as mudanças discutidas. Isto poderá criar um ‘clima’ de desconfiança levando a discussão para ‘arenas’ deslocalizadas e como tal para decisões ‘desterritorializadas’ Em Basto, há vários produtores que começaram a vender a sua produção para distribuidoras localizadas externamente à sub-região de Basto, mas dentro da Região dos Vinhos Verdes, uma vez que essas companhias tem um melhor estatuto nacional e internacional e uma melhor rede de exportações. Isto está ainda relacionado com uma competição por uma melhor rede de contactos e status mais forte, tornando as discussões sobre estratégias comuns para o desenvolvimento rural e regional de Basto mais difícil de acontecer (sobre isto a palavra impossível foi constantemente usada durante as entrevistas com os produtores de vinho). A relação predominante entre produtores é caracterizada por relações individualistas. Contudo foi observado que essas posições são ainda caracterizadas por uma desconfiança no interior da rede interprofissional local: conflitos para conseguir os mesmos potenciais clientes; comprar uvas a viticultores com melhor rácio qualidade/preço; estratégias individuais para conseguir um melhor status político na relação com a Comissão dos Vinhos Verdes. Para além disso a inexistência de uma activa intermediação institucional (autoridades municipais e a Comissão de Vinho Verde), a inexistência entre os produtores de Basto de uma associação ou mesmo a inexistência de uma cooperativa local tem levado a região de Basto a uma posição de subpromoção nas estratégias de promoção do Vinho Verde em comparação com outras sub-regiões. É também evidente pelos resultados que as mudanças no sector vitivinícolas na região de Basto têm sido estimuladas de fora da região (em resposta também às necessidades dos mercados internacionais) e raramente de dentro – mais uma vez, ‘arenas’ não localizadas e como tal decisões desterritorializadas. Nesse sentido, toda essa discussão e planeamento estratégico, terão um papel vital na preservação da identidade localizada do terroir perante os riscos de descaracterização e desterritorialização. Em suma, para ambos os casos, um dos maiores desafios parece ser como preservar o terroir vitivinícola e como tal o seu carácter e identidade local, quando a rede interprofissional em ambas as regiões se caracteriza, tanto por relações não-consensuais em Nemea como pelo modus operandi de isolamento sem comunicação em Basto. Como tal há uma necessidade de envolvimento entre os diversos agentes e as autoridades locais no sentido de uma rede localizada de governança. Assim sendo, em ambas as regiões, a existência dessa rede é essencial para prevenir os efeitos negativos na identidade do produto e na sua produção. Uma estratégia de planeamento integrado para o sector será vital para preservar essa identidade, prevenindo a sua desterritorialização através de uma restruturação do conhecimento tradicional em simultâneo com a democratização do acesso ao conhecimento das técnicas modernas de produção vitivinícola.
Resumo:
Following the workshop on new developments in daily licensing practice in November 2011, we brought together fourteen representatives from national consortia (from Denmark, Germany, Netherlands and the UK) and publishers (Elsevier, SAGE and Springer) met in Copenhagen on 9 March 2012 to discuss provisions in licences to accommodate new developments. The one day workshop aimed to: present background and ideas regarding the provisions KE Licensing Expert Group developed; introduce and explain the provisions the invited publishers currently use;ascertain agreement on the wording for long term preservation, continuous access and course packs; give insight and more clarity about the use of open access provisions in licences; discuss a roadmap for inclusion of the provisions in the publishers’ licences; result in report to disseminate the outcome of the meeting. Participants of the workshop were: United Kingdom: Lorraine Estelle (Jisc Collections) Denmark: Lotte Eivor Jørgensen (DEFF), Lone Madsen (Southern University of Denmark), Anne Sandfær (DEFF/Knowledge Exchange) Germany: Hildegard Schaeffler (Bavarian State Library), Markus Brammer (TIB) The Netherlands: Wilma Mossink (SURF), Nol Verhagen (University of Amsterdam), Marc Dupuis (SURF/Knowledge Exchange) Publishers: Alicia Wise (Elsevier), Yvonne Campfens (Springer), Bettina Goerner (Springer), Leo Walford (Sage) Knowledge Exchange: Keith Russell The main outcome of the workshop was that it would be valuable to have a standard set of clauses which could used in negotiations, this would make concluding licences a lot easier and more efficient. The comments on the model provisions the Licensing Expert group had drafted will be taken into account and the provisions will be reformulated. Data and text mining is a new development and demand for access to allow for this is growing. It would be easier if there was a simpler way to access materials so they could be more easily mined. However there are still outstanding questions on how authors of articles that have been mined can be properly attributed.
Resumo:
Cancer and cardio-vascular diseases are the leading causes of death world-wide. Caused by systemic genetic and molecular disruptions in cells, these disorders are the manifestation of profound disturbance of normal cellular homeostasis. People suffering or at high risk for these disorders need early diagnosis and personalized therapeutic intervention. Successful implementation of such clinical measures can significantly improve global health. However, development of effective therapies is hindered by the challenges in identifying genetic and molecular determinants of the onset of diseases; and in cases where therapies already exist, the main challenge is to identify molecular determinants that drive resistance to the therapies. Due to the progress in sequencing technologies, the access to a large genome-wide biological data is now extended far beyond few experimental labs to the global research community. The unprecedented availability of the data has revolutionized the capabilities of computational researchers, enabling them to collaboratively address the long standing problems from many different perspectives. Likewise, this thesis tackles the two main public health related challenges using data driven approaches. Numerous association studies have been proposed to identify genomic variants that determine disease. However, their clinical utility remains limited due to their inability to distinguish causal variants from associated variants. In the presented thesis, we first propose a simple scheme that improves association studies in supervised fashion and has shown its applicability in identifying genomic regulatory variants associated with hypertension. Next, we propose a coupled Bayesian regression approach -- eQTeL, which leverages epigenetic data to estimate regulatory and gene interaction potential, and identifies combinations of regulatory genomic variants that explain the gene expression variance. On human heart data, eQTeL not only explains a significantly greater proportion of expression variance in samples, but also predicts gene expression more accurately than other methods. We demonstrate that eQTeL accurately detects causal regulatory SNPs by simulation, particularly those with small effect sizes. Using various functional data, we show that SNPs detected by eQTeL are enriched for allele-specific protein binding and histone modifications, which potentially disrupt binding of core cardiac transcription factors and are spatially proximal to their target. eQTeL SNPs capture a substantial proportion of genetic determinants of expression variance and we estimate that 58% of these SNPs are putatively causal. The challenge of identifying molecular determinants of cancer resistance so far could only be dealt with labor intensive and costly experimental studies, and in case of experimental drugs such studies are infeasible. Here we take a fundamentally different data driven approach to understand the evolving landscape of emerging resistance. We introduce a novel class of genetic interactions termed synthetic rescues (SR) in cancer, which denotes a functional interaction between two genes where a change in the activity of one vulnerable gene (which may be a target of a cancer drug) is lethal, but subsequently altered activity of its partner rescuer gene restores cell viability. Next we describe a comprehensive computational framework --termed INCISOR-- for identifying SR underlying cancer resistance. Applying INCISOR to mine The Cancer Genome Atlas (TCGA), a large collection of cancer patient data, we identified the first pan-cancer SR networks, composed of interactions common to many cancer types. We experimentally test and validate a subset of these interactions involving the master regulator gene mTOR. We find that rescuer genes become increasingly activated as breast cancer progresses, testifying to pervasive ongoing rescue processes. We show that SRs can be utilized to successfully predict patients' survival and response to the majority of current cancer drugs, and importantly, for predicting the emergence of drug resistance from the initial tumor biopsy. Our analysis suggests a potential new strategy for enhancing the effectiveness of existing cancer therapies by targeting their rescuer genes to counteract resistance. The thesis provides statistical frameworks that can harness ever increasing high throughput genomic data to address challenges in determining the molecular underpinnings of hypertension, cardiovascular disease and cancer resistance. We discover novel molecular mechanistic insights that will advance the progress in early disease prevention and personalized therapeutics. Our analyses sheds light on the fundamental biological understanding of gene regulation and interaction, and opens up exciting avenues of translational applications in risk prediction and therapeutics.