809 resultados para Graduation in technology


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli selvittää teknologiansiirrosta vuorovaikutusnäkökulmasta tehdyt empiiriset tutkimukset systemaattisen kirjallisuuskatsauksen avulla. Teoreettinen näkökulma perustui vuorovaikutteisten innovaatioverkostojen rakentamista yliopistojen, yritysten ja valtion välille korostavaan Triple Helix -viitekehykseen. Valtion ohjaava rooli rajattiin tutkielman ulkopuolelle. Tutkimuskysymykset olivat miten teknologiansiirtoa on empiirisesti tutkittu yliopistojen ja yritysten välisenä vuorovaikutuksena, mitkä ovat yliopistojen ja yritysten väliset vuorovaikutusmuodot ja miten teknologiansiirtoa kannattaisi tutkia. Tutkimuksista koostuva aineisto hankittiin systemaattisella kirjallisuushaulla. Aineisto rajattiin teknologiansiirtoa vuorovaikutusnäkökulmasta tutkineisiin empiirisiin tutkimuksiin. Analysointimenetelmänä käytettiin sisällönanalyysiä. Tulosten perusteella teknologiansiirron tutkimisen teoreettisia lähestymistapoja olivat yrittäjyyssuuntautuneisuuteen yliopistoissa kannustava ”entrepreneurial university” -suuntaus, innovaatiot ja innovaatiojärjestelmät, tutkijoiden ominaisuudet ja sosiaalinen pääoma, vuorovaikutus- ja tiedonsiirtoprosessit, organisationaalinen oppiminen ja yliopiston teknologiansiirtopolitiikat. Teknologiansiirtoa oli tutkittu enemmän yliopisto- kuin yritysnäkökulmasta. Tutkimuksista puolet otti huomioon hiljaisen tiedon siirrettävyyden vain vuorovaikutuksen avulla. Tutkielman tuloksena tunnistettiin 34 erilaista vuorovaikutusmuotoa teknologiansiirrossa yliopistojen ja yritysten välillä. Tutkituimmat vuorovaikutusmuodot olivat patentit ja lisenssit, asiantuntija-apu ja konsultointi, epämuodolliset kontaktit ja verkostot, yliopistojen spin-off-yritykset ja sopimustutkimus. Tutkimusyhteistyötä oli tutkittu suhteellisen vähän. Teknologiansiirron tehokkuutta kannattaisi tutkia yritysnäkökulmasta selvittämällä tutkimusyhteistyön ja innovaatioiden välinen yhteys. Teknologiansiirtoa vuorovaikutusprosessina kannattaisi tutkia kaikkien siihen osallistuvien sidosryhmien näkökulmista. Teknologiansiirtoprojektien onnistumiseen vaikuttavia tekijöitä voisi selvittää vertailevien tapaustutkimusten avulla. Tiedonsiirtoprosessiin liittyvää vuorovaikutusta ei prosessina kannattaisi tutkia irrallisena ilmiönä siihen liittyvistä tiedonsiirto-olosuhteista. Vuorovaikutusmuotojen määrä suomalaisessa teknologiansiirrossa olisi tutkimisen arvoinen asia. Vuorovaikutusmuotojen lukumäärän selvittämisellä voisi arvioida yhteistyösuhteen syvyyttä yliopistojen ja yritysten välillä. Yritykset tulisi nähdä aktiivisina toimijoina teknologiansiirrossa, ja teknologiansiirtoa tulisi tutkia myös yritysten näkökulmasta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkimus- ja kehittämistoiminta (t&k-toiminta) on teknologia-alalla yhä merkittävämmässä roolissa. Tutkimuksen tavoitteena on selvittää, miten t&k-toiminta ja sen raportointikäytännöt ovat kehittyneet teknologiayrityksissä vuosina 2007-2011. Tutkimus toteutettiin kuvailevana analyysinä, jossa aineistona toimi Helsingin pörssiin vuonna 2012 listattujen teknologiayritysten tilinpäätöstiedot. Tutkimustulokset osoittavat teknologiayritysten yhteenlaskettujen t&k-menojen vähentyneen, pääasiassa Nokian takia, vuosina 2007-2011. Yritysten välillä oli havaittavissa selkeitä eroavaisuuksia t&k-toiminnan laajuuden sekä kehittämismenojen kirjauskäytäntöjen välillä. Yritykset luokiteltiin t&k-toiminnan laajuuden perusteella t&k-passiivisiin, t&k-neutraaleihin sekä t&k-intensiivisiin. Vaikka IAS 38 vaatii kehittämismenojen taseaktivointia aktivointiedellytysten täyttyessä, määrää yritysten konservatiivisuuden taso lopulta kehittämismenojen kirjauskäytännön. Eroavaisuudet konservatiivisuuden suhteen saattavat näin ollen heikentää yritysten tilinpäätöstietojen läpinäkyvyyttä ja vertailukelpoisuutta t&k-toiminnan suhteen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Supplier companies on industrial marketing have recognized good customer-supplier relationship improve retention, customer loyalty and competitiveness of the companies. This is the key reason for initiation of key account management. Another key reason for key account management is globalization of the customer, because then suppliers have to serve custo-mers all over the world in a coordinated way. This study aimed to identify and understand the success factors of the key account management in order to find out how to improve key account management of the target company. The additional aims were to explore the benefits of the key account management from the point of supplier and customer, and to find out how to measure those. This qualitative study utilizes a case study approach. Semi structured interviews with one supplier company in technology sector and one key account company in forest industry were conducted and content analyzed. The challenges of the key account management of supplier company were related to internal organization and the role of key account management in the company. Benefits cited for key account management were easiness of the business from the customer point of view, strengthened business from the supplier point of view, and deepened relationship because of the increased trust. Key success factors for the key account management were the key account manager and suitability and willingness of the customer for the key account management. Sales volumes, profitability, development of the market share were identified as sui-table values for measuring the success of the key account management. Development proposals for the key account management of the supplier company were presented on the report and those are: role of the key account management, the role of the key account manager, nature and selection of the key accounts, measuring of the key account management and corporate wide customer relationship database.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The future of privacy in the information age is a highly debated topic. In particular, new and emerging technologies such as ICTs and cognitive technologies are seen as threats to privacy. This thesis explores images of the future of privacy among non-experts within the time frame from the present until the year 2050. The aims of the study are to conceptualise privacy as a social and dynamic phenomenon, to understand how privacy is conceptualised among citizens and to analyse ideal-typical images of the future of privacy using the causal layered analysis method. The theoretical background of the thesis combines critical futures studies and critical realism, and the empirical material is drawn from three focus group sessions held in spring 2012 as part of the PRACTIS project. From a critical realist perspective, privacy is conceptualised as a social institution which creates and maintains boundaries between normative circles and preserves the social freedom of individuals. Privacy changes when actors with particular interests engage in technology-enabled practices which challenge current privacy norms. The thesis adopts a position of technological realism as opposed to determinism or neutralism. In the empirical part, the focus group participants are divided into four clusters based on differences in privacy conceptions and perceived threats and solutions. The clusters are fundamentalists, pragmatists, individualists and collectivists. Correspondingly, four ideal-typical images of the future are composed: ‘drift to low privacy’, ‘continuity and benign evolution’, ‘privatised privacy and an uncertain future’, and ‘responsible future or moral decline’. The images are analysed using the four layers of causal layered analysis: litany, system, worldview and myth. Each image has its strengths and weaknesses. The individualistic images tend to be fatalistic in character while the collectivistic images are somewhat utopian. In addition, the images have two common weaknesses: lack of recognition of ongoing developments and simplistic conceptions of privacy based on a dichotomy between the individual and society. The thesis argues for a dialectical understanding of futures as present images of the future and as outcomes of real processes and mechanisms. The first steps in promoting desirable futures are the awareness of privacy as a social institution, the awareness of current images of the future, including their assumptions and weaknesses, and an attitude of responsibility where futures are seen as the consequences of present choices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advances in technology have provided new ways of using entertainment and game technology to foster human interaction. Games and playing with games have always been an important part of people’s everyday lives. Traditionally, human-computer interaction (HCI) research was seen as a psychological cognitive science focused on human factors, with engineering sciences as the computer science part of it. Although cognitive science has made significant progress over the past decade, the influence of people’s emotions on design networks is increasingly important, especially when the primary goal is to challenge and entertain users (Norman 2002). Game developers have explored the key issues in game design and identified that the driving force in the success of games is user experience. User-centered design integrates knowledge of users’ activity practices, needs, and preferences into the design process. Geocaching is a location-based treasure hunt game created by a community of players. Players use GPS (Global Position System) technology to find “treasures” and create their own geocaches; the game can be developed when the players invent caches and used more imagination to creations the caches. This doctoral dissertation explores user experience of geocaching and its applications in tourism and education. Globally, based on the Geocaching.com webpage, geocaching has been played about 180 countries and there are more than 10 million registered geocachers worldwide (Geocaching.com, 25.11.2014). This dissertation develops and presents an interaction model called the GameFlow Experience model that can be used to support the design of treasure hunt applications in tourism and education contexts. The GameFlow Model presents and clarifies various experiences; it provides such experiences in a real-life context, offers desirable design targets to be utilized in service design, and offers a perspective to consider when evaluating the success of adventure game concepts. User-centered game designs have adapted to human factor research in mainstream computing science. For many years, the user-centered design approach has been the most important research field in software development. Research has been focusing on user-centered design in software development such as office programs, but the same ideas and theories that will reflect the needs of a user-centered research are now also being applied to game design (Charles et al. 2005.) For several years, we have seen a growing interest in user experience design. Digital games are experience providers, and game developers need tools to better understand the user experience related to products and services they have created. This thesis aims to present what the user experience is in geocaching and treasure hunt games and how it can be used to develop new concepts for the treasure hunt. Engineers, designers, and researchers should have a clear understanding of what user experience is, what its parts are, and most importantly, how we can influence user satisfaction. In addition, we need to understand how users interact with electronic products and people, and how different elements synergize their experiences. This doctoral dissertation represents pioneering work on the user experience of geocaching and treasure hunt games in the context of tourism and education. The research also provides a model for game developers who are planning treasure hunt concepts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tässä kandidaatintutkielmassa tarkastellaan vuonna 2008 kärjistyneen finanssikriisin aiheuttamien suhdannevaihteluiden vaikutusta Nasdaq OMX Helsinkiin listattujen teknologia-alan yritysten suorituskykyyn. Tutkielman päätavoitteena on selvittää millaisiin listautuneisiin teknologia-alan yrityksiin finanssikriisi vaikutti eniten. Analyysi toteutettiin kvantitatiivisin menetelmin hyödyntämällä klusterianalyysiä sekä Kruskal-Wallis -testiä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study focuses on five lower secondary school pupils’ daily use of their one-toone computers, the overall aim being to investigate literacy in this form of computing. Theoretically, the study is rooted in the New Literacy tradition with an ecological perspective, in combination with socio-semiotic theory in a multimodal perspective. New Literacy in the ecological perspective focuses on literacy practices and place/space and on the links between them. Literacy is viewed as socially based, in specific situations and in recurring social practices. Socio-semiotic theory embodying the multimodal perspective is used for the text analysis. The methodology is known as socio-semiotic ethnography. The ethnographic methods encompass just over two years of fieldwork with participating observations of the five participants’ computing activities at home, at school and elsewhere. The participants, one boy and two girls from the Blue (Anemone) School and two girls from the White (Anemone) School, were chosen to reflect a broad spectrum in terms of sociocultural and socioeconomic background. The study shows the existence of a both broad and deep variation in the way digital literacy features in the participants’ one-to-one computing. These variations are associated with experience in relation to the home, the living environment, place, personal qualities and school. The more varied computer usage of the Blue School participants is connected with the interests they developed in their homes and living environments and in the computing practices undertaken in school. Their more varied usage of the computer is reflected in their broader digital literacy repertoires and their greater number and variety of digital literacy abilities. The Blue School participants’ text production is more multifaceted, covers a wider range of subjects and displays a broader palette of semiotic resources. It also combines more text types and the texts are generally longer than those of the White School participants. The Blue School girls have developed a text culture that is close to that of the school. In their case, there is clear linkage between school-initiated and self-initiated computing activities, while other participants do not have the same opportunities to link and integrate self-initiated computing activities into the school context. It also becomes clear that the Blue School girls can relate and adapt their texts to different communicative practices and recipients. In addition, the study shows that the Blue School girls have some degree of scope in their school practice as a result of incorporating into it certain communicative practices that they have developed in nonschool contexts. Quite contrary to the hopes expressed that one-to-one computing would reduce digital inequality, it has increased between these participants. Whether the same or similar results apply in a larger perspective, on a more structural level, is a question that this study cannot answer. It can only draw attention to the need to investigate the matter. The study shows in a variety of ways that the White School participants do not have the same opportunity to develop their digital literacy as the Blue School participants. In an equivalence perspective, schools have a compensational task to perform. It is abundantly clear from the study that investing in one-to-one projects is not enough to combat digital inequality and achieve the digitisation goals established for school education. Alongside their investments in technology, schools need to develop a didactic that legitimises and compensates for the different circumstances of different pupils. The compensational role of schools in this connection is important not only for the present participants but also for the community at large, in that it can help to secure a cohesive, open and democratic society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Don Chapman was a Silver Badger, a unique distinction given to the first class of Brock University students upon their graduation in 1967 and 1968. Mr. Chapman was an active participant in the student life during his years at Brock University. After graduation he continued to take an active role as a member of the alumni of Brock University. Mr. Chapman was a teacher at St. John’s-Kilmarnock School, Waterloo, Ont., until his death in 2005.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Norah and Fred Fisher welcomed John Fisher into the world on November 29, 1912, not knowing what an influential role he would play in shaping Canada's history. John Fisher grew up as the middle child of five brothers and sisters in Frosty Hollow, New Brunswick, close to today’s town of Sackville. Sackville’s main industry was the Enterprise Foundry which the Fisher family owned and operated; however, Fisher had no plans of going into the family business. He was more inspired by his maternal grandfather, Dr. Cecil Wiggins, who lived with the family after retiring from the Anglican ministry. Wiggins encouraged all his grandchildren to be well read and to take part in discussions on current events. There were often visitors in the Fisher household taking part in discussions about politics, religion, and daily life. Fisher forced himself to take part in these conversations to help overcome his shyness in social settings. These conversations did help with his shyness and also in forming many opinions and observations about Canada. It put Fisher on the road to becoming Mr. Canada and delivering the many eloquent speeches for which he was known. Fisher did not venture far from home to complete his first degree. In 1934 he graduated from Mount Allison University in Sackville, NB with an Arts degree. The same year Fisher enrolled in Dalhousie’s law school. During his time at Dalhousie, Fisher discovered radio through Hugh Mills. Mills or “Uncle Mel” was on CHNS, Halifax’s only radio station at the time. Fisher began by making appearences on the radio drama show. By 1941 he had begun writing and broadcasting his own works and joined the staff as an announcer and continuity writer. In 1936 the Canadian Broadcasting Corporation was formed, the first National radio station. Fisher joined the CBC shortly after it’s beginning and remained with them, as well as the Halifax Herald newspaper, even after his law school graduation in 1937. By 1943 Fisher’s talks became a part of the CBC’s programming for a group of maritime radio stations. Fisher once described his talks as follows “my talks weren’t meant to be objective. . . they were meant to be favourable. They were ‘pride builders’” He began his famed John Fisher Reports at CBC Toronto when he transfered there shortly after the war. This program brought emmence pride to the fellow Canadians he spoke about leading to approximately 3500 requests per year to speak at banquets and meeting throughout Canada and the United States. Fisher was a well travelled indivdual who would draw on personal experiences to connect with his audience. His stories were told in simple, straight forward language for anyone to enjoy. He became a smooth, dynamic and passionate speaker who sold Canada to Canadians. He became a renowned journalist, folk historian, writer and broadcaster. Fisher was able to reach a vast array of people through his radio work and build Canadian pride, but he did not stop there. Other ways Fisher has contributed to Canada and the Canadian people include: Honoured by five Canadian Universities. 1956, became the Director of the Canadian Tourist Association. 1961, was appointed Special Assistant to the Prime Minister of Canada. 1963, Commissioner of the Centennial Commission (the Federal Agency Responsible for Canada’s 100th birthday) 1968, received the Service Medal , a coveted Order of Canada. President of John Fisher Enterprises Ltd., private consultant work, specializing in Centennial planning, broadcasts, lectures and promotion. John Fisher continued recording radio broadcasts even after his diagnosis with cancer. He would record 3 or 4 at a time so he was free to travel across Canada, the U.S., Europe and Mexico in search of treatments. Fisher passed away from the disease on February 15, 1981 and he is buried at Mount Pleasant Cemetery in Toronto.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse porte sur l’appropriation de l’Internet et du multimédias dans la population universitaire d’Afrique francophone, en l’an 2001. Elle couvre six pays : le Bénin, le Burkina Faso, le Cameroun, la Côte d’Ivoire, le Mali et le Togo. La recherche porte sur le recensement des centres de recherche démographique en Afrique francophone subsaharienne et sur une enquête auprès des universités de Yaoundé II et de Douala au Cameroun. La problématique de l’accès et de l’usage est centrale dans notre démarche. Elle est traduite dans la question de recherche suivante : « Dans un contexte dominé par les représentations des NTIC comme symboles de modernité et facteurs d’intégration à l’économie mondiale, quelles sont les modalités d’appropriation de ces technologies par les universitaires des institutions de l’enseignement et de la recherche en Afrique considérées dans cette étude ? » Pour aborder le matériel empirique, nous avons opté pour deux approches théoriques : les théories du développement en lien avec les (nouveaux) médias et la sociologie des innovations techniques. Enracinées dans la pensée des Lumières, complétée et affinée par les approches évolutionnistes inspirées de Spencer, le fonctionnalisme d’inspiration parsonienne et l’économie politique axée sur la pensée de W. W. Rostow, les théories du développement ont largement mis à contribution les théories de la communication pour atteindre leur objet. Alors que la crise de la modernité occidentale menace de délégitimer ces paradigmes, les technologies émergentes leur donnent une nouvelle naissance : dans la continuité de la pensée d’Auguste Comte, le développement est désormais pensé en termes d’intégration à un nouveau type de société, la société de l’information. Cette nouvelle promesse eschatologique et cette foi dans la technique comme facteur d’intégration à la société et à l’économie en réseau habitent tous les projets menés sur le continent, que ce soit le NEPAD, le Fond de solidarité numérique, le projet d’ordinateur à 100$ pour les enfants démunis ou le projet panafricain de desserte satellitaire, le RASCOM. Le deuxième volet de notre cadre de référence théorique est axé sur la sociologie des innovations techniques. Nous mobilisons la sociopolitique des usages de Vedel et Vitalis pour ramener la raison critique dans le débat sur le développement du continent africain, dans l’optique de montrer que la prérogative politique assumée par les États a encore sa place, si l’on veut que les ressources numériques servent à satisfaire les demandes sociales et non les seules demandes solvables essentiellement localisées dans les centres urbains. En refusant le déterminisme technique si courant dans la pensée sur le développement, nous voulons montrer que le devenir de la technique n’est pas inscrit dans son essence, comme une ombre portée, mais que l’action des humains, notamment l’action politique, peut infléchir la trajectoire des innovations techniques dans l’optique de servir les aspirations des citoyens. Sur le plan méthodologique, la démarche combine les méthodes quantitatives et les méthodes qualitatives. Les premières nous permettront de mesurer la présence d’Internet et du multimédia dans l’environnement des répondants. Les secondes nous aideront à saisir les représentations développées par les usagers au contact de ces outils. Dans la perspective socioconstructiviste, ces discours sont constitutifs des technologies, dans la mesure où ils sont autant de modalités d’appropriation, de construction sociale de l’usage. Ultimement, l’intégration du langage technique propre aux outils multimédias dans le langage quotidien des usagers traduit le dernier stade de cette appropriation. À travers cette recherche, il est apparu que les usagers sont peu nombreux à utiliser les technologies audiovisuelles dans le contexte professionnel. Quand à l’Internet et aux outils multimédias, leur présence et leurs usages restent limités, l’accès physique n’étant pas encore garanti à tous les répondants de l’étude. Internet suscite de grandes espérances, mais reste, là aussi, largement inaccessible en contexte professionnel, la majorité des usagers se rabattant sur les lieux publics comme les cybercafés pour pallier l’insuffisance des ressources au sein de leurs institutions d’appartenance. Quant aux représentations, elles restent encore largement tributaires des discours politiques et institutionnels dominants, selon lesquels l’avenir sera numérique ou ne sera pas. La thèse va cependant au-delà de ces données pour dessiner la carte numérique actuelle du continent, en intégrant dans la nouvelle donne technologique la montée fulgurante de la téléphonie cellulaire mobile. Il nous est apparu que l’Internet, dont la diffusion sur le continent a été plus que modeste, pourrait largement profiter de l’émergence sur le continent de la culture mobile, que favorise notamment la convergence entre les mini-portables et la téléphonie mobile.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’objectif principal de cette thèse est d’explorer et d’analyser la réception de l’œuvre d’Eugen Wüster afin d’expliquer comment ses travaux ont influencé le développement disciplinaire de la terminologie. Du point de vue historique, les travaux de Wüster, en particulier la Théorie générale de la terminologie, ont stimulé la recherche en terminologie. Malgré des opinions divergentes, on s’entend pour reconnaître que les travaux de Wüster constituent la pierre angulaire de la terminologie moderne. Notre recherche vise spécifiquement à explorer la réception de l’œuvre wüsterienne en étudiant les écrits relatifs à cette œuvre dans la littérature universitaire en anglais, en espagnol et en français entre 1979 et 2009, en Europe et en Amérique. Réalisée dans le cadre du débat sur la réception de l’œuvre de Wüster, cette étude se concentre exclusivement sur l’analyse des critiques et des commentaires de son œuvre. Pour ce faire, nous avons tenu compte de la production intellectuelle de Wüster, de sa réception positive ou négative, des nouvelles approches théoriques en terminologie ainsi que des études portant sur l’état de la question en terminologie entre 1979 et 2009. Au moyen d’une recherche qualitative de type exploratoire, nous avons analysé un corpus de textes dans lesquels les auteurs : a. ont cité textuellement au moins un extrait d’un texte écrit par Wüster ; b. ont référé aux travaux de Wüster dans la bibliographie de l’article ; ou c. ont fait un commentaire sur ces travaux. De cette manière, nous avons cerné les grandes lignes du débat autour de la réception de son œuvre. Les résultats de notre étude sont éloquents. Ils offrent une idée claire de la réception des travaux de Wüster dans la communauté scientifique. Premièrement, Wüster représente une figure centrale de la terminologie moderne en ce qui concerne la normalisation terminologique. Il fut le premier à proposer une théorie de la terminologie. Deuxièmement, la contextualisation appropriée de son œuvre constitue un point de départ essentiel pour une appréciation éclairée et juste de sa contribution à l’évolution de la discipline. Troisièmement, les résultats de notre recherche dévoilent comment les nouvelles approches théoriques de la terminologie se sont adaptées aux progrès scientifiques et techniques. Quatrièmement, une étude menée sur 166 articles publiés dans des revues savantes confirme que l’œuvre de Wüster a provoqué des réactions variées tant en Europe qu’en Amérique et que sa réception est plutôt positive. Les résultats de notre étude font état d’une tendance qu’ont les auteurs de critiquer les travaux de Wüster avec lesquels, dans la plupart des cas, ils ne semblent cependant pas être bien familiarisés. La « méthodologie des programmes de recherche scientifique », proposée par Lakatos (1978) et appliquée comme un modèle interprétatif, nous a permis de démontrer que Wüster a joué un rôle décisif dans le développement de la terminologie comme discipline et que la terminologie peut être perçue comme un programme de recherche scientifique. La conclusion principale de notre thèse est que la terminologie a vécu des changements considérables et progressifs qui l’ont aidée à devenir, en termes lakatosiens, une discipline forte tant au plan théorique que descriptif.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Le progrès scientifique et technologique n'est pas sans faille – les conséquences imprévues de son application peuvent causer de nouveaux problèmes. Tel est le constat machiavélien sur lequel est fondé le projet En imparfaite santé : la médicalisation de l'architecture du Centre Canadien d'Architecture (2011-2012), présenté sous forme d'exposition et de catalogue. Ce mémoire étudie comment les deux plateformes, la première étant expérientielle et la seconde théorique, formulent une critique du processus de la médicalisation actuelle, lequel est entré dans le champ de l'architecture contemporaine. L’exposition est approchée comme discours et comme installation d’objets pour un public; une attention particulière est alors portée à la scénographie et au parcours du visiteur. D’autres réflexions ont pour objet le graphisme, un outil soutenant le leitmotiv de confrontation. Dans l’étude du catalogue, l’accent est mis sur l’essai d’introduction, qui est implicitement traversé par le concept fondamentalement ambivalent de pharmakon. Le péritexte, l’encadrement physique du contenu principal de l’ouvrage, est aussi examiné. Ensuite, l’analyse comparative propose que chaque plateforme véhicule un propos différent, une stratégie rendue possible par l’ambivalence de la notion de corps, entendue littéralement et métaphoriquement. La conclusion finale du mémoire esquisse une courte proposition de contextualisation, autant de cette dualité que de la remise en question de l’autorité du discours techno-scientifique. Bien qu’En imparfaite santé dirige sa critique envers la persistance de la vision moderniste de l'architecture, nous avançons que le projet concerne tout autant, sinon plus, l'omniprésence actuelle du numérique. Ce dernier, à l’instar de l’architecture moderne, ne modifie pas seulement la conception du corps humain et architectural, il renforce également une croyance positiviste dans la technologie qui n'est pas toujours contrebalancée par la pensée critique.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette recherche s’intéresse aux enjeux de l’habitat de demain de la génération des baby-boomers, tout particulièrement ceux nés entre 1945 et 1953, qui arrivent aujourd’hui à la retraite. C’est au carrefour de la vision de ce que signifie habiter selon des auteurs comme Benoit Goetz ( 2011), des philosophes comme Heidegger (1958), Bachelard (1957), Benjamin (1955), Büber (1962) ou encore Deleuze (1980) d’une part, soulignant les facteurs de porosité et les liens aux autres, et d’autre part les caractéristiques des baby-boomers telles que présentées par Jean François Sirinelli (2003) et Josée Garceau (2012), que se situe la recherche. Cette génération informée entend rester active et pratique des « adeptions » qui influencent par les gestes un savoir habiter et par là son habitat. L’étude de terrain a sondé les aspirations des baby-boomers en ce qui concerne leur choix résidentiel pour l’avenir, pour comprendre sur quelles valeurs et vers quels buts leur projet se construit. Le choix de méthodologies qualitatives s’appuie sur le visionnement préalable d’un film récent : Et si on vivait tous ensemble. Des entretiens semi-dirigés, auprès de cinq baby-boomers, de 60 à 65 ans, effectués en deux phases avec verbatim approuvés,étaient basés sur trois thèmes : la mémoire, l’adeption et le projet. Entre autres résultats, sont confirmés avec variantes, plusieurs concepts théoriques, comme ceux de porosité et d’ouverture par la fenêtre à la fois physique et virtuelle, mais soulignent le spectre de la maison de retraite et des préoccupations financières concernant l’avenir d’un habitat nécessairement autonome. Cette génération imprégnée par le monde technologique veut avoir accès à tout ce que propose la modernité sans pour autant perdre le sens de l’historicité de leur vie. Nés dans un monde en bouillonnement, les baby-boomers ont réinventé chaque étape de leur existence, ce qui laisse préfigurer que cette génération s’apprête à réinventer la retraite et ses sites domiciliaires. Aussi l’approche design devra-t-elle complètement se renouveler pour ces nouveaux usagers.