37 resultados para Virtual business


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article expands the discussion of the impact of technology on services and contributes to a broader comprehension of the nature of virtual services. This is done by discovering dimensions that distinguish physical services from virtual services, i.e. services that are distributed by electronic means and where the customer has no direct human interaction with the service provider. Differences in the core characteristics of services, servicescape and service delivery are discussed. Moreover, dimensions that differentiate between virtual services are analysed. A classification scheme for virtual services is proposed, including the origin of the service, the element of the service offering, the customisation process, stage of the service process performed, and the degree of mobility of the service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Contemporary Finnish, spoken and written, reveals loanwords or foreignisms in the form of hybrids: a mixture of Finnish and foreign syllables (alumiinivalua). Sometimes loanwords are inserted into the Finnish sentence in their raw form just as they are found in the source language (pulp, after sales palvelu). Again, sometimes loanwords are calques, which appear Finnish but are spelled and pronounced in an altogether foreign manner (Protomanageri, Promenadi kampuksella). Research Questions What role does Finnish business translation play in the migration of foreignisms into Finnish if we consider translation "as a construct of solutions determined by the ideological constraints and conflicts characterizing the target culture" (Robyns 1992: 212)? What attitudes do the Finns display toward the presence of foreignisms in their language? What socio-economic or ideological conditions (Bassnett 1994: 321) are responsible for these attitudes? Are these conditions dynamic? What tools can be used to measure such attitudes? This dissertation set out to answer these and similar questions. Attitudes are imperialist (where otherness is both denied and transformed), defensive (where otherness is acknowledged, transformed, and vilified), transdiscursive (a neutral attitude to both otherness and transformation), or finally defective (where alien migration is acknowledged and "stimulated") (Robyns 1994: 60). Methodology The research method follows Rose's schema (1984: 8): (a) take an existing theory, (b) develop from it a proposition specific enough to be tested, (c) devise a scheme that tests this proposition, (d) carry through the scheme in practice, (e) draw up results and discuss conclusions in relation to the original theory. In other words, the method attempts an explanation of a Finnish social phenomenon based on systematic analyses of translated evidence (Lewins 1992: 4) whereby what really matters is the logical sequence that connects the empirical data to the initial research questions raised above and, ultimately to its conclusion (Yin 1984: 29). Results This research found that Finnish translators of the Nokia annual reports used a foreignism whenever possible such as komponentin instead of rakenneosa, or investoida instead of sijoittaa, and often without any apparent justification (Pryce 2003: 203-12) more than the translator's personal preference. In the old documents (minutes of meetings of the Board of Directors of Osakeyhtio H. Saastamoinen, Ltd. dated 5 July 1912-1917, a NOPSA booklet (1932), Enzo-Gutzeit-Tornator Oy document (1938), Imatra Steel Oy Annual Report 1964, and Nokia Oy Annual Report 1946), foreignisms under Haugen's (1950: 210-31) Classification #1 occurred an average of 0.6 times, while in the new documents (Nokia 1998 translated Annual Reports) they occurred an average of 6.5 times. That big difference, suggests transdiscursive and defective attitudes in Finnish society toward the other. In the 1850s, Finnish attitudes toward alien persons and cultures were hardened, intolerant and prohibitive because language politics were both nascent and emerging, and Finns adopted a defensive stance (Paloposki 2002: 102 ff) to protect their cultural and national treasures such as language and folklore. Innovation The innovation here is that no prior doctoral level research measured Finnish attitudes toward foreignisms using a business translation approach. This is the first time that Haugen's classification has been modified and applied in target language analysis. It is hoped that this method would be replicated in similar research in the future. Applications For practical applications, researchers with interest in languages, language development, language influences, language ideologies, and power structures that affect national language policies will find this thesis useful, especially the model for collecting, grouping, and analyzing foreignisms that has been demonstrated here. It is intended to document for posterity current attitudes of Finns toward the other as revealed in business translations from 1912-1964, and in 1998. This way, future language researchers would be able to explore a time-line of Finnish language development and attitudes toward the other. Communication firms may also find this research interesting. In future, could the model we adopted be used to analyze literary texts or religious texts for example? Future Trends Though business documents show transdiscursive attitudes, other segments of Finnish society may show defensive or imperialist attitudes. When the ideology of industrialization changes in the future, will Finnish attitudes toward the other change as well? Will it then be possible to use the same kind of analytical tools to measure Finnish attitudes? More broadly, will linguistic change continue in the same direction of transdiscursive attitudes, or will the change slow down or even reverse into xenophobic attitudes? Is this our model culture-specific or can it be used in the context of other cultures? Conclusion There is anger against foreignisms in Finland as newspaper publications and television broadcasts show, but research shows that a majority of Finns consider foreignisms and the languages from which they come as sources of enrichment for Finnish culture (Laitinen 2000, Eurobarometer series 41 of July 1994, 44 of Spring 1996, 50 of Autumn 1998). Ideologies of industrialization and globalization in Finland have facilitated transdiscursive tendencies. When Finland's political ideology was intolerant toward foreign influences in the 1850s because Finland was in the process of consolidating her nascent country and language, attitudes toward the importation of loanwords also became intolerant. Presently, when industrialization and globalization became the dominant ideologies, we see a shift in attitudes toward transdiscursive tendencies. Ideology is usually unseen and too often ignored by translation researchers. However, ideology reveals itself as the most powerful factor affecting language attitudes in a target culture. Key words Finnish, Business Translation, Ideology, Foreignisms, Imperialist Attitudes, Defensive Attitudes, Transdiscursive Attitudes, Defective Attitudes, the Other, Old Documents, New Documents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 26-hour English reading comprehension course was taught to two groups of second year Finnish Pharmacy students: a virtual group (33 students) and a teacher-taught group (25 students). The aims of the teaching experiment were to find out: 1.What has to be taken into account when teaching English reading comprehension to students of pharmacy via the Internet and using TopClass? 2. How will the learning outcomes of the virtual group and the control group differ? 3. How will the students and the Department of Pharmacy respond to the different and new method, i.e. the virtual teaching method? 4. Will it be possible to test English reading comprehension learning material using the groupware tool TopClass? The virtual exercises were written within the Internet authoring environment, TopClass. The virtual group was given the reading material and grammar booklet on paper, but they did the reading comprehension tasks (written by the teacher), autonomously via the Internet. The control group was taught by the same teacher in 12 2-hour sessions, while the virtual group could work independently within the given six weeks. Both groups studied the same material: ten pharmaceutical articles with reading comprehension tasks as well as grammar and vocabulary exercises. Both groups took the same final test. Students in both groups were asked to evaluate the course using a 1 to 5 rating scale and they were also asked to assess their respective courses verbally. A detailed analysis of the different aspects of the student evaluation is given. Conclusions: 1.The virtual students learned pharmaceutical English relatively well but not significantly better than the classroom students 2. The overall student satisfaction in the virtual pharmacy English reading comprehension group was found to be higher than that in the teacher-taught control group. 3. Virtual learning is easier for linguistically more able students; less able students need more time with the teacher. 4. The sample in this study is rather small, but it is a pioneering study. 5. The Department of Pharmacy in the University of Helsinki wishes to incorporate virtual English reading comprehension teaching in its curriculum. 6. The sophisticated and versatile TopClass system is relatively easy for a traditional teacher and quite easy for the students to learn. It can be used e.g. for automatic checking of routine answers and document transfer, which both lighten the workloads of both parties. It is especially convenient for teaching reading comprehension. Key words: English reading comprehension, teacher-taught class, virtual class, attitudes of students, learning outcomes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the study was to analyze and facilitate collaborative design in a virtual learning environment (VLE). Discussions of virtual design in design education have typically focused on technological or communication issues, not on pedagogical issues. Yet in order to facilitate collaborative design, it is also necessary to address the pedagogical issues related to the virtual design process. In this study, the progressive inquiry model of collaborative designing was used to give a structural level of facilitation to students working in the VLE. According to this model, all aspects of inquiry, such as creating the design context, constructing a design idea, evaluating the idea, and searching for new information, can be shared in a design community. The study consists of three design projects: 1) designing clothes for premature babies, 2) designing conference bags for an international conference, and 3) designing tactile books for visually impaired children. These design projects constituted a continuum of design experiments, each of which highlighted certain perspectives on collaborative designing. The design experiments were organized so that the participants worked in design teams, both face-to-face and virtually. The first design experiment focused on peer collaboration among textile teacher students in the VLE. The second design experiment took into consideration end-users needs by using a participatory design approach. The third design experiment intensified computer-supported collaboration between students and domain experts. The virtual learning environments, in these design experiments, were designed to support knowledge-building pedagogy and progressive inquiry learning. These environments enabled a detailed recording of all computer-mediated interactions and data related to virtual designing. The data analysis was based on qualitative content analysis of design statements in the VLE. This study indicated four crucial issues concerning collaborative design in the VLE in craft and design education. Firstly, using the collaborative design process in craft and design education gives rise to special challenges of building learning communities, creating appropriate design tasks for them, and providing tools for collaborative activities. Secondly, the progressive inquiry model of collaborative designing can be used as a scaffold support for design thinking and for reflection on the design process. Thirdly, participation and distributed expertise can be facilitated by considering the key stakeholders who are related to the design task or design context, and getting them to participate in virtual designing. Fourthly, in the collaborative design process, it is important that team members create and improve visual and technical ideas together, not just agree or disagree about proposed ideas. Therefore, viewing the VLE as a medium for collaborative construction of the design objects appears crucial in order to understand and facilitate the complex processes in collaborative designing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Väitöskirja koostuu neljästä esseestä, joissa tutkitaan empiirisen työntaloustieteen kysymyksiä. Ensimmäinen essee tarkastelee työttömyysturvan tason vaikutusta työllistymiseen Suomessa. Vuonna 2003 ansiosidonnaista työttömyysturvaa korotettiin työntekijöille, joilla on pitkä työhistoria. Korotus oli keskimäärin 15 % ja se koski ensimmäistä 150 työttömyyspäivää. Tutkimuksessa arvioidaan korotuksen vaikutus vertailemalla työllistymisen todennäköisyyksiä korotuksen saaneen ryhmän ja vertailuryhmän välillä ennen uudistusta ja sen jälkeen. Tuloksien perusteella työttömyysturvan korotus laski työllistymisen todennäköisyyttä merkittävästi, keskimäärin noin 16 %. Korotuksen vaikutus on suurin työttömyyden alussa ja se katoaa kun oikeus korotettuun ansiosidonnaiseen päättyy. Toinen essee tutkii työttömyyden pitkän aikavälin kustannuksia Suomessa keskittyen vuosien 1991 – 1993 syvään lamaan. Laman aikana toimipaikkojen sulkeminen lisääntyi paljon ja työttömyysaste nousi yli 13 prosenttiyksikköä. Tutkimuksessa verrataan laman aikana toimipaikan sulkemisen vuoksi työttömäksi jääneitä parhaassa työiässä olevia miehiä työllisinä pysyneisiin. Työttömyyden vaikutusta tarkastellaan kuuden vuoden seurantajaksolla. Vuonna 1999 työttömyyttä laman aikana kokeneen ryhmän vuosiansiot olivat keskimäärin 25 % alemmat kuin vertailuryhmässä. Tulojen menetys johtui sekä alhaisemmasta työllisyydestä että palkkatasosta. Kolmannessa esseessä tarkastellaan Suomen 1990-luvun alun laman aiheuttamaa työttömyysongelmaa tutkimalla työttömyyden kestoon vaikuttavia tekijöitä yksilötasolla. Kiinnostuksen kohteena on työttömyyden rakenteen ja työn kysynnän muutoksien vaikutus keskimääräiseen kestoon. Usein oletetaan, että laman seurauksena työttömäksi jää keskimääräistä huonommin työllistyviä henkilöitä, jolloin se itsessään pidentäisi keskimääräistä työttömyyden kestoa. Tuloksien perusteella makrotason kysyntävaikutus oli keskeinen työttömyyden keston kannalta ja rakenteen muutoksilla oli vain pieni kestoa lisäävä vaikutus laman aikana. Viimeisessä esseessä tutkitaan suhdannevaihtelun vaikutusta työpaikkaonnettomuuksien esiintymiseen. Tutkimuksessa käytetään ruotsalaista yksilötason sairaalahoitoaineistoa, joka on yhdistetty populaatiotietokantaan. Aineiston avulla voidaan tutkia vaihtoehtoisia selityksiä onnettomuuksien lisääntymiselle noususuhdanteessa, minkä on esitetty johtuvan esim. stressin tai kiireen vaikutuksesta. Tuloksien perusteella työpaikkaonnettomuudet ovat syklisiä, mutta vain tiettyjen ryhmien kohdalla. Työvoiman rakenteen vaihtelu saattaa selittää osan naisten onnettomuuksien syklisyydestä. Miesten kohdalla vain vähemmän vakavat onnettomuudet ovat syklisiä, mikä saattaa johtua strategisesta käyttäytymisestä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ProFacil model is a generic process model defined as a framework model showing the links between the facilities management process and the building end user’s business process. The purpose of using the model is to support more detailed process modelling. The model has been developed using the IDEF0 modelling method. The ProFacil model describes business activities from the generalized point of view as management-, support-, and core processes and their relations. The model defines basic activities in the provision of a facility. Examples of these activities are “operate facilities”, “provide new facilities”, “provide re-build facilities”, “provide maintained facilities” and “perform dispose of facilities”. These are all generic activities providing a basis for a further specialisation of company specific FM activities and their tasks. A facilitator can establish a specialized process model using the ProFacil model and interacting with company experts to describe their company’s specific processes. These modelling seminars or interviews will be done in an informal way, supported by the high-level process model as a common reference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.