912 resultados para Optics in computing
Resumo:
This contribution presents the first stage of a project to assist the transition of a traditional to a blended program in higher nursing education. We shall describe the goals and context of this project, present the evaluation framework, discuss some early results and then discuss the usefulness of the first version of the evaluation framework.
Resumo:
An experiment was carried out to examine the impact on electrodermal activity of people when approached by groups of one or four virtual characters at varying distances. It was premised on the basis of proxemics theory that the closer the approach of the virtual characters to the participant, the greater the level of physiological arousal. Physiological arousal was measured by the number of skin conductance responses within a short time period after the approach, and the maximum change in skin conductance level 5 s after the approach. The virtual characters were each either female or a cylinder of human size, and one or four characters approached each subject a total of 12 times. Twelve male subjects were recruited for the experiment. The results suggest that the number of skin conductance responses after the approach and the change in skin conductance level increased the closer the virtual characters approached toward the participants. Moreover, these response variables were inversely correlated with the number of visits, showing a typical adaptation effect. There was some evidence to suggest that the number of characters who simultaneously approached (one or four) was positively associated with the responses. Surprisingly there was no evidence of a difference in response between the humanoid characters and cylinders on the basis of this physiological data. It is suggested that the similarity in this quantitative arousal response to virtual characters and virtual objects might mask a profound difference in qualitative response, an interpretation supported by questionnaire and interview results. Overall the experiment supported the premise that people exhibit heightened physiological arousal the closer they are approached by virtual characters.
Resumo:
An experiment was carried out to examine the impact on electrodermal activity of people when approached by groups of one or four virtual characters at varying distances. It was premised on the basis of proxemics theory that the closer the approach of the virtual characters to the participant, the greater the level of physiological arousal. Physiological arousal was measured by the number of skin conductance responses within a short time period after the approach, and the maximum change in skin conductance level 5 s after the approach. The virtual characters were each either female or a cylinder of human size, and one or four characters approached each subject a total of 12 times. Twelve male subjects were recruited for the experiment. The results suggest that the number of skin conductance responses after the approach and the change in skin conductance level increased the closer the virtual characters approached toward the participants. Moreover, these response variables were inversely correlated with the number of visits, showing a typical adaptation effect. There was some evidence to suggest that the number of characters who simultaneously approached (one or four) was positively associated with the responses. Surprisingly there was no evidence of a difference in response between the humanoid characters and cylinders on the basis of this physiological data. It is suggested that the similarity in this quantitative arousal response to virtual characters and virtual objects might mask a profound difference in qualitative response, an interpretation supported by questionnaire and interview results. Overall the experiment supported the premise that people exhibit heightened physiological arousal the closer they are approached by virtual characters.
Resumo:
This paper focuses on the use of FLOSS to promote vendor independence/avoid lock-in in the enterprise. It looks at how FLOSS projects follow open standards, how forking prevents lock-in if a project threatens to migrate to a closed-source strategy and how FLOSS lowers the barrier to entry for SMEs wishing to implement and support software. However it also looks at how the adoption of policies mandating open standards instead of FLOSS and how the success of cloud computing threatens to erode those benefits. It discusses ways in which cloud computing can be adopted in the enterprise without forfeiting those advantages and urge corporate and government policy makers to mandate FLOSS rather than be satisfied with open standards.
Resumo:
Tietokonejärjestelmän osien ja ohjelmistojen suorituskykymittauksista saadaan tietoa,jota voidaan käyttää suorituskyvyn parantamiseen ja laitteistohankintojen päätöksen tukena. Tässä työssä tutustutaan suorituskyvyn mittaamiseen ja mittausohjelmiin eli ns. benchmark-ohjelmistoihin. Työssä etsittiin ja arvioitiin eri tyyppisiä vapaasti saatavilla olevia benchmark-ohjelmia, jotka soveltuvat Linux-laskentaklusterin suorituskyvynanalysointiin. Benchmarkit ryhmiteltiin ja arvioitiin testaamalla niiden ominaisuuksia Linux-klusterissa. Työssä käsitellään myös mittausten tekemisen ja rinnakkaislaskennan haasteita. Benchmarkkeja löytyi moneen tarkoitukseen ja ne osoittautuivat laadultaan ja laajuudeltaan vaihteleviksi. Niitä on myös koottu ohjelmistopaketeiksi, jotta laitteiston suorituskyvystä saisi laajemman kuvan kuin mitä yhdellä ohjelmalla on mahdollista saada. Olennaista on ymmärtää nopeus, jolla dataa saadaan siirretyä prosessorille keskusmuistista, levyjärjestelmistä ja toisista laskentasolmuista. Tyypillinen benchmark-ohjelma sisältää paljon laskentaa tarvitsevan matemaattisen algoritmin, jota käytetään tieteellisissä ohjelmistoissa. Benchmarkista riippuen tulosten ymmärtäminen ja hyödyntäminen voi olla haasteellista.
Resumo:
A reinforcement learning (RL) method was used to train a virtual character to move participants to a specified location. The virtual environment depicted an alleyway displayed through a wide field-of-view head-tracked stereo head-mounted display. Based on proxemics theory, we predicted that when the character approached within a personal or intimate distance to the participants, they would be inclined to move backwards out of the way. We carried out a between-groups experiment with 30 female participants, with 10 assigned arbitrarily to each of the following three groups: In the Intimate condition the character could approach within 0.38m and in the Social condition no nearer than 1.2m. In the Random condition the actions of the virtual character were chosen randomly from among the same set as in the RL method, and the virtual character could approach within 0.38m. The experiment continued in each case until the participant either reached the target or 7 minutes had elapsed. The distributions of the times taken to reach the target showed significant differences between the three groups, with 9 out of 10 in the Intimate condition reaching the target significantly faster than the 6 out of 10 who reached the target in the Social condition. Only 1 out of 10 in the Random condition reached the target. The experiment is an example of applied presence theory: we rely on the many findings that people tend to respond realistically in immersive virtual environments, and use this to get people to achieve a task of which they had been unaware. This method opens up the door for many such applications where the virtual environment adapts to the responses of the human participants with the aim of achieving particular goals.
Resumo:
An experiment was carried out to examine the impact on electrodermal activity of people when approached by groups of one or four virtual characters at varying distances. It was premised on the basis of proxemics theory that the closer the approach of the virtual characters to the participant, the greater the level of physiological arousal. Physiological arousal was measured by the number of skin conductance responses within a short time period after the approach, and the maximum change in skin conductance level 5 s after the approach. The virtual characters were each either female or a cylinder of human size, and one or four characters approached each subject a total of 12 times. Twelve male subjects were recruited for the experiment. The results suggest that the number of skin conductance responses after the approach and the change in skin conductance level increased the closer the virtual characters approached toward the participants. Moreover, these response variables were inversely correlated with the number of visits, showing a typical adaptation effect. There was some evidence to suggest that the number of characters who simultaneously approached (one or four) was positively associated with the responses. Surprisingly there was no evidence of a difference in response between the humanoid characters and cylinders on the basis of this physiological data. It is suggested that the similarity in this quantitative arousal response to virtual characters and virtual objects might mask a profound difference in qualitative response, an interpretation supported by questionnaire and interview results. Overall the experiment supported the premise that people exhibit heightened physiological arousal the closer they are approached by virtual characters.
Resumo:
Differential X-ray phase-contrast tomography (DPCT) refers to a class of promising methods for reconstructing the X-ray refractive index distribution of materials that present weak X-ray absorption contrast. The tomographic projection data in DPCT, from which an estimate of the refractive index distribution is reconstructed, correspond to one-dimensional (1D) derivatives of the two-dimensional (2D) Radon transform of the refractive index distribution. There is an important need for the development of iterative image reconstruction methods for DPCT that can yield useful images from few-view projection data, thereby mitigating the long data-acquisition times and large radiation doses associated with use of analytic reconstruction methods. In this work, we analyze the numerical and statistical properties of two classes of discrete imaging models that form the basis for iterative image reconstruction in DPCT. We also investigate the use of one of the models with a modern image reconstruction algorithm for performing few-view image reconstruction of a tissue specimen.
Resumo:
The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.
Resumo:
Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.
Resumo:
This study examines how firms interpret new, potentially disruptive technologies in their own strategic context. The work presents a cross-case analysis of four potentially disruptive technologies or technical operating models: Bluetooth, WLAN, Grid computing and Mobile Peer-to-peer paradigm. The technologies were investigated from the perspective of three mobile operators, a device manufacturer and a software company in the ICT industry. The theoretical background for the study consists of the resource-based view of the firm with dynamic perspective, the theories on the nature of technology and innovations, and the concept of business model. The literature review builds up a propositional framework for estimating the amount of radical change in the companies' business model with two middle variables, the disruptiveness potential of a new technology, and the strategic importance of a new technology to a firm. The data was gathered in group discussion sessions in each company. The results of each case analysis were brought together to evaluate, how firms interpret the potential disruptiveness in terms of changes in product characteristics and added value, technology and market uncertainty, changes in product-market positions, possible competence disruption and changes in value network positions. The results indicate that the perceived disruptiveness in terms ofproduct characteristics does not necessarily translate into strategic importance. In addition, firms did not see the new technologies as a threat in terms of potential competence disruption.
Resumo:
1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.
Resumo:
Our efforts are directed towards the understanding of the coscheduling mechanism in a NOW system when a parallel job is executed jointly with local workloads, balancing parallel performance against the local interactive response. Explicit and implicit coscheduling techniques in a PVM-Linux NOW (or cluster) have been implemented. Furthermore, dynamic coscheduling remains an open question when parallel jobs are executed in a non-dedicated Cluster. A basis model for dynamic coscheduling in Cluster systems is presented in this paper. Also, one dynamic coscheduling algorithm for this model is proposed. The applicability of this algorithm has been proved and its performance analyzed by simulation. Finally, a new tool (named Monito) for monitoring the different queues of messages in such an environments is presented. The main aim of implementing this facility is to provide a mean of capturing the bottlenecks and overheads of the communication system in a PVM-Linux cluster.
Resumo:
A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.
Resumo:
PURPOSE: To identify risk factors associated with mortality in patients with severe community-acquired pneumonia (CAP) caused by S. pneumoniae who require intensive care unit (ICU) management, and to assess the prognostic values of these risk factors at the time of admission. METHODS: Retrospective analysis of all consecutive patients with CAP caused by S. pneumoniae who were admitted to the 32-bed medico-surgical ICU of a community and referral university hospital between 2002 and 2011. Univariate and multivariate analyses were performed on variables available at admission. RESULTS: Among the 77 adult patients with severe CAP caused by S. pneumoniae who required ICU management, 12 patients died (observed mortality rate 15.6 %). Univariate analysis indicated that septic shock and low C-reactive protein (CRP) values at admission were associated with an increased risk of death. In a multivariate model, after adjustment for age and gender, septic shock [odds ratio (OR), confidence interval 95 %; 4.96, 1.11-22.25; p = 0.036], and CRP (OR 0.99, 0.98-0.99 p = 0.034) remained significantly associated with death. Finally, we assessed the discriminative ability of CRP to predict mortality by computing its receiver operating characteristic curve. The CRP value cut-off for the best sensitivity and specificity was 169.5 mg/L to predict hospital mortality with an area under the curve of 0.72 (0.55-0.89). CONCLUSIONS: The mortality of patients with S. pneumoniae CAP requiring ICU management was much lower than predicted by severity scores. The presence of septic shock and a CRP value at admission <169.5 mg/L predicted a fatal outcome.