961 resultados para General-purpose computing on graphics processing units (GPGPU)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although advertising is pervasive in our daily, it proves to be not necessarily efficient all the times due to bad conditions or bad contexts of reception. Indeed, the communication process might be jeopardized at its very last stage because of advertising exposure quality. However critical it may be, ad exposure quality is not very much examined by researchers or practitioners. In this paper, we investigate how tiredness combined with ad complexity might influence the way consumers extract and process ad elements. Investigating tiredness is useful because it is a common daily state experienced by everyone at various moments of the day. And although it might drastically alter ad reception, it has not been studied in advertising for the moment. In this regards, we observe eye movement patterns of consumers viewing simple or complex advertisements being tired or not. We surprisingly find that tired subjects viewing complex ads don’t adopt a lessening effort visual strategy. They rather use a resource demanding one. We assume that the Sustained Attention strategy occurring is a kind of adaptive strategy allowing to deal with an anticipated lack of resource.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A General Sales Agent (GSA) is an airline's outsourcing counter part that markets and manages cargo services. An empirical investigation is undertaken to ascertain whether GSAs contribute economically to the air cargo industry using three 'litmus test' indicators:1) contribution to the airline's sales and profitability by expanding o perating networks; 2) viability as a marketing option for emerging or struggling airlines to help cut operating costs to reduce prices; 3) cost-effective GSAs were found to establish an airline's market presence through wide network coverage and good local knowledge, leading to an expansion of airline's operating networks and generating greater sales revenue. Copyright © 2012 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Napjaink informatikai világának talán legkeresettebb hívó szava a cloud computing, vagy magyar fordításban, a számítási felhő. A fordítás forrása az EU-s (Digitális Menetrend magyar változata, 2010) A számítási felhő üzleti modelljének részletes leírását adja (Bőgel, 2009). Bőgel György ismerteti az új, közműszerű informatikai szolgáltatás kialakulását és gazdasági előnyeit, nagy jövőt jósolva a számítási felhőnek az üzleti modellek versenyében. A szerző – a számítási felhő üzleti előnyei mellett – nagyobb hangsúlyt fektet dolgozatában a gyors elterjedést gátló tényezőkre, és arra, hogy mit jelentenek az előnyök és a hátrányok egy üzleti, informatikai vagy megfelelőségi vezető számára. Nem csökkentve a cloud modell gazdasági jelentőségét, fontosnak tartja, hogy a problémákról és a kockázatokról is szóljon. Kiemeli, hogy a kockázatokban – különösen a biztonsági és adatvédelmi kockázatokban – lényeges különbségek vannak az Európai Gazdasági Térség és a világ többi része, pl. az Amerikai Egyesült Államok között. A cikkben rámutat ezekre a különbségekre, és az olvasó magyarázatot kap arra is, hogy miért várható a számítási felhő lassabb terjedése Európában, mint a világ más részein. Bemutatja az EU erőfeszítéseit is a számítási felhő európai terjedésének elősegítésére, tekintettel a modell versenyképességet növelő hatására. / === / One of the most popular concept of the recent web searches is cloud computing. Several authors present detailed description of the new service model and it's business benefits and cite the optimistic prognoses of the cloud experts regarding the competition of information system service models. The author analyses the operational benefits of the cloud application and give a detailed description of the inhibitors of the fast expansion of the service modell. He also analyses the pros and cons of the cloud for a business manager, an information and a compliance officer. When understanding the advantages of the cloud, it is equally important to review the problems and risks associated with the model. The paper gives a list of the expected cloud-specific risks. It also explains the differences in security and data protection approach between the European Economic Area and the rest of the world, including the USA. The explains why slower expansion of the cloud modell is expected in Europe than in the rest of the world. The efforts of the EU Committee in helping to spread the cloud model is also presented, as the EU's officers consider the model as an important element of competitiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the history of schema theory and how culture is incorporated into schema theory. Furthermore, the author argues that cultural schema affects students’ usage of reader-based processing and text-based processing in reading.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Potentially inappropriate prescribing (PIP) is common in older people in primary care, as evidenced by a significant body of quantitative research. However, relatively few qualitative studies have investigated the phenomenon of PIP and its underlying processes from the perspective of general practitioners (GPs). The aim of this paper is to explore qualitatively, GP perspectives regarding prescribing and PIP in older primary care patients.

Method: Semi-structured qualitative interviews were conducted with GPs participating in a randomised controlled trial (RCT) of an intervention to decrease PIP in older patients (≥70 years) in Ireland. Interviews were conducted with GP participants (both intervention and control) from the OPTI-SCRIPT cluster RCT as part of the trial process evaluation between January and July 2013. Interviews were conducted by one interviewer and audio recorded. Interviews were transcribed verbatim and a thematic analysis was conducted.

Results: Seventeen semi-structured interviews were conducted (13 male; 4 female). Three main, inter-related themes emerged (complex prescribing environment, paternalistic doctor-patient relationship, and relevance of PIP concept). Patient complexity (e.g. polypharmacy, multimorbidity), as well as prescriber complexity (e.g. multiple prescribers, poor communication, restricted autonomy) were all identified as factors contributing to a complex prescribing environment where PIP could occur, as was a paternalistic-doctor patient relationship. The concept of PIP was perceived to be of variable usefulness to GPs and the criteria to measure it may be at odds with the complex processes of prescribing for this patient population.

Conclusions: Several inter-related factors contributing to the occurrence of PIP were identified, some of which may be amenable to intervention. Improvement strategies focused on improved management of polypharmacy and multimorbidity, and communication across primary and secondary care could result in substantial improvements in PIP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solving a complex Constraint Satisfaction Problem (CSP) is a computationally hard task which may require a considerable amount of time. Parallelism has been applied successfully to the job and there are already many applications capable of harnessing the parallel power of modern CPUs to speed up the solving process. Current Graphics Processing Units (GPUs), containing from a few hundred to a few thousand cores, possess a level of parallelism that surpasses that of CPUs and there are much less applications capable of solving CSPs on GPUs, leaving space for further improvement. This paper describes work in progress in the solving of CSPs on GPUs, CPUs and other devices, such as Intel Many Integrated Cores (MICs), in parallel. It presents the gains obtained when applying more devices to solve some problems and the main challenges that must be faced when using devices with as different architectures as CPUs and GPUs, with a greater focus on how to effectively achieve good load balancing between such heterogeneous devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To reduce the amount of time needed to solve the most complex Constraint Satisfaction Problems (CSPs) usually multi-core CPUs are used. There are already many applications capable of harnessing the parallel power of these devices to speed up the CSPs solving process. Nowadays, the Graphics Processing Units (GPUs) possess a level of parallelism that surpass the CPUs, containing from a few hundred to a few thousand cores and there are much less applications capable of solving CSPs on GPUs, leaving space for possible improvements. This article describes the work in progress for solving CSPs on GPUs and CPUs and compares results with some state-of-the-art solvers, presenting already some good results on GPUs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Safe collaboration between a robot and human operator forms a critical requirement for deploying a robotic system into a manufacturing and testing environment. In this dissertation, the safety requirement for is developed and implemented for the navigation system of the mobile manipulators. A methodology for human-robot co-existence through a 3d scene analysis is also investigated. The proposed approach exploits the advance in computing capability by relying on graphic processing units (GPU’s) for volumetric predictive human-robot contact checking. Apart from guaranteeing safety of operators, human-robot collaboration is also fundamental when cooperative activities are required, as in appliance test automation floor. To achieve this, a generalized hierarchical task controller scheme for collision avoidance is developed. This allows the robotic arm to safely approach and inspect the interior of the appliance without collision during the testing procedure. The unpredictable presence of the operators also forms dynamic obstacle that changes very fast, thereby requiring a quick reaction from the robot side. In this aspect, a GPU-accelarated distance field is computed to speed up reaction time to avoid collision between human operator and the robot. An automated appliance testing also involves robotized laundry loading and unloading during life cycle testing. This task involves Laundry detection, grasp pose estimation and manipulation in a container, inside the drum and during recovery grasping. A wrinkle and blob detection algorithms for grasp pose estimation are developed and grasp poses are calculated along the wrinkle and blobs to efficiently perform grasping task. By ranking the estimated laundry grasp poses according to a predefined cost function, the robotic arm attempt to grasp poses that are more comfortable from the robot kinematic side as well as collision free on the appliance side. This is achieved through appliance detection and full-model registration and collision free trajectory execution using online collision avoidance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vista la crescente necessità di svolgere progetti d'innovazione guidati dalle nuove tecnologie, la presente ricerca risponde all'esigenza di trovare una metodologia efficace per poterli affrontare proficuamente. In un primo momento, l'elaborato propone una metodologia che permette di elaborare una classificazione dei progetti d'innovazione technology-driven in classi ed un pacchetto di tools funzionali al riconoscimento dell'appartenenza alle stesse. In un secondo momento, giunti a comprensione del fatto che ad ognuna delle classi corrisponde uno specifico fine raggiungibile con una specifica metodologia, lo studio descrive la metodologia seguita per raggiungere una efficace e ripetibile elaborazione di principi progettuali, buone pratiche e strumenti che permettano ad una Organizzazione di appropriarsi del valore di una tecnologia General Purpose (GPT) attraverso l'ideazione di soluzioni innovativa. La progettazione è figlia di un approccio di Design Thinking (DT), sia poichè esso è stato usato nello svolgimento stesso della ricerca, sia perchè la metodologia DT è alla base della modellazione del processo proposto per la classe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a non-rigid free-from 2D-3D registration approach using statistical deformation model (SDM). In our approach the SDM is first constructed from a set of training data using a non-rigid registration algorithm based on b-spline free-form deformation to encode a priori information about the underlying anatomy. A novel intensity-based non-rigid 2D-3D registration algorithm is then presented to iteratively fit the 3D b-spline-based SDM to the 2D X-ray images of an unseen subject, which requires a computationally expensive inversion of the instantiated deformation in each iteration. In this paper, we propose to solve this challenge with a fast B-spline pseudo-inversion algorithm that is implemented on graphics processing unit (GPU). Experiments conducted on C-arm and X-ray images of cadaveric femurs demonstrate the efficacy of the present approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compared reading acquisition in English and Italian children up to late primary school analyzing RTs and errors as a function of various psycholinguistic variables and changes due to experience. Our results show that reading becomes progressively more reliant on larger processing units with age, but that this is modulated by consistency of the language. In English, an inconsistent orthography, reliance on larger units occurs earlier on and it is demonstrated by faster RTs, a stronger effect of lexical variables and lack of length effect (by fifth grade). However, not all English children are able to master this mode of processing yielding larger inter-individual variability. In Italian, a consistent orthography, reliance on larger units occurs later and it is less pronounced. This is demonstrated by larger length effects which remain significant even in older children and by larger effects of a global factor (related to speed of orthographic decoding) explaining changes of performance across ages. Our results show the importance of considering not only overall performance, but inter-individual variability and variability between conditions when interpreting cross-linguistic differences.