878 resultados para Multi-agent computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aquest treball final de carrera es basa en la creació d'una borsa de treball on-line, distribuïda i multi-dispositiu. Ha estat creada a partir de noves tecnologies com Play Framework i Twiter Bootstrap, utilitzant els llenguatges Java i Scala, usant marcatge HTML5 i desplegada en un servidor de cloud computing anomenat Heroku.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cases of fatal outcome after surgical intervention are autopsied to determine the cause of death and to investigate whether medical error caused or contributed to the death. For medico-legal purposes, it is imperative that autopsy findings are documented clearly. Modern imaging techniques such as multi-detector computed tomography (MDCT) and postmortem CT angiography, which is used for vascular system imaging, are useful tools for determining cause of death. The aim of this study was to determine the utility of postmortem CT angiography for the medico-legal death investigation. This study investigated 10 medico-legal cases with a fatal outcome after surgical intervention using multi-phase postmortem whole body CT angiography. A native CT scan was performed as well as three angiographic phases (arterial, venous, and dynamic) using a Virtangio((R)) perfusion device and the oily contrast agent, Angiofil((R)). The results of conventional autopsy were compared to those from the radiological investigations. We also investigated whether the radiological findings affected the final interpretation of cause-of-death. Causes of death were hemorrhagic shock, intracerebral hemorrhage, septic shock, and a combination of hemorrhage and blood aspiration. The diagnoses were made by conventional autopsy as well as by postmortem CT angiography. Hemorrhage played an important role in eight of ten cases. The radiological exam revealed the exact source of bleeding in seven of the eight cases, whereas conventional autopsy localized the source of bleeding only generally in five of the seven cases. In one case, neither conventional autopsy nor CT angiography identified the source of hemorrhage. We conclude that postmortem CT angiography is extremely useful for investigating deaths following surgical interventions. This technique helps document autopsy findings and allows a second examination if it is needed; specifically, it detects and visualizes the sources of hemorrhages in detail, which is often of particular interest in such cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we propose a novel method for calculating cardiac 3-D strain. The method requires the acquisition of myocardial short-axis (SA) slices only and produces the 3-D strain tensor at every point within every pair of slices. Three-dimensional displacement is calculated from SA slices using zHARP which is then used for calculating the local displacement gradient and thus the local strain tensor. There are three main advantages of this method. First, the 3-D strain tensor is calculated for every pixel without interpolation; this is unprecedented in cardiac MR imaging. Second, this method is fast, in part because there is no need to acquire long-axis (LA) slices. Third, the method is accurate because the 3-D displacement components are acquired simultaneously and therefore reduces motion artifacts without the need for registration. This article presents the theory of computing 3-D strain from two slices using zHARP, the imaging protocol, and both phantom and in-vivo validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of contextual information in mobile devices is receiving increasing attention in mobile and ubiquitous computing research. An important requirement for mobile development today is that devices should be able to interact with the context. In this paper we present a series of contributions regarding previous work on context-awareness. In the first place, we describe a client-server architecture that provides a mechanism for preparing target non context-aware applications in order to be delivered as context-aware applications in a semi-automatic way. Secondly, the framework used in the server to instantiate specific components for context-awareness, the Implicit Plasticity Framework, provides independence from the underlying mobile technology used in client device, as it is shown in the case studies presented. Finally, proposed infrastructure deals with the interaction among different context constraints provided by diverse sensors. All of these contributions are extensions to the infrastructure based on the Dichotomic View of plasticity, which now offers multi-purpose support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical properties of binary complex networks are well understood and recently many attempts have been made to extend this knowledge to weighted ones. There are, however, subtle yet important considerations to be made regarding the nature of the weights used in this generalization. Weights can be either continuous or discrete magnitudes, and in the latter case, they can additionally have undistinguishable or distinguishable nature. This fact has not been addressed in the literature insofar and has deep implications on the network statistics. In this work we face this problem introducing multiedge networks as graphs where multiple (distinguishable) connections between nodes are considered. We develop a statistical mechanics framework where it is possible to get information about the most relevant observables given a large spectrum of linear and nonlinear constraints including those depending both on the number of multiedges per link and their binary projection. The latter case is particularly interesting as we show that binary projections can be understood from multiedge processes. The implications of these results are important as many real-agent-based problems mapped onto graphs require this treatment for a proper characterization of their collective behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Specifically we aim to demonstrate that the results of our earlier safety data hold true in this much larger multi-national and multi-ethnical population. BACKGROUND: We sought to re-evaluate the frequency, manifestations, and severity of acute adverse reactions associated with administration of several gadolinium- based contrast agents during routine CMR on a European level. METHODS: Multi-centre, multi-national, and multi-ethnical registry with consecutive enrolment of patients in 57 European centres. RESULTS: During the current observation 37,788 doses of Gadolinium based contrast agent were administered to 37,788 patients. The mean dose was 24.7 ml (range 5-80 ml), which is equivalent to 0.123 mmol/kg (range 0.01 - 0.3 mmol/kg). Forty-five acute adverse reactions due to contrast administration occurred (0.12%). Most reactions were classified as mild (43 of 45) according to the American College of Radiology definition. The most frequent complaints following contrast administration were rashes and hives (15 of 45), followed by nausea (10 of 45) and flushes (10 of 45). The event rate ranged from 0.05% (linear non-ionic agent gadodiamide) to 0.42% (linear ionic agent gadobenate dimeglumine). Interestingly, we also found different event rates between the three main indications for CMR ranging from 0.05% (risk stratification in suspected CAD) to 0.22% (viability in known CAD). CONCLUSIONS: The current data indicate that the results of the earlier safety data hold true in this much larger multi-national and multi-ethnical population. Thus, the "off-label" use of Gadolinium based contrast in cardiovascular MR should be regarded as safe concerning the frequency, manifestation and severity of acute events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In accordance with the Moore's law, the increasing number of on-chip integrated transistors has enabled modern computing platforms with not only higher processing power but also more affordable prices. As a result, these platforms, including portable devices, work stations and data centres, are becoming an inevitable part of the human society. However, with the demand for portability and raising cost of power, energy efficiency has emerged to be a major concern for modern computing platforms. As the complexity of on-chip systems increases, Network-on-Chip (NoC) has been proved as an efficient communication architecture which can further improve system performances and scalability while reducing the design cost. Therefore, in this thesis, we study and propose energy optimization approaches based on NoC architecture, with special focuses on the following aspects. As the architectural trend of future computing platforms, 3D systems have many bene ts including higher integration density, smaller footprint, heterogeneous integration, etc. Moreover, 3D technology can signi cantly improve the network communication and effectively avoid long wirings, and therefore, provide higher system performance and energy efficiency. With the dynamic nature of on-chip communication in large scale NoC based systems, run-time system optimization is of crucial importance in order to achieve higher system reliability and essentially energy efficiency. In this thesis, we propose an agent based system design approach where agents are on-chip components which monitor and control system parameters such as supply voltage, operating frequency, etc. With this approach, we have analysed the implementation alternatives for dynamic voltage and frequency scaling and power gating techniques at different granularity, which reduce both dynamic and leakage energy consumption. Topologies, being one of the key factors for NoCs, are also explored for energy saving purpose. A Honeycomb NoC architecture is proposed in this thesis with turn-model based deadlock-free routing algorithms. Our analysis and simulation based evaluation show that Honeycomb NoCs outperform their Mesh based counterparts in terms of network cost, system performance as well as energy efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work an agent based model (ABM) was proposed using the main idea from the Jabłonska-Capasso-Morale (JCM) model and maximized greediness concept. Using a multi-agents simulator, the power of the ABM was assessed by using the historical prices of silver metal dating from the 01.03.2000 to 01.03.2013. The model results, analysed in two different situations, with and without maximized greediness, have proven that the ABM is capable of explaining the silver price dynamics even in utmost events. The ABM without maximal greediness explained the prices with more irrationalities whereas the ABM with maximal greediness tracked the price movements with more rational decisions. In the comparison test, the model without maximal greediness stood as the best to capture the silver market dynamics. Therefore, the proposed ABM confirms the suggested reasons for financial crises or markets failure. It reveals that an economic or financial collapse may be stimulated by irrational and rational decisions, yet irrationalities may dominate the market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pérez-Castrillo and Wettstein (2002) propose a multi-bidding mechanism to determine a winner from a set of possible projects. The winning project is implemented and its surplus is shared among the agents. In the multi-bidding mechanism each agent announces a vector of bids, one for each possible project, that are constrained to sum up to zero. In addition, each agent chooses a favorite a object which is used as a tie-breaker if several projects receive the same highest aggregate bid. Since more desirable projects receive larger bids, it is natural to consider the multi-bidding mechanism without the announcement of favorite projects. We show that the merits of the multi-bidding mechanism appear not to be robust to this natural simplification. Specifically, a Nash equilibrium exists if and only if there are at least two individually optimal projects and all individually optimal projects are efficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the fastest expanding areas of computer exploitation is in embedded systems, whose prime function is not that of computing, but which nevertheless require information processing in order to carry out their prime function. Advances in hardware technology have made multi microprocessor systems a viable alternative to uniprocessor systems in many embedded application areas. This thesis reports the results of investigations carried out on multi microprocessors oriented towards embedded applications, with a view to enhancing throughput and reliability. An ideal controller for multiprocessor operation is developed which would smoothen sharing of routines and enable more powerful and efficient code I data interchange. Results of performance evaluation are appended.A typical application scenario is presented, which calls for classifying tasks based on characteristic features that were identified. The different classes are introduced along with a partitioned storage scheme. Theoretical analysis is also given. A review of schemes available for reducing disc access time is carried out and a new scheme presented. This is found to speed up data base transactions in embedded systems. The significance of software maintenance and adaptation in such applications is highlighted. A novel scheme of prov1d1ng a maintenance folio to system firmware is presented, alongwith experimental results. Processing reliability can be enhanced if facility exists to check if a particular instruction in a stream is appropriate. Likelihood of occurrence of a particular instruction would be more prudent if number of instructions in the set is less. A new organisation is derived to form the basement for further work. Some early results that would help steer the course of the work are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is to develop an Open Agent Architecture for Multilingual information retrieval from Relational Database. The query for information retrieval can be given in plain Hindi or Malayalam; two prominent regional languages of India. The system supports distributed processing of user requests through collaborating agents. Natural language processing techniques are used for meaning extraction from the plain query and information is given back to the user in his/ her native language. The system architecture is designed in a structured way so that it can be adapted to other regional languages of India

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Speaker(s): Prof. David Evans Organiser: Dr Tim Chown Time: 22/05/2014 10:45-11:45 Location: B53/4025 Abstract Secure multi-party computation enables two (or more) participants to reliably compute a function that depends on both of their inputs, without revealing those inputs to the other party or needing to trust any other party. It could enable two people who meet at a conference to learn who they known in common without revealing any of their other contacts, or allow a pharmaceutical company to determine the correct dosage of a medication based on a patient’s genome without compromising the privacy of the patient. A general solution to this problem has been known since Yao's pioneering work in the 1980s, but only recently has it become conceivable to use this approach in practice. Over the past few years, my research group has worked towards making secure computation practical for real applications. In this talk, I'll provide a brief introduction to secure computation protocols, describe the techniques we have developed to design scalable and efficient protocols, and share some recent results on improving efficiency and how secure computing applications are developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whilst radial basis function (RBF) equalizers have been employed to combat the linear and nonlinear distortions in modern communication systems, most of them do not take into account the equalizer's generalization capability. In this paper, it is firstly proposed that the. model's generalization capability can be improved by treating the modelling problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets. Then, as a modelling application, a new RBF equalizer learning scheme is introduced based on the directional evolutionary MOO (EMOO). Directional EMOO improves the computational efficiency of conventional EMOO, which has been widely applied in solving MOO problems, by explicitly making use of the directional information. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good performance not only on explaining the training samples but on predicting the unseen samples.