954 resultados para Emerging Paradigm Shift
Resumo:
Auf der Suche nach dem „vulnerablen Plaque“, der ein besonders hohes Risiko für Schlaganfall und Herzinfarkt besitzt, findet momentan ein Paradigmenwechsel statt. Anstelle des klassischen Stenosegrades gewinnt die Darstellung der Plaquemorphologie zunehmend an Bedeutung. Fragestellung: Ziel dieser Arbeit ist es, die Fähigkeiten eines modernen 16-Kanal-CT hinsichtlich der Auflösung des Plaqueinneren bei Atherosklerose der Karotiden zu untersuchen und den Halo-Effekt in vivo zu erforschen. Methoden: Für die Studie wurden von 28 Patienten mit bekannter, symptomatischer Karotisstenose vor der gefäßchirurgischen Intervention CT-Bilder angefertigt, die nachfolgend mit der Histologie der Gefäßpräparate korreliert wurden. Auf diese Weise konnten die mikroskopisch identifizierten Lipidkerne im CT-Bild eingezeichnet und hinsichtlich ihrer Fläche und Dichtewerte evaluiert werden. In einem weiteren Schritt führten 2 Radiologen in Unkenntnis der histologischen Ergebnisse unabhängig voneinander eine Befundung durch und markierten mutmaßliche Lipidkerne. Zudem wurden sowohl in der verblindeten als auch in der histologiekontrollierten Auswertung die Plaquetypen anhand der AHA-Klassifikation bestimmt. Ein dritter Befundungsdurchgang geschah unter Zuhilfenahme einer von uns entwickelten Software, die CT-Bilder farbkodiert um die Detektion der Lipidkerne zu verbessern. Anhand der Farbkodierung wurde zudem ein Indexwert errechnet, der eine objektive Zuordnung zur AHA-Klassifikation ermöglichen sollte. Von 6 Patienten wurde zusätzlich noch eine native CT-Aufnahme angefertigt, die durch MPR exakt an die Kontrastmittelserie angeglichen wurde. Auf diese Weise konnte der Halo-Effekt, der die Plaqueanteile im lumennahen Bereich überstrahlt, quantifiziert und charakterisiert werden. Ergebnisse: Während die Einstufung in die AHA-Klassifikation sowohl durch den Befunder als auch durch den Softwarealgorithmus eine hohe Korrelation mit der Histologie aufweist (Typ IV/Va: 89 %, Typ Vb: 70 %, Typ Vc: 89 %, Typ VI: 55 %), ist die Detektion der Lipidkerne in beiden Fällen nicht ausreichend gut und die Befunderabhängigkeit zu groß (Cohens Kappa: 18 %). Eine Objektivierung der AHA-Klassifikation der Plaques durch Indexberechnung nach Farbkodierung scheint möglich, wenn auch dem Befunder nicht überlegen. Die fibröse Kappe kann nicht abgegrenzt werden, da Überstrahlungseffekte des Kontrastmittels dessen HU-Werte verfälschen. Dieser Halo-Effekt zeigte sich im Median 1,1 mm breit mit einer Standardabweichung von 0,38 mm. Eine Abhängigkeit von der Kontrastmitteldichte im Gefäßlumen konnte dabei nicht nachgewiesen werden. Der Halo-Effekt fiel im Median um -106 HU/mm ab, bei einer Standardabweichung von 33 HU/mm. Schlussfolgerung: Die CT-Technologie zeigt sich, was die Darstellung von einzelnen Plaquekomponenten angeht, den bekannten Fähigkeiten der MRT noch unterlegen, insbesondere in Bezug auf die fibröse Kappe. Ihre Fähigkeiten liegen bisher eher in der Einstufung von Plaques in eine grobe Klassifikation, angelehnt an die der AHA. Die klinische Relevanz dessen jedoch gilt es in Zukunft in größeren Studien weiter zu untersuchen. Auch lässt die Weiterentwicklung der Computertomographie auf eine zukünftig höhere Auflösung der Plaquemorphologie hoffen.
Resumo:
La tesi analizza la rappresentazione del maquis nella letteratura spagnola contemporanea, scritti in castigliano e pubblicati dal 1985 ad oggi. La tesi si articola in tre capitoli: il primo presenta a livello teorico la metodologia e gli strumenti utilizzati nello svolgimento dello studio, ed è incentrato innanzitutto sul tentativo di definizione e catalogazione dei romanzi del maquis, con una particolare attenzione alla temperie culturale cui fanno riferimento, presentando le estetiche postmoderna e neomoderna e cercando di situare le opere narrative facenti parte del corpus della ricerca. Nel secondo capitolo è centrale in cambio l’analisi dei rapporti tra Storia e narrazione: oltre a concentrarsi sul dibattito interdisciplinare circa le connessioni tra la Storia e la narrativa, si cerca di dar conto dei riflessi di questa riflessione contemporanea all’interno delle opere facenti parte del corpus. Infine, il terzo capitolo riguarda l’analisi delle metafore animali rintracciabili nei romanzi sul maquis scelti, concentrandosi principalmente su Luna de lobos di Julio Llamazares e La agonía del búho chico di Justo Vila. L’impiego di questa figura retorica, che si ritrova in vari gradi in tutte le opere narrative scelte, risponde tanto ad una ricerca di verosimiglianza quanto alle modalità di rappresentazione della realtà empirica, riportando l’attenzione sui metodi atti alla figurazione e all’accesso alla conoscenza del mondo. La proposta di un cambio di paradigma estetico e narrativo in atto nella letteratura contemporanea spagnola cerca quindi una conferma nel momento dell’analisi, attraverso la quale si cerca di indagare se, e in che misura, il romanzo sul maquis si inserisce nel dibattito letterario odierno, e quanto contribuisce allo sviluppo del medesimo paradigma estetico in via di definizione – quello neomoderno –, cercando una conferma che si basa sulla presenza di quelle tematiche segnalate nel momento della discussione teorica e metodologica come tratti basilari e strutturanti le opere.
Resumo:
L’avanzare delle tecnologie ICT e l’abbattimento dei costi di produzione hanno portato ad un aumento notevole della criminalità informatica. Tuttavia il cambiamento non è stato solamente quantitativo, infatti si può assistere ad un paradigm-shift degli attacchi informatici da completamente opportunistici, ovvero senza un target specifico, ad attacchi mirati aventi come obiettivo una particolare persona, impresa o nazione. Lo scopo della mia tesi è quello di analizzare modelli e tassonomie sia di attacco che di difesa, per poi valutare una effettiva strategia di difesa contro gli attacchi mirati. Il lavoro è stato svolto in un contesto aziendale come parte di un tirocinio. Come incipit, ho effettuato un attacco mirato contro l’azienda in questione per valutare la validità dei sistemi di difesa. L’attacco ha avuto successo, dimostrando l’inefficacia di moderni sistemi di difesa. Analizzando i motivi del fallimento nel rilevare l’attacco, sono giunto a formulare una strategia di difesa contro attacchi mirati sotto forma di servizio piuttosto che di prodotto. La mia proposta è un framework concettuale, chiamato WASTE (Warning Automatic System for Targeted Events) il cui scopo è fornire warnings ad un team di analisti a partire da eventi non sospetti, ed un business process che ho nominato HAZARD (Hacking Approach for Zealot Attack Response and Detection), che modella il servizio completo di difesa contro i targeted attack. Infine ho applicato il processo all’interno dell’azienda per mitigare minacce ed attacchi informatici.
Resumo:
Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.
Resumo:
In recent years, Deep Learning techniques have shown to perform well on a large variety of problems both in Computer Vision and Natural Language Processing, reaching and often surpassing the state of the art on many tasks. The rise of deep learning is also revolutionizing the entire field of Machine Learning and Pattern Recognition pushing forward the concepts of automatic feature extraction and unsupervised learning in general. However, despite the strong success both in science and business, deep learning has its own limitations. It is often questioned if such techniques are only some kind of brute-force statistical approaches and if they can only work in the context of High Performance Computing with tons of data. Another important question is whether they are really biologically inspired, as claimed in certain cases, and if they can scale well in terms of "intelligence". The dissertation is focused on trying to answer these key questions in the context of Computer Vision and, in particular, Object Recognition, a task that has been heavily revolutionized by recent advances in the field. Practically speaking, these answers are based on an exhaustive comparison between two, very different, deep learning techniques on the aforementioned task: Convolutional Neural Network (CNN) and Hierarchical Temporal memory (HTM). They stand for two different approaches and points of view within the big hat of deep learning and are the best choices to understand and point out strengths and weaknesses of each of them. CNN is considered one of the most classic and powerful supervised methods used today in machine learning and pattern recognition, especially in object recognition. CNNs are well received and accepted by the scientific community and are already deployed in large corporation like Google and Facebook for solving face recognition and image auto-tagging problems. HTM, on the other hand, is known as a new emerging paradigm and a new meanly-unsupervised method, that is more biologically inspired. It tries to gain more insights from the computational neuroscience community in order to incorporate concepts like time, context and attention during the learning process which are typical of the human brain. In the end, the thesis is supposed to prove that in certain cases, with a lower quantity of data, HTM can outperform CNN.
Resumo:
An imaging biomarker that would provide for an early quantitative metric of clinical treatment response in cancer patients would provide for a paradigm shift in cancer care. Currently, nonimage based clinical outcome metrics include morphology, clinical, and laboratory parameters, however, these are obtained relatively late following treatment. Diffusion-weighted MRI (DW-MRI) holds promise for use as a cancer treatment response biomarker as it is sensitive to macromolecular and microstructural changes which can occur at the cellular level earlier than anatomical changes during therapy. Studies have shown that successful treatment of many tumor types can be detected using DW-MRI as an early increase in the apparent diffusion coefficient (ADC) values. Additionally, low pretreatment ADC values of various tumors are often predictive of better outcome. These capabilities, once validated, could provide for an important opportunity to individualize therapy thereby minimizing unnecessary systemic toxicity associated with ineffective therapies with the additional advantage of improving overall patient health care and associated costs. In this report, we provide a brief technical overview of DW-MRI acquisition protocols, quantitative image analysis approaches and review studies which have implemented DW-MRI for the purpose of early prediction of cancer treatment response.
Resumo:
Ulcerated diabetic foot is a complex problem. Ischaemia, neuropathy and infection are the three pathological components that lead to diabetic foot complications, and they frequently occur together as an aetiologic triad. Neuropathy and ischaemia are the initiating factors, most often together as neuroischaemia, whereas infection is mostly a consequence. The role of peripheral arterial disease in diabetic foot has long been underestimated as typical ischaemic symptoms are less frequent in diabetics with ischaemia than in non-diabetics. Furthermore, the healing of a neuroischaemic ulcer is hampered by microvascular dysfunction. Therefore, the threshold for revascularising neuroischaemic ulcers should be lower than that for purely ischaemic ulcers. Previous guidelines have largely ignored these specific demands related to ulcerated neuroischaemic diabetic feet. Any diabetic foot ulcer should always be considered to have vascular impairment unless otherwise proven. Early referral, non-invasive vascular testing, imaging and intervention are crucial to improve diabetic foot ulcer healing and to prevent amputation. Timing is essential, as the window of opportunity to heal the ulcer and save the leg is easily missed. This chapter underlines the paucity of data on the best way to diagnose and treat these diabetic patients. Most of the studies dealing with neuroischaemic diabetic feet are not comparable in terms of patient populations, interventions or outcome. Therefore, there is an urgent need for a paradigm shift in diabetic foot care; that is, a new approach and classification of diabetics with vascular impairment in regard to clinical practice and research. A multidisciplinary approach needs to implemented systematically with a vascular surgeon as an integrated member. New strategies must be developed and implemented for diabetic foot patients with vascular impairment, to improve healing, to speed up healing rate and to avoid amputation, irrespective of the intervention technology chosen. Focused studies on the value of predictive tests, new treatment modalities as well as selective and targeted strategies are needed. As specific data on ulcerated neuroischaemic diabetic feet are scarce, recommendations are often of low grade.
Resumo:
Seizures are often the presenting symptoms of a cerebral tumor and may precede its diagnosis by many years. The article under evaluation searched two large English registries for patients admitted for new-onset epilepsy. The risk of subsequently being diagnosed with a malignant brain tumor was found to be 26-fold higher compared with controls, persisted over many years and was accentuated in young patients. Recently, surgical advances have led to a significant decrease in surgical morbidities, making surgery the first treatment option for gliomas, especially low-grade gliomas. This paradigm shift warrants a consequent diagnostic workup (MRI) in patients at risk for low-grade glioma - that is, patients with new-onset epilepsy. The study is discussed in the context of the ongoing debate on neuroimaging after new-onset epilepsy.
Resumo:
Targeting neuroendocrine tumors expressing somatostatin receptor subtypes (sst) with radiolabeled somatostatin agonists is an established diagnostic and therapeutic approach in oncology. While agonists readily internalize into tumor cells, permitting accumulation of radioactivity, radiolabeled antagonists do not, and they have not been considered for tumor targeting. The macrocyclic chelator 1,4,7,10-tetraazacyclododecane-1,4,7,10-tetraacetic acid (DOTA) was coupled to two potent somatostatin receptor-selective peptide antagonists [NH(2)-CO-c(DCys-Phe-Tyr-DAgl(8)(Me,2-naphthoyl)-Lys-Thr-Phe-Cys)-OH (sst(3)-ODN-8) and a sst(2)-selective antagonist (sst(2)-ANT)], for labeling with (111/nat)In. (111/nat)In-DOTA-sst(3)-ODN-8 and (111/nat)In-DOTA-[4-NO(2)-Phe-c(DCys-Tyr-DTrp-Lys-Thr-Cys)-DTyr-NH(2)] ((111/nat)In-DOTA-sst(2)-ANT) showed high sst(3)- and sst(2)-binding affinity, respectively. They did not trigger sst(3) or sst(2) internalization but prevented agonist-stimulated internalization. (111)In-DOTA-sst(3)-ODN-8 and (111)In-DOTA-sst(2)-ANT were injected intravenously into mice bearing sst(3)- and sst(2)-expressing tumors, and their biodistribution was monitored. In the sst(3)-expressing tumors, strong accumulation of (111)In-DOTA-sst(3)-ODN-8 was observed, peaking at 1 h with 60% injected radioactivity per gram of tissue and remaining at a high level for >72 h. Excess of sst(3)-ODN-8 blocked uptake. As a control, the potent agonist (111)In-DOTA-[1-Nal(3)]-octreotide, with strong sst(3)-binding and internalization properties showed a much lower and shorter-lasting uptake in sst(3)-expressing tumors. Similarly, (111)In-DOTA-sst(2)-ANT was injected into mice bearing sst(2)-expressing tumors. Tumor uptake was considerably higher than with the highly potent sst(2)-selective agonist (111)In-diethylenetriaminepentaacetic acid-[Tyr(3),Thr(8)]-octreotide ((111)In-DTPA-TATE). Scatchard plots showed that antagonists labeled many more sites than agonists. Somatostatin antagonist radiotracers therefore are preferable over agonists for the in vivo targeting of sst(3)- or sst(2)-expressing tumors. Antagonist radioligands for other peptide receptors need to be evaluated in nuclear oncology as a result of this paradigm shift.
Resumo:
Factors such as instability and impingement lead to early cartilage damage and osteoarthritis of the hip joint. The surgical outcome of joint-preserving surgery about the hip joint depends on the preoperative quality of joint cartilage.For in vivo evaluation of cartilage quality, different biochemically sensitive magnetic resonance imaging (MRI) procedures have been tested, some of which have the potential of inducing a paradigm shift in the evaluation and treatment of cartilage damage and early osteoarthritis.Instead of reacting to late sequelae in a palliative way, physicians could assess cartilage damage early on, and the treatment intensity could be adequate and based on the disease stage. Furthermore, the efficiency of different therapeutic interventions could be evaluated and monitored.This article reviews the recent application of delayed gadolinium-enhanced MRI of cartilage (dGEMRIC) and discusses its use for assessing cartilage quality in the hip joint. dGEMRIC is more sensitive to early cartilage changes in osteoarthritis than are radiographic measures and might be a helpful tool for assessing cartilage quality.
Resumo:
The Austrian philosopher Ludwig Wittgenstein famously proposed a style of philosophy that was directed against certain pictures [bild] that tacitly direct our language and forms of life. His aim was to show the fly the way out of the fly bottle and to fight against the bewitchment of our intelligence by means of language: “A picture held us captive. And we could not get outside it, for it lay in our language and language seemed to repeat it to us inexorably” (Wittgenstein 1953, 115). In this context Wittgenstein is talking of philosophical pictures, deep metaphors that have structured our language but he does also use the term picture in other contexts (see Owen 2003, 83). I want to appeal to Wittgenstein in my use of the term ideology to refer to the way in which powerful underlying metaphors in neoclassical economics have a strong rhetorical and constitutive force at the level of public policy. Indeed, I am specifically speaking of the notion of ‘the performative’ in Wittgenstein and Austin. The notion of the knowledge economy has a prehistory in Hayek (1937; 1945) who founded the economics of knowledge in the 1930s, in Machlup (1962; 1970), who mapped the emerging employment shift to the US service economy in the early 1960s, and to sociologists Bell (1973) and Touraine (1974) who began to tease out the consequences of these changes for social structure in the post-industrial society in the early 1970s. The term has been taken up since by economists, sociologists, futurists and policy experts recently to explain the transition to the so-called ‘new economy’. It is not just a matter of noting these discursive strands in the genealogy of the ‘knowledge economy’ and related or cognate terms. We can also make a number of observations on the basis of this brief analysis. First, there has been a succession of terms like ‘postindustrial economy’, ‘information economy’, ‘knowledge economy’, ‘learning economy’, each with a set of related concepts emphasising its social, political, management or educational aspects. Often these literatures are not cross-threading and tend to focus on only one aspect of phenomena leading to classic dichotomies such as that between economy and society, knowledge and information. Second, these terms and their family concepts are discursive, historical and ideological products in the sense that they create their own meanings and often lead to constitutive effects at the level of policy. Third, while there is some empirical evidence to support claims concerning these terms, at the level of public policy these claims are empirically underdetermined and contain an integrating, visionary or futures component, which necessarily remains untested and is, perhaps, in principle untestable.
Resumo:
The society wrestles with mass social change congruent with economic globalization and the communications revolution. This change creates new challenges for the social work profession in the areas of social and economic justice. This article analyzes the terminology of the new global era, words that signify a paradigm shift in outlook, most of them a reaction to the new authoritarianism of the age. Globalization, oppression, social exclusion, human rights, harm reduction, and restorative justice are the representative terms chosen.
Resumo:
Geometric morphometrics (GMM) methods are very popular in physical anthropology. One disadvantage common to the existingGMM methods is that despite significant advancements in computed tomography (CT) and magnetic resonance imaging (MRI)technology, these methods still depend on landmarks or features that are either digitized directly from subject surface or extractedfrom surface models or outlines derived from a laser surface scan or from a CTor MRI scan. All the rest image contents contained ina CTor MRI scan are ignored by these methods. In this paper, we present a complementary solution called Volumetric Morphometrics(VMM). With VMM, we are aiming for a paradigm shift from landmarks and surfaces used in existing GMM approaches todisplacements and volumes in the new VMM approaches, taking the full advantage of modern CTand MRI technology. Preliminaryvalidation results on ancient human skulls are presented.
Resumo:
Information Centric Networking (ICN) as an emerging paradigm for the Future Internet has initially been rather focusing on bandwidth savings in wired networks, but there might also be some significant potential to support communication in mobile wireless networks as well as opportunistic network scenarios, where end systems have spontaneous but time-limited contact to exchange data. This chapter addresses the reasoning why ICN has an important role in mobile and opportunistic networks by identifying several challenges in mobile and opportunistic Information-Centric Networks and discussing appropriate solutions for them. In particular, it discusses the issues of receiver and source mobility. Source mobility needs special attention. Solutions based on routing protocol extensions, indirection, and separation of name resolution and data transfer are discussed. Moreover, the chapter presents solutions for problems in opportunistic Information-Centric Networks. Among those are mechanisms for efficient content discovery in neighbour nodes, resume mechanisms to recover from intermittent connectivity disruptions, a novel agent delegation mechanisms to offload content discovery and delivery to mobile agent nodes, and the exploitation of overhearing to populate routing tables of mobile nodes. Some preliminary performance evaluation results of these developed mechanisms are provided.
Resumo:
The partial shift from patient to model is a reasonable and necessary paradigm shift in surgery in order to increase patient safety and to adapt to the reduced training time periods in hospitals and increased quality demands. Since 1991 the Vascular International Foundation and School has carried out many training courses with more than 2,500 participants. The modular build training system allows to teach many open vascular and endovascular surgical techniques on lifelike models with a pulsatile circulation. The simulation courses cannot replace training in operating rooms but are suitable for supporting the cognitive and associative stages for achieving motor skills. Scientific evaluation of the courses has continually shown that the training principle established since 1991 can lead to significant learning success. They are extremely useful not only for beginners but also for experienced vascular surgeons. They can help to shorten the learning curve, to learn new techniques or to refine previously used techniques in all stages of professional development. Keywords Advanced training · Advanced training regulations · Training model · Vascular International · Certification