885 resultados para Self-help techniques.
Resumo:
During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.
Resumo:
Die Dissertationsschrift beschäftigt sich mit der Entwicklung und Anwendung einer alternativen Probenzuführungstechnik für flüssige Proben in der Massenspektrometrie. Obwohl bereits einige Anstrengungen zur Verbesserung unternommen wurden, weisen konventionelle pneumatische Zerstäuber- und Sprühkammersysteme, die in der Elementspurenanalytik mittels induktiv gekoppeltem Plasma (ICP) standardmäßig verwendet werden, eine geringe Gesamteffizienz auf. Pneumatisch erzeugtes Aerosol ist durch eine breite Tropfengrößenverteilung gekennzeichnet, was den Einsatz einer Sprühkammer bedingt, um die Aerosolcharakteristik an die Betriebsbedingungen des ICPs anzupassen.. Die Erzeugung von Tropfen mit einer sehr engen Tropfengrößenverteilung oder sogar monodispersen Tropfen könnte die Effizienz des Probeneintrags verbessern. Ein Ziel dieser Arbeit ist daher, Tropfen, die mittels des thermischen Tintenstrahldruckverfahrens erzeugt werden, zum Probeneintrag in der Elementmassenspektrometrie einzusetzen. Das thermische Tintenstrahldruckverfahren konnte in der analytischen Chemie im Bereich der Oberflächenanalytik mittels TXRF oder Laserablation bisher zur gezielten, reproduzierbaren Deposition von Tropfen auf Oberflächen eingesetzt werden. Um eine kontinuierliche Tropfenerzeugung zu ermöglichen, wurde ein elektronischer Mikrokontroller entwickelt, der eine Dosiereinheit unabhängig von der Hard- und Software des Druckers steuern kann. Dabei sind alle zur Tropfenerzeugung relevanten Parameter (Frequenz, Heizpulsenergie) unabhängig voneinander einstellbar. Die Dosiereinheit, der "drop-on-demand" Aerosolgenerator (DOD), wurde auf eine Aerosoltransportkammer montiert, welche die erzeugten Tropfen in die Ionisationsquelle befördert. Im Bereich der anorganischen Spurenanalytik konnten durch die Kombination des DOD mit einem automatischen Probengeber 53 Elemente untersucht und die erzielbare Empfindlichkeiten sowie exemplarisch für 15 Elemente die Nachweisgrenzen und die Untergrundäquivalentkonzentrationen ermittelt werden. Damit die Vorteile komfortabel genutzt werden können, wurde eine Kopplung des DOD-Systems mit der miniaturisierten Fließinjektionsanalyse (FIA) sowie miniaturisierten Trenntechniken wie der µHPLC entwickelt. Die Fließinjektionsmethode wurde mit einem zertifizierten Referenzmaterial validiert, wobei für Vanadium und Cadmium die zertifizierten Werte gut reproduziert werden konnten. Transiente Signale konnten bei der Kopplung des Dosiersystems in Verbindung mit der ICP-MS an eine µHPLC abgebildet werden. Die Modifikation der Dosiereinheit zum Ankoppeln an einen kontinuierlichen Probenfluss bedarf noch einer weiteren Reduzierung des verbleibenden Totvolumens. Dazu ist die Unabhängigkeit von den bisher verwendeten, kommerziell erhältlichen Druckerpatronen anzustreben, indem die Dosiereinheit selbst gefertigt wird. Die Vielseitigkeit des Dosiersystems wurde mit der Kopplung an eine kürzlich neu entwickelte Atmosphärendruck-Ionisationsmethode, die "flowing atmospheric-pressure afterglow" Desorptions/Ionisations Ionenquelle (FAPA), aufgezeigt. Ein direkter Eintrag von flüssigen Proben in diese Quelle war bislang nicht möglich, es konnte lediglich eine Desorption von eingetrockneten Rückständen oder direkt von der Flüssigkeitsoberfläche erfolgen. Die Präzision der Analyse ist dabei durch die variable Probenposition eingeschränkt. Mit dem Einsatz des DOD-Systems können flüssige Proben nun direkt in die FAPA eingetragen, was ebenfalls das Kalibrieren bei quantitativen Analysen organischer Verbindungen ermöglicht. Neben illegalen Drogen und deren Metaboliten konnten auch frei verkäufliche Medikamente und ein Sprengstoffanalogon in entsprechend präpariertem reinem Lösungsmittel nachgewiesen werden. Ebenso gelang dies in Urinproben, die mit Drogen und Drogenmetaboliten versetzt wurden. Dabei ist hervorzuheben, dass keinerlei Probenvorbereitung notwendig war und zur Ermittlung der NWG der einzelnen Spezies keine interne oder isotopenmarkierte Standards verwendet wurden. Dennoch sind die ermittelten NWG deutlich niedriger, als die mit der bisherigen Prozedur zur Analyse flüssiger Proben erreichbaren. Um im Vergleich zu der bisher verwendeten "pin-to-plate" Geometrie der FAPA die Lösungsmittelverdampfung zu beschleunigen, wurde eine alternative Elektrodenanordnung entwickelt, bei der die Probe länger in Kontakt mit der "afterglow"-Zone steht. Diese Glimmentladungsquelle ist ringförmig und erlaubt einen Probeneintrag mittels eines zentralen Gasflusses. Wegen der ringförmigen Entladung wird der Name "halo-FAPA" (h-FAPA) für diese Entladungsgeometrie verwendet. Eine grundlegende physikalische und spektroskopische Charakterisierung zeigte, dass es sich tatsächlich um eine FAPA Desorptions/Ionisationsquelle handelt.
Resumo:
In this thesis, different complex colloids were prepared by the process of solvent evaporation from emulsion droplets (SEED). The term “complex” is used to include both an addressable functionality as well as the heterogeneous nature of the colloids.Firstly, as the SEED process was used throughout the thesis, its mechanism especially in regard to coalescence was investigated,. A wide variety of different techniques was employed to study the coalescence of nanodroplets during the evaporation of the solvent. Techniques such as DLS or FCS turned out not to be suitable methods to determine droplet coalescence because of their dependence on dilution. Thus, other methods were developed. TEM measurements were conducted on mixed polymeric emulsions with the results pointing to an absence of coalescence. However, these results were not quantifiable. FRET measurements on mixed polymeric emulsions also indicated an absence of coalescence. Again the results were not quantifiable. The amount of coalescence taking place was then quantified by the application of DC-FCCS. This method also allowed for measuring coalescence in other processes such as the miniemulsion polymerization or the polycondensation reaction on the interface of the droplets. By simulations it was shown that coalescence is not responsible for the usually observed broad size distribution of the produced particles. Therefore, the process itself, especially the emulsification step, needs to be improved to generate monodisperse colloids.rnThe Janus morphology is probably the best known among the different complex morphologies of nanoparticles. With the help of functional polymers, it was possible to marry click-chemistry to Janus particles. A large library of functional polymers was prepared by copolymerization and subsequent post-functionalization or by ATRP. The polymers were then used to generate Janus particles by the SEED process. Both dually functionalized Janus particles and particles with one functionalized face could be obtained. The latter were used for the quantification of functional groups on the surface of the Janus particles. For this, clickable fluorescent dyes were synthesized. The degree of functionality of the polymers was found to be closely mirrored in the degree of functionality of the surface. Thus, the marriage of click-chemistry to Janus particles was successful.Another complex morphology besides Janus particles are nanocapsules. Stimulus-responsive nanocapsules that show triggered release are a highly demanding and interesting system, as nanocapsules have promising applications in drug delivery and in self-healing materials. To achieve heterogeneity in the polymer shell, the stimulus-responsive block copolymer PVFc-b-PMMA was employed for the preparation of the capsules. The phase separation of the two blocks in the shell of the capsules led to a patchy morphology. These patches could then be oxidized resulting in morphology changes. In addition, swelling occurred because of the hydrophobic to hydrophilic transition of the patches induced by the oxidation. Due to the swelling, an encapsulated payload could diffuse out of the capsules, hence release was achieved.The concept of using block copolymers responsive to one stimulus for the preparation of stimulus-responsive capsules was extended to block copolymers responsive to more than one stimulus. Here, a block copolymer responsive to oxidation and a pH change as well as a block copolymer responsive to a pH change and temperature were studied in detail. The release from the nanocapsules could be regulated by tuning the different stimuli. In addition, by encapsulating stimuli-responsive payloads it was possible to selectively release a payload upon one stimulus but not upon the other one.In conclusion, the approaches taken in the course of this thesis demonstrate the broad applicability and usefulness of the SEED process to generate complex colloids. In addition, the experimental techniques established such as DC-FCCS will provide further insight into other research areas as well.
Resumo:
Lymphedema of the arm is a common complication of breast cancer with symptoms that can persist over long periods of time. For older women (over 50% of breast cancer cases) it means living with the potential for long-term complications of persistent lymphedema in conjunction with the common diseases and disabilities of aging over survivorship. We identified women > or =65 years diagnosed with primary stage I-IIIA breast cancer. Data were collected over 7 years of follow-up from consenting patients' medical records and telephone interviews. Data collected included self-reported symptoms of persistent lymphedema, breast cancer characteristics, and selected sociodemographic and health-related characteristics. The overall prevalence of symptoms of persistent lymphedema was 36% over 7 years of follow-up. Having stage II or III (OR = 1.77, 95% CI: 1.07-2.93) breast cancer and having a BMI >30 (OR = 3.04, 95% CI: 1.69-5.45) were statistically significantly predictive of symptoms of persistent lymphedema. Women > or =80 years were less likely to report symptoms of persistent lymphedema when compared to younger women (OR = 0.44, 95% CI: 0.18-0.95). Women with symptoms of persistent lymphedema consistently reported worse general mental health and physical function. Symptoms of persistent lymphedema were common in this population of older breast cancer survivors and had a noticeable effect on both physical function and general mental health. Our findings provide evidence of the impact of symptoms of persistent lymphedema on the quality of survivorship of older women. Clinical and research efforts focused on risk factors for symptoms of persistent lymphedema in older breast cancer survivors may lead to preventative and therapeutic measures that help maintain their health and well-being over increasing periods of survivorship.
Resumo:
Self-control is a prerequisite for complex cognitive processes such as cooperation and planning. As such, comparative studies of self-control may help elucidate the evolutionary origin of these capacities. A variety of methods have been developed to test for self-control in non-human primates that include some variation of foregoing an immediate reward in order to gain a more favorable reward. We used a token exchange paradigm to test for self-control in capuchin monkeys (Cebus apella). Animals were trained that particular tokens could be exchanged for food items worth different values. To test for self-control, a monkey was provided with a token that was associated with a lower-value food. When the monkey exchanged the token, the experimenter provided the monkey with a choice between the lower-value food item associated with the token or another token that was associated with a higher-value food. If the monkey chose the token, they could then exchange it for the higher-value food. Of seven monkeys trained to exchange tokens, five demonstrated that they attributed value to the tokens by differentially selecting tokens for higher-value foods over tokens for lower-value foods. When provided with a choice between a food item or a token for a higher-value food, two monkeys selected the token significantly more than expected by chance. The ability of capuchin monkeys to forego an immediate food reward and select a token that could then be traded for a more preferred food demonstrated some degree of self-control. Thus, results suggest a token exchange paradigm could be a successful technique for assessing self-control in this New World species.
Resumo:
Mindfulness meditation describes a set of different mental techniques to train attention and awareness. Trait mindfulness and extended mindfulness interventions can benefit self-control. The present study investigated the short-term consequences of mindfulness meditation under conditions of limited self-control resources. Specifically, we hypothesized that a brief period of mindfulness meditation would counteract the deleterious effect that the exertion of self-control has on subsequent self-control performance. Participants who had been depleted of self-control resources by an emotion suppression task showed decrements in self-control performance as compared to participants who had not suppressed emotions. However, participants who had meditated after emotion suppression performed equally well on the subsequent self-control task as participants who had not exerted self-control previously. This finding suggests that a brief period of mindfulness meditation may serve as a quick and efficient strategy to foster self-control under conditions of low resources.
Resumo:
Stimulation of human epileptic tissue can induce rhythmic, self-terminating responses on the EEG or ECoG. These responses play a potentially important role in localising tissue involved in the generation of seizure activity, yet the underlying mechanisms are unknown. However, in vitro evidence suggests that self-terminating oscillations in nervous tissue are underpinned by non-trivial spatio-temporal dynamics in an excitable medium. In this study, we investigate this hypothesis in spatial extensions to a neural mass model for epileptiform dynamics. We demonstrate that spatial extensions to this model in one and two dimensions display propagating travelling waves but also more complex transient dynamics in response to local perturbations. The neural mass formulation with local excitatory and inhibitory circuits, allows the direct incorporation of spatially distributed, functional heterogeneities into the model. We show that such heterogeneities can lead to prolonged reverberating responses to a single pulse perturbation, depending upon the location at which the stimulus is delivered. This leads to the hypothesis that prolonged rhythmic responses to local stimulation in epileptogenic tissue result from repeated self-excitation of regions of tissue with diminished inhibitory capabilities. Combined with previous models of the dynamics of focal seizures this macroscopic framework is a first step towards an explicit spatial formulation of the concept of the epileptogenic zone. Ultimately, an improved understanding of the pathophysiologic mechanisms of the epileptogenic zone will help to improve diagnostic and therapeutic measures for treating epilepsy.
Resumo:
The Simulation Automation Framework for Experiments (SAFE) streamlines the de- sign and execution of experiments with the ns-3 network simulator. SAFE ensures that best practices are followed throughout the workflow a network simulation study, guaranteeing that results are both credible and reproducible by third parties. Data analysis is a crucial part of this workflow, where mistakes are often made. Even when appearing in highly regarded venues, scientific graphics in numerous network simulation publications fail to include graphic titles, units, legends, and confidence intervals. After studying the literature in network simulation methodology and in- formation graphics visualization, I developed a visualization component for SAFE to help users avoid these errors in their scientific workflow. The functionality of this new component includes support for interactive visualization through a web-based interface and for the generation of high-quality, static plots that can be included in publications. The overarching goal of my contribution is to help users create graphics that follow best practices in visualization and thereby succeed in conveying the right information about simulation results.
Resumo:
The purpose of this paper is to examine ways in which pedagogy and gender of instructor impact the development of self-regulated learning strategies as assessed by the Motivated Strategies for Learning Questionnaire (MSLQ) in male and female undergraduate engineering students. Pedagogy was operationalized as two general formats: lecture plus active learning techniques or problem-base/project-based learning. One hundred seventy-six students from four universities participated in the study. Within-group analyses found significant differences with regard to pedagogy, instructors’ gender, and student gender on the learning strategies and motivation subscales as operationalized by the MSLQ. Male and females students reported significant post-test differences with regard to the gender of instructor and the style of pedagogy. The results of this study showed a pattern where more positive responses for students of both genders were found with the same-gendered instructor. The results also suggested that male students responded more positively to project and problem-based courses with changes evidenced in motivation strategies and resource management. Female students showed decreases in resource management in these two types of courses. Further, female students reported increases in the lecture with active learning courses.
Resumo:
Comments that current proposals for licensure, accreditation, and 3rd-party reimbursement may have several unintended consequences. Until now discussion has focused on the effects of the proposed regulations on the development of psychology as a profession. Recent proposals, however, may have unexpected adverse consequences on 3 other areas as well: the education of professionals within psychology, the delivery of psychological and other helping services, and the self-definition of the consumer of psychological services. Any changes in licensure, accreditation, and reimbursement require compromises of concerns for the profession, for the consumer, and for psychologists' livelihood.
Resumo:
Recently the issue of radiative corrections to leptogenesis has been raised. Considering the "strong washout" regime, in which OPE-techniques permit to streamline the setup, we report the thermal self-energy matrix of heavy right-handed neutrinos at NLO (resummed 2-loop level) in Standard Model couplings. The renormalized expression describes flavour transitions and "inclusive" decays of chemically decoupled right-handed neutrinos. Although CP-violation is not addressed, the result may find use in existing leptogenesis frameworks.
Resumo:
CONCLUSION: Our self-developed planning and navigation system has proven its capacity for accurate surgery on the anterior and lateral skull base. With the incorporation of augmented reality, image-guided surgery will evolve into 'information-guided surgery'. OBJECTIVE: Microscopic or endoscopic skull base surgery is technically demanding and its outcome has a great impact on a patient's quality of life. The goal of the project was aimed at developing and evaluating enabling navigation surgery tools for simulation, planning, training, education, and performance. This clinically applied technological research was complemented by a series of patients (n=406) who were treated by anterior and lateral skull base procedures between 1997 and 2006. MATERIALS AND METHODS: Optical tracking technology was used for positional sensing of instruments. A newly designed dynamic reference base with specific registration techniques using fine needle pointer or ultrasound enables the surgeon to work with a target error of < 1 mm. An automatic registration assessment method, which provides the user with a color-coded fused representation of CT and MR images, indicates to the surgeon the location and extent of registration (in)accuracy. Integration of a small tracker camera mounted directly on the microscope permits an advantageous ergonomic way of working in the operating room. Additionally, guidance information (augmented reality) from multimodal datasets (CT, MRI, angiography) can be overlaid directly onto the surgical microscope view. The virtual simulator as a training tool in endonasal and otological skull base surgery provides an understanding of the anatomy as well as preoperative practice using real patient data. RESULTS: Using our navigation system, no major complications occurred in spite of the fact that the series included difficult skull base procedures. An improved quality in the surgical outcome was identified compared with our control group without navigation and compared with the literature. The surgical time consumption was reduced and more minimally invasive approaches were possible. According to the participants' questionnaires, the educational effect of the virtual simulator in our residency program received a high ranking.
Resumo:
Multi-input multi-output (MIMO) technology is an emerging solution for high data rate wireless communications. We develop soft-decision based equalization techniques for frequency selective MIMO channels in the quest for low-complexity equalizers with BER performance competitive to that of ML sequence detection. We first propose soft decision equalization (SDE), and demonstrate that decision feedback equalization (DFE) based on soft-decisions, expressed via the posterior probabilities associated with feedback symbols, is able to outperform hard-decision DFE, with a low computational cost that is polynomial in the number of symbols to be recovered, and linear in the signal constellation size. Building upon the probabilistic data association (PDA) multiuser detector, we present two new MIMO equalization solutions to handle the distinctive channel memory. With their low complexity, simple implementations, and impressive near-optimum performance offered by iterative soft-decision processing, the proposed SDE methods are attractive candidates to deliver efficient reception solutions to practical high-capacity MIMO systems. Motivated by the need for low-complexity receiver processing, we further present an alternative low-complexity soft-decision equalization approach for frequency selective MIMO communication systems. With the help of iterative processing, two detection and estimation schemes based on second-order statistics are harmoniously put together to yield a two-part receiver structure: local multiuser detection (MUD) using soft-decision Probabilistic Data Association (PDA) detection, and dynamic noise-interference tracking using Kalman filtering. The proposed Kalman-PDA detector performs local MUD within a sub-block of the received data instead of over the entire data set, to reduce the computational load. At the same time, all the inter-ference affecting the local sub-block, including both multiple access and inter-symbol interference, is properly modeled as the state vector of a linear system, and dynamically tracked by Kalman filtering. Two types of Kalman filters are designed, both of which are able to track an finite impulse response (FIR) MIMO channel of any memory length. The overall algorithms enjoy low complexity that is only polynomial in the number of information-bearing bits to be detected, regardless of the data block size. Furthermore, we introduce two optional performance-enhancing techniques: cross- layer automatic repeat request (ARQ) for uncoded systems and code-aided method for coded systems. We take Kalman-PDA as an example, and show via simulations that both techniques can render error performance that is better than Kalman-PDA alone and competitive to sphere decoding. At last, we consider the case that channel state information (CSI) is not perfectly known to the receiver, and present an iterative channel estimation algorithm. Simulations show that the performance of SDE with channel estimation approaches that of SDE with perfect CSI.
Resumo:
Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.
Resumo:
Bluetooth wireless technology is a robust short-range communications system designed for low power (10 meter range) and low cost. It operates in the 2.4 GHz Industrial Scientific Medical (ISM) band and it employs two techniques for minimizing interference: a frequency hopping scheme which nominally splits the 2.400 - 2.485 GHz band in 79 frequency channels and a time division duplex (TDD) scheme which is used to switch to a new frequency channel on 625 μs boundaries. During normal operation a Bluetooth device will be active on a different frequency channel every 625 μs, thus minimizing the chances of continuous interference impacting the performance of the system. The smallest unit of a Bluetooth network is called a piconet, and can have a maximum of eight nodes. Bluetooth devices must assume one of two roles within a piconet, master or slave, where the master governs quality of service and the frequency hopping schedule within the piconet and the slave follows the master’s schedule. A piconet must have a single master and up to 7 active slaves. By allowing devices to have roles in multiple piconets through time multiplexing, i.e. slave/slave or master/slave, the Bluetooth technology allows for interconnecting multiple piconets into larger networks called scatternets. The Bluetooth technology is explored in the context of enabling ad-hoc networks. The Bluetooth specification provides flexibility in the scatternet formation protocol, outlining only the mechanisms necessary for future protocol implementations. A new protocol for scatternet formation and maintenance - mscat - is presented and its performance is evaluated using a Bluetooth simulator. The free variables manipulated in this study include device activity and the probabilities of devices performing discovery procedures. The relationship between the role a device has in the scatternet and it’s probability of performing discovery was examined and related to the scatternet topology formed. The results show that mscat creates dense network topologies for networks of 30, 50 and 70 nodes. The mscat protocol results in approximately a 33% increase in slaves/piconet and a reduction of approximately 12.5% of average roles/node. For 50 node scenarios the set of parameters which creates the best determined outcome is unconnected node inquiry probability (UP) = 10%, master node inquiry probability (MP) = 80% and slave inquiry probability (SP) = 40%. The mscat protocol extends the Bluetooth specification for formation and maintenance of scatternets in an ad-hoc network.