831 resultados para time-over-threshold


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relacionado con línea de investigación del GDS del ISOM ver http://www.isom.upm.es/dsemiconductores.php

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The original contribution of this thesis to knowledge are novel digital readout architectures for hybrid pixel readout chips. The thesis presents asynchronous bus-based architecture, a data-node based column architecture and a network-based pixel matrix architecture for data transportation. It is shown that the data-node architecture achieves readout efficiency 99% with half the output rate as a bus-based system. The network-based solution avoids “broken” columns due to some manufacturing errors, and it distributes internal data traffic more evenly across the pixel matrix than column-based architectures. An improvement of > 10% to the efficiency is achieved with uniform and non-uniform hit occupancies. Architectural design has been done using transaction level modeling (TLM) and sequential high-level design techniques for reducing the design and simulation time. It has been possible to simulate tens of column and full chip architectures using the high-level techniques. A decrease of > 10 in run-time is observed using these techniques compared to register transfer level (RTL) design technique. Reduction of 50% for lines-of-code (LoC) for the high-level models compared to the RTL description has been achieved. Two architectures are then demonstrated in two hybrid pixel readout chips. The first chip, Timepix3 has been designed for the Medipix3 collaboration. According to the measurements, it consumes < 1 W/cm^2. It also delivers up to 40 Mhits/s/cm^2 with 10-bit time-over-threshold (ToT) and 18-bit time-of-arrival (ToA) of 1.5625 ns. The chip uses a token-arbitrated, asynchronous two-phase handshake column bus for internal data transfer. It has also been successfully used in a multi-chip particle tracking telescope. The second chip, VeloPix, is a readout chip being designed for the upgrade of Vertex Locator (VELO) of the LHCb experiment at CERN. Based on the simulations, it consumes < 1.5 W/cm^2 while delivering up to 320 Mpackets/s/cm^2, each packet containing up to 8 pixels. VeloPix uses a node-based data fabric for achieving throughput of 13.3 Mpackets/s from the column to the EoC. By combining Monte Carlo physics data with high-level simulations, it has been demonstrated that the architecture meets requirements of the VELO (260 Mpackets/s/cm^2 with efficiency of 99%).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les détecteurs à pixels Medipix ont été développés par la collaboration Medipix et permettent de faire de l'imagerie en temps réel. Leur surface active de près de $2\cm^2$ est divisée en 65536~pixels de $55\times 55\um^2$ chacun. Seize de ces détecteurs, les Medipix2, sont installés dans l'expérience ATLAS au CERN afin de mesurer en temps réel les champs de radiation produits par les collisions de hadrons au LHC. Ils seront prochainement remplacés par des Timepix, la plus récente version de ces détecteurs, qui permettent de mesurer directement l'énergie déposée dans chaque pixel en mode \textit{time-over-threshold} (TOT) lors du passage d'une particule dans le semi-conducteur. En vue d'améliorer l'analyse des données recueillies avec ces détecteurs Timepix dans ATLAS, un projet de simulation Geant4 a été amorcé par John Id\'{a}rraga à l'Université de Montréal. Dans le cadre de l'expérience ATLAS, cette simulation pourra être utilisée conjointement avec Athena, le programme d'analyse d'ATLAS, et la simulation complète du détecteur ATLAS. Sous l'effet de leur propre répulsion, les porteurs de charge créés dans le semi-conducteur sont diffusés vers les pixels adjacents causant un dépôt d'énergie dans plusieurs pixels sous l'effet du partage de charges. Un modèle effectif de cette diffusion latérale a été développé pour reproduire ce phénomène sans résoudre d'équation différentielle de transport de charge. Ce modèle, ainsi que le mode TOT du Timepix, qui permet de mesurer l'énergie déposée dans le détecteur, ont été inclus dans la simulation afin de reproduire adéquatement les traces laissées par les particules dans le semi-conducteur. On a d'abord étalonné le détecteur pixel par pixel à l'aide d'une source de $\Am$ et de $\Ba$. Ensuite, on a validé la simulation à l'aide de mesures d'interactions de protons et de particules $\alpha$ produits au générateur Tandem van de Graaff du Laboratoire René-J.-A.-Lévesque de l'Université de Montréal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Future experiments in nuclear and particle physics are moving towards the high luminosity regime in order to access rare processes. In this framework, particle detectors require high rate capability together with excellent timing resolution for precise event reconstruction. In order to achieve this, the development of dedicated FrontEnd Electronics (FEE) for detectors has become increasingly challenging and expensive. Thus, a current trend in R&D is towards flexible FEE that can be easily adapted to a great variety of detectors, without impairing the required high performance. This thesis reports on a novel FEE for two different detector types: imaging Cherenkov counters and plastic scintillator arrays. The former requires high sensitivity and precision for detection of single photon signals, while the latter is characterized by slower and larger signals typical of scintillation processes. The FEE design was developed using high-bandwidth preamplifiers and fast discriminators which provide Time-over-Threshold (ToT). The use of discriminators allowed for low power consumption, minimal dead-times and self-triggering capabilities, all fundamental aspects for high rate applications. The output signals of the FEE are readout by a high precision TDC system based on FPGA. The performed full characterization of the analogue signals under realistic conditions proved that the ToT information can be used in a novel way for charge measurements or walk corrections, thus improving the obtainable timing resolution. Detailed laboratory investigations proved the feasibility of the ToT method. The full readout chain was investigated in test experiments at the Mainz Microtron: high counting rates per channel of several MHz were achieved, and a timing resolution of better than 100 ps after walk correction based on ToT was obtained. Ongoing applications to fast Time-of-Flight counters and future developments of FEE have been also recently investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis recoje un trabajo experimental centrado en profundizar sobre el conocimiento de los bloques detectores monolíticos como alternativa a los detectores segmentados para tomografía por emisión de positrones (Positron Emission Tomography, PET). El trabajo llevado a cabo incluye el desarrollo, la caracterización, la puesta a punto y la evaluación de prototipos demostradores PET utilizando bloques monolíticos de ortosilicato de lutecio ytrio dopado con cerio (Cerium-Doped Lutetium Yttrium Orthosilicate, LYSO:Ce) usando sensores compatibles con altos campos magnéticos, tanto fotodiodos de avalancha (Avalanche Photodiodes, APDs) como fotomultiplicadores de silicio (Silicon Photomultipliers, SiPMs). Los prototipos implementados con APDs se construyeron para estudiar la viabilidad de un prototipo PET de alta sensibilidad previamente simulado, denominado BrainPET. En esta memoria se describe y caracteriza la electrónica frontal integrada utilizada en estos prototipos junto con la electrónica de lectura desarrollada específicamente para los mismos. Se muestran los montajes experimentales para la obtención de las imágenes tomográficas PET y para el entrenamiento de los algoritmos de red neuronal utilizados para la estimación de las posiciones de incidencia de los fotones γ sobre la superficie de los bloques monolíticos. Con el prototipo BrainPET se obtuvieron resultados satisfactorios de resolución energética (13 % FWHM), precisión espacial de los bloques monolíticos (~ 2 mm FWHM) y resolución espacial de la imagen PET de 1,5 - 1,7 mm FWHM. Además se demostró una capacidad resolutiva en la imagen PET de ~ 2 mm al adquirir simultáneamente imágenes de fuentes radiactivas separadas a distancias conocidas. Sin embargo, con este prototipo se detectaron también dos limitaciones importantes. En primer lugar, se constató una falta de flexibilidad a la hora de trabajar con un circuito integrado de aplicación específica (Application Specific Integrated Circuit, ASIC) cuyo diseño electrónico no era propio sino comercial, unido al elevado coste que requieren las modificaciones del diseño de un ASIC con tales características. Por otra parte, la caracterización final de la electrónica integrada del BrainPET mostró una resolución temporal con amplio margen de mejora (~ 13 ns FWHM). Tomando en cuenta estas limitaciones obtenidas con los prototipos BrainPET, junto con la evolución tecnológica hacia matrices de SiPM, el conocimiento adquirido con los bloques monolíticos se trasladó a la nueva tecnología de sensores disponible, los SiPMs. A su vez se inició una nueva estrategia para la electrónica frontal, con el ASIC FlexToT, un ASIC de diseño propio basado en un esquema de medida del tiempo sobre umbral (Time over Threshold, ToT), en donde la duración del pulso de salida es proporcional a la energía depositada. Una de las características más interesantes de este esquema es la posibilidad de manejar directamente señales de pulsos digitales, en lugar de procesar la amplitud de las señales analógicas. Con esta arquitectura electrónica se sustituyen los conversores analógicos digitales (Analog to Digital Converter, ADCs) por conversores de tiempo digitales (Time to Digital Converter, TDCs), pudiendo implementar éstos de forma sencilla en matrices de puertas programmable ‘in situ’ (Field Programmable Gate Array, FPGA), reduciendo con ello el consumo y la complejidad del diseño. Se construyó un nuevo prototipo demostrador FlexToT para validar dicho ASIC para bloques monolíticos o segmentados. Se ha llevado a cabo el diseño y caracterización de la electrónica frontal necesaria para la lectura del ASIC FlexToT, evaluando su linealidad y rango dinámico, el comportamiento frente a ruido así como la no linealidad diferencial obtenida con los TDCs implementados en la FPGA. Además, la electrónica presentada en este trabajo es capaz de trabajar con altas tasas de actividad y de discriminar diferentes centelleadores para aplicaciones phoswich. El ASIC FlexToT proporciona una excelente resolución temporal en coincidencia para los eventos correspondientes con el fotopico de 511 keV (128 ps FWHM), solventando las limitaciones de resolución temporal del prototipo BrainPET. Por otra parte, la resolución energética con bloques monolíticos leidos por ASICs FlexToT proporciona una resolución energética de 15,4 % FWHM a 511 keV. Finalmente, se obtuvieron buenos resultados en la calidad de la imagen PET y en la capacidad resolutiva del demostrador FlexToT, proporcionando resoluciones espaciales en el centro del FoV en torno a 1,4 mm FWHM. ABSTRACT This thesis is focused on the development of experimental activities used to deepen the knowledge of monolithic detector blocks as an alternative to segmented detectors for Positron Emission Tomography (PET). It includes the development, characterization, setting up, running and evaluation of PET demonstrator prototypes with monolithic detector blocks of Cerium-doped Lutetium Yttrium Orthosilicate (LYSO:Ce) using magnetically compatible sensors such as Avalanche Photodiodes (APDs) and Silicon Photomultipliers (SiPMs). The prototypes implemented with APDs were constructed to validate the viability of a high-sensitivity PET prototype that had previously been simulated, denominated BrainPET. This work describes and characterizes the integrated front-end electronics used in these prototypes, as well as the electronic readout system developed especially for them. It shows the experimental set-ups to obtain the tomographic PET images and to train neural networks algorithms used for position estimation of photons impinging on the surface of monolithic blocks. Using the BrainPET prototype, satisfactory energy resolution (13 % FWHM), spatial precision of monolithic blocks (~ 2 mm FWHM) and spatial resolution of the PET image (1.5 – 1.7 mm FWHM) in the center of the Field of View (FoV) were obtained. Moreover, we proved the imaging capabilities of this demonstrator with extended sources, considering the acquisition of two simultaneous sources of 1 mm diameter placed at known distances. However, some important limitations were also detected with the BrainPET prototype. In the first place, it was confirmed that there was a lack of flexibility working with an Application Specific Integrated Circuit (ASIC) whose electronic design was not own but commercial, along with the high cost required to modify an ASIC design with such features. Furthermore, the final characterization of the BrainPET ASIC showed a timing resolution with room for improvement (~ 13 ns FWHM). Taking into consideration the limitations obtained with the BrainPET prototype, along with the technological evolution in magnetically compatible devices, the knowledge acquired with the monolithic blocks were transferred to the new technology available, the SiPMs. Moreover, we opted for a new strategy in the front-end electronics, the FlexToT ASIC, an own design ASIC based on a Time over Threshold (ToT) scheme. One of the most interesting features underlying a ToT architecture is the encoding of the analog input signal amplitude information into the duration of the output signals, delivering directly digital pulses. The electronic architecture helps substitute the Analog to Digital Converters (ADCs) for Time to Digital Converters (TDCs), and they are easily implemented in Field Programmable Gate Arrays (FPGA), reducing the consumption and the complexity of the design. A new prototype demonstrator based on SiPMs was implemented to validate the FlexToT ASIC for monolithic or segmented blocks. The design and characterization of the necessary front-end electronic to read-out the signals from the ASIC was carried out by evaluating its linearity and dynamic range, its performance with an external noise signal, as well as the differential nonlinearity obtained with the TDCs implemented in the FPGA. Furthermore, the electronic presented in this work is capable of working at high count rates and discriminates different phoswich scintillators. The FlexToT ASIC provides an excellent coincidence time resolution for events that correspond to 511 keV photopeak (128 ps FWHM), resolving the limitations of the poor timing resolution of the BrainPET prototype. Furthermore, the energy resolution with monolithic blocks read by FlexToT ASICs provides an energy resolution of 15.4 % FWHM at 511 keV. Finally, good results were obtained in the quality of the PET image and the resolving power of the FlexToT demonstrator, providing spatial resolutions in the centre of the FoV at about 1.4 mm FWHM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To assess whether reported morbidity and complaints of lack of time and sleep are associated with the burden of professional work and housework among nurses. METHODS: A cross-sectional exploratory study was carried out among female nurses and nurse assistants (N=206) of a public hospital in Rio de Janeiro, Brazil. Data were collected by means of a questionnaire. The prevalence ratio and 95% confidence intervals were estimated. RESULTS: Mean duration of professional work and housework time was 40.4 and 31.6 hours/week, respectively. Long professional working time (over 44 hours/week) were associated with mild emotional disorders (PR=1.37; 95% CI: 1.05-1.80), complaints of lack of time for resting/leisure (PR=1.61; 95% CI: 1.31-1.97), housework (PR=1.48; 95% CI: 1.12-1.97), and childcare (PR=1.99; 95% CI: 1.51-2.63). Long housework time (over 28 hours/week) was associated with lower prevalence of lack of time for childcare (PR=0.62; 95% CI: 0.46-0.84). High housework load was associated with lack of personal time and complaints of varicose veins (PR=1.31; 95% CI: 1.14-1.50 and PR=1.31; 95% CI: 1.08-1.58, respectively). Complaints of varicose veins were also frequent among female nurses with a total work load above 84 hours (PR=1.30; 95% CI: 1.05-1.61), though this group has shown a lower prevalence of arterial hypertension and recurrent headaches (PR=0.35; 95% CI: 0.15-0.83 and PR=0.53; 95% CI: 0.32-0.89, respectively). CONCLUSIONS: Results suggest that both professional and home environments are relevant in the evaluation of work overload on nurses' health and their family and social life. It is stressed the need for instruments for analyzing total workload among female populations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: To clarify the role of growth monitoring in primary school children, including obesity, and to examine issues that might impact on the effectiveness and cost-effectiveness of such programmes. Data sources: Electronic databases were searched up to July 2005. Experts in the field were also consulted. Review methods: Data extraction and quality assessment were performed on studies meeting the review's inclusion criteria. The performance of growth monitoring to detect disorders of stature and obesity was evaluated against National Screening Committee (NSC) criteria. Results: In the 31 studies that were included in the review, there were no controlled trials of the impact of growth monitoring and no studies of the diagnostic accuracy of different methods for growth monitoring. Analysis of the studies that presented a 'diagnostic yield' of growth monitoring suggested that one-off screening might identify between 1: 545 and 1: 1793 new cases of potentially treatable conditions. Economic modelling suggested that growth monitoring is associated with health improvements [ incremental cost per quality-adjusted life-year (QALY) of pound 9500] and indicated that monitoring was cost-effective 100% of the time over the given probability distributions for a willingness to pay threshold of pound 30,000 per QALY. Studies of obesity focused on the performance of body mass index against measures of body fat. A number of issues relating to human resources required for growth monitoring were identified, but data on attitudes to growth monitoring were extremely sparse. Preliminary findings from economic modelling suggested that primary prevention may be the most cost-effective approach to obesity management, but the model incorporated a great deal of uncertainty. Conclusions: This review has indicated the potential utility and cost-effectiveness of growth monitoring in terms of increased detection of stature-related disorders. It has also pointed strongly to the need for further research. Growth monitoring does not currently meet all NSC criteria. However, it is questionable whether some of these criteria can be meaningfully applied to growth monitoring given that short stature is not a disease in itself, but is used as a marker for a range of pathologies and as an indicator of general health status. Identification of effective interventions for the treatment of obesity is likely to be considered a prerequisite to any move from monitoring to a screening programme designed to identify individual overweight and obese children. Similarly, further long-term studies of the predictors of obesity-related co-morbidities in adulthood are warranted. A cluster randomised trial comparing growth monitoring strategies with no growth monitoring in the general population would most reliably determine the clinical effectiveness of growth monitoring. Studies of diagnostic accuracy, alongside evidence of effective treatment strategies, could provide an alternative approach. In this context, careful consideration would need to be given to target conditions and intervention thresholds. Diagnostic accuracy studies would require long-term follow-up of both short and normal children to determine sensitivity and specificity of growth monitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Possible changes in the frequency and intensity of windstorms under future climate conditions during the 21st century are investigated based on an ECHAM5 GCM multi-scenario ensemble. The intensity of a storm is quantified by the associated estimated loss derived with using an empirical model. The geographical focus is ‘Core Europe’, which comprises countries of Western Europe. Possible changes of losses are analysed by comparing ECHAM5 GCM data for recent (20C, 1960 to 2000) and future climate conditions (B1, A1B, A2; 2060 to 2100), each with 3 ensemble members. Changes are quantified using both rank statistics and return periods (RP) estimated by fitting an extreme value distribution using the peak over threshold method to potential storm losses. The estimated losses for ECHAM5 20C and reanalysis events show similar statistical features in terms of return periods. Under future climate conditions, all climate scenarios show an increase in both frequency and magnitude of potential losses caused by windstorms for Core Europe. Future losses that are double the highest ECHAM5 20C loss are identified for some countries. While positive changes of ranking are significant for many countries and multiple scenarios, significantly shorter RPs are mostly found under the A2 scenario for return levels correspondent to 20 yr losses or less. The emergence time of the statistically significant changes in loss varies from 2027 to 2100. These results imply an increased risk of occurrence of windstorm-associated losses, which can be largely attributed to changes in the meteorological severity of the events. Additionally, factors such as changes in the cyclone paths and in the location of the wind signatures relative to highly populated areas are also important to explain the changes in estimated losses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of São Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of São Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As the performance gap between microprocessors and memory continues to increase, main memory accesses result in long latencies which become a factor limiting system performance. Previous studies show that main memory access streams contain significant localities and SDRAM devices provide parallelism through multiple banks and channels. These locality and parallelism have not been exploited thoroughly by conventional memory controllers. In this thesis, SDRAM address mapping techniques and memory access reordering mechanisms are studied and applied to memory controller design with the goal of reducing observed main memory access latency. The proposed bit-reversal address mapping attempts to distribute main memory accesses evenly in the SDRAM address space to enable bank parallelism. As memory accesses to unique banks are interleaved, the access latencies are partially hidden and therefore reduced. With the consideration of cache conflict misses, bit-reversal address mapping is able to direct potential row conflicts to different banks, further improving the performance. The proposed burst scheduling is a novel access reordering mechanism, which creates bursts by clustering accesses directed to the same rows of the same banks. Subjected to a threshold, reads are allowed to preempt writes and qualified writes are piggybacked at the end of the bursts. A sophisticated access scheduler selects accesses based on priorities and interleaves accesses to maximize the SDRAM data bus utilization. Consequentially burst scheduling reduces row conflict rate, increasing and exploiting the available row locality. Using a revised SimpleScalar and M5 simulator, both techniques are evaluated and compared with existing academic and industrial solutions. With SPEC CPU2000 benchmarks, bit-reversal reduces the execution time by 14% on average over traditional page interleaving address mapping. Burst scheduling also achieves a 15% reduction in execution time over conventional bank in order scheduling. Working constructively together, bit-reversal and burst scheduling successfully achieve a 19% speedup across simulated benchmarks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Repetitive transcranial magnetic stimulation (rTMS) is a novel research tool in neurology and psychiatry. It is currently being evaluated as a conceivable alternative to electroconvulsive therapy for the treatment of mood disorders. Eight healthy young (age range 21-25 years) right-handed men without sleep complaints participated in the study. Two sessions at a 1-week interval, each consisting of an adaptation night (sham stimulation) and an experimental night (rTMS in the left dorsolateral prefrontal cortex or sham stimulation; crossover design), were scheduled. In each subject, 40 trains of 2-s duration of rTMS (inter-train interval 28 s) were applied at a frequency of 20 Hz (i.e. 1600 pulses per session) and at an intensity of 90% of the motor threshold. Stimulations were scheduled 80 min before lights off. The waking EEG was recorded for 10-min intervals approximately 30 min prior to and after the 20-min stimulations, and polysomnographic recordings were obtained during the subsequent sleep episode (23.00-07.00 h). The power spectra of two referential derivations, as well as of bipolar derivations along the antero-posterior axis over the left and right hemispheres, were analyzed. rTMS induced a small reduction of sleep stage 1 (in min and percentage of total sleep time) over the whole night and a small enhancement of sleep stage 4 during the first non-REM sleep episode. Other sleep variables were not affected. rTMS of the left dorsolateral cortex did not alter the topography of EEG power spectra in waking following stimulation, in the all-night sleep EEG, or during the first non-REM sleep episode. Our results indicate that a single session of rTMS using parameters like those used in depression treatment protocols has no detectable side effects with respect to sleep in young healthy males.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.