984 resultados para Minimum presence threshold


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although extensively studied within the lidar community, the multiple scattering phenomenon has always been considered a rare curiosity by radar meteorologists. Up to few years ago its appearance has only been associated with two- or three-body-scattering features (e.g. hail flares and mirror images) involving highly reflective surfaces. Recent atmospheric research aimed at better understanding of the water cycle and the role played by clouds and precipitation in affecting the Earth's climate has driven the deployment of high frequency radars in space. Examples are the TRMM 13.5 GHz, the CloudSat 94 GHz, the upcoming EarthCARE 94 GHz, and the GPM dual 13-35 GHz radars. These systems are able to detect the vertical distribution of hydrometeors and thus provide crucial feedbacks for radiation and climate studies. The shift towards higher frequencies increases the sensitivity to hydrometeors, improves the spatial resolution and reduces the size and weight of the radar systems. On the other hand, higher frequency radars are affected by stronger extinction, especially in the presence of large precipitating particles (e.g. raindrops or hail particles), which may eventually drive the signal below the minimum detection threshold. In such circumstances the interpretation of the radar equation via the single scattering approximation may be problematic. Errors will be large when the radiation emitted from the radar after interacting more than once with the medium still contributes substantially to the received power. This is the case if the transport mean-free-path becomes comparable with the instrument footprint (determined by the antenna beam-width and the platform altitude). This situation resembles to what has already been experienced in lidar observations, but with a predominance of wide- versus small-angle scattering events. At millimeter wavelengths, hydrometeors diffuse radiation rather isotropically compared to the visible or near infrared region where scattering is predominantly in the forward direction. A complete understanding of radiation transport modeling and data analysis methods under wide-angle multiple scattering conditions is mandatory for a correct interpretation of echoes observed by space-borne millimeter radars. This paper reviews the status of research in this field. Different numerical techniques currently implemented to account for higher order scattering are reviewed and their weaknesses and strengths highlighted. Examples of simulated radar backscattering profiles are provided with particular emphasis given to situations in which the multiple scattering contributions become comparable or overwhelm the single scattering signal. We show evidences of multiple scattering effects from air-borne and from CloudSat observations, i.e. unique signatures which cannot be explained by single scattering theory. Ideas how to identify and tackle the multiple scattering effects are discussed. Finally perspectives and suggestions for future work are outlined. This work represents a reference-guide for studies focused at modeling the radiation transport and at interpreting data from high frequency space-borne radar systems that probe highly opaque scattering media such as thick ice clouds or precipitating clouds.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study analyzed the influence of recovery phase manipulation after hyperlactemia induction on the lactate minimum intensity during treadmill running. Twelve male runners (24.6 +/- A 6.3 years; 172 +/- A 8.0 cm and 62.6 +/- A 6.1 kg) performed three lactate minimum tests involving passive (LMT(P)) and active recoveries at 30%vVO(2max) (LMT(A30)) and 50%vVO(2max) (LMT(A50)) in the 8-min period following initial sprints. During subsequent graded exercise, lactate minimum speed and VO(2) in LMT(A50) (12.8 +/- A 1.5 km h(-1) and 40.3 +/- A 5.1 ml kg(-1) min(-1)) were significantly lower (P < 0.05) than those in LMT(A30) (13.3 +/- A 1.6 km h(-1) and 42.9 +/- A 5.3 ml kg(-1) min(-1)) and LMT(P) (13.8 +/- A 1.6 km h(-1) and 43.6 +/- A 6.1 ml kg(-1) min(-1)). In addition, lactate minimum speed in LMT(A30) was significantly lower (P < 0.05) than that in LMT(P). These results suggest that lactate minimum intensity is lowered by active recovery after hyperlactemia induction in an intensity-dependent manner compared to passive recovery.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Beer bottles are often used in physical disputes. If the bottles break, they may give rise to sharp trauma. However, if the bottles remain intact, they may cause blunt injuries. In order to investigate whether full or empty standard half-litre beer bottles are sturdier and if the necessary breaking energy surpasses the minimum fracture-threshold of the human skull, we tested the fracture properties of such beer bottles in a drop-tower. Full bottles broke at 30 J impact energy, empty bottles at 40 J. These breaking energies surpass the minimum fracture-threshold of the human neurocranium. Beer bottles may therefore fracture the human skull and therefore serve as dangerous instruments in a physical dispute.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While spatial determinants of emmetropization have been examined extensively in animal models and spatial processing of human myopes has also been studied, there have been few studies investigating temporal aspects of emmetropization and temporal processing in human myopia. The influence of temporal light modulation on eye growth and refractive compensation has been observed in animal models and there is evidence of temporal visual processing deficits in individuals with high myopia or other pathologies. Given this, the aims of this work were to examine the relationships between myopia (i.e. degree of myopia and progression status) and temporal visual performance and to consider any temporal processing deficits in terms of the parallel retinocortical pathways. Three psychophysical studies investigating temporal processing performance were conducted in young adult myopes and non-myopes: (1) backward visual masking, (2) dot motion perception and (3) phantom contour. For each experiment there were approximately 30 young emmetropes, 30 low myopes (myopia less than 5 D) and 30 high myopes (5 to 12 D). In the backward visual masking experiment, myopes were also classified according to their progression status (30 stable myopes and 30 progressing myopes). The first study was based on the observation that the visibility of a target is reduced by a second target, termed the mask, presented quickly after the first target. Myopes were more affected by the mask when the task was biased towards the magnocellular pathway; myopes had a 25% mean reduction in performance compared with emmetropes. However, there was no difference in the effect of the mask when the task was biased towards the parvocellular system. For all test conditions, there was no significant correlation between backward visual masking task performance and either the degree of myopia or myopia progression status. The dot motion perception study measured detection thresholds for the minimum displacement of moving dots, the maximum displacement of moving dots and degree of motion coherence required to correctly determine the direction of motion. The visual processing of these tasks is dominated by the magnocellular pathway. Compared with emmetropes, high myopes had reduced ability to detect the minimum displacement of moving dots for stimuli presented at the fovea (20% higher mean threshold) and possibly at the inferior nasal retina. The minimum displacement threshold was significantly and positively correlated to myopia magnitude and axial length, and significantly and negatively correlated with retinal thickness for the inferior nasal retina. The performance of emmetropes and myopes for all the other dot motion perception tasks were similar. In the phantom contour study, the highest temporal frequency of the flickering phantom pattern at which the contour was visible was determined. Myopes had significantly lower flicker detection limits (21.8 ± 7.1 Hz) than emmetropes (25.6 ± 8.8 Hz) for tasks biased towards the magnocellular pathway for both high (99%) and low (5%) contrast stimuli. There was no difference in flicker limits for a phantom contour task biased towards the parvocellular pathway. For all phantom contour tasks, there was no significant correlation between flicker detection thresholds and magnitude of myopia. Of the psychophysical temporal tasks studied here those primarily involving processing by the magnocellular pathway revealed differences in performance of the refractive error groups. While there are a number of interpretations for this data, this suggests that there may be a temporal processing deficit in some myopes that is selective for the magnocellular system. The minimum displacement dot motion perception task appears the most sensitive test, of those studied, for investigating changes in visual temporal processing in myopia. Data from the visual masking and phantom contour tasks suggest that the alterations to temporal processing occur at an early stage of myopia development. In addition, the link between increased minimum displacement threshold and decreasing retinal thickness suggests that there is a retinal component to the observed modifications in temporal processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Australian universities are currently engaging with new governmental policies and regulations that require them to demonstrate enhanced quality and accountability in teaching and research. The development of national academic standards for learning outcomes in higher education is one such instance of this drive for excellence. These discipline-specific standards articulate the minimum, or Threshold Learning Outcomes, to be addressed by higher education institutions so that graduating students can demonstrate their achievement to their institutions, accreditation agencies, and industry recruiters. This impacts not only on the design of Engineering courses (with particular emphasis on pedagogy and assessment), but also on the preparation of academics to engage with these standards and implement them in their day-to-day teaching practice on a micro level. This imperative for enhanced quality and accountability in teaching is also significant at a meso level, for according to the Australian Bureau of Statistics, about 25 per cent of teachers in Australian universities are aged 55 and above and more than 54 per cent are aged 45 and above (ABS, 2006). A number of institutions have undertaken recruitment drives to regenerate and enrich their academic workforce by appointing capacity-building research professors and increasing the numbers of early- and mid-career academics. This nationally driven agenda for quality and accountability in teaching permeates also the micro level of engineering education, since the demand for enhanced academic standards and learning outcomes requires both a strong advocacy for a shift to an authentic, collaborative, outcomes-focused education and the mechanisms to support academics in transforming their professional thinking and practice. Outcomes-focused education means giving greater attention to the ways in which the curriculum design, pedagogy, assessment approaches and teaching activities can most effectively make a positive, verifiable difference to students’ learning. Such education is authentic when it is couched firmly in the realities of learning environments, student and academic staff characteristics, and trustworthy educational research. That education will be richer and more efficient when staff works collaboratively, contributing their knowledge, experience and skills to achieve learning outcomes based on agreed objectives. We know that the school or departmental levels of universities are the most effective loci of changes in approaches to teaching and learning practices in higher education (Knight & Trowler, 2000). Heads of Schools are being increasingly entrusted with more responsibilities - in addition to setting strategic directions and managing the operational and sometimes financial aspects of their school, they are also expected to lead the development and delivery of the teaching, research and other academic activities. Guiding and mentoring individuals and groups of academics is one critical aspect of the Head of School’s role. Yet they do not always have the resources or support to help them mentor staff, especially the more junior academics. In summary, the international trend in undergraduate engineering course accreditation towards the demonstration of attainment of graduate attributes poses new challenges in addressing academic staff development needs and the assessment of learning. This paper will give some insights into the conceptual design, implementation and empirical effectiveness to date, of a Fellow-In-Residence Engagement (FIRE) program. The program is proposed as a model for achieving better engagement of academics with contemporary issues and effectively enhancing their teaching and assessment practices. It will also report on the program’s collaborative approach to working with Heads of Schools to better support academics, especially early-career ones, by utilizing formal and informal mentoring. Further, the paper will discuss possible factors that may assist the achievement of the intended outcomes of such a model, and will examine its contributions to engendering an outcomes-focussed thinking in engineering education.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de Mestrado em Ambiente, Saúde e Segurança.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Wireless Sensor Networks (WSN), neglecting the effects of varying channel quality can lead to an unnecessary wastage of precious battery resources and in turn can result in the rapid depletion of sensor energy and the partitioning of the network. Fairness is a critical issue when accessing a shared wireless channel and fair scheduling must be employed to provide the proper flow of information in a WSN. In this paper, we develop a channel adaptive MAC protocol with a traffic-aware dynamic power management algorithm for efficient packet scheduling and queuing in a sensor network, with time varying characteristics of the wireless channel also taken into consideration. The proposed protocol calculates a combined weight value based on the channel state and link quality. Then transmission is allowed only for those nodes with weights greater than a minimum quality threshold and nodes attempting to access the wireless medium with a low weight will be allowed to transmit only when their weight becomes high. This results in many poor quality nodes being deprived of transmission for a considerable amount of time. To avoid the buffer overflow and to achieve fairness for the poor quality nodes, we design a Load prediction algorithm. We also design a traffic aware dynamic power management scheme to minimize the energy consumption by continuously turning off the radio interface of all the unnecessary nodes that are not included in the routing path. By Simulation results, we show that our proposed protocol achieves a higher throughput and fairness besides reducing the delay

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A set of standards is proposed for university teaching. Embedding these within the Higher Education Academy UK Professional Standards Framework (UKPSF) would allow a more robust assessment of whether a university teacher has met a minimum acceptable threshold.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste estudo foi comparar a intensidade de exercício no lactato mínimo (LACmin), com a intensidade correspondente ao limiar de lactato (LL) e limiar anaeróbio (LAn). Participaram do estudo, 11 atletas do sexo masculino (idade, 22,5 + 3,17 anos; altura, 172,3 + 8,2 cm; peso, 66,9 + 8,2kg; e gordura corporal, 9,8 + 3,4%). Os indivíduos foram submetidos, em uma bicicleta eletromagnética (Quinton - Corival 400), a dois testes: 1) exercício contínuo de cargas crescentes - carga inicial de 100W, com incrementos de 25W a cada três min. até a exaustão voluntária; e 2) teste de lactato mínimo - inicialmente os indivíduos pedalaram duas vezes 425W (+ 120%max) durante 30 segundos, com um min. de intervalo, com o objetivo de induzir o acúmulo de lactato. Após oito min. de recuperação passiva, os indivíduos iniciaram um teste contínuo de cargas progressivas, idêntico ao descrito anteriormente. O LL e o LAn foram identificados como sendo o menor valor entre a razão - lactato sanguíneo (mM) / intensidade de exercício (W), e a intensidade correspondente a 3,5mM de lactato sanguíneo, respectivamente. O LACmin foi identificado como sendo a intensidade correspondente a menor concentração de lactato durante o teste de cargas progressivas. Não foi observada diferença significante entre a potência do LL (197,7 + 20,7W) e do LACmin (201,6 + 13,0W), sendo ambas significantemente menores do que do LAn (256,7 + 33,3W). Não foram encontradas também diferenças significantes para o (ml.kg-1.min-1) e a FC (bpm) obtidos no LL (43,2 + 5,01; 152,0 + 13,0) e no LACmin (42,1 + 3,9; 159,0 + 10,0), sendo entretanto significantemente menores do que os obtidos para o LAn (52,2 + 8,2; 174,0 + 13,0, respectivamente). Pode-se concluir que o teste de LACmin, nas condições experimentais deste estudo, pode subestimar a intensidade de MSSLAC (estimada indiretamente pelo LAn), o que concordacom outros estudos que determinaram a MSSLAC diretamente. Assim, são necessários mais estudos que analisem o possível componente tempo-dependente (intensidade inicial) que pode existir no protocolo do LACmin.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Observations of cosmic rays arrival directions made with the Pierre Auger Observatory have previously provided evidence of anisotropy at the 99% CL using the correlation of ultra high energy cosmic rays (UHECRs) with objects drawn from the Veron-Cetty Veron catalog. In this paper we report on the use of three catalog independent methods to search for anisotropy. The 2pt-L, 2pt+ and 3pt methods, each giving a different measure of self-clustering in arrival directions, were tested on mock cosmic ray data sets to study the impacts of sample size and magnetic smearing on their results, accounting for both angular and energy resolutions. If the sources of UHECRs follow the same large scale structure as ordinary galaxies in the local Universe and if UHECRs are deflected no more than a few degrees, a study of mock maps suggests that these three method can efficiently respond to the resulting anisotropy with a P-value = 1.0% or smaller with data sets as few as 100 events. using data taken from January 1, 2004 to July 31, 2010 we examined the 20, 30, ... , 110 highest energy events with a corresponding minimum energy threshold of about 49.3 EeV. The minimum P-values found were 13.5% using the 2pt-L method, 1.0% using the 2pt+ method and 1.1% using the 3pt method for the highest 100 energy events. In view of the multiple (correlated) scans performed on the data set, these catalog-independent methods do not yield strong evidence of anisotropy in the highest energy cosmic rays.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerosi lavori apparsi sulla letteratura scientifica negli ultimi decenni hanno evidenziato come, dall’inizio del XX secolo, la temperatura media globale sia aumentata. Tale fenomeno si è fatto più evidente dagli anni ’80, infatti ognuno degli ultimi tre decenni risulta più caldo dei precedenti. L’Europa e l’area mediterranea sono fra le regioni in cui il riscaldamento risulta più marcato, soprattutto per le temperature massime (dal 1951 sono cresciute di +0.39 °C per decennio) che hanno mostrato trend maggiori delle minime. Questo comportamento è stato osservato anche a scala nazionale (+0.25°C/dec per le massime e +0.20°C/dec per le minime). Accanto all’aumento dei valori medi è stato osservato un aumento (diminuzione) degli eventi di caldo (freddo) estremo, studiati attraverso la definizione di alcuni indici basati sui percentili delle distribuzioni. Resta aperto il dibattito su quali siano le cause delle variazioni negli eventi estremi: se le variazioni siano da attribuire unicamente ad un cambiamento nei valori medi, quindi ad uno shift rigido della distribuzione, o se parte del segnale sia dovuto ad una variazione nella forma della stessa, con un conseguente cambiamento nella variabilità. In questo contesto si inserisce la presente tesi con l’obiettivo di studiare l’andamento delle temperature giornaliere sul Trentino-Alto-Adige a partire dal 1926, ricercando cambiamenti nella media e negli eventi estremi in due fasce altimetriche. I valori medi delle temperature massime e minime hanno mostrato un evidente riscaldamento sull’intero periodo specialmente per le massime a bassa quota (`0.13 ̆ 0.03 °C/dec), con valori più alti per la primavera (`0.22 ̆ 0.05 °C/dec) e l’estate (`0.17 ̆ 0.05 °C/dec). Questi trends sono maggiori dopo il 1980 e non significativi in precedenza. L’andamento del numero di giorni con temperature al di sopra e al di sotto delle soglie dei percentili più estremi (stimate sull’intero periodo) indica un chiaro aumento degli estremi caldi, con valori più alti per le massime ad alta quota ( fino a +26.8% per il 99-esimo percentile) e una diminuzione degli estremi freddi (fino a -8.5% per il primo percentile delle minime a bassa quota). Inoltre, stimando anno per anno le soglie di un set di percentili e confrontando i loro trend con quelli della mediana, si è osservato, unicamente per le massime, un trend non uniforme verso temperature più alte, con i percentili più bassi (alti) caratterizzati da trend inferiori (superiori) rispetto a quello della mediana, suggerendo un allargamento della PDF.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Schizophrenia patients frequently suffer from complex motor abnormalities including fine and gross motor disturbances, abnormal involuntary movements, neurological soft signs and parkinsonism. These symptoms occur early in the course of the disease, continue in chronic patients and may deteriorate with antipsychotic medication. Furthermore gesture performance is impaired in patients, including the pantomime of tool use. Whether schizophrenia patients would show difficulties of actual tool use has not yet been investigated. Human tool use is complex and relies on a network of distinct and distant brain areas. We therefore aim to test if schizophrenia patients had difficulties in tool use and to assess associations with structural brain imaging using voxel based morphometry (VBM) and tract based spatial statistics (TBSS). Methode: In total, 44 patients with schizophrenia (DSM-5 criteria; 59% men, mean age 38) underwent structural MR imaging and performed the Tool-Use test. The test examines the use of a scoop and a hammer in three conditions: pantomime (without the tool), demonstration (with the tool) and actual use (with a recipient object). T1-weighted images were processed using SPM8 and DTI-data using FSL TBSS routines. To assess structural alterations of impaired tool use we first compared gray matter (GM) volume in VBM and white matter (WM) integrity in TBSS data of patients with and without difficulties of actual tool use. Next we explored correlations of Tool use scores and VBM and TBSS data. Group comparisons were family wise error corrected for multiple tests. Correlations were uncorrected (p < 0.001) with a minimum cluster threshold of 17 voxels (equivalent to a map-wise false positive rate of alpha < 0.0001 using a Monte Carlo procedure). Results: Tool use was impaired in schizophrenia (43.2% pantomime, 11.6% demonstration, 11.6% use). Impairment was related to reduced GM volume and WM integrity. Whole brain analyses detected an effect in the SMA in group analysis. Correlations of tool use scores and brain structure revealed alterations in brain areas of the dorso-dorsal pathway (superior occipital gyrus, superior parietal lobule, and dorsal premotor area) and the ventro-dorsal pathways (middle occipital gyrus, inferior parietal lobule) the action network, as well as the insula and the left hippocampus. Furthermore, significant correlations within connecting fiber tracts - particularly alterations within the bilateral corona radiata superior and anterior as well as the corpus callosum -were associated with Tool use performance. Conclusions: Tool use performance was impaired in schizophrenia, which was associated with reduced GM volume in the action network. Our results are in line with reports of impaired tool use in patients with brain lesions particularly of the dorso-dorsal and ventro-dorsal stream of the action network. In addition an effect of tool use on WM integrity was shown within fiber tracts connecting regions important for planning and executing tool use. Furthermore, hippocampus is part of a brain system responsible for spatial memory and navigation.The results suggest that structural brain alterations in the common praxis network contribute to impaired tool use in schizophrenia.