944 resultados para LOW-ENERGY LASER
Resumo:
Astrophysics is driven by observations, and in the present era there are a wealth of state-of-the-art ground-based and satellite facilities. The astrophysical spectra emerging from these are of exceptional quality and quantity and cover a broad wavelength range. To meaningfully interpret these spectra, astronomers employ highly complex modelling codes to simulate the astrophysical observations. Important input to these codes include atomic data such as excitation rates, photoionization cross sections, oscillator strengths, transition probabilities and energy levels/line wavelengths. Due to the relatively low temperatures associated with many astrophysical plasmas, the accurate determination of electron-impact excitation rates in the low energy region is essential in generating a reliable spectral synthesis. Hence it is these atomic data, and the main computational methods used to evaluate them, which we focus on in this publication. We consider in particular the complicated open d- shell structures of the Fe-peak ions in low ionization stages. While some of these data can be obtained experimentally, they are usually of insufficient accuracy or limited to a small number of transitions.
Resumo:
Accurate determination of electron excitation rates for the Fe-peak elements is complicated by the presence of an open 3d-shell in the description of the target ion, which can lead to hundreds of target state energy levels. Furthermore, the low energy scattering region is dominated by series of Rydberg resonances, which require a very fine energy mesh for their delineation. These problems have prompted the development of a suite of parallel R-matrix codes. In this work we report recent applications of these codes to the study of electron impact excitation of Ni III and Ni IV.
Resumo:
There is a significant lack of indoor air quality research in low energy homes. This study compared the indoor air quality of eight
newly built case study homes constructed to similar levels of air-tightness and insulation; with two different ventilation strategies (four homes with Mechanical Ventilation with Heat Recovery (MVHR) systems/Code level 4 and four homes naturally ventilated/Code level 3). Indoor air quality measurements were conducted over a 24 h period in the living room and main bedroom of each home during the summer and winter seasons. Simultaneous outside measurements and an occupant diary were also employed during the measurement period. Occupant interviews were conducted to gain information on perceived indoor air quality, occupant behaviour and building related illnesses. Knowledge of the MVHR system including ventilation related behaviour was also studied. Results suggest indoor air quality problems in both the mechanically ventilated and naturally ventilated homes, with significant issues identified regarding occupant use in the social homes
Resumo:
We have measured mass spectra for positive ions for low-energy electron impact on thymine using a reflectron time-of-flight mass spectrometer. Using computer controlled data acquisition, mass spectra have been acquired for electron impact energies up to 100 eV in steps of 0.5 eV. Ion yield curves for most of the fragment ions have been determined by fitting groups of adjacent peaks in the mass spectra with sequences of normalized Gaussians. The ion yield curves have been normalized by comparing the sum of the ion yields to the average of calculated total ionization cross sections. Appearance energies have been determined. The nearly equal appearance energies of 83 u and 55 u observed in the present work strongly indicate that near threshold the 55 u ion is formed directly by the breakage of two bonds in the ring, rather than from a successive loss of HNCO and CO from the parent ion. Likewise 54 u is not formed by CO loss from 82 u. The appearance energies are in a number of cases consistent with the loss of one or more hydrogen atoms from a heavier fragment, but 70 u is not formed by hydrogen loss from 71 u.
Resumo:
It is predicted that climate change will result in rising sea levels, more frequent and extreme weather events, hotter and drier summers and warmer and wetter winters. This will have a significant impact on the design of buildings, how they are kept cool and how they are weathered against more extreme climatic conditions. The residential sector is already a significant environmental burden with high associated operational energy. Climate change, and a growing population requiring residence, has the potential to exacerbate this problem seriously. New paradigms for residential building design are required to enable low-carbon dioxide operation to mitigate climate change. They must also face the reality of inevitable climate change and adopt climate change adaptation strategies to cope with future scenarios. However, any climate adaptation strategy for dwellings must also be cognisant of adapting occupant needs, influenced by ageing populations and new technologies. This paper presents concepts and priorities for changing how society designs residential buildings by designing for adaptation. A case study home is analysed in the context of its stated aims of low energy and adaptability. A post-occupancy evaluation of the house is presented, and future-proofing strategies are evaluated using climate projection data for future climate change scenarios.
Resumo:
Dissertação de Mestrado, Oceanografia, Faculdade de Ciências do Mar e do Ambiente, Universidade do Algarve, 2007
Resumo:
Flavour effects due to lepton interactions in the early Universe may have played an important role in the generation of the cosmological baryon asymmetry through leptogenesis. If the only source of high-energy CP violation comes from the left-handed leptonic sector, then it is possible to establish a bridge between flavoured leptogenesis and low-energy leptonic CP violation. We explore this connection taking into account our present knowledge about low-energy neutrino parameters and the matter-antimatter asymmetry observed in the Universe. In this framework, we find that leptogenesis favours a hierarchical light neutrino mass spectrum, while for quasi-degenerate and inverted hierarchical neutrino masses there is a very narrow allowed window. The absolute neutrino mass scale turns out to be m less than or similar to 0.1 eV. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
in RoboCup 2007: Robot Soccer World Cup XI
Resumo:
Cherts from the Middle Devonian Onondaga Formation of the Niagara Peninsula in Southern Ontario and Western New York State can now be distinguished from those of the Early Devonian Bois Blanc Formation of the same area based on differences in petrology, acritarchs, spores, and "Preservation Ratio" values. The finely crystalline, carbonate sediments of the Bois Blanc Formation were deposited under shallow, low energy conditions characterised by the acritarchs Leiofusa bacillum and L. minuta and a high relative abundance of the spore, Apiculiretusispora minor. The medio crystalline and bioclastic carbonate sediments of the Onondaga Formation were deposited under shallow, high energy conditions except for the finely crystalline lagoonal sediments of the Clarence Member which is characterised by the acritarchs Leiofusa navicula, L. sp. B, and L. tomaculata . The author has subdivided and correlated the Clarence Member of the Onondaga Formation using the "Preservation Ratio" values derived from the palynomorphs contained in the cherts. Clarence Member cherts were used by the Archaic people of the Niagara Peninsula for chipped-stone tools. The source area for the chert is considered to be the cobble beach deposits along the north shore of Lake Erie from Port Maitland to Nanticoke
Resumo:
A system comprised of a Martin-Puplett type polarizing interferometer and a Helium-3 cryostat was developed to study the transmission of materials in the very-far-infrared region of the spectrum. This region is of significant interest due to the low-energy excitations which many materials exhibit. The experimental transmission spectrum contains information concerning the optical properties of the material. The set-up of this system is described in detail along with the adaptations and improvements which have been made to the system to ensure the best results. Transmission experiments carried out with this new set-up for two different varieties of materials: superconducting thin films of lead and biological proteins, are discussed. Several thin films of lead deposited on fused silica quartz substrates were studied. From the ratio of the transmission in the superconducting state to that in the normal state the superconducting energy gap was determined to be approximately 25 cm-1 which corresponds to 2~/kBTc rv 5 in agreement with literature data. Furthermore, in agreement with theoretical predictions, the maximum in the transmission ratio was observed to increase as the film thickness was increased. These results provide verification of the system's ability to accurately measure the optical properties of thin low-Tc superconducting films. Transmission measurements were carried out on double deionized water, and a variety of different concentrations by weight of the globular protein, Bovine Serum Albumin, in the sol, gel and crystalline forms. The results of the water study agree well with literature values and thus further illustrate the reproducibility of the system. The results of the protein experiments, although preliminary, indicate that as the concentration increases the samples become more transparent. Some weak structure in the frequency dependent absorption coefficient, which is more prominent in crystalline samples, may be due to low frequency vibrations of the protein molecules.
Resumo:
Polyglutamine is a naturally occurring peptide found within several proteins in neuronal cells of the brain, and its aggregation has been implicated in several neurodegenerative diseases, including Huntington's disease. The resulting aggregates have been demonstrated to possess ~-sheet structure, and aggregation has been shown to start with a single misfolded peptide. The current project sought to computationally examine the structural tendencies of three mutant poly glutamine peptides that were studied experimentally, and found to aggregate with varying efficiencies. Low-energy structures were generated for each peptide by simulated annealing, and were analyzed quantitatively by various geometry- and energy-based methods. According to the results, the experimentally-observed inhibition of aggregation appears to be due to localized conformational restraint placed on the peptide backbone by inserted prolines, which in tum confines the peptide to native coil structure, discouraging transition towards the ~sheet structure required for aggregation. Such knowledge could prove quite useful to the design of future treatments for Huntington's and other related diseases.
Resumo:
The prediction of proteins' conformation helps to understand their exhibited functions, allows for modeling and allows for the possible synthesis of the studied protein. Our research is focused on a sub-problem of protein folding known as side-chain packing. Its computational complexity has been proven to be NP-Hard. The motivation behind our study is to offer the scientific community a means to obtain faster conformation approximations for small to large proteins over currently available methods. As the size of proteins increases, current techniques become unusable due to the exponential nature of the problem. We investigated the capabilities of a hybrid genetic algorithm / simulated annealing technique to predict the low-energy conformational states of various sized proteins and to generate statistical distributions of the studied proteins' molecular ensemble for pKa predictions. Our algorithm produced errors to experimental results within .acceptable margins and offered considerable speed up depending on the protein and on the rotameric states' resolution used.
Resumo:
Previous research has found that victims of crime tend to exhibit asynchronous movement (e.g. Grayson & Stein, 1981), and the fact that victims display different body language suggests that they may be sending inadvertent signals to their own vulnerability (e.g. Murzynski & Degelman, 1996). Body language has also be en linked with s e l f identification as a victim (Wheeler et aI., 2009), and self-identification has be en found to act as a proxy for more severe victimization (Baumer, 2002) and greater fear of crime (Greenberg & Beach, 2004). The first prediction in the present study, then, was that self-perceived vulnerability would be correlated with body language, while number of previous victimizations mayor may not show the same relationship. Findings from the present study indicate that self-perceived vulnerability exhibits a positive correlation with the body language cues that approaches significance r (10) = .45,p =.07, one-tailed. Different types of victimization, however, were not significantly correlated with these cues. A second goal of the study was to examine the relationship between psychopathic traits and accuracy in judgments of vulnerability. Seventy male participants rated the vulnerability of 12 female targets filmed walking down a hallway who had provided selfratings of vulnerability. Individuals scoring higher on Factor 2 and total psychopathy were significantly less discrepant from target self-rat~ngs of vulnerability, r (64) = - .39,p < .001; r (64) = - .29,p >.01, respectively. The final purpose of this study was to determine which body language cues were mos t salient to raters when making judgments of vulnerability. Participants rated the apparent vulnerability of a target in 7 video clips portraying each body language cue in isolation and a natural walk. Results of repeated measures analyses indicate that the videos rated as most vulnerable to victimization were those displaying low energy and l a ck of synchrony, followed by wide stride, short stride, and stiffknees, while the video displaying ne ck stiffness did not receive significantly different ratings from the mode l ' s natural walk. Replication with a larger sample size is necessary to increase confidence in findings and implications.
Resumo:
La luxation du genou, bien que très rare, demeure une blessure dévastatrice car elle entraîne de multiples complications en raison de la nature complexe du traumatisme associé à cette lésion. La luxation peut résulter d'un traumatisme à haute ou basse énergie. Les blessures sévères aux ligaments et aux structures associées donnent à la luxation du genou un potentiel élevé d'atteinte fonctionnelle. Le traitement conservateur, qui était considéré comme acceptable auparavant, est maintenant réservé à un très faible pourcentage de patients. La reconstruction chirurgicale est maintenant préconisée dans la plupart des cas, mais de nombreuses options existent et le traitement chirurgical optimal à préconiser reste controversé. Certains chirurgiens recommandent la reconstruction complète de tous les ligaments endommagés le plus tôt possible, tandis que d'autres, craignant l’établissement d’arthrofibrose, limitent l'intervention chirurgicale immédiate à la reconstruction du ligament croisé postérieur et de l'angle postéro-externe. En raison des multiples structures endommagées lors d’une luxation du genou, les chirurgiens utilisent couramment la combinaison des autogreffes et des allogreffes pour compléter la reconstruction ligamentaire. Les complications associées au prélèvement de la greffe, l'allongement de la greffe, l’affaiblissement précoce du greffon ainsi que les risques de transmission de maladies ont poussé les chirurgiens à rechercher différentes options d’intervention. L'utilisation de matériaux synthétiques pour le remplacement du ligament lésé a été proposée dans les années ´80. Après une première vague d'enthousiasme, les résultats décevants à long terme et les taux élevés d'échec ont diminué sa popularité. Depuis lors, une nouvelle génération de ligaments artificiels a vu le jour et parmi eux, le Ligament Advanced Reinforced System (LARS) a montré des résultats prometteurs. Il a été utilisé récemment dans les reconstructions isolées du ligament croisé antérieur et du ligament croisé postérieur pour lesquelles il a montré de bons résultats à court et moyen termes. Le but de cette étude rétrospective était d'évaluer la fonction et la stabilité du genou après la luxation aiguë suivant la reconstruction des ligaments croisés avec le ligament artificiel de type LARS. Cette étude a évalué 71 patients présentant une luxation du genou et qui ont subi une chirurgie de reconstruction du ligament croisé antérieur et du ligament croisé postérieur à l'aide du ligament LARS. Suite à la chirurgie le même protocole intensif de réadaptation a été suivi pour tous les patients, où la mise en charge progressive était permise après une période d’environ 6 semaines pendant laquelle la force musculaire et la stabilité dynamique se rétablissaient. Les outils d’évaluation utilisés étaient le score Lysholm, le formulaire de «l’International Knee Documentation Committee», le «Short Form-36», les tests cliniques de stabilité du genou, l'amplitude de mouvement articulaire à l’aide d’un goniomètre et la radiographie en stress Telos à 30° et 90° de flexion du genou. Le même protocole d’évaluation a été appliqué au genou controlatéral pour des fins de comparaison. Les résultats subjectifs et objectifs de cette étude sont satisfaisants et suggèrent que la réparation et la reconstruction combinées avec ligaments LARS est une alternative valable pour le traitement des luxations aiguës du genou. Ces résultats démontrent que ces interventions produisent des effets durables en termes d’amélioration de la fonction et révèlent la durabilité à long terme des ligaments artificiels LARS. Les patients sont à la fois plus autonomes et plus satisfaits avec le temps, même si la luxation du genou est considérée comme une catastrophe au moment où elle se produit. Des études prospectives randomisées sont maintenant nécessaires afin de comparer la sélection de la greffe et le délai de reconstruction chirurgicale.
Resumo:
Les détecteurs à pixels Medipix ont été développés par la collaboration Medipix et permettent de faire de l'imagerie en temps réel. Leur surface active de près de $2\cm^2$ est divisée en 65536~pixels de $55\times 55\um^2$ chacun. Seize de ces détecteurs, les Medipix2, sont installés dans l'expérience ATLAS au CERN afin de mesurer en temps réel les champs de radiation produits par les collisions de hadrons au LHC. Ils seront prochainement remplacés par des Timepix, la plus récente version de ces détecteurs, qui permettent de mesurer directement l'énergie déposée dans chaque pixel en mode \textit{time-over-threshold} (TOT) lors du passage d'une particule dans le semi-conducteur. En vue d'améliorer l'analyse des données recueillies avec ces détecteurs Timepix dans ATLAS, un projet de simulation Geant4 a été amorcé par John Id\'{a}rraga à l'Université de Montréal. Dans le cadre de l'expérience ATLAS, cette simulation pourra être utilisée conjointement avec Athena, le programme d'analyse d'ATLAS, et la simulation complète du détecteur ATLAS. Sous l'effet de leur propre répulsion, les porteurs de charge créés dans le semi-conducteur sont diffusés vers les pixels adjacents causant un dépôt d'énergie dans plusieurs pixels sous l'effet du partage de charges. Un modèle effectif de cette diffusion latérale a été développé pour reproduire ce phénomène sans résoudre d'équation différentielle de transport de charge. Ce modèle, ainsi que le mode TOT du Timepix, qui permet de mesurer l'énergie déposée dans le détecteur, ont été inclus dans la simulation afin de reproduire adéquatement les traces laissées par les particules dans le semi-conducteur. On a d'abord étalonné le détecteur pixel par pixel à l'aide d'une source de $\Am$ et de $\Ba$. Ensuite, on a validé la simulation à l'aide de mesures d'interactions de protons et de particules $\alpha$ produits au générateur Tandem van de Graaff du Laboratoire René-J.-A.-Lévesque de l'Université de Montréal.