846 resultados para Using an harmonic instrument
Resumo:
IntraCavity Laser Absorption Spectroscopy (ICLAS) is a high-resolution, high sensitivity spectroscopic method capable of measuring line positions, linewidths, lineshapes, and absolute line intensities with a sensitivity that far exceeds that of a traditional multiple pass absorption cell or Fourier Transform spectrometer. From the fundamental knowledge obtained through these measurements, information about the underlying spectroscopy, dynamics, and kinetics of the species interrogated can be derived. The construction of an ICLA Spectrometer will be detailed, and the measurements utilizing ICLAS will be discussed, as well as the theory of operation and modifications of the experimental apparatus. Results include: i) Line intensities and collision-broadening coefficients of the A band of oxygen and previously unobserved, high J, rotational transitions of the A band, hot-band transitions, and transitions of isotopically substituted species. ii) High-resolution (0.013 cm-1) spectra of the second overtone of the OH stretch of trans-nitrous acid recorded between 10,230 and 10,350 cm-1. The spectra were analyzed to yield a complete set of rotational parameters and an absolute band intensity, and two groups of anharmonic perturbations were observed and analyzed. These findings are discussed in the context of the contribution of overtone-mediated processes to OH radical production in the lower atmosphere.
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
This paper presents the design and implementation of a mission control system (MCS) for an autonomous underwater vehicle (AUV) based on Petri nets. In the proposed approach the Petri nets are used to specify as well as to execute the desired autonomous vehicle mission. The mission is easily described using an imperative programming language called mission control language (MCL) that formally describes the mission execution thread. A mission control language compiler (MCL-C) able to automatically translate the MCL into a Petri net is described and a real-time Petri net player that allows to execute the resulting Petri net onboard an AUV are also presented
Resumo:
This paper presents the assessment that inhabitants of some Colombian cities did on the conditions that contribute to the livability of public space. Seven hundred and forty people, inhabitants of Yopal, Villavicencio, Valledupar, Popayán, Pereira, Pasto, Neiva, Montería, Medellín, Fusagasugá, Cúcuta, Cartagena, Cali and Bogotá participated in the study. The assessment of the conditions that contribute to the livability of public space was carried out using an instrument composed of 48 items that inquired about the level of contribution that can have different conditions on the quality of public space, from a scale five points ranging from: Does not contribute at all (-2) to: Contribute significantly (+2). The results show the conditions that most affect the habitability of public space in Colombia, as well as the differences between cities according to the assessment made by participants about the general state of public space in cities. Multidimensional analysis (SSA) evidence a structure that reflects the function that public space plays in people’s assessment on Colombian cities. It is discussed the implications of the findings for urban planning and management and the designed instrument is proposed as a tool to assess the quality of urban public space.
Resumo:
La butirilcolinesterasa humana (BChE; EC 3.1.1.8) es una enzima polimórfica sintetizada en el hígado y en el tejido adiposo, ampliamente distribuida en el organismo y encargada de hidrolizar algunos ésteres de colina como la procaína, ésteres alifáticos como el ácido acetilsalicílico, fármacos como la metilprednisolona, el mivacurium y la succinilcolina y drogas de uso y/o abuso como la heroína y la cocaína. Es codificada por el gen BCHE (OMIM 177400), habiéndose identificado más de 100 variantes, algunas no estudiadas plenamente, además de la forma más frecuente, llamada usual o silvestre. Diferentes polimorfismos del gen BCHE se han relacionado con la síntesis de enzimas con niveles variados de actividad catalítica. Las bases moleculares de algunas de esas variantes genéticas han sido reportadas, entre las que se encuentra las variantes Atípica (A), fluoruro-resistente del tipo 1 y 2 (F-1 y F-2), silente (S), Kalow (K), James (J) y Hammersmith (H). En este estudio, en un grupo de pacientes se aplicó el instrumento validado Lifetime Severity Index for Cocaine Use Disorder (LSI-C) para evaluar la gravedad del consumo de “cocaína” a lo largo de la vida. Además, se determinaron Polimorfismos de Nucleótido Simple (SNPs) en el gen BCHE conocidos como responsables de reacciones adversas en pacientes consumidores de “cocaína” mediante secuenciación del gen y se predijo el efecto delos SNPs sobre la función y la estructura de la proteína, mediante el uso de herramientas bio-informáticas. El instrumento LSI-C ofreció resultados en cuatro dimensiones: consumo a lo largo de la vida, consumo reciente, dependencia psicológica e intento de abandono del consumo. Los estudios de análisis molecular permitieron observar dos SNPs codificantes (cSNPs) no sinónimos en el 27.3% de la muestra, c.293A>G (p.Asp98Gly) y c.1699G>A (p.Ala567Thr), localizados en los exones 2 y 4, que corresponden, desde el punto de vista funcional, a la variante Atípica (A) [dbSNP: rs1799807] y a la variante Kalow (K) [dbSNP: rs1803274] de la enzima BChE, respectivamente. Los estudios de predicción In silico establecieron para el SNP p.Asp98Gly un carácter patogénico, mientras que para el SNP p.Ala567Thr, mostraron un comportamiento neutro. El análisis de los resultados permite proponer la existencia de una relación entre polimorfismos o variantes genéticas responsables de una baja actividad catalítica y/o baja concentración plasmática de la enzima BChE y algunas de las reacciones adversas ocurridas en pacientes consumidores de cocaína.
Resumo:
Apesar da modernização dos meios tecnológicos e processos de aprendizagem, a Matemática na escola pública brasileira permanece difícil de ensinar e aprender, falta inovação metodológica que promova condições necessárias na apropriação dos saberes pelo aluno. Essa pesquisa sobre a Formação continuada de professores de Matemática do Ensino Fundamental Ciclo I e inovação da prática pedagógica: a música no ensino de frações propõe o uso da música como recurso didático metodológico inovador para o ensino de frações, com o objetivo de substituir aulas expositivas e exercícios mecânicos por vivências prazerosas, significativas e formadoras de um sujeito crítico participativo. Apresenta os mecanismos de avaliação da política educacional brasileira bem como o Ensino Fundamental de nove anos. Destaca a inovação metodológica como necessidade na formação continuada para o professor polivalente não especialista em matemática. Desenvolve a pesquisa qualitativa, estudo de caso, e considera o processo histórico da sociedade e do sujeito, para compreender o papel da escola, do professor e as especificidades do processo ensino e aprendizagem. O resultados dessa pesquisa mostram a necessidade de revisão, pelas instituições de ensino superior, na formação de profissionais de postura interrogativa de sua própria ação docente, capazes de reproduzir tal atitude no aluno. Este estudo contribui para a aprendizagem de frações, evitando-se aulas expositivas, exercícios mecânicos, por meio de uma proposta de formação continuada, utilizando música como instrumento para o ensino de frações, desenvolvida pela pesquisadora durante o processo da pesquisa ação, além de promover o debate nas unidades escolares envolvidas nas inovações de seus Projetos Pedagógicos.
Resumo:
An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.
Resumo:
Data from four recent reanalysis projects [ECMWF, NCEP-NCAR, NCEP - Department of Energy ( DOE), NASA] have been diagnosed at the scale of synoptic weather systems using an objective feature tracking method. The tracking statistics indicate that, overall, the reanalyses correspond very well in the Northern Hemisphere (NH) lower troposphere, although differences for the spatial distribution of mean intensities show that the ECMWF reanalysis is systematically stronger in the main storm track regions but weaker around major orographic features. A direct comparison of the track ensembles indicates a number of systems with a broad range of intensities that compare well among the reanalyses. In addition, a number of small-scale weak systems are found that have no correspondence among the reanalyses or that only correspond upon relaxing the matching criteria, indicating possible differences in location and/or temporal coherence. These are distributed throughout the storm tracks, particularly in the regions known for small-scale activity, such as secondary development regions and the Mediterranean. For the Southern Hemisphere (SH), agreement is found to be generally less consistent in the lower troposphere with significant differences in both track density and mean intensity. The systems that correspond between the various reanalyses are considerably reduced and those that do not match span a broad range of storm intensities. Relaxing the matching criteria indicates that there is a larger degree of uncertainty in both the location of systems and their intensities compared with the NH. At upper-tropospheric levels, significant differences in the level of activity occur between the ECMWF reanalysis and the other reanalyses in both the NH and SH winters. This occurs due to a lack of coherence in the apparent propagation of the systems in ERA15 and appears most acute above 500 hPa. This is probably due to the use of optimal interpolation data assimilation in ERA15. Also shown are results based on using the same techniques to diagnose the tropical easterly wave activity. Results indicate that the wave activity is sensitive not only to the resolution and assimilation methods used but also to the model formulation.
Resumo:
Competency management is a very important part of a well-functioning organisation. Unfortunately competency descriptions are not uniformly specified nor defined across borders: National, sectorial or organisational, leading to an opaque competency description market with a multitude of competency frameworks and competency benchmarks. An ontology is a formalised description of a domain, which enables automated reasoning engines to be built which by utilising the interrelations between entities can make “intelligent” choices in different situations within the domain. Introducing formalised competency ontologies automated tools, such as skill gap analysis, training suggestion generation, job search and recruitment, can be developed, which compare and contrast different competency descriptions on the semantic level. The major problem with defining a common formalised ontology for competencies is that there are so many viewpoints of competencies and competency frameworks. Work within the TRACE project has focused on finding common trends within different competency frameworks in order to allow an intermediate competency description to be made, which other frameworks can reference. This research has shown that competencies can be divided up into “knowledge”, “skills” and what we call “others”. An ontology has been created based on this with a simple structure of different “kinds” of “knowledges” and “skills” using semantic interrelations to define the basic semantic structure of the ontology. A prototype tool for analysing a skill gap analysis has been developed. Personal profiles can be produced using the tool and a skill gap analysis is performed on a desired competency profile by using an ontologically based inference engine, which is able to list closest fit and possible proficiency gaps
Resumo:
Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X data to detect flooded regions in urban areas is described. An important application for this would be the calibration and validation of the flood extent predicted by an urban flood inundation model. To date, research on such models has been hampered by lack of suitable distributed validation data. The study uses a 3m resolution TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with airborne LiDAR data to estimate regions of the TerraSAR-X image in which water would not be visible due to radar shadow or layover caused by buildings and taller vegetation, and these regions were masked out in the flood detection process. A semi-automatic algorithm for the detection of floodwater was developed, based on a hybrid approach. Flooding in rural areas adjacent to the urban areas was detected using an active contour model (snake) region-growing algorithm seeded using the un-flooded river channel network, which was applied to the TerraSAR-X image fused with the LiDAR DTM to ensure the smooth variation of heights along the reach. A simpler region-growing approach was used in the urban areas, which was initialized using knowledge of the flood waterline in the rural areas. Seed pixels having low backscatter were identified in the urban areas using supervised classification based on training areas for water taken from the rural flood, and non-water taken from the higher urban areas. Seed pixels were required to have heights less than a spatially-varying height threshold determined from nearby rural waterline heights. Seed pixels were clustered into urban flood regions based on their close proximity, rather than requiring that all pixels in the region should have low backscatter. This approach was taken because it appeared that urban water backscatter values were corrupted in some pixels, perhaps due to contributions from side-lobes of strong reflectors nearby. The TerraSAR-X urban flood extent was validated using the flood extent visible in the aerial photos. It turned out that 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. These findings indicate that TerraSAR-X is capable of providing useful data for the calibration and validation of urban flood inundation models.
Resumo:
We investigated diurnal nitrate (NO3-) concentration variability in the San Joaquin River using an in situ optical NO3- sensor and discrete sampling during a 5-day summer period characterized by high algal productivity. Dual NO3- isotopes (delta N-15(NO3) and delta O-18(NO3)) and dissolved oxygen isotopes (delta O-18(DO)) were measured over 2 days to assess NO3- sources and biogeochemical controls over diurnal time-scales. Concerted temporal patterns of dissolved oxygen (DO) concentrations and delta O-18(DO) were consistent with photosynthesis, respiration and atmospheric O-2 exchange, providing evidence of diurnal biological processes independent of river discharge. Surface water NO3- concentrations varied by up to 22% over a single diurnal cycle and up to 31% over the 5-day study, but did not reveal concerted diurnal patterns at a frequency comparable to DO concentrations. The decoupling of delta N-15(NO3) and delta O-18(NO3) isotopes suggests that algal assimilation and denitrification are not major processes controlling diurnal NO3- variability in the San Joaquin River during the study. The lack of a clear explanation for NO3- variability likely reflects a combination of riverine biological processes and time-varying physical transport of NO3- from upstream agricultural drains to the mainstem San Joaquin River. The application of an in situ optical NO3- sensor along with discrete samples provides a view into the fine temporal structure of hydrochemical data and may allow for greater accuracy in pollution assessment.
Resumo:
A method for in situ detection of atmospheric turbulence has been developed using an inexpensive sensor carried within a conventional meteorological radiosonde. The sensor-a Hall effect magnetometer-was used to monitor the terrestrial magnetic field. Rapid time scale (10 s or less) fluctuations in the magnetic field measurement were related to the motion of the radiosonde, which was strongly influenced by atmospheric turbulence. Comparison with cloud radar measurements showed turbulence in regions where rapid time-scale magnetic fluctuations occurred. Reliable measurements were obtained between the surface and the stratosphere.
Resumo:
The mathematical difficulties which can arise in the force constant refinement procedure for calculating force constants and normal co-ordinates are described and discussed. The method has been applied to the methyl fluoride molecule, using an electronic computer. The best values of the twelve force constants in the most general harmonic potential field were obtained to fit twenty-two independently observed experimental data, these being the six vibration frequencies, three Coriolis zeta constants and two centrifugal stretching constants DJ and DJK, for both CH3F and CD3F. The calculations have been repeated both with and without anharmonicity corrections to the vibration frequencies. All the experimental data were weighted according to the reliability of the observations, and the corresponding standard errors and correlation coefficients of the force constants have been deduced. The final force constants are discussed briefly, and compared with previous treatments, particularly with a recent Urey-Bradley treatment for this molecule.
Resumo:
We previously demonstrated that a dry, room temperature stable formulation of a live bacterial vaccine was highly susceptible to bile, and suggested that this will lead to significant loss of viability of any live bacterial formulation released into the intestine using an enteric coating or capsule. We found that bile and acid tolerance is very rapidly recovered after rehydration with buffer or water, raising the possibility that rehydration in the absence of bile prior to release into the intestine might solve the problem of bile toxicity to dried cells. We describe here a novel formulation that combines extensively studied bile acid adsorbent resins with the dried bacteria, to temporarily adsorb bile acids and allow rehydration and recovery of bile resistance of bacteria in the intestine before release. Tablets containing the bile acid adsorbent cholestyramine release 250-fold more live bacteria when dissolved in a bile solution, compared to control tablets without cholestyramine or with a control resin that does not bind bile acids. We propose that a simple enteric coated oral dosage form containing bile acid adsorbent resins will allow improved live bacterial delivery to the intestine via the oral route, a major step towards room temperature stable, easily administered and distributed vaccine pills and other bacterial therapeutics
Resumo:
The vibrations and tunnelling motion of malonaldehyde have been studied in their full dimensionality using an internal coordinate path Hamiltonian. In this representation there is one large amplitude internal coordinate s and 3N - 7 (=20) normal coordinates Q which are orthogonal to the large amplitude motion at all points. It is crucial that a high accuracy potential energy surface is used in order to obtain a good representation for the tunneling motion; we use a Moller-Plesset (MP2) surface. Our methodology is variational, that is we diagonalize a sufficiently large matrix in order to obtain the required vibrational levels, so an exact representation for the kinetic energy operator is used. In a harmonic valley representation (s, Q) complete convergence of the normal coordinate motions and the internal coordinate motions has been obtained; for the anharmonic valley in which we use two- and three-body terms in the surface (s, Q(1), Q(2)), we also obtain complete convergence. Our final computed stretching fundamentals are deficient because our potential energy surface is truncated at quartic terms in the normal coordinates, but our lower fundamentals are good.