844 resultados para curriculum-based measurement
Resumo:
We present cross-validation of remote sensing measurements of methane profiles in the Canadian high Arctic. Accurate and precise measurements of methane are essential to understand quantitatively its role in the climate system and in global change. Here, we show a cross-validation between three datasets: two from spaceborne instruments and one from a ground-based instrument. All are Fourier Transform Spectrometers (FTSs). We consider the Canadian SCISAT Atmospheric Chemistry Experiment (ACE)-FTS, a solar occultation infrared spectrometer operating since 2004, and the thermal infrared band of the Japanese Greenhouse Gases Observing Satellite (GOSAT) Thermal And Near infrared Sensor for carbon Observation (TANSO)-FTS, a nadir/off-nadir scanning FTS instrument operating at solar and terrestrial infrared wavelengths, since 2009. The ground-based instrument is a Bruker 125HR Fourier Transform Infrared (FTIR) spectrometer, measuring mid-infrared solar absorption spectra at the Polar Environment Atmospheric Research Laboratory (PEARL) Ridge Lab at Eureka, Nunavut (80° N, 86° W) since 2006. For each pair of instruments, measurements are collocated within 500 km and 24 h. An additional criterion based on potential vorticity values was found not to significantly affect differences between measurements. Profiles are regridded to a common vertical grid for each comparison set. To account for differing vertical resolutions, ACE-FTS measurements are smoothed to the resolution of either PEARL-FTS or TANSO-FTS, and PEARL-FTS measurements are smoothed to the TANSO-FTS resolution. Differences for each pair are examined in terms of profile and partial columns. During the period considered, the number of collocations for each pair is large enough to obtain a good sample size (from several hundred to tens of thousands depending on pair and configuration). Considering full profiles, the degrees of freedom for signal (DOFS) are between 0.2 and 0.7 for TANSO-FTS and between 1.5 and 3 for PEARL-FTS, while ACE-FTS has considerably more information (roughly 1° of freedom per altitude level). We take partial columns between roughly 5 and 30 km for the ACE-FTS–PEARL-FTS comparison, and between 5 and 10 km for the other pairs. The DOFS for the partial columns are between 1.2 and 2 for PEARL-FTS collocated with ACE-FTS, between 0.1 and 0.5 for PEARL-FTS collocated with TANSO-FTS or for TANSO-FTS collocated with either other instrument, while ACE-FTS has much higher information content. For all pairs, the partial column differences are within ± 3 × 1022 molecules cm−2. Expressed as median ± median absolute deviation (expressed in absolute or relative terms), these differences are 0.11 ± 9.60 × 10^20 molecules cm−2 (0.012 ± 1.018 %) for TANSO-FTS–PEARL-FTS, −2.6 ± 2.6 × 10^21 molecules cm−2 (−1.6 ± 1.6 %) for ACE-FTS–PEARL-FTS, and 7.4 ± 6.0 × 10^20 molecules cm−2 (0.78 ± 0.64 %) for TANSO-FTS–ACE-FTS. The differences for ACE-FTS–PEARL-FTS and TANSO-FTS–PEARL-FTS partial columns decrease significantly as a function of PEARL partial columns, whereas the range of partial column values for TANSO-FTS–ACE-FTS collocations is too small to draw any conclusion on its dependence on ACE-FTS partial columns.
An LDA and probability-based classifier for the diagnosis of Alzheimer's Disease from structural MRI
Resumo:
In this paper a custom classification algorithm based on linear discriminant analysis and probability-based weights is implemented and applied to the hippocampus measurements of structural magnetic resonance images from healthy subjects and Alzheimer’s Disease sufferers; and then attempts to diagnose them as accurately as possible. The classifier works by classifying each measurement of a hippocampal volume as healthy controlsized or Alzheimer’s Disease-sized, these new features are then weighted and used to classify the subject as a healthy control or suffering from Alzheimer’s Disease. The preliminary results obtained reach an accuracy of 85.8% and this is a similar accuracy to state-of-the-art methods such as a Naive Bayes classifier and a Support Vector Machine. An advantage of the method proposed in this paper over the aforementioned state of the art classifiers is the descriptive ability of the classifications it produces. The descriptive model can be of great help to aid a doctor in the diagnosis of Alzheimer’s Disease, or even further the understand of how Alzheimer’s Disease affects the hippocampus.
Resumo:
Ruminant husbandry is a major source of anthropogenic greenhouse gases (GHG). Filling knowledge gaps and providing expert recommendation are important for defining future research priorities, improving methodologies and establishing science-based GHG mitigation solutions to government and non-governmental organisations, advisory/extension networks, and the ruminant livestock sector. The objectives of this review is to summarize published literature to provide a detailed assessment of the methodologies currently in use for measuring enteric methane (CH4) emission from individual animals under specific conditions, and give recommendations regarding their application. The methods described include respiration chambers and enclosures, sulphur hexafluoride tracer (SF6) technique, and techniques based on short-term measurements of gas concentrations in samples of exhaled air. This includes automated head chambers (e.g. the GreenFeed system), the use of carbon dioxide (CO2) as a marker, and (handheld) laser CH4 detection. Each of the techniques are compared and assessed on their capability and limitations, followed by methodology recommendations. It is concluded that there is no ‘one size fits all’ method for measuring CH4 emission by individual animals. Ultimately, the decision as to which method to use should be based on the experimental objectives and resources available. However, the need for high throughput methodology e.g. for screening large numbers of animals for genomic studies, does not justify the use of methods that are inaccurate. All CH4 measurement techniques are subject to experimental variation and random errors. Many sources of variation must be considered when measuring CH4 concentration in exhaled air samples without a quantitative or at least regular collection rate, or use of a marker to indicate (or adjust) for the proportion of exhaled CH4 sampled. Consideration of the number and timing of measurements relative to diurnal patterns of CH4 emission and respiratory exchange are important, as well as consideration of feeding patterns and associated patterns of rumen fermentation rate and other aspects of animal behaviour. Regardless of the method chosen, appropriate calibrations and recovery tests are required for both method establishment and routine operation. Successful and correct use of methods requires careful attention to detail, rigour, and routine self-assessment of the quality of the data they provide.
Resumo:
Measurements of down-welling microwave radiation from raining clouds performed with the Advanced Microwave Radiometer for Rain Identification (ADMIRARI) radiometer at 10.7-21-36.5 GHz during the Global Precipitation Measurement Ground Validation ""Cloud processes of the main precipitation systems in Brazil: A contribution to cloud resolving modeling and to the Global Precipitation Measurement"" (CHUVA) campaign held in Brazil in March 2010 represent a unique test bed for understanding three-dimensional (3D) effects in microwave radiative transfer processes. While the necessity of accounting for geometric effects is trivial given the slant observation geometry (ADMIRARI was pointing at a fixed 30 elevation angle), the polarization signal (i.e., the difference between the vertical and horizontal brightness temperatures) shows ubiquitousness of positive values both at 21.0 and 36.5 GHz in coincidence with high brightness temperatures. This signature is a genuine and unique microwave signature of radiation side leakage which cannot be explained in a 1D radiative transfer frame but necessitates the inclusion of three-dimensional scattering effects. We demonstrate these effects and interdependencies by analyzing two campaign case studies and by exploiting a sophisticated 3D radiative transfer suited for dichroic media like precipitating clouds.
Resumo:
This paper presents the use of a multiprocessor architecture for the performance improvement of tomographic image reconstruction. Image reconstruction in computed tomography (CT) is an intensive task for single-processor systems. We investigate the filtered image reconstruction suitability based on DSPs organized for parallel processing and its comparison with the Message Passing Interface (MPI) library. The experimental results show that the speedups observed for both platforms were increased in the same direction of the image resolution. In addition, the execution time to communication time ratios (Rt/Rc) as a function of the sample size have shown a narrow variation for the DSP platform in comparison with the MPI platform, which indicates its better performance for parallel image reconstruction.
Measurement of the energy spectrum of cosmic rays above 10(18) eV using the Pierre Auger Observatory
Resumo:
We report a measurement of the flux of cosmic rays with unprecedented precision and Statistics using the Pierre Auger Observatory Based on fluorescence observations in coincidence with at least one Surface detector we derive a spectrum for energies above 10(18) eV We also update the previously published energy spectrum obtained with the surface detector array The two spectra are combined addressing the systematic uncertainties and, in particular. the influence of the energy resolution on the spectral shape The spectrum can be described by a broken power law E(-gamma) with index gamma = 3 3 below the ankle which is measured at log(10)(E(ankle)/eV) = 18 6 Above the ankle the spectrum is described by a power law with index 2 6 followed by a flux suppression, above about log(10)(E/eV) = 19 5, detected with high statistical significance (C) 2010 Elsevier B V All rights reserved
Resumo:
This thesis evaluates different sites for a weather measurement system and a suitable PV- simulation for University of Surabaya (UBAYA) in Indonesia/Java. The weather station is able to monitor all common weather phenomena including solar insolation. It is planned to use the data for scientific and educational purposes in the renewable energy studies. During evaluation and installation it falls into place that official specifications from global meteorological organizations could not be meet for some sensors caused by the conditions of UBAYA campus. After arranging the hardware the weather at the site was monitored for period of time. A comparison with different official sources from ground based and satellite bases measurements showed differences in wind and solar radiation. In some cases the monthly average solar insolation was deviating 42 % for satellite-based measurements. For the ground based it was less than 10 %. The average wind speed has a difference of 33 % compared to a source, which evaluated the wind power in Surabaya. The wind direction shows instabilities towards east compared with data from local weather station at the airport. PSET has the chance to get some investments to investigate photovoltaic on there own roof. With several simulations a suitable roof direction and the yearly and monthly outputs are shown. With a 7.7 kWpeak PV installation with the latest crystalline technology on the market 8.82 MWh/year could be achieved with weather data from 2012. Thin film technology could increase the value up to 9.13 MWh/year. However, the roofs have enough area to install PV. Finally the low price of electricity in Indonesia makes it not worth to feed in the energy into the public grid.
Resumo:
The focus of this article is on relations between classroom interaction, curricular knowledge and student engagement in diverse classrooms. It is based on a study with ethnographic perspective in which two primary school classes in Sweden were followed for three years. The analysis draws on Halliday's Systemic Functional Linguistics. The results indicate that language use in the classrooms is on a basic everyday level and that high teacher control results in low-demanding tasks and low engagement among students. Interaction in the classrooms mainly consists of short talk-turns with fragmented language, frequent repairs and interruptions, while writing and reading consists of single words and short sentences. Although the classroom atmosphere is friendly and inclusive, second language students are denied necessary opportunities to develop curricular knowledge and Swedish at the advanced level, which they will need higher up in the school system. The restricted curriculum that these students are offered in school thus restricts their opportunities to school success. Thus, I argue for a more reflective and critical approach regarding language use in classrooms.
Resumo:
GPS tracking of mobile objects provides spatial and temporal data for a broad range of applications including traffic management and control, transportation routing and planning. Previous transport research has focused on GPS tracking data as an appealing alternative to travel diaries. Moreover, the GPS based data are gradually becoming a cornerstone for real-time traffic management. Tracking data of vehicles from GPS devices are however susceptible to measurement errors – a neglected issue in transport research. By conducting a randomized experiment, we assess the reliability of GPS based traffic data on geographical position, velocity, and altitude for three types of vehicles; bike, car, and bus. We find the geographical positioning reliable, but with an error greater than postulated by the manufacturer and a non-negligible risk for aberrant positioning. Velocity is slightly underestimated, whereas altitude measurements are unreliable.
Resumo:
The accurate measurement of a vehicle’s velocity is an essential feature in adaptive vehicle activated sign systems. Since the velocities of the vehicles are acquired from a continuous wave Doppler radar, the data collection becomes challenging. Data accuracy is sensitive to the calibration of the radar on the road. However, clear methodologies for in-field calibration have not been carefully established. The signs are often installed by subjective judgment which results in measurement errors. This paper develops a calibration method based on mining the data collected and matching individual vehicles travelling between two radars. The data was cleaned and prepared in two ways: cleaning and reconstructing. The results showed that the proposed correction factor derived from the cleaned data corresponded well with the experimental factor done on site. In addition, this proposed factor showed superior performance to the one derived from the reconstructed data.
Resumo:
The increasing availability of social statistics in Latin America opens new possibilities in terms of accountability and incentive mechanisms for policy makers. This paper addresses these issues within the institutional context of the Brazilian educational system. We build a theoretical model based on the theory of incentives to analyze the role of the recently launched Basic Education Development Index (Ideb) in the provision of incentives at the sub-national level. The first result is to demonstrate that an education target system has the potential to improve the allocation of resources to education through conditional transfers to municipalities and schools. Second, we analyze the local government’s decision about how to allocate its education budget when seeking to accomplish the different objectives contemplated by the index, which involves the interaction between its two components, average proficiency and the passing rate. We discuss as well policy issues concerning the implementation of the synthetic education index in the light of this model arguing that there is room for improving the Ideb’s methodology itself. In addition, we analyze the desirable properties of an ideal education index and we argue in favor of an ex-post relative learning evaluation system for different municipalities (schools) based on the value added across different grades
Resumo:
Na moderna Economia do Conhecimento, na Era do Big Data, entender corretamente o uso e a gestão da Tecnologia de Informação e Comunicação (TIC) tendo como base o campo acadêmico de estudos de Sistemas de Informação (SI), torna-se cada vez mais relevante e estratégico para as organizações que pretendem: permanecer em atividade, estar aptas para atender novas demandas (internas e externas) e enfrentar as complexas mudanças na competição de mercado. Esta pesquisa utiliza a teoria dos estágios de crescimento, fundamentada pelos estudos de Richard L. Nolan nos anos 70. A literatura acadêmica relacionada com modelos de estágios de crescimento e o contexto do campo de estudo de SI, fornecem as bases conceituais deste estudo. A pesquisa identifica um modelo com seus construtos relacionados aos estágios de crescimento das iniciativas da TIC/SI organizacional, partindo das variáveis de benchmark de segundo nível de Nolan, e propõe sua operacionalização com a criação e desenvolvimento de uma escala. De caráter exploratório e descritivo, a pesquisa traz contribuição teórica ao paradigma da teoria dos estágios de crescimento, adicionando um novo processo de crescimento em sua estrutura conceitual. Como resultado, é disponibilizado além de um instrumento de escala bilíngue (português e inglês), recomendações e regras para aplicação de um instrumento de pesquisa do tipo survey, na continuidade deste estudo. Como implicação geral desta pesquisa, é esperado que seu uso e aplicação ao mensurar a avaliação do nível de estágio da TIC/SI em organizações, possam auxiliar dois perfis de indivíduos: acadêmicos que estudam essa temática, assim como, profissionais que buscam respostas de suas ações práticas nas organizações onde trabalham.
Resumo:
Reviewing the de nition and measurement of speculative bubbles in context of contagion, this paper analyses the DotCom bubble in American and European equity markets using the dynamic conditional correlation (DCC) model proposed by (Engle and Sheppard 2001) as on one hand as an econometrics explanation and on the other hand the behavioral nance as an psychological explanation. Contagion is de ned in this context as the statistical break in the computed DCCs as measured by the shifts in their means and medians. Even it is astonishing, that the contagion is lower during price bubbles, the main nding indicates the presence of contagion in the di¤erent indices among those two continents and proves the presence of structural changes during nancial crisis
Resumo:
The paper aims at showing how curricular complexity tends to be depleted by the use of digital platforms based on the SCORM (Sharable Content Object Reference Model) standard, which was created with the main purpose of recycling content as it is supposed to be independent both from the context of learning and the supporting technology also deemed to be neutral, all surrounded by a rhetoric of innovation and “pedagogical” innovation. The starting point of the discussion is García Perez’s model of Traditional Didactics as a simple tool to show almost graphically that any ancient didactic model is far richer in terms of complexity than the linearity, in disguise most of the times but still visible under a not so sophisticated critical lens, of the interaction human-(reusable) content that is the basis of the SCORM standard. The paper also addresses some of the more common deliberate mix-ups related to those digital platforms, such as learning and teaching, content and learning object, systems of automatic teaching and learning management systems.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)