883 resultados para non separable data
Resumo:
We investigate the effect of education Conditional Cash Transfer programs (CCTs) on teenage pregnancy. Our main concern is with how the size and sign of the effect may depend on the design of the program. Using a simple model we show that an education CCT that conditions renewal on school performance reduces teenage pregnancy; the program can increase teenage pregnancy if it does not condition on school performance. Then, using an original data base, we estimate the causal impact on teenage pregnancy of two education CCTs implemented in Bogot´a (Subsidio Educativo, SE, and Familias en Acci´on, FA); both programs differ particularly on whether school success is a condition for renewal or not. We show that SE has negative average effect on teenage pregnancy while FA has a null average effect. We also find that SE has either null or no effect for adolescents in all age and grade groups while FA has positive, null or negative effects for adolescents in different age and grade groups. Since SE conditions renewal on school success and FA does not, we can argue that the empirical results are consistent with the predictions of our model and that conditioning renewal of the subsidy on school success crucially determines the effect of the subsidy on teenage pregnancy
Resumo:
Aquesta tesi estudia com estimar la distribució de les variables regionalitzades l'espai mostral i l'escala de les quals admeten una estructura d'espai Euclidià. Apliquem el principi del treball en coordenades: triem una base ortonormal, fem estadística sobre les coordenades de les dades, i apliquem els output a la base per tal de recuperar un resultat en el mateix espai original. Aplicant-ho a les variables regionalitzades, obtenim una aproximació única consistent, que generalitza les conegudes propietats de les tècniques de kriging a diversos espais mostrals: dades reals, positives o composicionals (vectors de components positives amb suma constant) són tractades com casos particulars. D'aquesta manera, es generalitza la geostadística lineal, i s'ofereix solucions a coneguts problemes de la no-lineal, tot adaptant la mesura i els criteris de representativitat (i.e., mitjanes) a les dades tractades. L'estimador per a dades positives coincideix amb una mitjana geomètrica ponderada, equivalent a l'estimació de la mediana, sense cap dels problemes del clàssic kriging lognormal. El cas composicional ofereix solucions equivalents, però a més permet estimar vectors de probabilitat multinomial. Amb una aproximació bayesiana preliminar, el kriging de composicions esdevé també una alternativa consistent al kriging indicador. Aquesta tècnica s'empra per estimar funcions de probabilitat de variables qualsevol, malgrat que sovint ofereix estimacions negatives, cosa que s'evita amb l'alternativa proposada. La utilitat d'aquest conjunt de tècniques es comprova estudiant la contaminació per amoníac a una estació de control automàtic de la qualitat de l'aigua de la conca de la Tordera, i es conclou que només fent servir les tècniques proposades hom pot detectar en quins instants l'amoni es transforma en amoníac en una concentració superior a la legalment permesa.
Resumo:
This study examined culturally and linguistically diverse families with deaf and hard of hearing children. A literature review consisted of looking at the rate of immigration to the United States, English speaking parents of children who are deaf and hard of hearing, bilingual education, and the obstacles bilingual parents of children who are deaf and hard of hearing may face. The data obtained was used to compile a list of resources for parents of children who are deaf and hard of hearing available in languages other than English in order to assist these families.
Resumo:
The object of analysis in the present text is the issue of operational control and data retention in Poland. The analysis of this issue follows from a critical stance taken by NGOs and state institutions on the scope of operational control wielded by the Polish police and special services – it concerns, in particular, the employment of “itemized phone bills and the so-called phone tapping.” Besides the quantitative analysis of operational control and the scope of data retention, the text features the conclusions of the Human Rights Defender referred to the Constitutional Tribunal in 2011. It must be noted that the main problems concerned with the employment of operational control and data retention are caused by: (1) a lack of specification of technical means which can be used by individual services; (2) a lack of specification of what kind of information and evidence is in question; (3) an open catalogue of information and evidence which can be clandestinely acquired in an operational mode. Furthermore, with regard to the access granted to teleinformation data by the Telecommunications Act, attention should be drawn to a wide array of data submitted to particular services. Also, the text draws on the so-called open interviews conducted mainly with former police officers with a view to pointing to some non-formal reasons for “phone tapping” in Poland. This comes in the form of a summary.
Resumo:
Populations of Lesser Scaup (Aythya affinis) have declined markedly in North America since the early 1980s. When considering alternatives for achieving population recovery, it would be useful to understand how the rate of population growth is functionally related to the underlying vital rates and which vital rates affect population growth rate the most if changed (which need not be those that influenced historical population declines). To establish a more quantitative basis for learning about life history and population dynamics of Lesser Scaup, we summarized published and unpublished estimates of vital rates recorded between 1934 and 2005, and developed matrix life-cycle models with these data for females breeding in the boreal forest, prairie-parklands, and both regions combined. We then used perturbation analysis to evaluate the effect of changes in a variety of vital-rate statistics on finite population growth rate and abundance. Similar to Greater Scaup (Aythya marila), our modeled population growth rate for Lesser Scaup was most sensitive to unit and proportional change in adult female survival during the breeding and non-breeding seasons, but much less so to changes in fecundity parameters. Interestingly, population growth rate was also highly sensitive to unit and proportional changes in the mean of nesting success, duckling survival, and juvenile survival. Given the small samples of data for key aspects of the Lesser Scaup life cycle, we recommend additional research on vital rates that demonstrate a strong effect on population growth and size (e.g., adult survival probabilities). Our life-cycle models should be tested and regularly updated in the future to simultaneously guide science and management of Lesser Scaup populations in an adaptive context.
Resumo:
Two wavelet-based control variable transform schemes are described and are used to model some important features of forecast error statistics for use in variational data assimilation. The first is a conventional wavelet scheme and the other is an approximation of it. Their ability to capture the position and scale-dependent aspects of covariance structures is tested in a two-dimensional latitude-height context. This is done by comparing the covariance structures implied by the wavelet schemes with those found from the explicit forecast error covariance matrix, and with a non-wavelet- based covariance scheme used currently in an operational assimilation scheme. Qualitatively, the wavelet-based schemes show potential at modeling forecast error statistics well without giving preference to either position or scale-dependent aspects. The degree of spectral representation can be controlled by changing the number of spectral bands in the schemes, and the least number of bands that achieves adequate results is found for the model domain used. Evidence is found of a trade-off between the localization of features in positional and spectral spaces when the number of bands is changed. By examining implied covariance diagnostics, the wavelet-based schemes are found, on the whole, to give results that are closer to diagnostics found from the explicit matrix than from the nonwavelet scheme. Even though the nature of the covariances has the right qualities in spectral space, variances are found to be too low at some wavenumbers and vertical correlation length scales are found to be too long at most scales. The wavelet schemes are found to be good at resolving variations in position and scale-dependent horizontal length scales, although the length scales reproduced are usually too short. The second of the wavelet-based schemes is often found to be better than the first in some important respects, but, unlike the first, it has no exact inverse transform.
Resumo:
Bayesian inference has been used to determine rigorous estimates of hydroxyl radical concentrations () and air mass dilution rates (K) averaged following air masses between linked observations of nonmethane hydrocarbons (NMHCs) spanning the North Atlantic during the Intercontinental Transport and Chemical Transformation (ITCT)-Lagrangian-2K4 experiment. The Bayesian technique obtains a refined (posterior) distribution of a parameter given data related to the parameter through a model and prior beliefs about the parameter distribution. Here, the model describes hydrocarbon loss through OH reaction and mixing with a background concentration at rate K. The Lagrangian experiment provides direct observations of hydrocarbons at two time points, removing assumptions regarding composition or sources upstream of a single observation. The estimates are sharpened by using many hydrocarbons with different reactivities and accounting for their variability and measurement uncertainty. A novel technique is used to construct prior background distributions of many species, described by variation of a single parameter . This exploits the high correlation of species, related by the first principal component of many NMHC samples. The Bayesian method obtains posterior estimates of , K and following each air mass. Median values are typically between 0.5 and 2.0 × 106 molecules cm−3, but are elevated to between 2.5 and 3.5 × 106 molecules cm−3, in low-level pollution. A comparison of estimates from absolute NMHC concentrations and NMHC ratios assuming zero background (the “photochemical clock” method) shows similar distributions but reveals systematic high bias in the estimates from ratios. Estimates of K are ∼0.1 day−1 but show more sensitivity to the prior distribution assumed.
Resumo:
Satellite-based rainfall monitoring is widely used for climatological studies because of its full global coverage but it is also of great importance for operational purposes especially in areas such as Africa where there is a lack of ground-based rainfall data. Satellite rainfall estimates have enormous potential benefits as input to hydrological and agricultural models because of their real time availability, low cost and full spatial coverage. One issue that needs to be addressed is the uncertainty on these estimates. This is particularly important in assessing the likely errors on the output from non-linear models (rainfall-runoff or crop yield) which make use of the rainfall estimates, aggregated over an area, as input. Correct assessment of the uncertainty on the rainfall is non-trivial as it must take account of • the difference in spatial support of the satellite information and independent data used for calibration • uncertainties on the independent calibration data • the non-Gaussian distribution of rainfall amount • the spatial intermittency of rainfall • the spatial correlation of the rainfall field This paper describes a method for estimating the uncertainty on satellite-based rainfall values taking account of these factors. The method involves firstly a stochastic calibration which completely describes the probability of rainfall occurrence and the pdf of rainfall amount for a given satellite value, and secondly the generation of ensemble of rainfall fields based on the stochastic calibration but with the correct spatial correlation structure within each ensemble member. This is achieved by the use of geostatistical sequential simulation. The ensemble generated in this way may be used to estimate uncertainty at larger spatial scales. A case study of daily rainfall monitoring in the Gambia, west Africa for the purpose of crop yield forecasting is presented to illustrate the method.
Resumo:
Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X data to detect flooded regions in urban areas is described. An important application for this would be the calibration and validation of the flood extent predicted by an urban flood inundation model. To date, research on such models has been hampered by lack of suitable distributed validation data. The study uses a 3m resolution TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with airborne LiDAR data to estimate regions of the TerraSAR-X image in which water would not be visible due to radar shadow or layover caused by buildings and taller vegetation, and these regions were masked out in the flood detection process. A semi-automatic algorithm for the detection of floodwater was developed, based on a hybrid approach. Flooding in rural areas adjacent to the urban areas was detected using an active contour model (snake) region-growing algorithm seeded using the un-flooded river channel network, which was applied to the TerraSAR-X image fused with the LiDAR DTM to ensure the smooth variation of heights along the reach. A simpler region-growing approach was used in the urban areas, which was initialized using knowledge of the flood waterline in the rural areas. Seed pixels having low backscatter were identified in the urban areas using supervised classification based on training areas for water taken from the rural flood, and non-water taken from the higher urban areas. Seed pixels were required to have heights less than a spatially-varying height threshold determined from nearby rural waterline heights. Seed pixels were clustered into urban flood regions based on their close proximity, rather than requiring that all pixels in the region should have low backscatter. This approach was taken because it appeared that urban water backscatter values were corrupted in some pixels, perhaps due to contributions from side-lobes of strong reflectors nearby. The TerraSAR-X urban flood extent was validated using the flood extent visible in the aerial photos. It turned out that 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. These findings indicate that TerraSAR-X is capable of providing useful data for the calibration and validation of urban flood inundation models.
Resumo:
The problems of inverting experimental information obtained from vibration-rotation spectroscopy to determine the potential energy surface of a molecule are discussed, both in relation to semi-rigid molecules like HCN, NO2, H2CO, etc., and in relation to non-rigid or floppy molecules with large amplitude vibrations like HCNO, C3O2, and small ring molecules. Although standard methods exist for making the necessary calculations in the former case, they are complex, and they require an abundance of precise data on the spectrum that is rarely available. In the case of floppy molecules there are often data available over many excited states of the large amplitude vibration, but there are difficulties in knowing the precise form of the large amplitude coordinate(s), and in allowing for the vibrational averaging effects of the other modes. In both cases difficulties arise from the curvilinear nature of the vibrational paths which are not adequately handled by our present theories.
Resumo:
Fixed transactions costs that prohibit exchange engender bias in supply analysis due to censoring of the sample observations. The associated bias in conventional regression procedures applied to censored data and the construction of robust methods for mitigating bias have been preoccupations of applied economists since Tobin [Econometrica 26 (1958) 24]. This literature assumes that the true point of censoring in the data is zero and, when this is not the case, imparts a bias to parameter estimates of the censored regression model. We conjecture that this bias can be significant; affirm this from experiments; and suggest techniques for mitigating this bias using Bayesian procedures. The bias-mitigating procedures are based on modifications of the key step that facilitates Bayesian estimation of the censored regression model; are easy to implement; work well in both small and large samples; and lead to significantly improved inference in the censored regression model. These findings are important in light of the widespread use of the zero-censored Tobit regression and we investigate their consequences using data on milk-market participation in the Ethiopian highlands. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The findings from a study measuring consumer acceptance of genetically modified (GM) foods are presented. The empirical data were collected in an experimental market, an approach used extensively in experimental economics for measuring the monetary value of goods. The approach has several advantages over standard approaches used in sensory and marketing research (e.g., surveys and focus groups) because of its non-hypothetical nature and the realism introduced by using real goods, real money, and market discipline. In each of three US locations, we elicited the monetary compensation consumers required to consume a GM food. Providing positive information about the benefits of GM food production, in some cases, reduced the level of monetary compensation demanded to consume the GM food. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
Avian genomes are small and streamlined compared with those of other amniotes by virtue of having fewer repetitive elements and less non-coding DNA(1,2). This condition has been suggested to represent a key adaptation for flight in birds, by reducing the metabolic costs associated with having large genome and cell sizes(3,4). However, the evolution of genome architecture in birds, or any other lineage, is difficult to study because genomic information is often absent for long-extinct relatives. Here we use a novel bayesian comparative method to show that bone-cell size correlates well with genome size in extant vertebrates, and hence use this relationship to estimate the genome sizes of 31 species of extinct dinosaur, including several species of extinct birds. Our results indicate that the small genomes typically associated with avian flight evolved in the saurischian dinosaur lineage between 230 and 250 million years ago, long before this lineage gave rise to the first birds. By comparison, ornithischian dinosaurs are inferred to have had much larger genomes, which were probably typical for ancestral Dinosauria. Using comparative genomic data, we estimate that genome-wide interspersed mobile elements, a class of repetitive DNA, comprised 5 - 12% of the total genome size in the saurischian dinosaur lineage, but was 7 - 19% of total genome size in ornithischian dinosaurs, suggesting that repetitive elements became less active in the saurischian lineage. These genomic characteristics should be added to the list of attributes previously considered avian but now thought to have arisen in non-avian dinosaurs, such as feathers(5), pulmonary innovations 6, and parental care and nesting