980 resultados para Data manipulation
Resumo:
This paper assesses whether eligibility for conditional cash transfer programs have been manipulated, as well as the impact of this phenomenon on time allocation within households. To perform this analysis, we use data from the 2006 PNAD (Brazilian national household survey) and investigate the eligibility manipulation for the Bolsa Família (Family Stipend) program during this time period. The program assists families with a monthly per capita income of around R$120.00 (US$60.00). By applying the tests developed by McCrary (2008), we find suggestive evidence that individuals manipulate their income by voluntarily reducing their labor supply in order to become eligible to the program. Moreover, the reduction in labor supply is greater among women, especially single or divorced mothers. This evidence raises some concern about the unintended consequences related to the eligibility criteria utilized by Bolsa Família, as well as the program’s impact on individuals living in extreme poverty.
Resumo:
Theories can be produced by individuals seeking a good reputation of knowledge. Hence, a significant question is how to test theories anticipating that they might have been produced by (potentially uninformed) experts who prefer their theories not to be rejected. If a theory that predicts exactly like the data generating process is not rejected with high probability then the test is said to not reject the truth. On the other hand, if a false expert, with no knowledge over the data generating process, can strategically select theories that will not be rejected then the test can be ignorantly passed. These tests have limited use because they cannot feasibly dismiss completely uninformed experts. Many tests proposed in the literature (e.g., calibration tests) can be ignorantly passed. Dekel and Feinberg (2006) introduced a class of tests that seemingly have some power of dismissing uninformed experts. We show that some tests from their class can also be ignorantly passed. One of those tests, however, does not reject the truth and cannot be ignorantly passed. Thus, this empirical test can dismiss false experts.We also show that a false reputation of knowledge can be strategically sustained for an arbitrary, but given, number of periods, no matted which test is used (provided that it does not reject the truth). However, false experts can be discredited, even with bounded data sets, if the domain of permissible theories is mildly restricted.
Resumo:
The aim of this study was to evaluate whether digitized images obtained from occlusal radiographs taken with low or over dose of radiation could be improved with the aid of computer software for digital treatment. Thirteen occlusal radiographs of a dry skull were taken employing 13 different exposure times. The radiographs were digitized and then manipulated with the program for image editing. 143 evaluations were performed by specialists in dental radiology who classified radiographs as appropriate or not appropriate for interpretation. Test Z was used for statistical analysis of the data and the results showed that it is possible to manipulate digitized radiographic images taken with 75% of the ideal exposure time and to make them suitable for interpretation and diagnosis. Conversely, it was concluded that the over exposed images, 57.50% above the standard exposure time, were inadequate.
Resumo:
In this paper is reported the use of the chromatographic profiles of volatiles to determine disease markers in plants - in this case, leaves of Eucalyptus globulus contaminated by the necrotroph fungus Teratosphaeria nubilosa. The volatile fraction was isolated by headspace solid phase microextraction (HS-SPME) and analyzed by comprehensive two-dimensional gas chromatography-fast quadrupole mass spectrometry (GC. ×. GC-qMS). For the correlation between the metabolic profile described by the chromatograms and the presence of the infection, unfolded-partial least squares discriminant analysis (U-PLS-DA) with orthogonal signal correction (OSC) were employed. The proposed method was checked to be independent of factors such as the age of the harvested plants. The manipulation of the mathematical model obtained also resulted in graphic representations similar to real chromatograms, which allowed the tentative identification of more than 40 compounds potentially useful as disease biomarkers for this plant/pathogen pair. The proposed methodology can be considered as highly reliable, since the diagnosis is based on the whole chromatographic profile rather than in the detection of a single analyte. © 2013 Elsevier B.V..
Resumo:
Background: The relationship between normal and tangential force components (grip force - GF and load force - LF, respectively) acting on the digits-object interface during object manipulation reveals neural mechanisms involved in movement control. Here, we examined whether the feedback type provided to the participants during exertion of LF would influence GF-LF coordination and task performance. Methods. Sixteen young (24.7 ±3.8 years-old) volunteers isometrically exerted continuously sinusoidal FZ (vertical component of LF) by pulling a fixed instrumented handle up and relaxing under two feedback conditions: targeting and tracking. In targeting condition, FZ exertion range was determined by horizontal lines representing the upper (10 N) and lower (1 N) targets, with frequency (0.77 or 1.53 Hz) dictated by a metronome. In tracking condition, a sinusoidal template set at similar frequencies and range was presented and should be superposed by the participants' exerted FZ. Task performance was assessed by absolute errors at peaks (AEPeak) and valleys (AEValley) and GF-LF coordination by GF-LF ratios, maximum cross-correlation coefficients (r max), and time lags. Results: The results revealed no effect of feedback and no feedback by frequency interaction on any variable. AE Peak and GF-LF ratio were higher and rmax lower at 1.53 Hz than at 0.77 Hz. Conclusion: These findings indicate that the type of feedback does not influence task performance and GF-LF coordination. Therefore, we recommend the use of tracking tasks when assessing GF-LF coordination during isometric LF exertion in externally fixed instrumented handles because they are easier to understand and provide additional indices (e.g., RMSE) of voluntary force control. © 2013 Pedão et al.; licensee BioMed Central Ltd.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.
Resumo:
The role of the substantia nigra pars reticulata (SNPr) and superior colliculus (SC) network in rat strains susceptible to audiogenic seizures still remain underexplored in epileptology. In a previous study from our laboratory, the GABAergic drugs bicuculline (BIC) and muscimol (MUS) were microinjected into the deep layers of either the anterior SC (aSC) or the posterior SC (pSC) in animals of the Wistar audiogenic rat (WAR) strain submitted to acoustic stimulation, in which simultaneous electroencephalographic (EEG) recording of the aSC, pSC, SNPr and striatum was performed. Only MUS microinjected into the pSC blocked audiogenic seizures. In the present study, we expanded upon these previous results using the retrograde tracer Fluorogold (FG) microinjected into the aSC and pSC in conjunction with quantitative EEG analysis (wavelet transform), in the search for mechanisms associated with the susceptibility of this inbred strain to acoustic stimulation. Our hypothesis was that the WAR strain would have different connectivity between specific subareas of the superior colliculus and the SNPr when compared with resistant Wistar animals and that these connections would lead to altered behavior of this network during audiogenic seizures. Wavelet analysis showed that the only treatment with an anticonvulsant effect was MUS microinjected into the pSC region, and this treatment induced a sustained oscillation in the theta band only in the SNPr and in the pSC. These data suggest that in WAR animals, there are at least two subcortical loops and that the one involved in audiogenic seizure susceptibility appears to be the pSC-SNPr circuit. We also found that WARs presented an increase in the number of FG + projections from the posterior SNPr to both the aSC and pSC (primarily to the pSC), with both acting as proconvulsant nuclei when compared with Wistar rats. We concluded that these two different subcortical loops within the basal ganglia are probably a consequence of the WAR genetic background. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Secondary metabolites play an important role in plant protection against biotic and abiotic stress. In Populus, phenolic glycosides (PGs) and condensed tannins (CTs) are two such groups of compounds derived from the common phenylpropanoid pathway. The basal levels and the inducibility of PGs and CTs depend on genetic as well as environmental factors, such as soil nitrogen (N) level. Carbohydrate allocation, transport and sink strength also affect PG and CT levels. A negative correlation between the levels of PGs and CTs was observed in several studies. However, the molecular mechanism underlying such relation is not known. We used a cell culture system to understand negative correlation of PGs and CTs. Under normal culture conditions, neither salicin nor higher-order PGs accumulated in cell cultures. Several factors, such as hormones, light, organelles and precursors were discussed in the context of aspen suspension cells’ inability to synthesize PGs. Salicin and its isomer, isosalicin, were detected in cell cultures fed with salicyl alcohol, salicylaldehyde and helicin. At higher levels (5 mM) of salicyl alcohol feeding, accumulation of salicins led to reduced CT production in the cells. Based on metabolic and gene expression data, the CT reduction in salicin-accumulating cells is partly a result of regulatory changes at the transcriptional level affecting carbon partitioning between growth processes, and phenylpropanoid CT biosynthesis. Based on molecular studies, the glycosyltransferases, GT1-2 and GT1-246, may function in glycosylation of simple phenolics, such as salicyl alcohol in cell cultures. The uptake of such glycosides into vacuole may be mediated to some extent by tonoplast localized multidrug-resistance associated protein transporters, PtMRP1 and PtMRP6. In Populus, sucrose is the common transported carbohydrate and its transport is possibly regulated by sucrose transporters (SUTs). SUTs are also capable of transporting simple PGs, such as salicin. Therefore, we characterized the SUT gene family in Populus and investigated, by transgenic analysis, the possible role of the most abundantly expressed member, PtSUT4, in PG-CT homeostasis using plants grown under varying nitrogen regimes. PtSUT4 transgenic plants were phenotypically similar to the wildtype plants except that the leaf area-to-stem volume ratio was higher for transgenic plants. In SUT4 transgenics, levels of non-structural carbohydrates, such as sucrose and starch, were altered in mature leaves. The levels of PGs and CTs were lower in green tissues of transgenic plants under N-replete, but were higher under N-depleted conditions, compared to the levels in wildtype plants. Based on our results, SUT4 partly regulates N-level dependent PG-CT homeostasis by differential carbohydrate allocation.
Resumo:
The prevalence of obesity has continued to rise over the last several decades in the United States lending to overall increases in risk for chronic diseases including many types of cancer. In contrast, reduction in energy consumption via calorie restriction (CR) has been shown to be a potent inhibitor of carcinogenesis across a broad range of species and tumor types. Previous data has demonstrated differential signaling through Akt and mTOR via the IGF-1R and other growth factor receptors across the diet-induced obesity (DIO)/CR spectrum. Furthermore, mTORC1 is known to be regulated directly via nutrient availability, supporting its role in the link between epithelial carcinogenesis and diet-induced obesity. In an effort to better understand the importance of mTORC1 in the context of both positive and negative energy balance during epithelial carcinogenesis, we have employed the use of specific pharmacological inhibitors, rapamycin (mTORC1 inhibitor) and metformin (AMPK activator) to target mTORC1 or various components of this pathway during skin tumor promotion. Two-stage skin carcinogenesis studies demonstrated that mTORC1 inhibition via rapamycin, metformin or combination treatments greatly inhibited skin tumor development in normal, overweight and obese mice. Furthermore, mechanisms by which these chemopreventive agents may be exerting their anti-tumor effects were explored. In addition, the effect of these compounds on the epidermal proliferative response was analyzed and drastic decreases in epidermal hyperproliferation and hyperplasia were found. Rapamycin also inhibited dermal inflammatory cell infiltration in a dose-dependent manner. Both compounds also blocked or attenuated TPA-induced signaling through epidermal mTORC1 as well as several downstream targets. In addition, inhibition of this pathway by metformin appeared to be, at least in part, dependent on AMPK activation in the skin. Overall, the data indicate that pharmacological strategies targeting this pathway offset the tumor-enhancing effects of DIO and may serve as possible CR mimetics. They suggest that mTORC1 contributes significantly to the process of skin tumor promotion, specifically during dietary energy balance effects. Exploiting the mechanistic information underlying dietary energy balance responsive pathways will help translate decades of research into effective strategies for prevention of epithelial carcinogenesis.
Resumo:
The growing field of ocean acidification research is concerned with the investigation of organism responses to increasing pCO2 values. One important approach in this context is culture work using seawater with adjusted CO2 levels. As aqueous pCO2 is difficult to measure directly in small-scale experiments, it is generally calculated from two other measured parameters of the carbonate system (often AT, CT or pH). Unfortunately, the overall uncertainties of measured and subsequently calculated values are often unknown. Especially under high pCO2, this can become a severe problem with respect to the interpretation of physiological and ecological data. In the few datasets from ocean acidification research where all three of these parameters were measured, pCO2 values calculated from AT and CT are typically about 30% lower (i.e. ~300 µatm at a target pCO2 of 1000 µatm) than those calculated from AT and pH or CT and pH. This study presents and discusses these discrepancies as well as likely consequences for the ocean acidification community. Until this problem is solved, one has to consider that calculated parameters of the carbonate system (e.g. pCO2, calcite saturation state) may not be comparable between studies, and that this may have important implications for the interpretation of CO2 perturbation experiments.
Resumo:
Two 7-day mesocosm experiments were conducted in October 2012 at the Instituto Nacional de Desenvolvimento das Pescas (INDP), Mindelo, Cape Verde. Surface water was collected at night before the start of the respective experiment with RV Islândia south of São Vicente (16°44.4'N, 25°09.4'W) and transported to shore using four 600L food safe intermediate bulk containers. Sixteen mesocosm bags were distributed in four flow-through water baths and shaded with blue, transparent lids to approximately 20% of surface irradiation. Mesocosm bags were filled from the containers by gravity, using a submerged hose to minimize bubbles. The accurate volume inside the individual bags was calculated after addition of 1.5 mmol silicate and measuring the resulting silicate concentration. The volume ranged from 105.5 to 145 L. The experimental manipulation comprised addition of different amounts of inorganic N and P. In the first experiment, the P supply was changed at constant N supply in thirteen of the sixteen units, while in the second experiment the N supply was changed at constant P supply in twelve of the sixteen units. In addition to this, "cornerpoints" were chosen that were repeated during both experiments. Four cornerpoints should have been repeated, but setting the nutrient levels in one mesocosm was not succesfull and therefore this mesocosm also was set at the center point conditions. Experimental treatments were evenly distributed between the four water baths. Initial sampling of the mesocosms on day 1 of each run was conducted between 9:45 and 11:30. After nutrient manipulation, sampling was conducted on a daily basis between 09:00 and 10:30 for days 2 to 8.
Resumo:
The ongoing oceanic uptake of anthropogenic carbon dioxide (CO2) is significantly altering the carbonate chemistry of seawater, a phenomenon referred to as ocean acidification. Experimental manipulations have been increasingly used to gauge how continued ocean acidification will potentially impact marine ecosystems and their associated biogeochemical cycles in the future; however, results amongst studies, particularly when performed on natural communities, are highly variable, which may reflect community/environment-specific responses or inconsistencies in experimental approach. To investigate the potential for identification of more generic responses and greater experimentally reproducibility, we devised and implemented a series (n = 8) of short-term (2-4 days) multi-level (>=4 conditions) carbonate chemistry/nutrient manipulation experiments on a range of natural microbial communities sampled in Northwest European shelf seas. Carbonate chemistry manipulations and resulting biological responses were found to be highly reproducible within individual experiments and to a lesser extent between geographically separated experiments. Statistically robust reproducible physiological responses of phytoplankton to increasing pCO2, characterised by a suppression of net growth for small-sized cells (<10 µm), were observed in the majority of the experiments, irrespective of natural or manipulated nutrient status. Remaining between-experiment variability was potentially linked to initial community structure and/or other site-specific environmental factors. Analysis of carbon cycling within the experiments revealed the expected increased sensitivity of carbonate chemistry to biological processes at higher pCO2 and hence lower buffer capacity. The results thus emphasise how biogeochemical feedbacks may be altered in the future ocean.
Resumo:
The manipulation and handling of an ever increasing volume of data by current data-intensive applications require novel techniques for e?cient data management. Despite recent advances in every aspect of data management (storage, access, querying, analysis, mining), future applications are expected to scale to even higher degrees, not only in terms of volumes of data handled but also in terms of users and resources, often making use of multiple, pre-existing autonomous, distributed or heterogeneous resources.
Resumo:
Light Detection and Ranging (LIDAR) provides high horizontal and vertical resolution of spatial data located in point cloud images, and is increasingly being used in a number of applications and disciplines, which have concentrated on the exploit and manipulation of the data using mainly its three dimensional nature. Bathymetric LIDAR systems and data are mainly focused to map depths in shallow and clear waters with a high degree of accuracy. Additionally, the backscattering produced by the different materials distributed over the bottom surface causes that the returned intensity signal contains important information about the reflection properties of these materials. Processing conveniently these values using a Simplified Radiative Transfer Model, allows the identification of different sea bottom types. This paper presents an original method for the classification of sea bottom by means of information processing extracted from the images generated through LIDAR data. The results are validated using a vector database containing benthic information derived by marine surveys.