947 resultados para Data aggregation
Resumo:
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Resumo:
In the present study, different aerial parts from twelve Amazonian plant species found in the National Institute for Amazon Research's (INPA's) Adolpho Ducke Forest Reserve (in Manaus, Amazonas, Brazil) were collected. Separate portions of dried, ground plant materials were extracted with water (by infusion), methanol and chloroform (by continuous liquid-solid extraction) and solvents were removed first by rotary evaporation, and finally by freeze-drying which yielded a total of seventy-one freeze-dried extracts for evaluation. These extracts were evaluated initially at concentrations of 500 and 100 µg/mL for in vitro hemolytic activity and in vitro inhibition of platelet aggregation in human blood, respectively. Sixteen extracts (23 % of all extracts tested, 42 % of all plant species), representing the following plants: Chaunochiton kappleri (Olacaceae), Diclinanona calycina (Annonaceae), Paypayrola grandiflora (Violaceae), Pleurisanthes parviflora (Icacinaceae), Sarcaulus brasiliensis (Sapotaceae), exhibited significant inhibitory activity towards human platelet aggregation. A group of extracts with antiplatelet aggregation activity having no in vitro hemolytic activity has therefore been identified. Three extracts (4 %), all derived from Elaeoluma nuda (Sapotaceae), exhibited hemolytic activity. None of the plant species in this study has known use in traditional medicine. So, these data serve as a baseline or minimum of antiplatelet and hemolytic activities (and potential usefulness) of non-medicinal plants from the Amazon forest. Finally, in general, these are the first data on hemolytic and inhibitory activity on platelet aggregation for the genera which these plant species represent.
Resumo:
Tau-mediated neurodegeneration is a central event in Alzheimer's disease (AD) and other tauopathies. Consistent with suggestions that lifetime stress may be a clinically-relevant precipitant of AD pathology, we previously showed that stress triggers tau hyperphosphorylation and accumulation; however, little is known about the etiopathogenic interaction of chronic stress with other AD risk factors, such as sex and aging. This study focused on how these various factors converge on the cellular mechanisms underlying tau aggregation in the hippocampus of chronically stressed male and female (middle-aged and old) mice expressing the most commonly found disease-associated Tau mutation in humans, P301L-Tau. We report that environmental stress triggers memory impairments in female, but not male, P301L-Tau transgenic mice. Furthermore, stress elevates levels of caspase-3-truncated tau and insoluble tau aggregates exclusively in the female hippocampus while it also alters the expression of the molecular chaperones Hsp90, Hsp70, and Hsp105, thus favoring accumulation of tau aggregates. Our findings provide new insights into the molecular mechanisms through which clinically-relevant precipitating factors contribute to the pathophysiology of AD. Our data point to the exquisite sensitivity of the female hippocampus to stress-triggered tau pathology.
Resumo:
OBJECTIVE: Studies have demonstrated that methylxanthines, such as caffeine, are A1 and A2 adenosine receptor antagonists found in the brain, heart, lungs, peripheral vessels, and platelets. Considering the high consumption of products with caffeine in their composition, in Brazil and throughout the rest of the world, the authors proposed to observe the effects of this substance on blood pressure and platelet aggregation. METHODS: Thirteen young adults, ranging from 21 to 27 years of age, participated in this study. Each individual took 750mg/day of caffeine (250mg tid), over a period of seven days. The effects on blood pressure were analyzed through the pressor test with handgrip, and platelet aggregation was analyzed using adenosine diphosphate, collagen, and adrenaline. RESULTS: Diastolic pressure showed a significant increase 24 hours after the first intake (p<0.05). This effect, however, disappeared in the subsequent days. The platelet aggregation tests did not reveal statistically significant alterations, at any time during the study. CONCLUSION: The data suggest that caffeine increases diastolic blood pressure at the beginning of caffeine intake. This hypertensive effect disappears with chronic use. The absence of alterations in platelet aggregation indicates the need for larger randomized studies.
Resumo:
This paper assesses empirically the importance of size discrimination and disaggregate data for deciding where to locate a start-up concern. We compare three econometric specifications using Catalan data: a multinomial logit with 4 and 41 alternatives (provinces and comarques, respectively) in which firm size is the main covariate; a conditional logit with 4 and 41 alternatives including attributes of the sites as well as size-site interactions; and a Poisson model on the comarques and the full spatial choice set (942 municipalities) with site-specific variables. Our results suggest that if these two issues are ignored, conclusions may be misleading. We provide evidence that large and small firms behave differently and conclude that Catalan firms tend to choose between comarques rather than between municipalities. Moreover, labour-intensive firms seem more likely to be located in the city of Barcelona. Keywords: Catalonia, industrial location, multinomial response model. JEL: C250, E30, R00, R12
Resumo:
We analyze the two-dimensional parabolic-elliptic Patlak-Keller-Segel model in the whole Euclidean space R2. Under the hypotheses of integrable initial data with finite second moment and entropy, we first show local in time existence for any mass of "free-energy solutions", namely weak solutions with some free energy estimates. We also prove that the solution exists as long as the entropy is controlled from above. The main result of the paper is to show the global existence of free-energy solutions with initial data as before for the critical mass 8 Π/Χ. Actually, we prove that solutions blow-up as a delta dirac at the center of mass when t→∞ keeping constant their second moment at any time. Furthermore, all moments larger than 2 blow-up as t→∞ if initially bounded.
Resumo:
It has been recently emphasized that, if individuals have heterogeneous dynamics, estimates of shock persistence based on aggregate data are significatively higher than those derived from its disaggregate counterpart. However, a careful examination of the implications of this statement on the various tools routinely employed to measure persistence is missing in the literature. This paper formally examines this issue. We consider a disaggregate linear model with heterogeneous dynamics and compare the values of several measures of persistence across aggregation levels. Interestingly, we show that the average persistence of aggregate shocks, as measured by the impulse response function (IRF) of the aggregate model or by the average of the individual IRFs, is identical on all horizons. This result remains true even in situations where the units are (short-memory) stationary but the aggregate process is long-memory or even nonstationary. In contrast, other popular persistence measures, such as the sum of the autoregressive coefficients or the largest autoregressive root, tend to be higher the higher the aggregation level. We argue, however, that this should be seen more as an undesirable property of these measures than as evidence of different average persistence across aggregation levels. The results are illustrated in an application using U.S. inflation data.
Resumo:
Empirical studies on the determinants of industrial location typically use variables measured at the available administrative level (municipalities, counties, etc.). However, this amounts to assuming that the effects these determinants may have on the location process do not extent beyond the geographical limits of the selected site. We address the validity of this assumption by comparing results from standard count data models with those obtained by calculating the geographical scope of the spatially varying explanatory variables using a wide range of distances and alternative spatial autocorrelation measures. Our results reject the usual practice of using administrative records as covariates without making some kind of spatial correction. Keywords: industrial location, count data models, spatial statistics JEL classification: C25, C52, R11, R30
Resumo:
In economic literature, information deficiencies and computational complexities have traditionally been solved through the aggregation of agents and institutions. In inputoutput modelling, researchers have been interested in the aggregation problem since the beginning of 1950s. Extending the conventional input-output aggregation approach to the social accounting matrix (SAM) models may help to identify the effects caused by the information problems and data deficiencies that usually appear in the SAM framework. This paper develops the theory of aggregation and applies it to the social accounting matrix model of multipliers. First, we define the concept of linear aggregation in a SAM database context. Second, we define the aggregated partitioned matrices of multipliers which are characteristic of the SAM approach. Third, we extend the analysis to other related concepts, such as aggregation bias and consistency in aggregation. Finally, we provide an illustrative example that shows the effects of aggregating a social accounting matrix model.
Resumo:
Simpson's paradox, also known as amalgamation or aggregation paradox, appears whendealing with proportions. Proportions are by construction parts of a whole, which canbe interpreted as compositions assuming they only carry relative information. TheAitchison inner product space structure of the simplex, the sample space of compositions, explains the appearance of the paradox, given that amalgamation is a nonlinearoperation within that structure. Here we propose to use balances, which are specificelements of this structure, to analyse situations where the paradox might appear. Withthe proposed approach we obtain that the centre of the tables analysed is a naturalway to compare them, which avoids by construction the possibility of a paradox.Key words: Aitchison geometry, geometric mean, orthogonal projection
Resumo:
Foreign trade statistics are the main data source to the study of international trade.However its accuracy has been under suspicion since Morgernstern published hisfamous work in 1963. Federico and Tena (1991) have resumed the question arguing thatthey can be useful in an adequate level of aggregation. But the geographical assignmentproblem remains unsolved. This article focuses on the spatial variable through theanalysis of the reliability of textile international data for 1913. A geographical biasarises between export and import series, but because of its quantitative importance it canbe negligible in an international scale.
Resumo:
SUMMARY: We present a tool designed for visualization of large-scale genetic and genomic data exemplified by results from genome-wide association studies. This software provides an integrated framework to facilitate the interpretation of SNP association studies in genomic context. Gene annotations can be retrieved from Ensembl, linkage disequilibrium data downloaded from HapMap and custom data imported in BED or WIG format. AssociationViewer integrates functionalities that enable the aggregation or intersection of data tracks. It implements an efficient cache system and allows the display of several, very large-scale genomic datasets. AVAILABILITY: The Java code for AssociationViewer is distributed under the GNU General Public Licence and has been tested on Microsoft Windows XP, MacOSX and GNU/Linux operating systems. It is available from the SourceForge repository. This also includes Java webstart, documentation and example datafiles.
Resumo:
BACKGROUND: Patients who have acute coronary syndromes with or without ST-segment elevation have high rates of major vascular events. We evaluated the efficacy of early clopidogrel administration (300 mg) (<24 hours) when given with aspirin in such patients. METHODS: We included 30,243 patients who had an acute coronary syndrome with or without ST segment elevation. Data on early clopidogrel administration were available for 24,463 (81%). Some 15,525 (51%) of the total cohort were administrated clopidogrel within 24h of admission. RESULTS: In-hospital death occurred in 2.9% of the patients in the early clopidogrel group treated with primary PCI and in 11.4% of the patients in the other group without primary percutaneous coronary intervention (PCI) and no early clopidogrel. The unadjusted clopidogrel odds ratio (OR) for mortality was 0.31 (95% confidence interval 0.27-0.34; p <0.001). Incidence of major adverse cardiac death (MACE) was 4.1% in the early clopidogrel group treated with 1°PCI and 13.5% in the other group without primary PCI and no early clopidogrel (OR 0.35, confidence interval 0.32-0.39, p <0.001). Early clopidogrel administration and PCI were the only treatment lowering mortality as shown by mutlivariate analysis. CONCLUSIONS: The early administration of the anti-platelet agent clopidogrel in patients with acute coronary syndromes with or without ST-segment elevation has a beneficial effect on mortality and major adverse cardiac events. The lower mortality rate and incidence of MACE emerged with a combination of primary PCI and early clopidogrel administration.
Resumo:
Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.
Resumo:
Laser diffraction (LD) provides detailed analysis of particle size distribution. Its application to testing the stability of soil aggregates can assist studies on the aggregation of soils with contrasting electrochemical properties. The objectives of the present work were: (a) to propose a protocol for using LD to study soil aggregation, (b) to study the aggregation of an Acrisol under the influence of different doses and forms of lime. Samples were collected in 2005 from a Brazilian Acrisol that in 1994 had received 0.0; 2.0; 8.5 and 17.0 Mg ha-1 of lime, left on the soil surface or incorporated. Aggregates from 4.76 to 8.00 mm diameters were studied using the traditional method proposed by Kemper & Chepil (1965), with wet sieving, while aggregates from 1.00 to 2.00 mm were studied using a CILAS® laser diffractometer that distinguishes particles ranging from 0.04 to 2,500.00 μm. LD readings were made after six consecutive pre-treatments, using agitation times, a chemical dispersion agent and ultrasound. Mean Weighted Diameter (MWD) and the Aggregate Stability Index (ASI) calculated, using the traditional method does not discriminate the treatments. However, LD is able to produce detailed data on soil aggregation, resulting in indexes of stability of aggregates that are linearly related to the doses of lime applied (MWD: R² = 0.986 and ASI: R² = 0.876). It may be concluded that electrochemical changes in the Brazilian Acrisol resulting from incorporated lime affect the stability of aggregates, increasing stability with increased doses of lime.