990 resultados para Historical-deductive method
Resumo:
We developed a semiquantitative job exposure matrix (JEM) for workers exposed to polychlorinated biphenyls (PCBs) at a capacitor manufacturing plant from 1946 to 1977. In a recently updated mortality study, mortality of prostate and stomach cancer increased with increasing levels of cumulative exposure estimated with this JEM (trend p values = 0.003 and 0.04, respectively). Capacitor manufacturing began with winding bales of foil and paper film, which were placed in a metal capacitor box (pre-assembly), and placed in a vacuum chamber for flood-filling (impregnation) with dielectric fluid (PCBs). Capacitors dripping with PCB residues were then transported to sealing stations where ports were soldered shut before degreasing, leak testing, and painting. Using a systematic approach, all 509 unique jobs identified in the work histories were rated by predetermined process- and plant-specific exposure determinants; then categorized based on the jobs' similarities (combination of exposure determinants) into 35 job exposure categories. The job exposure categories were ranked followed by a qualitative PCB exposure rating (baseline, low, medium, and high) for inhalation and dermal intensity. Category differences in other chemical exposures (solvents, etc.) prevented further combining of categories. The mean of all available PCB concentrations (1975 and 1977) for jobs within each intensity rating was regarded as a representative value for that intensity level. Inhalation (in microgram per cubic milligram) and dermal (unitless) exposures were regarded as equally important. Intensity was frequency adjusted for jobs with continuous or intermittent PCB exposures. Era-modifying factors were applied to the earlier time periods (1946-1974) because exposures were considered to have been greater than in later eras (1975-1977). Such interpolations, extrapolations, and modifying factors may introduce non-differential misclassification; however, we do believe our rigorous method minimized misclassification, as shown by the significant exposure-response trends in the epidemiologic analysis.
Resumo:
Santiago Ramón y Cajal developed a great body of scientific research during the last decade of 19th century, mainly between 1888 and 1892, when he published more than 30 manuscripts. The neuronal theory, the structure of dendrites and spines, and fine microscopic descriptions of numerous neural circuits are among these studies. In addition, numerous cell types (neuronal and glial) were described by Ramón y Cajal during this time using this 'reazione nera' or Golgi method. Among these neurons were the special cells of the molecular layer of the neocortex. These cells were also termed Cajal cells or Retzius cells by other colleagues. Today these cells are known as Cajal-Retzius cells. From the earliest description, several biological aspects of these fascinating cells have been analyzed (e.g., cell morphology, physiological properties, origin and cellular fate, putative function during cortical development, etc). In this review we will summarize in a temporal basis the emerging knowledge concerning this cell population with specific attention the pioneer studies of Santiago Ramón y Cajal.
Resumo:
Sediments can be natural archives to reconstruct the history of pollutant inputs into coastal areas. This is important to improve management strategies and evaluate the success of pollution control measurements. In this work, the vertical distribution of organochlorine pesticides (DDTs, Lindane, HCB, Heptachlor, Aldrin and Mirex) was determined in a sediment core collected from the Gulf of Batabanó, Cuba, which was dated by using the (210)Pb dating method and validated with the (239,240)Pu fallout peak. Results showed significant changes in sediment accumulation during the last 40 years: recent mass accumulation rates (0.321 g cm(-2) yr(-1)) double those estimated before 1970 (0.15 g cm(-2) yr(-1)). This change matches closely land use change in the region (intense deforestation and regulation of the Colon River in the late 1970s). Among pesticides, only DDTs isomers, Lindane and HCB were detected, and ranged from 0.029 to 0.374 ng g(-1) dw for DDTs, from<0.006 to 0.05 ng g(-1) dw for Lindane and from<0.04 to 0.134 ng g(-1) dw for HCB. Heptachlor, Aldrin and Mirex were below the detection limits (∼0.003 ng g(-1)), indicating that these compounds had a limited application in the Coloma watershed. Pesticide contamination was evident since the 1970s. DDTs and HCB records showed that management strategies, namely the banning the use of organochlorine contaminants, led to a concentration decline. However, Lindane, which was restricted in 1990, can still be found in the watershed. According to NOAA guidelines, pesticides concentrations encountered in these sediments are low and probably not having an adverse effect on sediment dwelling organisms.
Resumo:
In this study the theoretical part was created to make comparison between different Value at Risk models. Based on that comparison one model was chosen to the empirical part which concentrated to find out whether the model is accurate to measure market risk. The purpose of this study was to test if Volatility-weighted Historical Simulation is accurate in measuring market risk and what improvements does it bring to market risk measurement compared to traditional Historical Simulation. Volatility-weighted method by Hull and White (1998) was chosen In order to improve the traditional methods capability to measure market risk. In this study we found out that result based on Historical Simulation are dependent on chosen time period, confidence level and how samples are weighted. The findings of this study are that we cannot say that the chosen method is fully reliable in measuring market risk because back testing results are changing during the time period of this study.
Resumo:
Ever since the inception of economics over two hundred years ago, the tools at the discipline's disposal have grown more and more more sophisticated. This book provides a historical introduction to the methodology of economics through the eyes of economists. The story begins with John Stuart Mill's seminal essay from 1836 on the definition and method of political economy, which is then followed by an examination of how the actual practices of economists changed over time to such an extent that they not only altered their methods of enquiry, but also their self-perception as economists. Beginning as intellectuals and journalists operating to a large extent in the public sphere, they then transformed into experts who developed their tools of research increasingly behind the scenes. No longer did they try to influence policy agendas through public discourse; rather they targeted policymakers directly and with instruments that showed them as independent and objective policy advisors, the tools of the trade changing all the while. In order to shed light on this evolution of economic methodology, this book takes carefully selected snapshots from the discipline's history. It tracks the process of development through the nineteenth and twentieth centuries, analysing the growth of empirical and mathematical modelling. It also looks at the emergence of the experiment in economics, in addition to the similarities and differences between modelling and experimentation. This book will be relevant reading for students and academics in the fields of economic methodology, history of economics, and history and philosophy of the social sciences.
Resumo:
There is an intense debate on the convenience of moving from historical cost (HC) toward the fair value (FV) principle. The debate and academic research is usually concerned with financial instruments, but the IAS 41 requirement of fair valuation for biological assets brings it into the agricultural domain. This paper performs an empirical study with a sample of Spanish farms valuing biological assets at HC and a sample applying FV, finding no significant differences between both valuation methods to assess future cash flows. However, most tests reveal more predictive power of future earnings under fair valuation of biological assets, which is not explained by differences in volatility of earnings and profitability. The study also evidences the existence of flawed HC accounting practices for biological assets in agriculture, which suggests scarce information content of this valuation method in the predominant small business units existing in the agricultural sector in advanced Western countries
Resumo:
This paper presents empirical research comparing the accounting difficulties that arise from the use of two valuation methods for biological assets, fair value (FV) and historical cost (HC) accounting, in the agricultural sector. It also compares how reliable each valuation method is in the decision-making process of agents within the sector. By conducting an experiment with students, farmers, and accountants operating in the agricultural sector, we find that they have more difficulties, make larger miscalculations and make poorer judgements with HC accounting than with FV accounting. In-depth interviews uncover flawed accounting practices in the agricultural sector in Spain in order to meet HC accounting requirements. Given the complexities of cost calculation for biological assets and the predominance of small family business units in advanced Western countries, the study concludes that accounting can be more easily applied in the agricultural sector under FV than HC accounting, and that HC conveys a less accurate grasp of the real situation of a farm.
Resumo:
Mann–Kendall non-parametric test was employed for observational trend detection of monthly, seasonal and annual precipitation of five meteorological subdivisions of Central Northeast India (CNE India) for different 30-year normal periods (NP) viz. 1889–1918 (NP1), 1919–1948 (NP2), 1949–1978 (NP3) and 1979–2008 (NP4). The trends of maximum and minimum temperatures were also investigated. The slopes of the trend lines were determined using the method of least square linear fitting. An application of Morelet wavelet analysis was done with monthly rainfall during June– September, total rainfall during monsoon season and annual rainfall to know the periodicity and to test the significance of periodicity using the power spectrum method. The inferences figure out from the analyses will be helpful to the policy managers, planners and agricultural scientists to work out irrigation and water management options under various possible climatic eventualities for the region. The long-term (1889–2008) mean annual rainfall of CNE India is 1,195.1 mm with a standard deviation of 134.1 mm and coefficient of variation of 11%. There is a significant decreasing trend of 4.6 mm/year for Jharkhand and 3.2 mm/day for CNE India. Since rice crop is the important kharif crop (May– October) in this region, the decreasing trend of rainfall during themonth of July may delay/affect the transplanting/vegetative phase of the crop, and assured irrigation is very much needed to tackle the drought situation. During themonth of December, all the meteorological subdivisions except Jharkhand show a significant decreasing trend of rainfall during recent normal period NP4. The decrease of rainfall during December may hamper sowing of wheat, which is the important rabi crop (November–March) in most parts of this region. Maximum temperature shows significant rising trend of 0.008°C/year (at 0.01 level) during monsoon season and 0.014°C/year (at 0.01 level) during post-monsoon season during the period 1914– 2003. The annual maximum temperature also shows significant increasing trend of 0.008°C/year (at 0.01 level) during the same period. Minimum temperature shows significant rising trend of 0.012°C/year (at 0.01 level) during postmonsoon season and significant falling trend of 0.002°C/year (at 0.05 level) during monsoon season. A significant 4– 8 years peak periodicity band has been noticed during September over Western UP, and 30–34 years periodicity has been observed during July over Bihar subdivision. However, as far as CNE India is concerned, no significant periodicity has been noticed in any of the time series.
Resumo:
This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
Widespread reports of low pollination rates suggest a recent anthropogenic decline in pollination that could threaten natural and agricultural ecosystems. Nevertheless, unequivocal evidence for a decline in pollination over time has remained elusive because it was not possible to determine historical pollination rates. Here we demonstrate a widely applicable method for reconstructing historical pollination rates, thus allowing comparison with contemporary rates from the same sites. We focused on the relationship between the oil-collecting bee Rediviva peringueyi (Melittidae) and the guild of oil-secreting orchid species (Coryciinae) that depends on it for pollination. The guild is distributed across the highly transformed and fragmented lowlands of the Cape Region of South Africa. We show that rehydrated herbarium specimens of Pterygodium catholicum, the most abundant member of the guild, contain a record of past pollinator activity in the form of pollinarium removal rates. Analysis of a pollination time series showed a recent decline in pollination on Signal Hill, a small urban conservation area. The same herbaria contain historical species occurrence data. We analyzed this data and found that there has been a contemporaneous shift in orchid guild composition in urban areas due to the local extirpation of the non-clonal species, consistent with their greater dependence on seeds and pollination for population persistence.
Resumo:
This article presents a reinterpretation of James Harrington's writings. It takes issue with J. G. A. Pocock's reading, which treats him as importing into England a Machiavellian ‘language of political thought’. This reading is the basis of Pocock's stress on the republicanism of eighteenth-century opposition values. Harrington's writings were in fact a most implausible channel for such ideas. His outlook owed much to Stoicism. Unlike the Florentine, he admired the contemplative life; was sympathetic to commerce; and was relaxed about the threat of ‘corruption’ (a concept that he did not understand). These views can be associated with his apparent aims: the preservation of a national church with a salaried but politically impotent clergy; and the restoration of the royalist gentry to a leading role in English politics. Pocock's hypothesis is shown to be conditioned by his method; its weaknesses reflect some difficulties inherent in the notion of ‘languages of thought’.
Assessment of the Wind Gust Estimate Method in mesoscale modelling of storm events over West Germany
Resumo:
A physically based gust parameterisation is added to the atmospheric mesoscale model FOOT3DK to estimate wind gusts associated with storms over West Germany. The gust parameterisation follows the Wind Gust Estimate (WGE) method and its functionality is verified in this study. The method assumes that gusts occurring at the surface are induced by turbulent eddies in the planetary boundary layer, deflecting air parcels from higher levels down to the surface under suitable conditions. Model simulations are performed with horizontal resolutions of 20 km and 5 km. Ten historical storm events of different characteristics and intensities are chosen in order to include a wide range of typical storms affecting Central Europe. All simulated storms occurred between 1990 and 1998. The accuracy of the method is assessed objectively by validating the simulated wind gusts against data from 16 synoptic stations by means of “quality parameters”. Concerning these parameters, the temporal and spatial evolution of the simulated gusts is well reproduced. Simulated values for low altitude stations agree particularly well with the measured gusts. For orographically exposed locations, the gust speeds are partly underestimated. The absolute maximum gusts lie in most cases within the bounding interval given by the WGE method. Focussing on individual storms, the performance of the method is better for intense and large storms than for weaker ones. Particularly for weaker storms, the gusts are typically overestimated. The results for the sample of ten storms document that the method is generally applicable with the mesoscale model FOOT3DK for mid-latitude winter storms, even in areas with complex orography.
Resumo:
Includes bibliography
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)