895 resultados para Straight and Reverse Problems of Data Uncertainty
Resumo:
Although according to Angélil-Carter (2002: 2) ‘plagiarism is a modern Western concept which arose with the introduction of copyright laws in the Eighteenth century’, its avoidance is now a basic plank of respectable academic scholarship. Student plagiarism is currently a hot topic, at least for those who teach and study in British and American universities. There are companies selling both off-the-shelf and written-to-order term papers and others, like Turnitin.com, offering an electronic detection service. Recently an Australian Rector was dismissed for persistent plagiarism earlier in his career and most Anglo-American universities have warnings against and definitions of plagiarism on their websites – indeed Pennycook notes that in the mid-90s Stanford University's documents about plagiarism were reproduced by the University of Oregon apparently without attribution, and suggests, whimsically, that there is 'one set of standards for the guardians of truth and knowledge and another for those seeking entry' (1996: 213), (example and quote taken from Pecorari, 2002, p 29).
Resumo:
Тихомир Трифонов, Цветанка Георгиева-Трифонова - В настоящата статия е представена системата bgBell/OLAP за складиране и онлайн аналитична обработка на данни за уникални български камбани. Реализираната система предоставя възможност за извеждане на обобщени справки и анализиране на различни характеристики на камбаните, за да се извлече предварително неизвестна и потенциално полезна информация.
Resumo:
The official cooperation between the Hungarian and the Belarusian geography began to be outlined in a sunny afternoon of June 2010 in the Minsk building of the Geographic Faculty of the Belarusian State University, four years ago. Then we reviewed the potential frames of cooperation with Professor Ekaterina Antipova. It was supported by the academican Károly Kocsis, member of the Hungarian Academy of Sciences, director of the Geographical Research Institute, and we could also win the support of the dean Ivan Pirozhnik and the academician Vladimir Loginov from the Belarusian State University and the National Academy of Sciences of Belarus, respectively. This informal cooperation became official in the autumn of 2010 in the frame of the Academic Mobility Agreement Project between the Hungarian and the Belarusian academies of sciences. Since then several publications have appeared about Hungary and Belarus in the geographic journals of both countries, however, this is the first, long awaited, significant common publication. Besides the project-based co-operations like e. g. the EastMig (www.eastmig.mtafki.hu) and the ReSEP-CEE (www.mtafki.hu/ReSEP_CEE_Be.html) supported by the Visegrad Fund, a vivid student exchange program was also launched from the autumn of 2010 between the Geographic Research Institute of the Hungarian Academy of Sciences, and the Geographic Faculty of the Belarusian State University with the scholarship program of the Visegrad Fund. Later the Department of Economic Geography of the Corvinus University of Budapest, headed by István Tózsa became also an active partner of the cooperation. The publishing expenses of this book are also fully financed by the Department of Economic Geography.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
Judith Tsouvalis mounts a lively and interesting critique of the post-foundational Left’s theorisations through the marshalling of Latourian insights into the possibilities for a more grounded, pragmatic and concrete approach to political action. Tsouvalis takes Latour’s appropriation of John Dewey’s philosophical pragmatism (classically stated in the 1927 [1954] work, The Public and Its Problems) to argue that problems enable Dingpolitik – object or problem-orientated politics – through assembling concrete plural publics around matters of shared concern and contestation. She counter positions this pragmatic politics of concern, through which new communities of understanding are formed, to the abstract and ‘anthropomorphic’ critiques of the ‘post-political condition’ which offer little in the way of a constructive engagement in the collective making of a better world.
Resumo:
The effects of data uncertainty on real-time decision-making can be reduced by predicting early revisions to US GDP growth. We show that survey forecasts efficiently anticipate the first-revised estimate of GDP, but that forecasting models incorporating monthly economic indicators and daily equity returns provide superior forecasts of the second-revised estimate. We consider the implications of these findings for analyses of the impact of surprises in GDP revision announcements on equity markets, and for analyses of the impact of anticipated future revisions on announcement-day returns.
Resumo:
The design of satisfactory supporting and expansion devices for highway bridges is a problem which has concerned bridge design engineers for many years. The problems associated with these devices have been emphasized by the large number of short span bridges required by the current expanded highway program of expressways and interstate highways. The initial objectives of this investigation were: (1) To review and make a field study of devices used for the support of bridge superstructures and for provision of floor expansion; (2) To analyze the forces or factors which influence the design and behavior of supporting devices and floor expansion systems; and (3) To ascertain the need for future research particularly on the problems of obtaining more economical and efficient supporting and expansion devices, and determining maximum allowable distance between such devices. The experimental portion was conducted to evaluate one of the possible simple and economical solutions to the problems observed in the initial portion. The investigation reported herein is divided into four major parts or phases as follows: (1) A review of literature; (2) A survey by questionnaire of design practice of a number of state highway departments and consulting firms; (3) Field observation of existing bridges; and, (4) An experimental comparison of the dynamic behavior of rigid and elastomeric bearings.
Resumo:
The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.
Resumo:
Data on antimicrobial use play a key role in the development of policies for the containment of antimicrobial resistance. On-farm data could provide a detailed overview of the antimicrobial use, but technical and methodological aspects of data collection and interpretation, as well as data quality need to be further assessed. The aims of this study were (1) to quantify antimicrobial use in the study population using different units of measurement and contrast the results obtained, (2) to evaluate data quality of farm records on antimicrobial use, and (3) to compare data quality of different recording systems. During 1 year, data on antimicrobial use were collected from 97 dairy farms. Antimicrobial consumption was quantified using: (1) the incidence density of antimicrobial treatments; (2) the weight of active substance; (3) the used daily dose and (4) the used course dose for antimicrobials for intestinal, intrauterine and systemic use; and (5) the used unit dose, for antimicrobials for intramammary use. Data quality was evaluated by describing completeness and accuracy of the recorded information, and by comparing farmers' and veterinarians' records. Relative consumption of antimicrobials depended on the unit of measurement: used doses reflected the treatment intensity better than weight of active substance. The use of antimicrobials classified as high priority was low, although under- and overdosing were frequently observed. Electronic recording systems allowed better traceability of the animals treated. Recording drug name or dosage often resulted in incomplete or inaccurate information. Veterinarians tended to record more drugs than farmers. The integration of veterinarian and farm data would improve data quality.
Resumo:
This commentary is based on a general concern regarding the low level of self-criticism (-evaluation) in the interpretation of molecular pharmacological data published in ethnopharmacology-related journals. Reports on potentially new lead structures or pharmacological effects of medicinal plant extracts are mushrooming. At the same time, nonsense in bioassays is an increasing phenomenon in herbal medicine research. Only because a dataset is reproducible does not imply that it is meaningful. Currently, there are thousands of claims of pharmacological effects of medicinal plants and natural products. It is argued that claims to knowledge in ethnopharmacology, as in the exact sciences, should be rationally criticized if they have empirical content as it is the case with biochemical and pharmacological analyses. Here the major problem is the misemployment of the concentration-effect paradigm and the overinterpretation of data obtained in vitro. Given the almost exponential increase of scientific papers published it may be the moment to adapt to a falsificationist methodology.
Resumo:
Printed by order of the Massachusetts General Court.
Resumo:
Hearings held January 13-April 3, 1958
Resumo:
"Twenty years of city planning progress in the United States [by] John Nolen": 19th, p. 1-44.
Resumo:
Data mining can be defined as the extraction of implicit, previously un-known, and potentially useful information from data. Numerous re-searchers have been developing security technology and exploring new methods to detect cyber-attacks with the DARPA 1998 dataset for Intrusion Detection and the modified versions of this dataset KDDCup99 and NSL-KDD, but until now no one have examined the performance of the Top 10 data mining algorithms selected by experts in data mining. The compared classification learning algorithms in this thesis are: C4.5, CART, k-NN and Naïve Bayes. The performance of these algorithms are compared with accuracy, error rate and average cost on modified versions of NSL-KDD train and test dataset where the instances are classified into normal and four cyber-attack categories: DoS, Probing, R2L and U2R. Additionally the most important features to detect cyber-attacks in all categories and in each category are evaluated with Weka’s Attribute Evaluator and ranked according to Information Gain. The results show that the classification algorithm with best performance on the dataset is the k-NN algorithm. The most important features to detect cyber-attacks are basic features such as the number of seconds of a network connection, the protocol used for the connection, the network service used, normal or error status of the connection and the number of data bytes sent. The most important features to detect DoS, Probing and R2L attacks are basic features and the least important features are content features. Unlike U2R attacks, where the content features are the most important features to detect attacks.