983 resultados para paradigms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subject and methodology of biblical scholarship has expanded immense-ly during the last few decades. The traditional text-, literary-, source- and form-critical approaches, labeled historical-critical scholarship , have faced the challenge of social sciences. Various new literary, synchronic readings, sometimes characterized with the vague term postmodernism, have in turn challenged historicalcritical, and social-scientific approaches. Widened limits and diverging methodologies have caused a sense of crisis in biblical criticism. This metatheoretical thesis attempts to bridge the gap between philosophical discussion about the basis of biblical criticism and practical academic biblical scholarship. The study attempts to trace those epistemological changes that have produced the wealth of methods and results within biblical criticism. The account of the cult reform of King Josiah of Judah as reported in 2 Kings 22:1 23:30 serves as the case study because of its importance for critical study of the Hebrew Bible. Various scholarly approaches embracing 2 Kings 22:1 23:30 are experimentally arranged around four methodological positions: text, author, reader, and context. The heuristic model is a tentative application of Oliver Jahraus s model of four paradigms in literary theory. The study argues for six theses: 1) Our knowledge of the world is con-structed, fallible and theory-laden. 2) Methodological plurality is the neces-sary result of changes in epistemology and culture in general. 3) Oliver Jahraus s four methodological positions in regard to literature are also an applicable model within biblical criticism to comprehend the methodological plurality embracing the study of the Hebrew Bible. 4) Underlying the methodological discourse embracing biblical criticism is the epistemological ten-sion between the natural sciences and the humanities. 5) Biblical scholars should reconsider and analyze in detail concepts such as author and editor to overcome the dichotomy between the Göttingen and Cross schools. 6) To say something about the historicity of 2 Kings 22:1 23:30 one must bring together disparate elements from various disciplines and, finally, admit that though it may be possible to draw some permanent results, our conclusions often remain provisional.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Code Division Multiple Access (CDMA) techniques, by far, had been applied to LAN problems by many investigators, An analytical study of well known algorithms for generation of Orthogonal codes used in FO-CDMA systems like those for prime, quasi-Prime, Optical Orthogonal and Matrix codes has been presented, Algorithms for OOCs like Greedy/Modified Greedy/Accelerated Greedy algorithms are implemented. Many speed-up enhancements. for these algorithms are suggested. A novel Synthetic Algorithm based on Difference Sets (SADS) is also proposed. Investigations are made to vectorise/parallelise SADS to implement the source code on parallel machines. A new matrix for code families of OOCs with different seed code-words but having the same (n,w,lambda) set is formulated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we explore the application of wireless sensor technologies for the benefit of small and marginal farmers in semi-arid regions. The focus in this paper is to discuss the merits and demerits of data gathering & relay paradigms that collect localized data over a wide area. The data gathered includes soil moisture, temperature, pressure, rain data and humidity. The challenge to technology intervention comes mainly due to two reasons: (a) Farmers in general are interested in crop yield specific to their piece of land. This is because soil texture can vary rapidly over small regions. (b) Due to a high run-off, the soil moisture retention can vary from region to region depending on the topology of the farm. Both these reasons alter the needs drastically. Additionally, small and marginal farms can be sandwiched between rich farm lands. The village has very little access to grid power. Power cuts can extend up to 12 hours in a day and upto 3 or 4 days during some months in the year. In this paper, we discuss 3 technology paradigms for data relaying. These include Wi-Fi (Wireless Fidelity), GPRS (General Packet Radio Service) and DTN (Delay and Disruption Tolerant Network) technologies. We detail the merits and demerits of each of these solutions and provide our final recommendations. The project site is a village called Chennakesavapura in the state of Karnataka, India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusion of multi-sensor imaging data enables a synergetic interpretation of complementary information obtained by sensors of different spectral ranges. Multi-sensor data of diverse spectral, spatial and temporal resolutions require advanced numerical techniques for analysis and interpretation. This paper reviews ten advanced pixel based image fusion techniques – Component substitution (COS), Local mean and variance matching, Modified IHS (Intensity Hue Saturation), Fast Fourier Transformed-enhanced IHS, Laplacian Pyramid, Local regression, Smoothing filter (SF), Sparkle, SVHC and Synthetic Variable Ratio. The above techniques were tested on IKONOS data (Panchromatic band at 1 m spatial resolution and Multispectral 4 bands at 4 m spatial resolution). Evaluation of the fused results through various accuracy measures, revealed that SF and COS methods produce images closest to corresponding multi-sensor would observe at the highest resolution level (1 m).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

``The goal of this study was to examine the effect of maternal iron deficiency on the developing hippocampus in order to define a developmental window for this effect, and to see whether iron deficiency causes changes in glucocorticoid levels. The study was carried out using pre-natal, post-natal, and pre + post-natal iron deficiency paradigm. Iron deficient pregnant dams and their pups displayed elevated corticosterone which, in turn, differentially affected glucocorticoid receptor (GR) expression in the CA1 and the dentate gyrus. Brain Derived Neurotrophic Factor (BDNF) was reduced in the hippocampi of pups following elevated corticosterone levels. Reduced neurogenesis at P7 was seen in pups born to iron deficient mothers, and these pups had reduced numbers of hippocampal pyramidal and granule cells as adults. Hippocampal subdivision volumes also were altered. The structural and molecular defects in the pups were correlated with radial arm maze performance; reference memory function was especially affected. Pups from dams that were iron deficient throughout pregnancy and lactation displayed the complete spectrum of defects, while pups from dams that were iron deficient only during pregnancy or during lactation displayed subsets of defects. These findings show that maternal iron deficiency is associated with altered levels of corticosterone and GR expression, and with spatial memory deficits in their pups.'' (C) 2013 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The world is in the midst of a biodiversity crisis, threatening essential goods and services on which humanity depends. While there is an urgent need globally for biodiversity research, growing obstacles are severely limiting biodiversity research throughout the developing world, particularly in Southeast Asia. Facilities, funding, and expertise are often limited throughout this region, reducing the capacity for local biodiversity research. Although western scientists generally have more expertise and capacity, international research has sometimes been exploitative ``parachute science,'' creating a culture of suspicion and mistrust. These issues, combined with misplaced fears of biopiracy, have resulted in severe roadblocks to biodiversity research in the very countries that need it the most. Here, we present an overview of challenges to biodiversity research and case studies that provide productive models for advancing biodiversity research in developing countries. Key to success is integration of research and education, a model that fosters sustained collaboration by focusing on the process of conducting biodiversity research as well as research results. This model simultaneously expands biodiversity research capacity while building trust across national borders. It is critical that developing countries enact policies that protect their biodiversity capital without shutting down international and local biodiversity research that is essential to achieve the long-term sustainability of biodiversity, promoting food security and economic development.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of ketamine, an N-methyl-D-aspartate (NMDA) antagonist, on memory in animals have been limited to the sub-anesthetic dose given prior to training in previous studies. We evaluated the effects of post-training anesthetic doses of ketamine to se

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tucker

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent developments in microfabrication and nanotechnology will enable the inexpensive manufacturing of massive numbers of tiny computing elements with sensors and actuators. New programming paradigms are required for obtaining organized and coherent behavior from the cooperation of large numbers of unreliable processing elements that are interconnected in unknown, irregular, and possibly time-varying ways. Amorphous computing is the study of developing and programming such ultrascale computing environments. This paper presents an approach to programming an amorphous computer by spontaneously organizing an unstructured collection of processing elements into cooperative groups and hierarchies. This paper introduces a structure called an AC Hierarchy, which logically organizes processors into groups at different levels of granularity. The AC hierarchy simplifies programming of an amorphous computer through new language abstractions, facilitates the design of efficient and robust algorithms, and simplifies the analysis of their performance. Several example applications are presented that greatly benefit from the AC hierarchy. This paper introduces three algorithms for constructing multiple levels of the hierarchy from an unstructured collection of processors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grattan, J. Pollution and paradigms: Lessons from Icelandic volcanism for continental flood basalt studies. Lithos. 2005. 79 pp 343-353

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Semiconductor nanowires are pseudo 1-D structures where the magnitude of the semiconducting material is confined to a length of less than 100 nm in two dimensions. Semiconductor nanowires have a vast range of potential applications, including electronic (logic devices, diodes), photonic (laser, photodetector), biological (sensors, drug delivery), energy (batteries, solar cells, thermoelectric generators), and magnetic (spintronic, memory) devices. Semiconductor nanowires can be fabricated by a range of methods which can be categorised into one of two paradigms, bottom-up or top-down. Bottom-up processes can be defined as those where structures are assembled from their sub-components in an additive fashion. Top-down fabrication strategies use sculpting or etching to carve structures from a larger piece of material in a subtractive fashion. This seminar will detail a number of novel routes to fabricate semiconductor nanowires by both bottom-up and top-down paradigms. Firstly, a novel bottom-up route to fabricate Ge nanowires with controlled diameter distributions in the sub-20 nm regime will be described. This route details nanowire synthesis and diameter control in the absence of a foreign seed metal catalyst. Additionally a top-down route to nanowire array fabrication will be detailed outlining the importance of surface chemistry in high-resolution electron beam lithography (EBL) using hydrogen silsesquioxane (HSQ) on Ge and Bi2Se3 surfaces. Finally, a process will be described for the directed self-assembly of a diblock copolymer (PS-b-PDMS) using an EBL defined template. This section will also detail a route toward selective template sidewall wetting of either block in the PS-b-PDMS system, through tailored functionalisation of the template and substrate surfaces.