575 resultados para scatter hoarding
Resumo:
The title of the study is ''Toxicology Literature: An Informetric Analysis".In the field of Toxicology, the interdisciplinary research resulted in 'information fragmentation' of the basic subject to environmental, medical and economic toxicology. The interest in collaborative research resulted in the transdisciplinary growth of Toxicology which ultimately resulted in the scatter of literature.For the purpose of present study Toxicology is defined as the physical and chemical aspects of all poisons affecting environmental, economical and medical aspects of human life. Informetrics is "the use and development of a variety of measures to study and analyse several properties of information in general and documents in particular."The present study fled light on the main fields of Toxicology research as well as the important primary journals through which the results are being published. The authorshippattern, subject-wise scatter, country-wise, language-wise and growth pattern, self-citation, bibliographic coupling of the journals were studied. The study will be of great use in forrnulatinq the acquisition policy of documents in a library. The present study is useful in identifying obsolate journals so that they can be discarded from the collection
Resumo:
At intermediate depths of the Arabian Sea, the circulation and characteristics of water are more influenced by the high saline waters from the north and low saline waters from the south of equator. The interaction of these waters which greatly differ in characteristics is less understood compared to that at the upper layers. An understanding of the nature of the intermediate waters is of vital importance not only because of the unusual characteristics of the waters but also due to the influx of the different water masses from the neighbouring Red Sea and Persian Gulf. Hence, in the present investigation, it is proposed to study the water characteristics and current structure of the intermediate waters in the Arabian Sea through the distribution of the water properties on the isanosteric surfaces of 100, 80, 60 and 4O—cl/t, vertical sections, and scatter diagrams An attempt is also made to present the potential vorticity between different steric levels to understand the circulation and mixing processes. Data collected during and subsequent to International Indian Ocean Expedition (IIOE) are used for this study. The thesis has been divided into six chapters with further sub divisions
Resumo:
The thesis is divided into six chapters, with Further subdivisions.’ Chapter one has two sections. Section one deals with a general introduction, and section two,with the material and treatment of data For the present investigation. The second chapter concerns with the distribution of oxyty in the oxygen minimum layer and its topography during the southwest and northeast monsoons. The distribution of oxyty at various isanosteric surfaces within which the oxygen minimum layer lies during southwest and northeast monsoons and their topographies Form chapter three. In the fourth chapter the Flow pattern and its influence on the oxygen minimum layer are discussed. The fifth chapter presents the scatter diagrams of oxyty against temperature at the various isanosteric surfaces. The sixth chapter summarises the results of the investigation and presents the conclusions drawn therefrom
Resumo:
Knowledge discovery in databases is the non-trivial process of identifying valid, novel potentially useful and ultimately understandable patterns from data. The term Data mining refers to the process which does the exploratory analysis on the data and builds some model on the data. To infer patterns from data, data mining involves different approaches like association rule mining, classification techniques or clustering techniques. Among the many data mining techniques, clustering plays a major role, since it helps to group the related data for assessing properties and drawing conclusions. Most of the clustering algorithms act on a dataset with uniform format, since the similarity or dissimilarity between the data points is a significant factor in finding out the clusters. If a dataset consists of mixed attributes, i.e. a combination of numerical and categorical variables, a preferred approach is to convert different formats into a uniform format. The research study explores the various techniques to convert the mixed data sets to a numerical equivalent, so as to make it equipped for applying the statistical and similar algorithms. The results of clustering mixed category data after conversion to numeric data type have been demonstrated using a crime data set. The thesis also proposes an extension to the well known algorithm for handling mixed data types, to deal with data sets having only categorical data. The proposed conversion has been validated on a data set corresponding to breast cancer. Moreover, another issue with the clustering process is the visualization of output. Different geometric techniques like scatter plot, or projection plots are available, but none of the techniques display the result projecting the whole database but rather demonstrate attribute-pair wise analysis
Resumo:
Efficient optic disc segmentation is an important task in automated retinal screening. For the same reason optic disc detection is fundamental for medical references and is important for the retinal image analysis application. The most difficult problem of optic disc extraction is to locate the region of interest. Moreover it is a time consuming task. This paper tries to overcome this barrier by presenting an automated method for optic disc boundary extraction using Fuzzy C Means combined with thresholding. The discs determined by the new method agree relatively well with those determined by the experts. The present method has been validated on a data set of 110 colour fundus images from DRION database, and has obtained promising results. The performance of the system is evaluated using the difference in horizontal and vertical diameters of the obtained disc boundary and that of the ground truth obtained from two expert ophthalmologists. For the 25 test images selected from the 110 colour fundus images, the Pearson correlation of the ground truth diameters with the detected diameters by the new method are 0.946 and 0.958 and, 0.94 and 0.974 respectively. From the scatter plot, it is shown that the ground truth and detected diameters have a high positive correlation. This computerized analysis of optic disc is very useful for the diagnosis of retinal diseases
Resumo:
One of the interesting consequences of Einstein's General Theory of Relativity is the black hole solutions. Until the observation made by Hawking in 1970s, it was believed that black holes are perfectly black. The General Theory of Relativity says that black holes are objects which absorb both matter and radiation crossing the event horizon. The event horizon is a surface through which even light is not able to escape. It acts as a one sided membrane that allows the passage of particles only in one direction i.e. towards the center of black holes. All the particles that are absorbed by black hole increases the mass of the black hole and thus the size of event horizon also increases. Hawking showed in 1970s that when applying quantum mechanical laws to black holes they are not perfectly black but they can emit radiation. Thus the black hole can have temperature known as Hawking temperature. In the thesis we have studied some aspects of black holes in f(R) theory of gravity and Einstein's General Theory of Relativity. The scattering of scalar field in this background space time studied in the first chapter shows that the extended black hole will scatter scalar waves and have a scattering cross section and applying tunneling mechanism we have obtained the Hawking temperature of this black hole. In the following chapter we have investigated the quasinormal properties of the extended black hole. We have studied the electromagnetic and scalar perturbations in this space-time and find that the black hole frequencies are complex and show exponential damping indicating the black hole is stable against the perturbations. In the present study we show that not only the black holes exist in modified gravities but also they have similar properties of black hole space times in General Theory of Relativity. 2 + 1 black holes or three dimensional black holes are simplified examples of more complicated four dimensional black holes. Thus these models of black holes are known as toy models of black holes in four dimensional black holes in General theory of Relativity. We have studied some properties of these types of black holes in Einstein model (General Theory of Relativity). A three dimensional black hole known as MSW is taken for our study. The thermodynamics and spectroscopy of MSW black hole are studied and obtained the area spectrum which is equispaced and different thermo dynamical properties are studied. The Dirac perturbation of this three dimensional black hole is studied and the resulting quasinormal spectrum of this three dimensional black hole is obtained. The different quasinormal frequencies are tabulated in tables and these values show an exponential damping of oscillations indicating the black hole is stable against the mass less Dirac perturbation. In General Theory of Relativity almost all solutions contain singularities. The cosmological solution and different black hole solutions of Einstein's field equation contain singularities. The regular black hole solutions are those which are solutions of Einstein's equation and have no singularity at the origin. These solutions possess event horizon but have no central singularity. Such a solution was first put forward by Bardeen. Hayward proposed a similar regular black hole solution. We have studied the thermodynamics and spectroscopy of Hay-ward regular black holes. We have also obtained the different thermodynamic properties and the area spectrum. The area spectrum is a function of the horizon radius. The entropy-heat capacity curve has a discontinuity at some value of entropy showing a phase transition.
Resumo:
Sediment composition is mainly controlled by the nature of the source rock(s), and chemical (weathering) and physical processes (mechanical crushing, abrasion, hydrodynamic sorting) during alteration and transport. Although the factors controlling these processes are conceptually well understood, detailed quantification of compositional changes induced by a single process are rare, as are examples where the effects of several processes can be distinguished. The present study was designed to characterize the role of mechanical crushing and sorting in the absence of chemical weathering. Twenty sediment samples were taken from Alpine glaciers that erode almost pure granitoid lithologies. For each sample, 11 grain-size fractions from granules to clay (ø grades <-1 to >9) were separated, and each fraction was analysed for its chemical composition. The presence of clear steps in the box-plots of all parts (in adequate ilr and clr scales) against ø is assumed to be explained by typical crystal size ranges for the relevant mineral phases. These scatter plots and the biplot suggest a splitting of the full grain size range into three groups: coarser than ø=4 (comparatively rich in SiO2, Na2O, K2O, Al2O3, and dominated by “felsic” minerals like quartz and feldspar), finer than ø=8 (comparatively rich in TiO2, MnO, MgO, Fe2O3, mostly related to “mafic” sheet silicates like biotite and chlorite), and intermediate grains sizes (4≤ø <8; comparatively rich in P2O5 and CaO, related to apatite, some feldspar). To further test the absence of chemical weathering, the observed compositions were regressed against three explanatory variables: a trend on grain size in ø scale, a step function for ø≥4, and another for ø≥8. The original hypothesis was that the trend could be identified with weathering effects, whereas each step function would highlight those minerals with biggest characteristic size at its lower end. Results suggest that this assumption is reasonable for the step function, but that besides weathering some other factors (different mechanical behavior of minerals) have also an important contribution to the trend. Key words: sediment, geochemistry, grain size, regression, step function
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 different compositional datasets and modelled the first canonical variable using a segmented regression model solely based on an observation about the scatter plots. In this paper, multiple linear regressions are applied to different datasets to confirm the validity of our proposed model. In addition to dating the unknown tephras by calibration as discussed previously, another method of mapping the unknown tephras into samples of the reference set or missing samples in between consecutive reference samples is proposed. The application of these methodologies is demonstrated with both simulated and real datasets. This new proposed methodology provides an alternative, more acceptable approach for geologists as their focus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age of unknown tephra. Kew words: Tephrochronology; Segmented regression
Resumo:
In this video, taken in front of the Parthenon at the Acropolis in Athens, Su White interviews Les Carr about why he asserts that there is a moral duty for teachers who create educational content to put that content in the public domain, rather than hoarding it in their institution.
Resumo:
La ictericia es causa frecuente de hospitalización en neonatos. El objetivo principal fue establecer la correlación entre bilirrubinometría transcutánea y bilirrubina sérica, y su utilidad para identificar requerimiento de fototerapia. Materiales y métodos: Es un estudio descriptivo de correlación. Incluyó neonatos con realización de bilirrubinometría transcutánea y bilirrubina sérica. Se calculan índices de correlación de Pearson para el tórax y la frente. La variabilidad de las diferencias entre niveles transcutáneos y bilirrubina séricas, se correlacionan, mediante diagrama de Bland-Altman. Se obtienen mediciones de desempeño de la prueba, con sensibilidad y especificidad. Resultados: Se incluyeron 88 Neonatos, con 112 bilirrubinometrías transcutáneas, (mediana en tórax: 6,0 mg/dl y en frente: 5,9 mg/dl). Se tomaron 95 muestras de bilirrubina sérica, (mediana: 7,2 mg/dl). Los índices de correlación de Pearson fueron 0.83 (tórax) y 0.93 (frente). Utilizando puntos de corte establecidos, para determinar la necesidad de fototerapia, se obtuvieron, sensibilidad del 93 % y especificidad del 86 % (tórax), y sensibilidad del 94% y especificidad del 83 % (frente). Utilizando percentiles 75 del gráfico de Buthani, se obtuvieron sensibilidad del 83% y Especificidad del 98% para la bilirrubinometría transcutánea. Discusión: Las mediciones transcutáneas se correlacionaron bien con las sanguíneas. Las sensibilidades obtenidas fueron satisfactorias. Conclusión: Las mediciones transcutáneas, hubieran detectado la mayoría de infantes que requirieron fototerapia, confirmando la utilidad como método no invasivo en la práctica clínica.
Resumo:
En este trabajo se proponen dos tipos de contratos para los préstamos interbancarios con el fin de que los bancos suavicen sus choques de liquidez a través del mercado interbancario. En particular, se estudia la situación en la que los bancos con faltantes de liquidez que tienen bajo riesgo de crédito abandonan el mercado debido a que la tasa de interés es alta en relación a su fuente alterna de financiamiento. La asimetría en la información acerca del riesgo de crédito impide que los bancos con excedentes de liquidez ajusten la tasa de interés considerando el riesgo de su contraparte. Dado lo anterior, se diseñan dos contratos para los créditos interbancarios que se diferencian en las tasas de interés cobradas. Así, siempre que un banco constituya un depósito podrá obtener liquidez a bajas tasas de interés; en la situación contraria la tasa será más elevada.
Resumo:
We present an analysis of trace gas correlations in the lowermost stratosphere. In‐situ aircraft measurements of CO, N2O, NOy and O3, obtained during the STREAM 1997 winter campaign, have been used to investigate the role of cross‐tropopause mass exchange on tracer‐tracer relations. At altitudes several kilometers above the local tropopause, undisturbed stratospheric air was found with NOy/NOy * ratios close to unity, NOy/O3 about 0.003–0.006 and CO mixing ratios as low as 20 ppbv (NOy * is a proxy for total reactive nitrogen derived from NOy–N2O relations measured in the stratosphere). Mixing of tropospheric air into the lowermost stratosphere has been identified by enhanced ratios of NOy/NOy * and NOy/O3, and from scatter plots of CO versus O3. The enhanced NOy/O3 ratio in the lowermost stratospheric mixing zone points to a reduced efficiency of O3 formation from aircraft NOx emissions.
Resumo:
The problem of modeling solar energetic particle (SEP) events is important to both space weather research and forecasting, and yet it has seen relatively little progress. Most important SEP events are associated with coronal mass ejections (CMEs) that drive coronal and interplanetary shocks. These shocks can continuously produce accelerated particles from the ambient medium to well beyond 1 AU. This paper describes an effort to model real SEP events using a Center for Integrated Space weather Modeling (CISM) MHD solar wind simulation including a cone model of CMEs to initiate the related shocks. In addition to providing observation-inspired shock geometry and characteristics, this MHD simulation describes the time-dependent observer field line connections to the shock source. As a first approximation, we assume a shock jump-parameterized source strength and spectrum, and that scatter-free transport occurs outside of the shock source, thus emphasizing the role the shock evolution plays in determining the modeled SEP event profile. Three halo CME events on May 12, 1997, November 4, 1997 and December 13, 2006 are used to test the modeling approach. While challenges arise in the identification and characterization of the shocks in the MHD model results, this approach illustrates the importance to SEP event modeling of globally simulating the underlying heliospheric event. The results also suggest the potential utility of such a model for forcasting and for interpretation of separated multipoint measurements such as those expected from the STEREO mission.
Resumo:
We investigate the “flux excess” effect, whereby open solar flux estimates from spacecraft increase with increasing heliocentric distance. We analyze the kinematic effect on these open solar flux estimates of large-scale longitudinal structure in the solar wind flow, with particular emphasis on correcting estimates made using data from near-Earth satellites. We show that scatter, but no net bias, is introduced by the kinematic “bunching effect” on sampling and that this is true for both compression and rarefaction regions. The observed flux excesses, as a function of heliocentric distance, are shown to be consistent with open solar flux estimates from solar magnetograms made using the potential field source surface method and are well explained by the kinematic effect of solar wind speed variations on the frozen-in heliospheric field. Applying this kinematic correction to the Omni-2 interplanetary data set shows that the open solar flux at solar minimum fell from an annual mean of 3.82 × 1016 Wb in 1987 to close to half that value (1.98 × 1016 Wb) in 2007, making the fall in the minimum value over the last two solar cycles considerably faster than the rise inferred from geomagnetic activity observations over four solar cycles in the first half of the 20th century.
Resumo:
We propose a mechanism to explain suggested links between seismic activity and ionospheric changes detected overhead. Specifically, we explain changes in the natural extremely low-frequency (ELF) radio noise recently observed in the topside ionosphere aboard the DEMETER satellite at night, before major earthquakes. Our mechanism utilises increased electrical conductivity of surface layer air before a major earthquake, which reduces the surface-ionosphere electrical resistance. This increases the vertical fair weather current, and (to maintain continuity of electron flow) lowers the ionosphere. Magnitudes of crucial parameters are estimated and found to be consistent with observations. Natural variability in ionospheric and atmospheric electrical properties is evaluated, and may be overcome using a hybrid detection approach. Suggested experiments to investigate the mechanism involve measuring the cut-off frequency of ELF “tweeks”, the amplitude and phase of very low frequency radio waves in the Earth–ionosphere waveguide, or medium frequency radar, incoherent scatter or rocket studies of the lower ionospheric electron density.