941 resultados para Method of extraction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Landscape geochemical investigations were conducted upon portions of a natural uniform landscape in southern Norway. This consisted of sampling both soil profile samples and spruce tree twigs for the analysis of twelve chemical elements. These elements were cobalt, copper, nickel, lead, zinc, manganese, magnesium, iron, calcium, sodium, potassium and aluminum which were determined by atomic absorption analysis on standardized extraction techniques for both organic and inorganic materials. Two "landscape traverses" were chosen for a comparative study of the effects of varying landscape parameters upon the trace element distribution patterns throughout the landscape traverses. The object of this study was to test this method of investigation and the concept of an ideal uniform landscape under Norwegian conditions. A "control traverse" was established to represent uniform landscape conditions typical of the study area and was used to determine "normal" or average trace element distribution patterns. A "signal traverse" was selected nearby over an area of lead mineralization where the depth to bedrock is very small. The signal traverse provided an area of similar landscape conditions to those of the control traverse with significant differences in the bedrock configuration and composition. This study was also to determine the effect of the bedrock mineralization upon the distribution patterns of the twelve chemical elements within the major components of the two landscape traverses (i.e. soil profiles and tree branches). The lead distribution within the soils of the signal traverse showed localized accumulations of lead within the overburden with maximum values occurring within the organic A horizon of soil profile #10. Above average concentrations of lead were common within the signal traverse, however, the other elements studied were not significantly different from those averages determined throughout the soils of the control traverse. The spruce twig samples did not have corresponding accumulations of lead near the soil lead anomaly. This is attributable to the very localized nature of the lead dispersion pattern within the soils. This approach to the study of the geochemistry of a natural landscape was effective in establishing: a) average or "normal" trace element distribution patterns b) local variations in the landscape morphology and c) the effect of unusually high lead concentrations upon the geochemistry of the landscape (i.e. within the soil profiles and tree branches). This type of study provides the basis for further more intensive studies and serves only as a first approximation of the behaviour of elements within a natural landscape.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We o¤er an axiomatization of the serial cost-sharing method of Friedman and Moulin (1999). The key property in our axiom system is Group Demand Monotonicity, asking that when a group of agents raise their demands, not all of them should pay less.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

S’insérant dans les domaines de la Lecture et de l’Analyse de Textes Assistées par Ordinateur (LATAO), de la Gestion Électronique des Documents (GÉD), de la visualisation de l’information et, en partie, de l’anthropologie, cette recherche exploratoire propose l’expérimentation d’une méthodologie descriptive en fouille de textes afin de cartographier thématiquement un corpus de textes anthropologiques. Plus précisément, nous souhaitons éprouver la méthode de classification hiérarchique ascendante (CHA) pour extraire et analyser les thèmes issus de résumés de mémoires et de thèses octroyés de 1985 à 2009 (1240 résumés), par les départements d’anthropologie de l’Université de Montréal et de l’Université Laval, ainsi que le département d’histoire de l’Université Laval (pour les résumés archéologiques et ethnologiques). En première partie de mémoire, nous présentons notre cadre théorique, c'est-à-dire que nous expliquons ce qu’est la fouille de textes, ses origines, ses applications, les étapes méthodologiques puis, nous complétons avec une revue des principales publications. La deuxième partie est consacrée au cadre méthodologique et ainsi, nous abordons les différentes étapes par lesquelles ce projet fut conduit; la collecte des données, le filtrage linguistique, la classification automatique, pour en nommer que quelques-unes. Finalement, en dernière partie, nous présentons les résultats de notre recherche, en nous attardant plus particulièrement sur deux expérimentations. Nous abordons également la navigation thématique et les approches conceptuelles en thématisation, par exemple, en anthropologie, la dichotomie culture ̸ biologie. Nous terminons avec les limites de ce projet et les pistes d’intérêts pour de futures recherches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The standard models for statistical signal extraction assume that the signal and noise are generated by linear Gaussian processes. The optimum filter weights for those models are derived using the method of minimum mean square error. In the present work we study the properties of signal extraction models under the assumption that signal/noise are generated by symmetric stable processes. The optimum filter is obtained by the method of minimum dispersion. The performance of the new filter is compared with their Gaussian counterparts by simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the medical field, microwaves play a larger role for treatment than diagnosis. For the detection of diseases by microwave methods, it is essential to know the dielectric properties of biological materials. For the present study, a cavity perturbation technique was employed to determine the dielectric properties of these materials. Rectangular cavity resonators were used to measure the complex permittivity of human bile, bile stones, gastric juice and saliva. The measurements were carried out in the S and J bands. It is observed that normal and infected bile have different dielectric constant and loss tangent. Dielectric constant of infected bile and gastric juice varies from patient to patient. Detection and extraction of bile stone with possible method of treatment is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While channel coding is a standard method of improving a system’s energy efficiency in digital communications, its practice does not extend to high-speed links. Increasing demands in network speeds are placing a large burden on the energy efficiency of high-speed links and render the benefit of channel coding for these systems a timely subject. The low error rates of interest and the presence of residual intersymbol interference (ISI) caused by hardware constraints impede the analysis and simulation of coded high-speed links. Focusing on the residual ISI and combined noise as the dominant error mechanisms, this paper analyses error correlation through concepts of error region, channel signature, and correlation distance. This framework provides a deeper insight into joint error behaviours in high-speed links, extends the range of statistical simulation for coded high-speed links, and provides a case against the use of biased Monte Carlo methods in this setting

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two ongoing projects at ESSC that involve the development of new techniques for extracting information from airborne LiDAR data and combining this information with environmental models will be discussed. The first project in conjunction with Bristol University is aiming to improve 2-D river flood flow models by using remote sensing to provide distributed data for model calibration and validation. Airborne LiDAR can provide such models with a dense and accurate floodplain topography together with vegetation heights for parameterisation of model friction. The vegetation height data can be used to specify a friction factor at each node of a model’s finite element mesh. A LiDAR range image segmenter has been developed which converts a LiDAR image into separate raster maps of surface topography and vegetation height for use in the model. Satellite and airborne SAR data have been used to measure flood extent remotely in order to validate the modelled flood extent. Methods have also been developed for improving the models by decomposing the model’s finite element mesh to reflect floodplain features such as hedges and trees having different frictional properties to their surroundings. Originally developed for rural floodplains, the segmenter is currently being extended to provide DEMs and friction parameter maps for urban floods, by fusing the LiDAR data with digital map data. The second project is concerned with the extraction of tidal channel networks from LiDAR. These networks are important features of the inter-tidal zone, and play a key role in tidal propagation and in the evolution of salt-marshes and tidal flats. The study of their morphology is currently an active area of research, and a number of theories related to networks have been developed which require validation using dense and extensive observations of network forms and cross-sections. The conventional method of measuring networks is cumbersome and subjective, involving manual digitisation of aerial photographs in conjunction with field measurement of channel depths and widths for selected parts of the network. A semi-automatic technique has been developed to extract networks from LiDAR data of the inter-tidal zone. A multi-level knowledge-based approach has been implemented, whereby low level algorithms first extract channel fragments based mainly on image properties then a high level processing stage improves the network using domain knowledge. The approach adopted at low level uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels. The higher level processing includes a channel repair mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The soil fauna is often a neglected group in many large-scale studies of farmland biodiversity due to difficulties in extracting organisms efficiently from the soil. This study assesses the relative efficiency of the simple and cheap sampling method of handsorting against Berlese-Tullgren funnel and Winkler apparatus extraction. Soil cores were taken from grassy arable field margins and wheat fields in Cambridgeshire, UK, and the efficiencies of the three methods in assessing the abundances and species densities of soil macroinver-tebrates were compared. Handsorting in most cases was as efficient at extracting the majority of the soil macrofauna as the Berlese-Tullgren funnel and Winkler bag methods, although it underestimated the species densities of the woodlice and adult beetles. There were no obvious biases among the three methods for the particular vegetation types sampled and no significant differences in the size distributions of the earthworms and beetles. Proportionally fewer damaged earthworms were recorded in larger (25 x 25 cm) soil cores when compared with smaller ones (15 x 15 cm). Handsorting has many benefits, including targeted extraction, minimum disturbance to the habitat and shorter sampling periods and may be the most appropriate method for studies of farmland biodiversity when a high number of soil cores need to be sampled. (C) 2008 Elsevier Masson SAS. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

LDL oxidation may be important in atherosclerosis. Extensive oxidation of LDL by copper induces increased uptake by macrophages, but results in decomposition of hydroperoxides, making it more difficult to investigate the effects of hydroperoxides in oxidised LDL on cell function. We describe here a simple method of oxidising LDL by dialysis against copper ions at 4 degrees C, which inhibits the decomposition of hydroperoxides, and allows the production of LDL rich in hydroperoxides (626 +/- 98 nmol/mg LDL protein) but low in oxysterols (3 +/- 1 nmol 7-ketocholesterol/mg LDL protein), whilst allowing sufficient modification (2.6 +/- 0.5 relative electrophoretic mobility) for rapid uptake by macrophages (5.49 +/- 0.75 mu g I-125-labelled hydroperoxide-rich LDL vs. 0.46 +/- 0.04 mu g protein/mg cell protein in 18 h for native LDL). By dialysing under the same conditions, but at 37 degrees C, the hydroperoxides are decomposed extensively and the LDL becomes rich in oxysterols. This novel method of oxidising LDL with high yield to either a hydroperoxide- or oxysterol-rich form by simply altering the temperature of dialysis may provide a useful tool for determining the effects of these different oxidation products on cell function. (C) 2007 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the interaction of spatial and dynamic aspects of resource extraction from forests by local people. Highly cyclical and varied across space and time, the patterns of resource extraction resulting from the spatial–temporal model bear little resemblance to the patterns drawn from focusing either on spatial or temporal aspects of extraction alone. Ignoring this variability inaccurately depicts villagers’ dependence on different parts of the forest and could result in inappropriate policies. Similarly, the spatial links in extraction decisions imply that policies imposed in one area can have unintended consequences in other areas. Combining the spatial–temporal model with a measure of success in community forest management—the ability to avoid open-access resource degradation—characterizes the impact of incomplete property rights on patterns of resource extraction and stocks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The adaptive thermal comfort theory considers people as active rather than passive recipients in response to ambient physical thermal stimuli, in contrast with conventional, heat-balance-based, thermal comfort theory. Occupants actively interact with the environments they occupy by means of utilizing adaptations in terms of physiological, behavioural and psychological dimensions to achieve ‘real world’ thermal comfort. This paper introduces a method of quantifying the physiological, behavioural and psychological portions of the adaptation process by using the analytic hierarchy process (AHP) based on the case studies conducted in the UK and China. Apart from three categories of adaptations which are viewed as criteria, six possible alternatives are considered: physiological indices/health status, the indoor environment, the outdoor environment, personal physical factors, environmental control and thermal expectation. With the AHP technique, all the above-mentioned criteria, factors and corresponding elements are arranged in a hierarchy tree and quantified by using a series of pair-wise judgements. A sensitivity analysis is carried out to improve the quality of these results. The proposed quantitative weighting method provides researchers with opportunities to better understand the adaptive mechanisms and reveal the significance of each category for the achievement of adaptive thermal comfort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reconstructions of salinity are used to diagnose changes in the hydrological cycle and ocean circulation. A widely used method of determining past salinity uses oxygen isotope (δOw) residuals after the extraction of the global ice volume and temperature components. This method relies on a constant relationship between δOw and salinity throughout time. Here we use the isotope-enabled fully coupled General Circulation Model (GCM) HadCM3 to test the application of spatially and time-independent relationships in the reconstruction of past ocean salinity. Simulations of the Late Holocene (LH), Last Glacial Maximum (LGM), and Last Interglacial (LIG) climates are performed and benchmarked against existing compilations of stable oxygen isotopes in carbonates (δOc), which primarily reflect δOw and temperature. We find that HadCM3 produces an accurate representation of the surface ocean δOc distribution for the LH and LGM. Our simulations show considerable variability in spatial and temporal δOw-salinity relationships. Spatial gradients are generally shallower but within ∼50% of the actual simulated LH to LGM and LH to LIG temporal gradients and temporal gradients calculated from multi-decadal variability are generally shallower than both spatial and actual simulated gradients. The largest sources of uncertainty in salinity reconstructions are found to be caused by changes in regional freshwater budgets, ocean circulation, and sea ice regimes. These can cause errors in salinity estimates exceeding 4 psu. Our results suggest that paleosalinity reconstructions in the South Atlantic, Indian and Tropical Pacific Oceans should be most robust, since these regions exhibit relatively constant δOw-salinity relationships across spatial and temporal scales. Largest uncertainties will affect North Atlantic and high latitude paleosalinity reconstructions. Finally, the results show that it is difficult to generate reliable salinity estimates for regions of dynamic oceanography, such as the North Atlantic, without additional constraints.