947 resultados para methodology of indexation


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on current data and experience, the joint working group of the European Society of Minimally Invasive Neurological Therapy (ESMINT) and the European Society of Neuroradiology (ESNR) make suggestions on trial design and conduct aimed to investigate therapeutic effects of mechanical thrombectomy (MT). We anticipate that this roadmap will facilitate the setting up and conduct of successful trials in close collaboration with our neighbouring disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To assess the methodology of meta-analyses published in leading general and specialist medical journals over a 10-year period. STUDY DESIGN AND SETTING: Volumes 1993-2002 of four general medicine journals and four specialist journals were searched by hand for meta-analyses including at least five controlled trials. Characteristics were assessed using a standardized questionnaire. RESULTS: A total of 272 meta-analyses, which included a median of 11 trials (range 5-195), were assessed. Most (81%) were published in general medicine journals. The median (range) number of databases searched increased from 1 (1-9) in 1993/1994 to 3.5 (1-21) in 2001/2002, P<0.0001. The proportion of meta-analyses including searches by hand (10% in 1993/1994, 25% in 2001/2002, P=0.005), searches of the grey literature (29%, 51%, P=0.010 by chi-square test), and of trial registers (10%, 32%, P=0.025) also increased. Assessments of the quality of trials also became more common (45%, 70%, P=0.008), including whether allocation of patients to treatment groups had been concealed (24%, 60%, P=0.001). The methodological and reporting quality was consistently higher in general medicine compared to specialist journals. CONCLUSION: Many meta-analyses published in leading journals have important methodological limitations. The situation has improved in recent years but considerable room for further improvements remains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technological and environmental problems related to ore processing are a serious limitation for sustainable development of mineral resources, particularly for countries / companies rich in ores, but with little access to sophisticated technology, e.g. in Latin America. Digital image analysis (DIA) can provide a simple, unexpensive and broadly applicable methodology to assess these problems, but this methodology has to be carefully defined, to produce reproducible and relevant information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vernacular architecture has demonstrated its perfect environmental adaptation through its empirical development and improvement by generations of user-builders. Nowadays, the sustainability of vernacular architecture is the aim of some research projects in which the same method should be applied in order to be comparable. Hence, we propose a research method putting together various steps. Through the analysis of geographical, lithology, economic, cultural and social influence as well as materials and constructive systems, vernacular architecture is analyzed. But, all this information is put together with the natural landscape (topography and vegetation) and the climatic data (temperature, winds, rain and sun exposure). In addition, the use of bioclimatic charts, such as Olgyay or Givoni’s, revealed the necessities and strategies in urban and building design. They are satisfied in the vernacular architecture by the application of different energy conservation mechanisms, some of them are shown by different examples in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main problem to study vertical drainage from the moisture distribution, on a vertisol profile, is searching for suitable methods using these procedures. Our aim was to design a digital image processing methodology and its analysis to characterize the moisture content distribution of a vertisol profile. In this research, twelve soil pits were excavated on a ba re Mazic Pellic Vertisols ix of them in May 13/2011 and the rest in May 19 /2011 after a moderate rainfall event. Digital RGB images were taken from each vertisol pit using a Kodak? camera selecting a size of 1600x945 pixels. Each soil image was processed to homogenized brightness and then a spatial filter with several window sizes was applied to select the optimum one. The RGB image obtained were divided in each matrix color selecting the best thresholds for each one, maximum and minimum, to be applied and get a digital binary pattern. This one was analyzed by estimating two fractal scaling exponents box counting dimension D BC) and interface fractal dimension (D) In addition, three pre-fractal scaling coefficients were determinate at maximum resolution: total number of boxes intercepting the foreground pattern (A), fractal lacunarity (?1) and Shannon entropy S1). For all the images processed the spatial filter 9x9 was the optimum based on entropy, cluster and histogram criteria. Thresholds for each color were selected based on bimodal histograms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fresnel lenses used as primary optics in concentrating photovoltaic modules may show warping produced by lens manufacturing or module assembly (e.g., stress during molding or weight load) or due to stress during operation (e.g., mismatch of thermal expansion between different materials). To quantify this problem, a simple method called “checkerboard method” is presented. The proposed method identifies shape errors on the front surface of primary lenses by analyzing the Fresnel reflections. This paper also deals with the quantification of the effects these curvatures have on their optical performance and on the electrical performance of concentrating modules incorporating them. This method can be used to perform quality control of Fresnel lenses in scenarios of high volume production.