944 resultados para critical appraisal
Resumo:
Background Mitochondrial DNA (mtDNA) is being analyzed by an increasing number of laboratories in order to investigate its potential role as an active marker of tumorigenesis in various types of cancer. Here we question the conclusions drawn in most of these investigations, especially those published in high-rank cancer research journals, under the evidence that a significant number of these medical mtDNA studies are based on obviously flawed sequencing results. Methods and Findings In our analyses, we take a phylogenetic approach and employ thorough database searches, which together have proven successful for detecting erroneous sequences in the fields of human population genetics and forensics. Apart from conceptual problems concerning the interpretation of mtDNA variation in tumorigenesis, in most cases, blocks of seemingly somatic mutations clearly point to contamination or sample mix-up and, therefore, have nothing to do with tumorigenesis. Conclusion The role of mitochondria in tumorigenesis remains unclarified. Our findings of laboratory errors in many contributions would represent only the tip of the iceberg since most published studies do not provide the raw sequence data for inspection, thus hindering a posteriori evaluation of the results. There is no precedent for such a concatenation of errors and misconceptions affecting a whole subfield of medical research.
Wetlands and riparian zones as buffers and critical habitats for biotic communities in Lake Victoria
Resumo:
Despite their ecological and socio-economic importance, Lake Victoria's adjoining "swamps" and lake interface are among the least investigated parts of the lake. The "swamps" a term commonly equated to "wastelands" and the difficult working environment they present in comparison to open water, are major factors for the low level of attention accorded to shoreline wetlands. Moreover, definitions of wetlands highlighted for example in the Ramsar Convention as "areas of marsh, fern, peatland or water, whether natural or artificial, permanent or temporary, with water that is static or flowing, fresh or brackish, or salt, including areas of marine water, the depth of which does not exceed six metres" (Ramsar, 1971) were designed to protect birds (water fowl) of international importance. The Ramsar definition, which also includes oceans, has till recently been of limited use for Lake Victoria, because itdoes not fully recognise wetlands in relation to other public concerns such as water quality, biodiversity and the tisheries that are of higher socioeconomic priority than waterfowl. Prior to 1992, fishery research on Lake Victoria included studies of inshore shallow habitats of the lake without specific reference to distance or the type of vegetation at the shore. Results of these studies also conveniently relied heavily on trawl and gill net data from the 5-10 m depth zones as the defining boundary of shallow inshore habitats. In Lake Victoria, such a depth range can be at least one kilometre from the lake interface and by the 10m depth contour, habitats are in the sub-littoral range. Findings from these studies could thus not be used to make direct inferences on the then assumed importance of Lake Victoria wetlands in general.
A critical review of Glucose biosensors based on Carbon nanomaterials: Carbon nanotubes and graphene
Resumo:
There has been an explosion of research into the physical and chemical properties of carbon-based nanomaterials, since the discovery of carbon nanotubes (CNTs) by Iijima in 1991. Carbon nanomaterials offer unique advantages in several areas, like high surface-volume ratio, high electrical conductivity, chemical stability and strong mechanical strength, and are thus frequently being incorporated into sensing elements. Carbon nanomaterial-based sensors generally have higher sensitivities and a lower detection limit than conventional ones. In this review, a brief history of glucose biosensors is firstly presented. The carbon nanotube and grapheme-based biosensors, are introduced in Sections 3 and 4, respectively, which cover synthesis methods, up-to-date sensing approaches and nonenzymatic hybrid sensors. Finally, we briefly outline the current status and future direction for carbon nanomaterials to be used in the sensing area. © 2012 by the authors; licensee MDPI, Basel, Switzerland.
Resumo:
This paper presents the modeling of second generation (2 G) high-temperature superconducting (HTS) pancake coils using finite element method. The axial symmetric model can be used to calculate current and magnetic field distribution inside the coil. The anisotropic characteristics of 2 G tapes are included in the model by direct interpolation. The model is validated by comparing to experimental results. We use the model to study critical currents of 2 G coils and find that 100μV/m is too high a criterion to determine long-term operating current of the coils, because the innermost turns of a coil will, due to the effect of local magnetic field, reach their critical current much earlier than outer turns. Our modeling shows that an average voltage criterion of 20μV/m over the coil corresponds to the point at which the innermost turns' electric field exceeds 100μV/m. So 20μV/m is suggested to be the critical current criterion of the HTS coil. The influence of background field on the coil critical current is also studied in the paper. © 2012 American Institute of Physics.
Resumo:
Lifetimes of excited states in 128Ce were measured using the recoil distance Doppler-shift (RDDS) and the Doppler-shift attenuation (DSAM) methods. The experiments were performed at the Wright Nuclear Structure Laboratory of Yale University. Excited states of 128Ce were populated in the 100Mo(32Si,4n) reaction at 120 MeV and the nuclear γ decay was measured with an array of eight Clover detectors positioned at forward and backward angles. The deduced yrast transition strengths together with the energies of the levels within the ground-state (gs) band of 128Ce are in agreement with the predicted values for the X(5) critical point symmetry. Thus, we suggest 128Ce as a benchmark X(5) nucleus in the mass A ≈ 130 region. © World Scientific Publishing Company.
Resumo:
Application of High Temperature Superconducting (HTS) has been increasingly popular since the new superconducting materials were discovered. This paper presents a new high-precision digital lock-in measurement technique which is used for measuring critical current and AC loss of the 2nd Generation HTS tape. Using a lock-in amplifier and nano-voltage meter, we can resolve signals at nano-volt level, while using a specially designed compensation coil we can cancel out inductive by adjusting the coil inductance. Furthermore, a finer correction for the inductive component can be achieved by adjusting the reference phase of the lock-in amplifier. The critical current and AC loss measurement algorithms and hardware layout are described and analyzed, and results for both numerical and experimental data under varieties of frequencies are presented. © 2008 SICE.
Resumo:
The physical meaning and methods of determining loudness were reviewed Loudness is a psychoacoustic metric which closely corresponds to the perceived intensity of a sound stimulus. It can be determined by graphical procedures, numerical methods, or by commercial software. These methods typically require the consideration of the 1/3 octave band spectrum of the sound of interest. The sounds considered in this paper are a 1 kHz tone and pink noise. The loudness of these sounds was calculated in eight ways using different combinations of input data and calculation methods. All the methods considered are based on Zwicker loudness. It was determined that, of the combinations considered, only the commercial software dBSonic and the loudness calculation procedure detailed in DIN 45631 using 1/3 octave band levels filtered using ANSI S1.11-1986 gave the correct values of loudness for a 1 kHz tone. Comparing the results between the sources also demonstrated the difference between sound pressure level and loudness. It was apparent that the calculation and filtering methods must be considered together, as a given calculation will produce different results for different 1/3 octave band input. In the literature reviewed, no reference provided a guide to the selection of the type of filtering that should be used in conjunction with the loudness computation method.
Resumo:
The physical meaning and calculation procedures for determining loudness was critically analyzed. Four noise sources were used in comparing the software packages dBFA dBSonic, which were used in the investigation to a public domain code. The purpose of the comparison was to evaluate the validity of the results obtained and to gain an idea of the shortcomings of the relevant standards. A comparison of the results for loudness was computed from various methods, used in the study. Two basic sources of input data such as a sound level meter (SLM) and a 01 dB data acquisition system (DAQ), were available for the comparison. The SLM directly gave 1/3 octave band levels, while the data from the DAQ was filtered to give the results. Five processing methods, including a Visual Basic (VB) program and a VB program adapted from dBFA, were used for the study. It was found that the calculation of loudness from 1/3 octave cannot be separated from the filtering process.
Resumo:
Building on recent developments in mixed methods, we discuss the methodological implications of critical realism and explore how these can guide dynamic mixed-methods research design in information systems. Specifically, we examine the core ontological assumptions of CR in order to gain some perspective on key epistemological issues such as causation and validity, and illustrate how these shape our logic of inference in the research process through what is known as retroduction. We demonstrate the value of a CR-led mixed-methods research approach by drawing on a study that examines the impact of ICT adoption in the financial services sector. In doing so, we provide insight into the interplay between qualitative and quantitative methods and the particular value of applying mixed methods guided by CR methodological principles. Our positioning of demi-regularities within the process of retroduction contributes a distinctive development in this regard. We argue that such a research design enables us to better address issues of validity and the development of more robust meta-inferences.